Web/Tech

  • Batteries Not Included

    Mercedes-Benz might have a sense of humor.

    They lent Saturday Night Live a C-Class for a parody TV commercial that makes fun of battery-powered cars.

    The 2-minute video, starring Julia Louis-Dreyfus, shows a car that runs on AA batteries (which is why the car is called the Mercedes AA Class). 

    Here is the video.

    Screen Shot 2022-04-03 at 5.29.44 PM

    via SNL

    One of the highlighted features, the "Auto-Dump," helps you discharge worn-out batteries … all 9,648 of them.

    The video ends with the disclaimer: "Batteries not included."

    I hope you had a fun April Fool's Day.

  • Top 20 Internet Giants

    For most of my life, I've been a tech early adopter. 

    Here are some snippets from that journey. I fell in love with the Mac 128 in the 1980s. My frustration with the limitations of floppies caused me to fly across the country to get one of the earliest 20 MB hard drives (which I didn't know how I would ever fill up). Much to the consternation of those who thought only secretaries should be seen typing, I was one of the first lawyers to use a computer to do work. I waited in lines to grab Palm Pilots and cool phones before smartphones became a thing. And, somehow, I don't enjoy setting up my computer anymore (OK, I do – but not like I did before). 

    A lot has changed, while much stays the same.

    In the late 90s, I was obsessed with the early web scene. I spoke at computer events like Comdex and MacWorld, and I was able to see and identify many of the companies that would become major players. Many of those "major players" expanded into the dot-com bubble, then disappeared.

    I've watched that cycle play itself out several times as the landscape and players changed and evolved.

    There is a chart that captures a lot of those changes by listing the 20 Internet Giants that ruled the web since 1998. 

    Take a look. 

    The-20-Internet-Giants-That-Rule-the-Webvia visualcapitalist

    Humans are very good at recognizing major turning points. However, with that said, they often are much worse than they would believe in regards to understanding the implications of the changes they so easily predicted.

    Who would have guessed that AOL would become almost wholly irrelevant? Or that Yahoo would make so many horrible decisions and still last to 2022?

    In the early days of the internet, most of the leaders were aggregators and search engines. Now we have a much broader set of influencers. The top 20 players in the space are also playing much larger games than their 1998 predecessors. Most of the leaders are platforms that help other products succeed as well. 

    I'm curious to see what names are added to the list in 5 years. 

    Who do you believe we will see there in 2027?

  • Changing the Course of History

    A little over a week ago, a deepfake of Ukrainian President Volodymyr Zelenskyy was used to try and convince Ukraine's soldiers to lay down their arms and surrender against Russia.  On top of being shared on social media, hackers got it onto news sites and a TV ticker as well. 

    While it's not explicitly known that Russia did this – there's a long history of Russian cyberwarfare, including many instances of media manipulation. 

    Luckily, while the lip-sync was okay in this video, several cues helped us know it was fake. 

    Unfortunately, this is only the tip of the iceberg.  Many deepfakes aren't as easy to discern.  Consequently, as we fight wars (both physical and cultural), manipulated videos will increasingly alter both perceptions and reality. 

    Even when proven to be fake, the damage can persist.  Some people might believe it anyway … while others may begin distrusting all videos from leaders as potential misinformation. 

    That being said, not all deepfakes are malicious, and the potential for the technology is attractive.  Production companies are already using it to splice in actors who might have aged or died into scenes in movies.  Deepfake technology can also be used to allow a celebrity to sell their likeness without having to waste their time doing all the filming necessary to produce the intended finished product. 

    Deepfake technology also allows us to create glimpses into potential pasts or futures.  For example, On July 20th, 1969, Neil Armstrong and Buzz Aldrin landed safely on the moon.  They then returned to Earth safely as well.   What if they didn't?  MIT recently created a deepfake of a speech Nixon's speechwriter William Safire wrote during the Apollo 11 mission in case of disaster.  The whole video is worth watching, but the "fake history" speech starts around the 4:20 mark. 

    "Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace." – Nixon's Apollo 11 Disaster Speech

     

    MIT via In Event Of Moon Disaster

    Conclusion

    “Every record has been destroyed or falsified, every book rewritten, every picture has been repainted, every statue and street building has been renamed, every date has been altered. And the process is continuing day by day and minute by minute. History has stopped.“ – Orwell, 1984

    In an ideal world, history would be objective; facts about what happened, unencumbered by the bias of society, or the victor, the narrator, etc.  On some level, however, history is written by the winners.  Think about it … perceived "truth" is shaped by the bias and perspectives of the chronicler.

    Consequently, history (as we know it) is subjective.  The narrative shifts to support the needs of the society that's reporting it. 

    The Cold War with the Soviet Union was a great example.  During the war, immediately thereafter, and even today, the interpretation of what transpired has repeatedly changed (both here and there).  The truth is that we are uncertain about what we are certain about.

    But while that was one example, to a certain degree, we can see this type of phenomenon everywhere.  Yes, we're even seeing it again with Russia.

    But it runs deeper than cyber-warfare.  News stations color the story told based on whether they're red or blue, and the internet is quick to jump on a bandwagon even if the information is hearsay.  The goal is attention rather than truth.

    Media disinformation is more dangerous than ever.  Alternative history can only be called that when it's discernible from the truth … and unfortunately, we're prone to look for information that already fits our biases. 

    As deepfakes get better, we'll likely get better at detecting them.  But it's a cat-and-mouse game with no end in sight.  Signaling Theory posits that signalers evolve to become better at manipulating receivers, while receivers become more resistant to manipulation. 

    I'm excited about the possibilities of technology, even though new capabilities present us with both promise and peril. 

    Meanwhile, "Change" and "Human Nature" remain constant.

    And so we go.

  • A Look at Codebases

    When I was a child, NASA got to the Moon with computers much less sophisticated than those we now keep in our pockets.

    At that time, when somebody used the term "computers," they were probably referring to people doing math.  The idea that businesses or individuals would use computing devices, as we do now, was far-fetched science fiction.

    Recently, I shared an article on the growing "compute" calculations used in machine learning.  We showed that the amount of compute used in machine learning has doubled every six months since 2010, with today's largest models using datasets up to 1,900,000,000,000 points.

    This week, I want to take a look at lines of code.  Think of that as a loose proxy showing how sophisticated software is becoming.

     

    Screen Shot 2022-03-11 at 4.31.30 PMvia informationisbeautiful

    As you go through the chart, you'll see that in the early 2000's we had software with up to approximately twenty-five million lines of code.  Meanwhile, today, the average car uses one hundred million, and Google uses two billion lines of code across their internet services. 

    For context, if you count DNA as code, the human genome has 3.3 billion lines of code.  So, while technology has increased massively – we're still not close to emulating the complexity of humanity. 

    Another thing to consider is that when computers had tighter memory constraints, coders had to be deliberate about how they used each line of code or variable.  They found hacks and workarounds to make a lot out of a little.

    However, with an abundance of memory and processing power, software can get bloated as lazy (or lesser) programmers get by with inefficient code.  Consequently, not all the increase in size results from increasing complexity – some of it is the result of lackadaisical programming or more forgiving development platforms.

    In better-managed products, they consider whether the code is working as intended as well as reasonable resource usage. 

    In our internal development, we look to build modular code that allows us to re-use equations, techniques, and resources.  We look at our platform as a collection of evolving components. 

    As the cost and practicality of bigger systems become more manageable, we can use our intellectual property assets differently than before. 

    For example, a "trading system" doesn't have to trade profitably to be valuable anymore.  It can be used as a "sensor" that generates useful information for other parts of the system. It helps us move closer to what I like to call, digital omniscience. 

    As a result of increased capabilities and capacities, we can use older and less capable components to inform better decision-making.

    In the past, computing constraints limited us to use only our most recent system at the highest layer of our framework.

    We now have more ways to win.

    But, bigger isn't always better – and applying constraints can encourage creativity.

    Nonetheless, as technology continues to skyrocket, so will the applications and our expectations about what they can do for us.

    We live in exciting times … Onwards!

  • Artificial Intelligence Is Great, Artificial Stupidity Is Scary

    When I first got out of Law School in the 1980s, "professionals" didn't type … that was your assistant's job (or the "typing pool," which was a real thing too).

    At that point, most people couldn't have imagined what computers and software are capable of now.  And if you tried to tell people how pervasive computers and 'typing' would be … they would have thought that you were delusional.

    My career has spanned a series of cycles where I was able to imagine what advanced tech would enable (and how businesses would have to change to best leverage those new capabilities).

    Malcolm Gladwell suggests that it takes 10,000 hours of focus and effort for someone to become an expert at something.  While that is not necessarily true or accurate, it's still a helpful heuristic.

    Today, we can do research that took humans 10,000 hours in the time it took you to read this sentence.  Moreover, technology doesn't forget what it's learned – As a result, technological memory is much better than yours or mine.  Consequently, the type and quality of decisions, inferences, and actions are better as well.  Ultimately, we will leverage the increased speed, capacity, and capabilities of autonomous platforms.  While that is easy to anticipate, the consequences of these discontinuous innovations are hard to predict.  Things often take longer to happen than you would think.  But, when they do, the consequences are often more significant and more far-reaching than anticipated.

    Still, technology isn't a cure-all.  Many people miss out on the benefits of A.I. and technology for the same reasons they didn't master the hobbies they picked up as an adolescent. 

    I shot a video discussing how to use technology to create a sustainable creative advantage.  Check it out

     

     

    Many people recognize a "cool" new technology (like A.I.), but they underestimate the level of commitment and effort that mastery takes. 

    When using A.I. and high-performance computing, you need to ask the same questions you ask yourself about your ultimate purpose. 

    • What's my goal?
    • What do I (or my systems) need to learn to accomplish my goal?
    • What are the best ways to achieve that goal (or something better)?

    Too many companies are focused on A.I. as if that is the goal.  A.I. is simply a tool.  As I mentioned in the video, you must define the problem the right way in order to find an optimal solution. 

    Artificial Intelligence is a game-changer – so you have to approach it as such. 

    Know your mission and your strategy, recognize what you're committing to, set it as a compass heading and make deliberate movement in that direction. 

    I end the video by saying, "Wisdom comes from making finer distinctions.  So, it is an iterative and recursive process… but it is also evolutionary.  And frankly, that is extraordinarily exciting!"

    I hope you agree.

    Onwards!

  • Compute Trends In Machine Learning

    I often talk about Machine Learning and Artificial Intelligence in broad strokes.  Part of that is based on me – and part of that is a result of my audience.  I tend to speak with entrepreneurs (rather than data scientists or serious techies).  So talking about training FLOPs, parameters, and the actual benchmarks of ML is probably outside of their interest range. 

    But, every once in a while, it's worth taking a look into the real tangible progress computers have been making. 

    Less Wrong put together a great dataset on the growth of machine learning systems between 1952 and 2021.  While there are many variables that are important in judging the performance and intelligence of systems, their dataset focuses on parameter count.  It does this because it's easy to find data that is also a reasonable proxy for model complexity. 

    L30zg2ysf3i81

    Pnlqbv9tf3i81

    Giuliano Giacaglia and Less Wrong (click here for an interactive version)

    One of the simplest takeaways is that ML training compute has been doubling basically every six months since 2010.  Compared to Moore's Law, where compute power doubled every two years, we're radically eclipsing that.  Especially as we've entered a new era of technology. 

    Now, to balance this out, we have to ask the question, what actually makes AI intelligent?  Model size is important, but you also have factors like training compute and training dataset size.  You also must consider the actual results that these systems produce.  As well, model size isn't a 1-t0-1 with model complexity as architectures and domains have different inputs and needs (but can have similar sizes). 

    A few other brief takeaways are that language models have seen the most growth, while gaming models have the fewest trainable parameters.  This is somewhat counterintuitive at first glance, but makes sense as the complexity of games means that they have more constraints in other domains.  If you really get into the data, there are plenty more questions and insights to be had.  But, you can learn more from either Giancarlo or Less Wrong.

    And, a question to leave with is whether the scaling laws of machine learning will differ as deep learning become more prevalent.  Right now, model size comparisons suggest not, but there are so many other metrics to consider. 

    What do you think is going to happen?

  • The Future of Spaceflight

    When I talk about exponential technologies, I almost always end up discussing Tesla and SpaceX. 

    Elon Musk is an interesting guy.

    220220  Elon

    Whether they end up doing everything they say they're going to, his companies massively accelerate the rate at which capabilities turn into products and platforms for future growth.

    I recently shared the Elon quote: "Stop being patient and start asking yourself, how do I accomplish my 10-year plan in 6 months?  You'll probably fail, but you'll be a lot further along than the person who simply accepted it was going to take 10 years!"

    I don't know if he really said it.  Nonetheless, it sounds like him … and I agree with the sentiment.

    The New Space Race.

    When I was young, the Space Race captured the heart and souls of Americans.  But, for the past few decades, it was in the background.  Recently, that has changed.  The space race is getting hot again.  Resources are pouring into this area, and SpaceX is leading the pack. 

    In 2018, I shared excitement that the boosters he used were reusable.  Today, people are talking about how the newest ship, Starship, could render other rocket programs obsolete. 

     

    Cost-of-space-flight-chartvia visualcapitalist

     

    While there's always room for competition, I can see many programs falling far behind if they haven't been focusing on reusability.  Assuming Starship delivers on its promises (keeping in mind that Elon is often over-confident about his timeline), it will be cheaper and more versatile than anything out there. 

    I think it's naive to assume that other companies aren't doing interesting things … but by the time they release anything comparable, it's possible that SpaceX will already dominate the market. 

    The economics of reusable rocketry isn't yet cost-effective for most potential customers, but Musk is undoubtedly moving the needle in the right direction. 

    Hopefully, he can continue to raise the expectations of both consumers and producers.  The results could be out-of-the-world.

    Right now, suborbital trips from Virgin Galactic and Blue Origin cost between 250K to 500K per trip – and trips to actual orbit cost over $50 million

    However, I believe the cost of space travel – and space tourism – will drop radically within my lifetime. 

    It's hard to comprehend the scale of the universe and the scale of our potential … but that's what makes it worth exploring!

    Even though we've only been talking about space travel, there are so many other exponential technologies that this applies to just as well.

    Onwards!

  • Functional Mapping: Nature’s Desired Path

    There's a concept in design and transportation called Desire Paths

    The desired path is the path that users take despite the intended path by the builder of a community or application. 

    Here's a great example

    6tj18p093vb81Reddit via itstartswithani

    And, here's a whole community forum focused on desire paths

    It's often easier to take advantage of human nature … or just nature … than fight against it. 

    To that effect, I shot a short video on how this relates to your business and tech adoption. I call it functional mapping. Check it out

     

    Understanding the natural path for both technology and your clients makes it easier to understand and anticipate the capabilities, constraints, and milestones that define your path forward.   That means you actually have to understand the different types of users and what they expect to do.

    6a00e5502e47b28833026bded38d1b200c-600wi

    Each stage is really about the opportunity to scale desired capabilities and automation.

    It isn't really about building the technology, rather, it is about supporting the desire.

    You don’t have to get it right.  You just have to create momentum in the right direction.   Meaning, if you understand what is coming, you don't have to build it … but you should figure out where you want to build something that will move things in the right direction.

    You’ve probably heard me talk about how Capabilities become Prototypes. Then Prototypes become Products.  And, ultimately, Products become Platforms.

    This model is fractal.  That means it works on many levels of magnification or iteration.

    What first looks like a product is later seen as a prototype for something bigger.

    SpaceX's goal to get to Mars feels like their North Star right now … but once it's achieved, it becomes the foundation for new goals.

    This Framework helps you validate capabilities before sinking resources into them. 

    It helps you anticipate which potential outcomes you want to accelerate.  Rather than simply figuring out what the easiest next step is … you have to figure out which path is the best next step to your desired outcome.

    The world is changing fast! Hope you're riding the wave instead of getting caught in the riptide!

  • The Future of the Blockchain

    Last week, I talked about market performance in 2021.  A decent portion of that article talked about cryptocurrency and the recent downturn (after a stellar 2021).  I'm a skeptic by nature, so it's hard for me to get behind any specific coin (even Bitcoin) at this point in time.

    This week, I had a conversation with good friends, including John Raymonds, about the topic as well.  John is much more active in the space and brought up some good points.  Something I noticed was how the level of discussion is starting to elevate and mature. People are beginning to think about secondary and tertiary value propositions. The conversation even made me think about repurposing some of our underutilized hardware in our server room for some crypto-related purposes.

    So, today, I want to focus on a different aspect of the equation … the potential value propositions of cryptocurrency as a technology – and the blockchain. 

    To start, I want to talk about Industrial Revolutions.  In part because we're at another inflection point. 

    A Look at Industrial Revolutions

    The Industrial Revolution has two phases: one material, the other social; one concerning the making of things, the other concerning the making of men. - Charles A. Beard

    There are several turning points in our history where the world changed forever.  Former paradigms and realities became relics of a bygone era. 

    Tomorrow's workforce will require different skills and face different challenges than we do today.  You can consider this the Fourth Industrial Revolution.  Compare today's changes to our previous industrial revolutions. 

    Each revolution shared multiple similarities.  They were disruptive.  They were centered on technological innovation.  They created concatenating socio-cultural impacts.

    Since most of us remember the third revolution, let's spend some time on that. 

    Here's a map of the entire "internet" in 1973. 

     

    6a00e5502e47b2883301bb096809ce970d-600wi

    Reddit via @WorkerGnome.

    Most of us didn't use the internet at this point, but you probably remember Web1 (static HTML pages, a 5-minute download to view a 3Mb picture, and of course … waiting for a website to load over the dialup connection before you could read it).  It was still amazing.

    Then Web 2.0 came, and so did everything else we now associate with the internet; Facebook, YouTube, ubiquitous porn sites, Google.  But, with Web 2.0 also came user tracking, advertising, and we became the "product."  Remember, you're not the customers of those platforms – advertisers are.  And if you're not the customer, you're the product. And when you're not the customer, there's no reason for the platforms not to censor your thoughts to control the narrative. 

    Putting You In Control

    Web3 (and the blockchain and its reliant technologies) brings the power back to the people. 

    Primarily due to decentralized access with equal treatment for everyone.  Governments are already being pressured by Bitcoin and other cryptocurrencies.  But so are banks and brokerages, due to smart contracts and Ethereum.  Soon, even VCs will be impacted due to OHM fork treasuries or initial DEX offerings. 

    As Web3 gets more mature, so will decentralized finance.  Meaning, big banks, governments, ISPs, and more will have less control over the applications and uses of the technology. 

    If handled correctly, that means competition can discourage the productization of your digital presence.

    Removing Barriers

    There are many practical ways this would impact your life – but let's look at one that's already happening. 

    El Salvador recently made Bitcoin legal tender.  Talking about all the reasons this happened is beyond the scope of this article, but it does make El Salvador an excellent case study for the possibilities. 

    To start, it's now easier and quicker to buy a beer there (with Bitcoin) than it is in the US with cash.  It also stabilizes pricing in a civil war, because it's easy to move both in and out of the country. 

    Consequently, it also means that their currency holds its value as they travel to other places. 

    Let's take this to the extreme.  Let's say someone was to convert all their net worth to Bitcoin, and put it in a hardware wallet.  They could conceivably memorize their seed phrase, throw the wallet in a fire, and fly to El Salvador with only the clothes on their back.  After finding a way to scrounge up the money to buy another hardware wallet through random acts of labor … they would be in a completely new country with their entire net worth and no other footprint.  It's scary – especially for governments and their taxing authority.  But, it creates a new set of potentials and freedom.

    Now, take it a step further.  What would the world look like if you had all your health data, insurance, etc., with you anywhere you travel?  The world becomes your oyster in a way that was almost impossible before. 

    And that's only the beginning. 

    But, to bring it back to my skepticism again, there are a lot of roadblocks, inferences, and time in between today and the decentralization of the internet and finance.  And for now, that thought experiment only really works if you're willing to move to El Salvador.  The larger countries seem to be doing everything they can to discourage the adoption of cryptocurrencies. Though I think the smaller countries view this as a chance to become one of the new hubs of the world.

    However, maybe it's time for this quote by Elon Musk: 

    "Stop being patient and start asking yourself, how do I accomplish my 10 year plan in 6 months? You will probably fail but you will be a lot further ahead than the person who simply accepted it was going to take 10 years."