I often talk about Machine Learning and Artificial Intelligence in broad strokes. Part of that is based on me – and part of that is a result of my audience. I tend to speak with entrepreneurs (rather than data scientists or serious techies). So talking about training FLOPs, parameters, and the actual benchmarks of ML is probably outside of their interest range.
But, every once in a while, it's worth taking a look into the real tangible progress computers have been making.
Less Wrong put together a great dataset on the growth of machine learning systems between 1952 and 2021. While there are many variables that are important in judging the performance and intelligence of systems, their dataset focuses on parameter count. It does this because it's easy to find data that is also a reasonable proxy for model complexity.
Giuliano Giacaglia and Less Wrong (click here for an interactive version)
One of the simplest takeaways is that ML training compute has been doubling basically every six months since 2010. Compared to Moore's Law, where compute power doubled every two years, we're radically eclipsing that. Especially as we've entered a new era of technology.
Now, to balance this out, we have to ask the question, what actually makes AI intelligent? Model size is important, but you also have factors like training compute and training dataset size. You also must consider the actual results that these systems produce. As well, model size isn't a 1-t0-1 with model complexity as architectures and domains have different inputs and needs (but can have similar sizes).
A few other brief takeaways are that language models have seen the most growth, while gaming models have the fewest trainable parameters. This is somewhat counterintuitive at first glance, but makes sense as the complexity of games means that they have more constraints in other domains. If you really get into the data, there are plenty more questions and insights to be had. But, you can learn more from either Giancarlo or Less Wrong.
And, a question to leave with is whether the scaling laws of machine learning will differ as deep learning become more prevalent. Right now, model size comparisons suggest not, but there are so many other metrics to consider.
What do you think is going to happen?
The Future of Spaceflight
When I talk about exponential technologies, I almost always end up discussing Tesla and SpaceX.
Elon Musk is an interesting guy.
Whether they end up doing everything they say they're going to, his companies massively accelerate the rate at which capabilities turn into products and platforms for future growth.
I recently shared the Elon quote: "Stop being patient and start asking yourself, how do I accomplish my 10-year plan in 6 months? You'll probably fail, but you'll be a lot further along than the person who simply accepted it was going to take 10 years!"
I don't know if he really said it. Nonetheless, it sounds like him ... and I agree with the sentiment.
The New Space Race.
When I was young, the Space Race captured the heart and souls of Americans. But, for the past few decades, it was in the background. Recently, that has changed. The space race is getting hot again. Resources are pouring into this area, and SpaceX is leading the pack.
In 2018, I shared excitement that the boosters he used were reusable. Today, people are talking about how the newest ship, Starship, could render other rocket programs obsolete.
While there's always room for competition, I can see many programs falling far behind if they haven't been focusing on reusability. Assuming Starship delivers on its promises (keeping in mind that Elon is often over-confident about his timeline), it will be cheaper and more versatile than anything out there.
I think it's naive to assume that other companies aren't doing interesting things ... but by the time they release anything comparable, it's possible that SpaceX will already dominate the market.
The economics of reusable rocketry isn't yet cost-effective for most potential customers, but Musk is undoubtedly moving the needle in the right direction.
Hopefully, he can continue to raise the expectations of both consumers and producers. The results could be out-of-the-world.
Right now, suborbital trips from Virgin Galactic and Blue Origin cost between 250K to 500K per trip - and trips to actual orbit cost over $50 million.
However, I believe the cost of space travel - and space tourism - will drop radically within my lifetime.
It's hard to comprehend the scale of the universe and the scale of our potential ... but that's what makes it worth exploring!
Even though we've only been talking about space travel, there are so many other exponential technologies that this applies to just as well.
Onwards!
Posted at 05:19 PM in Business, Current Affairs, Gadgets, Ideas, Market Commentary, Science, Trading, Trading Tools, Travel, Web/Tech | Permalink | Comments (0)
Reblog (0)