“Nobody phrases it this way, but I think that artificial intelligence is almost a humanities discipline. It's really an attempt to understand human intelligence and human cognition.” —Sebastian Thrun
We often use human consciousness as the ultimate benchmark for artificial exploration.
The human brain is ridiculously intricate. While weighing only three pounds, it contains about 100 billion neurons and 100 trillion connections between them. On top of the sheer complexity, the order of the connections and the order of actions the brain does naturally make it even harder to replicate. The human brain is also constantly reorganizing and adapting. It's a beautiful piece of machinery.
We've had millions of years for this powerhouse of a computer to be created, and now we're trying to do the same with neural networks and machines in a truncated time period. While deep learning algorithms have been around for a while, we're just now developing enough data and computing power to change deep learning from a thought experiment to a real edge.
Think of it this way, when talking about the human brain, we talk about left-brain and right-brain. The theory is that left-brain activities are analytical and methodical, and right-brain activities are creative, free-form, and artistic. We're great at training AI for left-brain activities (obviously with exceptions). In fact, AI is beating us at these left-brain activities because a computer has a much higher input bandwidth than we do, they're less biased, and they can perform 10,000 hours of research by the time you finish this article.
It's tougher to train AI for right-brain tasks. That's where deep learning comes in.
Deep learning is a subset of machine learning based on unsupervised learning from unstructured/unlabeled data. Instead of asking AI a question, giving it metrics, and letting it chug away, you're letting AI be intuitive. Deep learning is a much more faithful representation of the human brain. It utilizes a hierarchy of convolutional neural networks to handle linear and non-linear operations so it can think creatively to better problem-solve on potentially various data sets and in unseen environments.
When a baby is first learning to walk, it might stand up and fall down. It might then take a small stutter step, or maybe a step that's much too far for its little baby body to handle. It will fall, fail, and learn. Fall, fail, and learn. That's very similar to the goal of deep learning or reinforcement learning.
What's missing is the intrinsic reward that keeps humans moving when the extrinsic rewards aren't coming fast enough. AI can beat humans at many games but has struggled with puzzle/platformers because there's not always a clear objective outside of clearing the level.
A relatively new (in practice, not in theory) approach is to train AI around "curiosity"[1]. Curiosity helps it overcome that boundary. Curiosity lets humans explore and learn for vast periods of time with no reward in sight, and it looks like it can do that for computers too!
Soon, I expect to see AI learn to forgive and forget, be altruistic, follow and break rules, learn to resolve disputes, and even value something that resembles "love" to us.
Exciting stuff!
_______
[1] - Yuri Burda, Harri Edwards, Deepak Pathak, Amos Storkey, Trevor Darrell and Alexei A. Efros. Large-Scale Study of Curiosity-Driven Learning
In ICLR 2019.
Billion ... With A B
Humans are notoriously bad at large numbers. It's hard to wrap our minds around something of that scale. We're wired to think locally and linearly, not exponentially (it's one of the reasons I love AI so much).
Here are a couple of ways to help you understand a billion dollars.
via AskOpinion
Next, let's look at spending over time. If you were to spend a dollar every second for an entire day, you would spend $86,400 per day. You can do that for approximately twelve days if you have a million dollars. With a billion dollars, you can do that for over 31 years. Ignoring the difference between net worth and cash, Jeff Bezos could spend $9M per day for over 31 years.
If you make $100K a year, you can earn $1 million in 10 years. At the same rate, it would take you 10,000 years to make $1 billion.
Here is an example framed around spending money. Imagine that someone making $50K a year decides to buy a laptop, a car, and a house. Now we're going to make a relative comparison of the cost of those items for people making a lot more than $50K per year. To do this, we'll shrink the cost of the price of those items (to see the relative cost-to-income ratio). For a millionaire, a laptop might cost the equivalent of $100 dollars, a Porsche would cost $3,000 dollars, and a house would cost $25,000. Now, let's say you're Mike Bloomberg, and you're worth $60B. A laptop's relative cost would be pennies, a Porsche's relative cost would be less than 60 cents, and a mansion's relative cost would be around $500 dollars. You could have everything you ever wanted for a minute fraction of your wealth.
For a different perspective, here's an interesting visualization from informationisbeautiful. It shows various examples of things worth billions of dollars – including the personal wealth of several billionaires.
Okay, last one before I show a video ...
Let's try explaining the concept of a Billion through time. Fifty thousand seconds is just under 14 hours. A million seconds was 11 days ago. A billion seconds ago from today? 1990. Pretty crazy.
Here's a video from the 1970s that helps you understand scale through the power of tens – and an exploration of our universe.
Hope you enjoyed this. Let me know what you think.
Posted at 12:08 PM in Business, Current Affairs, Ideas, Just for Fun, Market Commentary | Permalink | Comments (0)
Reblog (0)