Science

  • Can AI Be Curious?

    “Nobody phrases it this way, but I think that artificial intelligence is almost a humanities discipline. It's really an attempt to understand human intelligence and human cognition.” —Sebastian Thrun

    We often use human consciousness as the ultimate benchmark for artificial exploration. 

    The human brain is ridiculously intricate.  While weighing only three pounds, it contains about 100 billion neurons and 100 trillion connections between them.  On top of the sheer complexity, the order of the connections and the order of actions the brain does naturally make it even harder to replicate.  The human brain is also constantly reorganizing and adapting.  It's a beautiful piece of machinery.  

    We've had millions of years for this powerhouse of a computer to be created, and now we're trying to do the same with neural networks and machines in a truncated time period.  While deep learning algorithms have been around for a while, we're just now developing enough data and computing power to change deep learning from a thought experiment to a real edge. 

    Think of it this way, when talking about the human brain, we talk about left-brain and right-brain.  The theory is that left-brain activities are analytical and methodical, and right-brain activities are creative, free-form, and artistic.  We're great at training AI for left-brain activities (obviously with exceptions).  In fact, AI is beating us at these left-brain activities because a computer has a much higher input bandwidth than we do, they're less biased, and they can perform 10,000 hours of research by the time you finish this article.

    BRain SPlit

    It's tougher to train AI for right-brain tasks.  That's where deep learning comes in. 

    Deep learning is a subset of machine learning based on unsupervised learning from unstructured/unlabeled data.  Instead of asking AI a question, giving it metrics, and letting it chug away, you're letting AI be intuitive.  Deep learning is a much more faithful representation of the human brain.  It utilizes a hierarchy of convolutional neural networks to handle linear and non-linear operations so it can think creatively to better problem-solve on potentially various data sets and in unseen environments. 

    When a baby is first learning to walk, it might stand up and fall down.  It might then take a small stutter step, or maybe a step that's much too far for its little baby body to handle.  It will fall, fail, and learn.  Fall, fail, and learn.  That's very similar to the goal of deep learning or reinforcement learning

    What's missing is the intrinsic reward that keeps humans moving when the extrinsic rewards aren't coming fast enough.  AI can beat humans at many games but has struggled with puzzle/platformers because there's not always a clear objective outside of clearing the level. 

    A relatively new (in practice, not in theory) approach is to train AI around "curiosity"[1].  Curiosity helps it overcome that boundary.  Curiosity lets humans explore and learn for vast periods of time with no reward in sight, and it looks like it can do that for computers too! 

    OpenAI via Two Minute Papers

    Soon, I expect to see AI learn to forgive and forget, be altruistic, follow and break rules, learn to resolve disputes, and even value something that resembles "love" to us.

    Exciting  stuff! 

    _______

    [1] – Yuri Burda, Harri Edwards, Deepak Pathak, Amos Storkey, Trevor Darrell and Alexei A. Efros.  Large-Scale Study of Curiosity-Driven Learning
    In ICLR 2019.

  • A Few Notes from a Trip to Israel

    My wife and I just got back from Israel.  We were there to see my son Zachary play Rugby for Team USA in an International Tournament.
     
    220724 Rugby Tournament in Israel
     
    I feel like I need a vacation after this trip.  There were so many things to do and see.
     
    Israel is smaller than the smallest state in America … Yet, consider its importance in the modern world (for example, by looking at the density of its holy sites, historical attractions, technological innovations, Nobel Prize winners, hostile borders, and military presence).
     
    It was fascinating how so many religions consider this the Holy Land.  Here is a photo I took of the Wailing Wall and the Dome of the Rock in the Old City of Jerusalem.
     
    220724 Old City of Jerusalem
     
    It’s easy to feel closer to “something” while here.
     
    Almost everything we saw in Israel is a testament to determination, ingenuity, and faith!
     
    With that said, I started to think about how difficult it was to conceive of many of the things they built (considering how difficult it would be to execute or actually build them in the desert, without electricity, etc.). Many of the sites we visited took decades to build … but have lasted for thousands of years.  Examples include the Fortress at Masada, the Wailing Wall, and the Port of Caesaria.  In my mind, I compare these moonshots to many of our current big, hairy, audacious goals (like reading and writing our DNA, autonomous artificial intelligence, or space exploration).
     
    Technologies might change, but human nature has remained surprisingly consistent throughout time.
     
    Onwards!
  • First Photos From the Webb Telescope

    The Hubble Telescope was conceived of in the 1940s, but launched in 1990. It revolutionized our ability to see the complexities of the universe. 

    Now, the Webb Telescope is taking it to the next level. 

    220712092620-04-james-webb-telescope-first-images-0712-carina-nebula-super-169via NASA

    The picture above shows the "Cosmic Cliffs," which is actually the edge of a young Nebula called Carina. 

    Below, is a picture of a cluster of galaxies called Stephan's Quintet. 

    220712092616-03-james-webb-telescope-first-images-0712-stephans-quintet-super-169via NASA

    Not only does this help us see far away systems that we've never seen before, but it also provides detail to the things we have seen.

    First, bring order to chaos …. Then, wisdom comes from making finer distinctions.  With that in mind, I'm excited to see how this drives the future of science. 

    Here's a brief video from Neil Degrasse Tyson on the new telescope. 

     

    via NBC News

  • Reinventing The Wheel

    When I think about the invention of the wheel, I think about cavemen (even though I know that cavemen did not invent the wheel).

    Lots of significant inventions predated the wheel by thousands of years.  For example, woven cloth, rope, baskets, boats, and even the flute were all invented before the wheel.

    While simple, the wheel worked well (and still does).  Consequently, the phrase "reinventing the wheel" often is used derogatorily to depict needless or inefficient efforts.

    But how does that compare to sliced bread (which was also a pretty significant invention)?

    Despite being a hallmark of innovation, it still took more than 300 years for the wheel to be used for travel.  With a bit more analysis, it makes sense. In order to use a wheel for travel, it needs an axle, and it needs to be durable, and loadbearing, requiring relatively advanced woodworking and engineering. 

    2014-innovatie-stenentijdperk

    All the aforementioned products created before the wheel (except for the flute) were necessary for survival.  That's why they came first.

    As new problems arose, so did new solutions.

    Necessity is the mother of invention

    Unpacking that phrase is a good reminder that inventions (and innovation) are often solution-centric. 

    Too many entrepreneurs are attracted to an idea because it sounds cool. They get attracted to their ideas and neglect their ideal customer's actual needs. You see it often with people slapping "AI" on to their product and pretending it's more helpful. 

    If you want to be disruptive, cool isn't enough. Your invention has to be functional, and it has to fix a problem people have (even if they don't know they have it.) The more central the complaint is to their daily lives the better.  

    6a00e5502e47b2883301b7c93a974c970b-600wi

    Henry Ford famously said: “If I had asked people what they wanted, they would have said faster horses.

    Innovation means thinking about and anticipating wants and future needs.

    Your customers may not even need something radically new. Your innovation may be a better application of existing technology or a reframe of best practices. 

    Uber didn't create a new car, they created a new way to get from where you want with existing infrastructure and less friction. Netflix didn't reinvent the movie, they made it easier for you to watch one. 

    As an entrepreneur, the trick is build for human nature (meaning, give people what they crave or eliminate the constraint they are trying to avoid) rather than the cool new tech that you are excited about.  

    Human nature doesn’t seem to change much … Meanwhile, the pace of innovation continues to accelerate. 

    The challenge is to focus on what people want rather than the distraction of possibility.

    It gets harder as more things become possible.

    We certainly live in interesting times!

  • Americans’ Top Financial Concern

    It's no secret that the economy is slowing – with high inflation rates and rising interest rates

    According to Mohamed El-Erian, from Queens College at Cambridge University, we're experiencing stagflation – which is when inflation is high but growth is slowing significantly. Theoretically, that leads to recession. 

    Inflation-Top-Financial-Concern_Infographicvia visualcapitalist

    The Consumer Price Index has also grown by over 8% in the past year, so the American household is facing financial threats from many angles. 

    Many feel that the Fed has responded disappointingly recently, and their response (or lack thereof) will be a major dictator of whether we enter a recession. 

    I believe that emotions play a role too. When people are afraid, they spend less and hoard what they can to save themselves from an unknown future. They feel anticipatory grief.  And their fear, uncertainty, and doubt ripple through society and our lives. 

    Personally, I've weathered my heaviest storms by sailing toward the future regardless of the threats. An abundance mindset is a powerful tool, and as more people feel confident it becomes a macroeconomic trend with real influence. 

    I'd encourage you to think about what opportunities there are and will be. There are always seasons of change … Winter eventually comes – and goes. Nevertheless, winter can be a great opportunity to plan your next moves and build the infrastructure to sow more seeds in the coming spring. 

    As well, unlike nature, you can personally have springtime while the majority are in winter. We're currently in an A.I. springtime – and I believe that will continue regardless of economic trends. 

    Happy to talk about this … Let me know what you are thinking and feeling!

  • Companies With The Most Patents in 2021

    Intellectual Property is an important asset class in exponential industries.

    Why?  Because I.P. is both a property right (that increases the owner's tangible and intangible value) and a form of protection.

    They say good fences make good neighbors.  But you are also more willing to work to build an asset if you know that your right to use and profit from it is protected.

    As a result of that thinking, Capitalogix has numerous patents – and we're developing a patent strategy that goes far into the future.  So, it's a topic that's front of mind for me.

    Consequently, this visualization of which companies got the most patents last year caught my eye.  In 2021, the U.S. granted over 327,000 patents.  Here is who got them.

     

    DS-Top-25-Companies-with-the-Most-New-Patents-in-2021-main-Apr20Raul Amoros via VisualCapitalist

    While IBM isn't the public-facing industry leader they once were, they've been topping the list for most patents for the past three decades.  Their patents this past year cover everything from climate change to energies, high-performance computing, and A.I.. 

    What ideas and processes do you have that are worth patenting?  And, what processes are worth not patenting – to keep from prying eyes?

    Food for thought … Onwards!

  • Thoughtful Entrepreneur Podcast

    Recently I had a chance to talk with Josh Elledge on his Thoughtful Entrepreneur podcast. We talked about AI's inevitable influence on trading as well as my experience as an entrepreneur. 

    Getson Wide

    Despite mis-spelling Capital Logix … it's Capitalogix … the conversation we had is worth a listen. 

    Check it out.  

  • Dall-E …. Not Wall-E: AI-Generated Art

    Neural networks creating images from text isn't new.  I wrote about it in 2019 when AI self-portraits were going viral. 

     


    Mauro Martino via YouTube

    Just like VR is getting a new lease on life, despite its age, AI-generated art is getting another 15-minutes of fame. 

    This past week, a new model called Dall-E Mini went viral.  It creates images based on the text prompts you give it – and it's surprisingly good.  You even can give Dall-E absurd prompts, and it will do its best to hybridize them (for example, a kangaroo made of cheese). 

    Unfortunately, like our current reality, Dall-E may not be able to produce cheap gas prices.  Nonetheless, it is fun to try.  Click the image to enter the concepts you want Dall-E to attempt to represent.

    My projectvia Dall-E mini

    While the images themselves aren't fantastic, the tool's goal is to understand and translate text into a coherent graphic response.  The capabilities of tools like this are growing exponentially (and reflect a massive improvement since I last talked about AI-generated images).

    Part of the improvement is organic (better hardware, software, algorithmic evolution, etc.), while another part comes from stacking.  For example, Dall-E's use of GPT-3 has vastly increased its ability to process language. 

    However, the algorithms still don't "understand" the meaning of the images the way we do … they are guessing based on what they've "seen" before.  That means it's biased by the data it was fed and can easily get stumped.  The Dall-E website's "Bias and Limitations" section acknowledges that it was trained on unfiltered internet data, which means it has a known, but unintended, bias to be offensive or stereotypical against minority groups. 

    It's not the first time, and it won't be the last, that an internet-trained AI will be offensive. 

    Currently, most AI is essentially a brute force application of math masquerading as intelligence and computer science.  Fortunately, it provides a lot of value even in that regard. 

    The uses continue to get more elegant and complex as time passes … but we're still coding the elegance. 

    An Elegant Use Of Brute Force_GapingVoid

     

    Onwards!