Web/Tech

  • Gartner’s 2019 Hype Cycle For Emerging Technologies

    Technology is a massive differentiator in today's competitive landscape. 

    Sorting through predictions of which new technologies are going to impact the world and which are going to fizzle out can be an overwhelming task. I look forward to Gartner's report each year as a benchmark to compare reality against. 

    Last year, Gartner reported Deep Learning and Biochips were at the top of the hype cycle – in the "peak of inflated expectations." While I'm excited about both industries, there was certainly more buzz than actual improvement in those spaces last year. Excitement almost always exceeds realistic expectations when technologies gain mainstream appeal. 

    What's a "Hype Cycle"?

    As technology advances, it is human nature to get excited about the possibilities and to get disappointed when those expectations aren't met. 

    At its core, the Hype Cycle tells us where in the product's timeline we are, and how long it will take the technology to hit maturity. It attempts to tell us which technologies will survive the hype and have the potential to become a part of our daily life. 

    Gartner's Hype Cycle Report is a considered analysis of market excitement, maturity, and the benefit of various technologies.  It aggregates data and distills more than 2,000 technologies into a succinct and contextually understandable snapshot of where various emerging technologies sit in their hype cycle.

    Here are the five regions of Gartner's Hype Cycle framework:

    1. Innovation Trigger (potential technology breakthrough kicks off),
    2. Peak of Inflated Expectations (Success stories through early publicity),
    3. Trough of Disillusionment (waning interest),
    4. Slope of Enlightenment (2nd & 3rd generation products appear), and
    5. Plateau of Productivity (Mainstream adoption starts). 

    Understanding this hype cycle framework enables you to ask important questions like "How will these technologies impact my business?" and  "Which technologies can I trust to stay relevant in 5 years?"

    That being said – it's worth acknowledging that the hype cycle can't predict which technologies will survive the trough of disillusionment and which ones will fade into obscurity. 

    What's exciting this year?

    It's worth noting that in this edition of the hype cycle, Gartner shifted towards introducing new technologies at the expense of technologies that would normally persist through multiple iterations of the cycle; 21 new technologies were added to the list. For comparison, here's my article from last year, and here's my article from 2015. Click on the chart below to see a larger version of this year's Hype Cycle.

    CTMKT_741609_CTMKT_for_Emerging_Tech_Hype_Cycle_LargerText-1via Gartner

    This year's ~30 key technologies were selected from over 2000 technologies and bucketed into 5 major trends:

    • Sensing and Mobility represents technologies that are gaining more detailed awareness of the world around them like 3D sensing cameras, the next iteration of autonomous driving, and drones. Improvements in sensor technology and their communication through the IoT is leading to more data and more insight. 
    • Augmented Human builds on the "Do It Yourself Biohacking" trend from last year. It represents technologies that improve both the cognitive and physical abilities of humanity – technologies like biochips, augmented intelligence and robotic skin. The future is bringing implants to extend humans past their perceived limits and increase our understanding of our bodies; biochips with the potential to detect diseases, synthetic muscles, and neural implants. Many of my friends believe this realm will elongate human lifespans. 
    • Postclassical Compute and Comms represents new architectures of classical computing technologies like 5G or nanotech – it results in faster CPUs, denser memory and increased throughput. Innovation is commonly thought of as new technologies, but better versions of existing technologies can provide just as much value – and disrupt industries in a very similar way. 
    • Digital Ecosystems are platforms that connect various types of "actors." They create seamless communication between companies, people and APIs. This enables more efficient decentralized organizations (and decentralized autonomous organizations) and allows constant adoption of new evolutions in technology. Examples of this technology are the decentralized web, synthetic data, and decentralized autonomous organizations. 
    • My wheelhouse, Advanced AI and Analytics is an acknowledgment of new classes of algorithms and data science that are leading to new capabilities, deeper insights, and adaptive AI. The future of this space involves more accurate predictions and recommendations on smaller data sets. More signal. Less noise.  

     

    Looking past the overarching trends of this year, it's also fun to look at what technologies are just starting their hype cycle. 

    • Artificial Tissue (Biotech) could be used to repair or replace portions of, or whole, tissues(cartilage, skin, muscle, etc.)
    • Flying Autonomous Vehicles can be used as taxis, but also as transports for other things such as medical supplies, food delivery, etc. Amazon and Uber are likely excited about this development – and expect it in the next couple of years. 
    • Decentralized Web builds on the same arguments blockchain creates against normal currencies. Because the mainstream centralized web is dominated by massive and corporate-controlled platforms like Facebook and Google, the decentralized web movement strives to enable free speech and increased access to those users whose access to the internet is strictly regulated. 
    • Transfer Learning refers to the ability of an AI to solve one problem and apply that "lesson" to a different but tangential problem. When AI becomes able to generalize knowledge more abstractly, you will see a massive spike in utilization. 
    • Augmented Intelligence complements humanity instead of replacing them with robots. To be clear – Augmented intelligence is a subset of AI, but a different perspective/approach to its adoption. 

    AI has been around since the '60s, but technological advancement and increased data mean we are now in an AI spring after decades of stagnation. 

    Many of these technologies have been hyped for years – but the hype cycle is different than the adoption cycle. We often overestimate a year and underestimate 10. 

    Which technologies do you think will survive the hype?

  • Prophetic Predictions

    New technologies fascinate me … As we approach the Singularity, I guess that is becoming human nature. 

    Second Thought has put together a video that looks at various predictions from the early 1900s. It is a fun watch – Check it out. 

     

    via Second Thought

    It's interesting to look at what they strategically got right compared to what was tactically different. 

    While not all predictions are made equal, it seems that we have a better idea of what we want compared to how to accomplish it. 

    The farther the horizon, the more guesswork is involved. Compared to the prior video on predictions from the mid-1900s, this video on the internet from 1995 seems downright prophetic. 

     

    via YouTube

  • Trade Shows & The Evolution Of Trading

    I recently participated on several panel discussions about AI and trading. This picture was taken at The Trading Show in New York.

    NYCTradingShow2019 1

    I speak at a number of events every year because I enjoy meeting people pushing the envelope and shaking things up.  It is also a great opportunity to feel the pulse of the industry (by paying attention to the titles of the sessions, the types of sponsors and vendors attracted, and of course, the makeup of the audiences).

    Big changes are coming!  Technical innovations and data science insights continue to impress, but the use of alternative data and advanced AI is at a tipping point.  I describe these shifts in the book I’m finishing up, called “Next On Wall Street – Understanding AI’s Inevitable Impact on Trading”.  Let me know if you want to know more about the book.

    In the 90s, when I’d go to conferences, I would pay attention to speakers.  Now, when I go to conferences, I'm paying attention to the audience.   The players are changing so fast, the game itself is changing.

     

    Evolution of Trading

     

    There have been various generations of trading built on different innovations. When computerized data became available, simply understanding how to download and use it generated Alpha. The same could be said for each later evolution: the adoption of complex algorithms, access to massive amounts of clean data, and the adoption of AI strategies.

    Each time a new shift happens, traders pivot or fail. The scale of innovation increases, but the pattern remains.

    At this most recent conference, I was excited to see people recognizing the pivot toward AI, Big Data and high-speed computing.

    Change happens slowly, and then all at once, and we’re getting close to that inflection point.

    Onwards!

  • Will A Robot Take Your Job? The War on Automation

    Dread of a robot-dominated future is mounting. Is there basis for it?

    Michael Osborne and Carl Frey, from Oxford University, calculated how susceptible various jobs are to automation. They based their results on nine key skills:

    • social perceptiveness
    • negotiation
    • persuasion
    • assisting and caring for others
    • originality
    • fine arts
    • finger dexterity
    • manual dexterity
    • and the need to work in a cramped area

    Screen Shot 2019-09-27 at 12.13.04 PMvia Michael Osborne & Carl Frey (Click For A Comprehensive Infographic)

    There are various statistics about the rate of change for robots taking jobs. Many expect that ~50% of current jobs will be automated by 2035.  Turns out, that statistic is from Michael and Carl, and the numbers were 47% by 20341

    The quote actually refers to the risk of them being automated. That 47% number doesn't take into account the cost, regulatory, political, or social pressures – so it's unlikely the full 47% will be realized. 

    Screen Shot 2019-09-27 at 12.16.19 PMvia The Economist

    Many use that quote as a fear-monger toward future joblessness and an increasing lack of middle-class mobility, but Mr. Frey isn't a proponent of that belief and neither am I. 

    Industrialization created short-term strife but vastly increased the economic pie over the long-term. It's likely that future automation will have similar effects if managed correctly. It's possible to truncate the pain if we learn from previous iterations of this cycle.  The fact that we're so far along technologically in comparison to previous revolutions means we're in a better position to proactively handle the transitory period. 

    We can't fail to manage the short-term consequences of the new tech because it will lead to unrest. If unrest and opposition to automation persist – it's likely the situation will be exacerbated. It's only by embracing innovation that we can make sure automation is a boon to the middle-class and not the bane of their existence. 

    Throughout history, technology has always created more jobs than it has destroyed – and while currently, that isn't the case, it doesn't mean it won't be. I often compare the AI revolution to the introduction of electricity. Electricity was a massive disruptor, and put many people out of work,  but a fantastic benefit to society. 

    Doom and gloom sell. It's much easier to convince people something's going to be painful than amazing because we're creatures of habit and our monkey brains fear pain much more than they enjoy pleasure. 

    Our attitudes and actions play a pivotal role in how the world impacts us. Pragmatically, we have various institutions in place to make the transition as painless as possible – note that I wouldn't say painless, but as painless as possible. 

    Onwards!

    _________________

    [1] Frey, Carl & Osborne, Michael. (2013). The Future of Employment: How Susceptible Are Jobs to Computerisation?

  • Country Comparisons: Harvard’s Economic Dynamics Data-Viz

    Harvard's Center for International Development put together a tool that I think is pretty cool. It's called the Atlas of Economic Complexity.  Its goal is to get you to think differently about the economic strategy, policy, and investment opportunities for individual countries. 

    Each country's profile analyzes its economic dynamics and future growth prospects, including which industries are burgeoning. They made it look pretty as well.  If you're curious about specific questions, you can use their exploration function instead. 

    Screen Shot 2019-09-27 at 1.38.01 PMvia Atlas of Economic Complexity

    Interesting stuff.  Play around. Compare. Enjoy!

  • Terminator or Skynet

    Boston Dynamics just released a video of their Atlas robot doing an impressive gymnastics routine. Comparing it to their videos from 2009 shows how insane the progress is. 

    You see the fear of Skynet-esque advanced AI … but Terminator-style robots may be a more immediate threat. 

    via Boston Dynamics

    On the one hand, Boston Dynamics makes robotics look cute but there's promise and peril. For example, Syria is using autonomous killer drones in Turkey

    Any tool can be used for good or evil, there's no inherent morality in a tool, but we're certainly good at finding ways to push the boundaries of their uses.

  • Training AI to Be Curious

    “Nobody phrases it this way, but I think that artificial intelligence is almost a humanities discipline. It's really an attempt to understand human intelligence and human cognition.” —Sebastian Thrun

    We often use human consciousness as the ultimate benchmark for artificial exploration. 

    The human brain is ridiculously intricate. While weighing only three pounds, it contains about 100 billion neurons and 100 trillion connections between those. On top of the sheer number complexity, the order of the connections, and the order of actions the brain does naturally make it even harder to replicate. The human brain is also constantly reorganizing and adapting. It's a beautiful piece of machinery.  

    We've had millions of years for this powerhouse of a computer to be created, and now we're trying to do the same with neural networks and machines in a truncated time period.  While deep learning algorithms have been around for a while, we're only just now developing enough data and enough compute power to change deep learning from a thought experiment to providing a real edge. 

    Think of it this way, when talking about the human brain we talk about left-brain and right-brain. The theory is that left-brain activities are analytical and methodical, and right-brain activities are creative, free-form and artistic. We're great at training AI for left-brain activities (obviously with exceptions). In fact, AI is beating us at these left-brain activities because a computer has a much higher input bandwidth than we do, they're less biased, and they can perform 10,000 hours of research by the time you finish this article.

    BRain SPlit

    It's tougher to train AI for right-brain tasks. That's where deep learning comes in. 

    Deep learning is a subset of machine learning based on unsupervised learning from unstructured/unlabeled data. Instead of asking AI a question, giving it metrics and letting it chug away, you're letting AI be intuitive. Deep learning is a much more faithful representation of the human brain. It utilizes a hierarchy of convolutional neural networks to handle linear and non-linear operations so it can think creatively to better problem-solve on potentially various data sets and in unseen environments. 

    When a baby is first learning to walk it might stand up and fall down. It might then take a small stutter step, or maybe a step that's much too far for its little baby body to handle. It will fall, fail, and learn. Fall, fail, and learn. That's very similar to the goal for deep learning or reinforcement learning

    What's missing is the intrinsic reward that keeps humans moving when the extrinsic rewards aren't coming fast enough. AI can beat humans at a lot of games but has struggled with puzzle/platformers because there's not always a clear objective outside of clearing the level. 

    A relatively new (in practice, not in theory) approach is to train AI around "curiousity"[1]. Curiosity helps it overcome that boundary. Curiosity lets humans explore and learn for vast periods of time with no reward in sight, and it looks like it can do that for computers too! 

     

    OpenAI via Two Minute Papers

    Exciting stuff! 

    _______

    [1] – Yuri Burda, Harri Edwards, Deepak Pathak, Amos Storkey, Trevor Darrell and Alexei A. Efros. Large-Scale Study of Curiosity-Driven Learning
    In ICLR 2019.