~ ~ ~

Buzzword peaks and valleys

# February 14, 2023

At the turn of 2017, everyone was talking about AI. The rise of deep learning and new transformer architectures seemed ready to thrust us into an age of innovation. Every company wanted to be an AI-first company: rebranding, adding copy to their promotional pages, etc. Most didn't change their underlying tech. They had a Logistic Regression model for a particular feature and suddenly they were AI Everywhere All At Once. Many companies I researched during this time didn't have a single ML engineer or data scientist.

At the turn of 2020, everyone wanted to go into Web3. A proliferation of startups looked to reinvent the core stack of financial ecosystem (instant clearance, payment, trading platforms) alongside the backbone of the Internet (peer detection, file sharing, identification). Every company newly wanted to become a Blockchain company. R&D Groups were spun up to investigate how blockchains could be slotted into existing business practices and leverage. AI took a temporary backseat.

Now it's 2023 and once again, we are all in on AI. This is thanks in part to the cultural phenomena that is ChatGPT - based largely on the same foundational model introduced nearly two years ago1. Many companies are racing to deploy AI models (generative where possible) just to put it on their slide deck. Like clockwork, three years later, we've reverted back to AI.

It's not necessarily bad to capitalize on trends. If a company with a few SVMs wants to label themselves as an AI company, they're not technically lying. And in some ways SVMs can be better than even the most modern deep learning methods. They're more interpretable, require more intentional thought to input features, and have clearer bounds on behavior. I'm still bullish on SVMs.

But expectations are important. And I do fear that once again pseudo-AI is going to drown out real-AI companies. When a buzzword is everywhere it just leads to confusion. Does AI become another checkbox on the RFP?

The pitch of AI can be particularly pernicious in this way. The pitch of "give us your data, the system will get better" only works for a limited time. Eventually users will expect to see more, perhaps way more, personalization and delightful predictive experiences. Without a step change in data or model architectures, which typically is out of the organizational wheelhouse, that's not going to happen.

Eventually people get immune to the pitch - to the flashy buzzwords, to the unrealized expectations. At the end of the day it comes down to what the product does versus how it does it. Does this actually help people get their job done or help them better enjoy life? My barometer remains to buy a product for what it can do today, not what it can do tomorrow.

Speaking of tomorrow, I'm convinced Blockchain is going to be a comeback story in 2026. I'll grab my popcorn.


  1. Technically speaking, GPT3.5 was released last November and included RL finetuning to better mirror human text. But the model architecture is almost identical to GPT3. Companies building on the GPT3 API have also been around for some time, which makes me think this recent hype cycle has more to do with ChatGPT and sociology than the tech itself. 

Stay in Touch

I write mostly about engineering, machine learning, and company building. If you want to get updated about longer essays, subscribe here.

I hate spam so I keep these infrequent - once or twice a month, maximum.