You’ll have noticed how previously normal people start acting like addicts to their favourite generative AI and shouting at you like you’re trying to take their cocaine away.
Matthias Döpmann is a software developer. He’s been trying out AI autocomplete for coding. He was very impressed at how good the tools are for the first 80% of the code. But that last 20% is the hard bit — where you have to stare into space and think for a bit, work out structure, and understand your problem properly: [Revontulet.dev]
For a good 12 hours, over the course of 1 1/2 days, I tried to prompt it such that it yields what we needed. Eventually, I noticed that my prompts converged more and more to be almost the code I wanted. After still not getting a working result, I ended up implementing it myself in less than 30 minutes.
So he used the chatbot as a vastly more wasteful rubber duck. He adds:
… This experience is shared among peers, where AI traps you into the thinking “I am only one prompt away”, whilst clearly it just does not know the answer.
That is: generative AI works the same way as gambling addiction. Spin the gacha! Just one more spin, bro, I can feel it!
Large language models work the same way as a carnival psychic. Chatbots look smart by the Barnum Effect — which is where you read what’s actually a generic statement about people and you take it as being personally about you. The only intelligence there is yours. [Out Of The Software Crisis, 2023; blog post, 2023]
This is how people fall for chatbot girlfriends. They know it’s a bot, but they fall in love with the personality they’ve projected onto the generic statement generator.
There’s a book on this — Hooked: How to Build Habit-Forming Products by Nir Eyal, from 2014. This is the how-to on getting people addicted to your mobile app. [Amazon UK, Amazon US]
Here’s Eyal’s “Hook Model”:
- First, the trigger is what gets you in. Say, you see a chatbot prompt and it suggests you type in a question.
- Second is the action — say, you do ask the bot a question.
- Third is the reward — and it’s got to be a variable reward. Sometimes the chatbot comes up with a mediocre answer — but sometimes you love the answer! Eyal says: “Feedback loops are all around us, but predictable ones don’t create desire.” Intermittent rewards are the key tool to create an addiction.
- Fourth is the investment — the user puts time, effort, or money into the process to get a better result next time. Skin in the game gives the user a sunk cost they’ve put in.
- Then the user loops back to the beginning. They might be more likely to follow an external trigger — or they’ll come to your site themselves looking for the dopamine rush from that variable reward.
Hooked builds on research papers about the mechanisms of gambling addiction — a life-destroying problem — to explain how you can use the same psychological manipulation techniques in your app. This book is on the bookshelf at every startup.
Eyal said he wrote Hooked to promote healthy habits, not addiction — but from the outside, you’ll be hard pressed to tell the difference. Because the model is, literally, how to design a poker machine. Keep the lab rats pulling the lever.
You’ll see the Hooked model in every mobile game. You’ll see it in dating apps, where the business model is to string you along and not connect with someone nice, but keep you spending money on the dating app. Before Duolingo went nuts with AI slop, it used the same techniques to get you to do your language practice every day!
I sort of recommend you read Hooked, though the book’s a bit disingenuous about what it says it’s doing, and it will make you angry. Because it explains so much about everything we deal with.
With ChatGPT, Sam Altman hit upon a way to use the Hook Model with a text generator. The unreliability and hallucinations themselves are the hook — the intermittent reward, to keep the user running prompts and hoping they’ll get a win this time.
This is why you see previously normal techies start evangelising AI coding on LinkedIn or Hacker News like they saw a glimpse of God and they’ll keep paying for the chatbot tokens until they can just see a glimpse of Him again. And you have to as well. This is why they act like they joined a cult. Send ’em a copy of this post.
For addictive potential, Sergio Visinoni posted a few months ago: “Is GenAI digital cocaine?” He works through the analogy. [blog post]
Coincidentally, Natalie Ponte on LinkedIn posted today: [LinkedIn]
try replacing “ai” with “cocaine” in all the posts you read about it. it’s pretty funny
Let’s try it!
- “My cocaine skeptic friends are all nuts, they’ll be left behind.”
- “Cocaine isn’t going to replace you. Someone using cocaine is going to replace you.” Checks out.
Cocaine doesn’t make you a business genius — it just makes you think you’re a business genius. Same for AI.