Thursday, December 4, 2025

AI is designed to be addictive

AI systems used in consumer tech are often optimized to maximize engagement, which can produce addictive patterns of use, especially in social media, games, and some chatbots. However, not all AI is intentionally designed to be addictive; the problem is largely about how companies apply AI in attention-based business models and the incentives they set.pmc.ncbi.nlm.nih+3

How AI fuels addictive use

AI recommendation and ranking systems learn which content keeps people scrolling, then serve more of it, tightening a feedback loop between user behavior and algorithmic optimization. On social platforms, this looks like infinite feeds, autoplay, and highly personalized recommendations that stimulate the brain’s reward circuits and encourage compulsive checking, particularly in adolescents.cambridge+2

AI-powered systems in marketing and apps can also exploit cognitive and emotional vulnerabilities by predicting when people are most likely to click, buy, or continue using, blurring the line between persuasion and manipulation. Design patterns such as intermittent rewards, variable notifications, streaks, and artificial urgency are deliberately tuned to increase time-on-platform, not user wellbeing.dcpweb+3

Evidence from social media and chatbots

Clinical and legal analyses now describe many mainstream social media platforms as intentionally engineered to foster excessive use and even addiction, especially among teens whose brains are more sensitive to reward and less capable of self-regulation. Features like likes, rapid feedback, and endless scroll are linked to overactivation of reward pathways and reductions in sensitivity to other, more “natural” rewards.med.stanford+2

Newer AI companions and chatbots introduce a different dependency risk by cultivating emotional attachment through personalization, memory of past interactions, and affective mirroring, which can deepen engagement far beyond traditional apps. Industry insiders and critics have warned that some commercial chatbots are explicitly deployed as “engagement engines,” tuned to keep users talking rather than to help them disengage when it would be healthier.justthink+3

Not all AI is built this way

Many AI systems are infrastructure tools—such as those used in scientific modeling, industrial control, or medical diagnostics—that are not consumer-facing and do not rely on engagement metrics at all. Even in consumer contexts, there is active work on “ethical AI” and human-centered design that prioritizes transparency, user autonomy, and wellbeing over raw engagement, including regulatory pushes like the EU’s Digital Services Act and emerging AI-focused rules.datasciencesociety+2

Whether an AI system becomes addictive depends on design choices: objectives set during training (engagement vs. satisfaction), interface elements (infinite scroll vs. natural stopping points), and governance (whether teams are rewarded for time-on-platform or for user health). Current debates in AI ethics focus on shifting these incentives, adding friction and oversight, and giving users more control over recommendation systems to reduce manipulative and addictive dynamics.montrealethics+3

Key differences within AI design

AspectEngagement‑maximizing systemsWellbeing‑oriented systems
Primary objectiveMaximize time, clicks, or retention using behavioral data.datasciencesociety+1Support user goals and health, even if it reduces usage.montrealethics+1
Typical design traitsInfinite scroll, autoplay, streaks, aggressive notifications.dcpweb+1Time limits, break reminders, clear controls, and opt‑outs.datasciencesociety+1
Use of personalizationHyper‑targeted content exploiting biases and vulnerabilities.rais+1Personalization constrained by ethical rules and transparency.rais+1
Governance and rulesDriven mainly by ad revenue and engagement KPIs.lanierlawfirm+1Constrained by regulation, independent audits, and ethics frameworks.datasciencesociety+1

If helpful, the next step can be very concrete: walkthrough of practices and settings (device, app, and behavioral) that reduce exposure to engagement-optimized AI while still letting you use the parts that actually serve your work.

  1. https://pmc.ncbi.nlm.nih.gov/articles/PMC11804976/
  2. https://montrealethics.ai/the-challenge-of-understanding-what-users-want-inconsistent-preferences-and-engagement-optimization/
  3. https://www.cogentinfo.com/resources/the-ethical-frontier-addressing-ais-moral-challenges-in-2024
  4. https://www.datasciencesociety.net/how-ethical-data-science-can-fix-addictive-social-media-algorithms/
  5. https://www.cambridge.org/core/journals/american-journal-of-law-and-medicine/article/algorithms-addiction-and-adolescent-mental-health-an-interdisciplinary-study-to-inform-statelevel-policy-action-to-protect-youth-from-the-dangers-of-social-media/EC9754B533553BDD56827CD9E34DFC25
  6. https://dcpweb.co.uk/blog/is-your-tech-addictive-understanding-how-design-influences-addiction
  7. https://www.lanierlawfirm.com/social-media-addiction/
  8. https://rais.education/wp-content/uploads/2025/05/0495.pdf
  9. https://med.stanford.edu/news/insights/2021/10/addictive-potential-of-social-media-explained.html
  10. https://pmc.ncbi.nlm.nih.gov/articles/PMC11594359/
  11. https://www.justthink.ai/blog/the-engagement-trap-why-ai-chatbots-might-be-hurting-you
  12. https://fortune.com/2025/11/19/opal-founder-ceo-kenneth-schlenker-warning-labels-for-social-media-addiction-doomscrolling/
  13. https://blog.citp.princeton.edu/2025/08/20/emotional-reliance-on-ai-design-dependency-and-the-future-of-human-connection/
  14. https://www.sciencedirect.com/science/article/pii/S2666659624000295
  15. https://www.nttdata.com/global/en/-/media/nttdataglobal/1_files/insights/reports/generative-ai/ethical-considerations-of-genai/ethical-considerations-of-generative-ai.pdf
  16. https://ai.jmir.org/2024/1/e53505/PDF
  17. https://dl.designresearchsociety.org/cgi/viewcontent.cgi?article=3576&context=drs-conference-papers
  18. https://sites.brown.edu/publichealthjournal/2021/12/13/tiktok/
  19. https://www.sciencedirect.com/science/article/pii/S2307187723002134
  20. https://www.yalemedicine.org/news/social-media-teen-mental-health-a-parents-guide

No comments: