top of page

Worried about AI? Too late.

I would like to tell you a story, a story of a dystopian future. A future where there is a technology that surreptitiously infuses you with an addictive drug. Not continuously or randomly, but in a patterned way designed to condition you to perform certain actions. You know, like those much maligned and misinterpreted rat studies conducted in the 70's. In this way, the technology influences what you read (and, as a result, what emotions you feel), what you buy, who you interact with, and how. The technology does not use any sort of Matrix style implant or physical coercion (too many implementation barriers). Rather the technology achieves this mind control by a small device you carry around with you all the time.

credit: created by the author with Midjourney

How did the technology get people to carry these manipulative devices? This is one of the cleverest aspects of the technology. The drug it uses is dopamine, produced by your own brain. It has figured out how to manipulate dopamine release with information displayed on the device. As a result, people are slowly conditioned to always want the device near them. Twisted, isn't it?


It all started innocently enough, long before the technology took over. Humans created the devices with noble intent: to facilitate communication and entertainment. But, humans being humans, they also wanted to monetize the devices—well, not the devices, but the information displayed on the devices. Initially the humans just made their best guesses as to how to maximize their profits through the services provided by the devices.


But as more and more people used the devices, more and more data became available regarding how people were using the devices, and this ever growing data set was a ripe target for machine learning. Machine learning algorithms were used to optimize the parameters that would maximize use of the devices and the income they generated. The humans set up the analysis, but really didn't know the exact details of how the algorithm weighted the different parameters or how the use of these devices was being optimized for profit. What may not have occurred to them is that what was driving the interactions with the devices was dopamine produced by the users' own brains. No one really asked the hard questions about what was going on. Profits rose, the users were happy, the investors were happy. All was well.


All was well, except that human beings were now compulsively carrying small digital devices that were controlling their brains, more or less dictating what people read (and, as a result, what emotions they feel), what they buy, who they interact with, and how.


By now you can probably see through my parable. Robust discussion of the promises and perils of generative AI tools like Chat GPT are hard to avoid these days, and rightly so. I take a mainly sanguine view of these tools (the primary threats in my view relate more to elimination of certain job sectors and less to impending robot apocalypse), but it does behoove us to think carefully about possible unintended consequences and possible safe guards as the technology continues to, um, evolve.


But what these discussions often miss is that our worst fears have already been realized—and realized so completely and insidiously we didn't even really notice, let alone care. Machines have invaded our pockets and our brains and are controlling our thoughts and actions minute by minute. Don't believe me? Leave your phone in the kitchen tonight when you go to bed. If you can.


Pause right now and write down one way you can start to put a little healthy distance between you and technology--enough distance to be able to carefully observe the role it plays in your life and how that compares to the role you want it to play. Now that you have written it down, go ahead and do it. Day by day shape your relationship with technology into what you really want it to be. And keep coming back here as we explore what mindfulness has to offer all aspects of our lives.


0 comments

Recent Posts

See All
bottom of page