What are the trade-offs of AOAI’s?
Without being too much of a downer, let’s dwell on some problems.
In our last post, we defined a new term—AOAI (Attention Optimizing AI’s).
AOAI’s are algorithms, developed using machine learning, with the goal of optimizing your attention on a tool or platform. The term is inclusive of screens and social media, but intentionally broader—to draw attention to how every digitally enabled experience is arriving at the same goal and method. They are so ubiquitous that they have become the single most influential experience in our lives—beyond the influence of any other experience, relationship or institution. And while they have a huge range of remarkable and compelling benefits, there are also some trade-offs.
These trade-offs will be the focus of this post.
A basis in optimism
The problem with talking about problems is that you can come across like a downer. We fell into this trap in our first attempts at articulating our mission. It was described as “dark” by several team members, and rightly so upon reflection. We also recognize that pessimism is too often used as a shortcut critical thinking. This is something we really want to avoid.
We believe our mission is rooted in optimism. Optimism in human potential. Optimism that our problems are solvable. Optimism that technology can be used in better alignment with human needs. Optimism for the profession of teaching. And without irony, optimism in institutions—our system of education and democratic systems.
Yet in our search to find solutions, we found we needed better explanations of the problems themselves. There have been countless studies and concerns raised to date about technology. Yet we feel none of them seemed to help us define a path toward a solution. They are too narrow in scope, don’t form solid enough connections to societal trends, are overly inflammatory (i.e., The Social Dilemma) or not reflective of what the latest developments in AI imply about the future.
So in the interest of finding better solutions, and at the risk of sounding like a downer, we will be dwelling on problems in this post. To keep things focused, we’ll be dwelling on the present challenges of AOAI’s and covering our concerns from the latest developments in AI in a subsequent post.
As we dwell, let’s just not lose sight of our basis in optimism.
The trade-offs of AOAI’s
Let’s get started with a deep dive into a trade-off that is scientifically proven, and more commonly understood.
AOAI’s imbalance our brain reward systems. Social media, screens, and by extension AOAI’s, leverage the reward systems of our brains. They trigger a dopamine release in response to the gratifying experiences they provide. Dopamine itself is a positive thing—it contributes strongly to how humans learn. But you can get too much of a good thing. Like other addictive substances, AOAI’s start to imbalance our reward systems over repeated use. Positive reward responses get diminished and anti-rewards (i.e., norepinephrine, etc) get strengthened. And greater and greater stimulation is required to return to a balanced reward-system state. AOAI’s know how to stimulate us artfully and keep the click-starved party rolling.
The problem comes in when we have a non-gratifying experience. This could be encountered digitally, like an article or post you don’t agree with. Or it could simply be whenever you shut down your technology and engage in the world around you. The real world is not organized to deliver constantly gratifying experiences. And these non-gratifying experiences now provoke something akin to a hangover.
Our lived experience of this hangover goes something like this. Instead of provoking our curiosity to learn something new, we skip past disconfirming information and seek more gratification—never really expanding our understanding. Boredom is now associated with suffering and pain, because that is what it feels like when you have a hangover. And the idea that it is a useful, if not profound signal gets completely lost. A conversation with family, friends or co-workers now becomes a patience testing experience—you can’t accelerate playback past 1x—leading to diminished deep listening, a loss of connection and inability to collaborate. And if that conversation includes something you don’t agree with or someone you don’t like, there is a complete inability to find common ground or understanding.
We feel the connection between this reward system imbalance from AOAI’s and some important societal trends, like the loneliness epidemic, the growing political divide and growing mistrust in institutions (i.e., decentralization) aren’t emphasized strongly enough.
That said, this explanation isn’t good enough on its own. The obvious solution to framing the problem as one of addiction is stopping usage, or at least severely limiting it. But this negates all the enormous benefits technology does bring to our lives. And it denies that AOAI’s are unavoidably embedded into everything, everywhere. So it’s simply not a workable idea.
This means we have to go deeper into the problem if we want to find solutions.
AOAI’s diminish our emotional regulation. Our attention has always been captured by provoking our emotions. “If it bleeds, it leads” is a saying that predates the internet. AOAI’s recognize this and dial up the provocation to weapon’s grade. Our emotions are provoked more often than ever before—by many orders of magnitude. This point alone is worth dwelling on.
In addition, the changes to your brain reward systems from addiction come with a related outcome. They provoke a stress response—making you even more emotionally reactive.
Taken together, it isn’t much of a leap to say that our emotional regulation has been diminished. It is harder than ever to contain our emotional responses, especially to things meant to provoke or that we don’t immediately like. And the connection between limited emotional regulation from AOAI’s and societal trends like outrage, cancel culture, helicopter parenting, safetyism, etc. simply can’t be emphasized strongly enough, in our opinion.
It is worth noting that we weren’t exactly paragons of emotional regulation before AOAI’s. This is likely one of the most challenging and determinant struggles for humanity. We believe progress over the last several thousand years has been highly dependent not just on our ability to build knowledge and skills, but also on our ability to regulate our emotions—to prioritize values over our feelings—in order to participate in creating meaningful progress.
Which brings us to our third trade-off.
AOAI’s change our values. Humans have developed a diverse and wide range of values to create progress, meaning and make sense of the inevitable suffering of life. Some examples of these positive values include honesty, flexibility, curiosity, delay of gratification, humility, gratitude, courage, forgiveness and many more.
AOAI’s, particularly the social media variety, have elevated what was previously thought of as an undesirable behavior—seeking attention—into a value. We are rewarded for followings, likes and other forms of engagement—so we seek it.
The reason seeking attention was originally not thought of positively, was because it was often in conflict with developing these historically important values. And you can see this conflict in our online behaviors every day. We take risks we wouldn’t normally take. We provoke others just to get a response. We take offense to more things. We focus on form over substance, novel experiences over responsibility, etc.
The contribution of this change in values from AOAI’s on things like online abuse, political polarization (again), and even events like the capital insurrection and gun violence simply can’t be emphasized enough in our minds.
New pathways for solutions
To quickly summarize, the trade-offs of AOAI’s start with imbalanced brain reward systems (i.e., digital addiction) but also include diminished emotional regulation and changed values. While this isn’t a change to our fundamental human capacities, it is a major change to how we react to the world around us. And we feel it is a problem that needs solutions.
We found this expanded explanation of the trade-offs with AOAI’s to be worthwhile to express. It makes better connections with concerning societal trends, and it opens up some new pathways for solutions. Specifically, it opens up the idea of treating this as a problem of values.
And it led us to ask ourselves the question—what if we could find a way to strengthen the values that combat the trade-offs of AOAI’s?
It is worth noting that there are some interesting ideas emerging to help individuals with this problem, like Dopamine Fasting and the methods in Indistractable (written by the same author as Hooked—someone who knows a thing or two about AOAI’s). It is also worth noting that our own value system (i.e., being coachable) was conceived to help deal with these same challenges. But these ideas can’t reach people at the scale of AOAI’s on their own.
This is where our system of education came back into focus for us, along with the need for educational technology with better goals than attention—the goal of building better relationships.
These points will be explored further in coming posts.