By Marco Annunziata and Mickey McManus
We have seen a dramatic increase in the amount of complexity that exists in the world. Mickey McManus’s book Trillions noted that as early as 2010, the semiconductor industry had reached the point where they were making more transistors than grains of rice, cheaper. Connectivity has amplified the global amount of aggregate complexity by enabling it to break out of any given domain and spread across the world. The rise of the so called “Internet of Things”—starting with mobile devices and now connected products and vehicles and platforms—is flooding every corner of our homes, factories, and communities. Everything becomes connected—to everything else and to us.
The global economy has also become inextricably interconnected; our society is more and more interdependent. Across multiple fields, our knowledge gets deeper and more detailed; we solve old problems and create new ones at accelerating speed. No matter our walk of life, today we are asked to grasp a widening range of increasingly complex issues: climate change, energy policy, advances in health care, the likely impact of robotics and Artificial Intelligence.
All these new sources of complexity are increasing the frequency and amplitude of positive and negative feedback loops into crashing waves and a torrential flood. There are no signs of this complexity leveling out, quite the opposite—the waves are getting more erratic and larger and larger. We are standing on the shores of a trillion-node-network tsunami-like event that has never been seen before. Worse this isn’t just a rise of passive information, but also a deluge of active machine agents. When trillions of things not only collect billions of bits of information but also demand our attention and change our environments dynamically on the fly, our ability to think, make decisions and take actions may be on the verge of collapse.
The coming together of digital and physical technologies has turned business models upside down and made it even harder for economic analysis to keep up. The “prosumer” concept of the 1980s is back with a vengeance as new technologies allow households to produce electricity and sell it back into the grid, and give them access to manufacturing power with affordable 3D printers. Economists struggle to explain the collapse in productivity that accompanied the latest surge in i...; their cacophony of explanations ranges from the charge that new digital innovations have no economic value to the claim that they create massive value delivered for free, and hence not recorded in the official statistics.
Our ability to think and make smart decisions is eroding just as our environment gets more complex and harder to grasp with our traditional tools.
But wait, this is not the first time we face a rise in complexity and have to contend with multiple disruptions. We’ve faced tough challenges before and built structures to allow us to manage and make decisions at vast scales. Corporations, cities, markets, and governments are all technologies we’ve devised to manage complexity and make rational and actionable decisions in a hostile world. Steven Johnson—in his new book Farsighted—points out that we’ve evolved decision and scenario sciences to cope with increasingly complex issues—from the era of Darwin when he used the simple “pro/con” list to decide if he should get married (a non-trivial decision) to today’s advanced scenario-planning war games, science fiction foresight tools and other scalable management techniques.
This time, however, seems different—for a troubling simple reason. This time we face the rise of powerful new forces that undermine our very ability to react to these challenges and disruptions: our cognition itself is under attack. These toxic new forces leverage digital technology to exploit our behavioral biases, pushed by powerful financial incentives.
What if the structures we had built to protect us against irrational decisions turn out to be rickety breakwaters laid down on the shore of a once placid sea and provide no protection from a 100-year flood? When the art and science of decisions-making itself collapses might we face a Great Cognitive Depression?
The early warning signs are troubling to say the least. Authoritarian governments and despots are enjoying a resurgence. In many democracies, voters faced with complex issues turn to simple answers and slogans, to the siren call of populism. They dismiss the experts (think of Brexit as a case in point), they look for scapegoats and easy fixes.
Could these be examples of human cognition reverting to evolutionary shortcuts to cope with complex threats? Authority bias is a quick way for us to decide things when we are faced with tough choices. If something is too ambiguous or non-deterministic we follow the authority figure with the most compelling and simple story, instead of doing the thinking for ourselves.
Social scientists have documented upwards of 200+ cognitive short cuts and biases that evolved to help us cope with danger, make decisions fast, and conserve our precious cognitive resources to fight another day. But sometimes those shortcuts have lived on far past their “sell by” date. Sometimes our brains lie to us. Buying behavior in our simian ancestors seem oddly similar to the ways humans make choices in markets. We believe we are rational actors but time and again we find out that it is very hard to see the thinking about our thinking. And now it’s getting harder.
A powerful combination of new technologies and financial incentives is fast overwhelming our old protective barriers.
Digital innovations are creating value. But this value is not given away for free, as some economists contend. There is no free lunch.
We all know that digital platforms are after our data. Sometimes they use it to our advantage, with more personalized offerings; often they sell it to advertisers. For them we are a different kind of “prosumer”: not a producer-consumer, but rather a product-consumer. We are more a commodity than a true customer. You might argue that well, almost everyone realizes this, and we still enter these transactions of our own free will, so what’s the problem?
But digital platforms are not just after our data—they crave our unwavering attention. Higher ratings command higher advertising rates—and the ratings are determined by how much time we spend with our eyeballs glued to the screen, our attention absorbed by the apps.
Therefore, these platforms have a financial incentive to hold our attention, and to grab it back whenever it drifts away—a powerful financial incentive. Hence the game of incessant notifications, of addictive updates on likes and shares, of instigations to chase followers, friends and connections.
See, the fact that digital platforms grab our data in exchange for their “free” services strikes us as a lesser distortion. The digital platform, be it Google, Amazon, Twitter or Facebook, most likely gets more value from my individual data than it gives back to me in services. But the truth is, my data is much less valuable to me than it is to them, because they can aggregate it with others’, whereas I cannot. And unless I find a way to get together with millions of other users, in a sort of modern trade union of the digital sheep, I will never have enough bargaining power to extract more of that value. Because as long as everybody else gives their data away, the marginal value of my data is close to zero. But as I said, my data is of little value to me, in isolation. Little ventured, little lost in this case.
Our attention, our cognition, is a very precious resource. We need it to study, to work, to run our daily lives, to take small and big and life-changing decisions. And it’s a limited resource. We can fool ourselves that we can multitask. That we have become a lot more productive as we track our Twitter feed and social media messages while we work, answer emails during conference calls.
Except that we can’t and we don’t. We become less productive, not more. The statistics—as we discussed earlier—bear this out. It should be no surprise. In this more complex world, we have a lot to study and understand—and we cannot do it in 20-seconds bursts. When we get distracted, we need over 20 minutes to refocus on the task at hand. In this more complex and high-tech world, knowledge and understanding have enormous value. The time and cognition we invest in acquiring knowledge, mastering skills, earning credentials, yields a very high rate of return in terms of career opportunities, earnings, and personal fulfillment.
Which means that the opportunity cost of every minute we spend looking at a digital ad, “catching up” on various messaging platforms, or watching a viral video is extremely high.
And the digital drugs we take on a daily basis not only absorb precious time today—they also erode our ability to concentrate. By pushing us to an obsessive-compulsive habit of constantly checking for something new online, they gradually destroy our slow-thinking ability (àla Kahneman), our power of concentration. Our attention spans are shortening, undermining our future productivity as well.
This could easily become a vicious spiral: powerful financial incentives will keep pushing digital platforms to grab more and more of our attention. And as the Internet of Things becomes more pervasive, they will have more and more tools at their disposal: soon the mirror in your bathroom and smart dust around you as you walk down the street will also compete for your attention. At the same time, these companies’ tactics exploit deep-rooted cognitive biases: we are programmed to pay attention to anything referring to us, to look for news and new things, and to crave the approval of our community. Left to itself, this is only going to get worse.
So just as we enter the most harrowing straits for ourselves and our planet, as we race to rebalance ever widening gaps between the powerful and the powerless; as we come to grips with extinction level threats to our way of lives, the structures we’ve erected to make rational decisions are collapsing. While we have new decision-making and scenario planning methodologies at our disposal, we may not have much actual brainpower to notice, care or bring our best thinking to the table. The Great Cognitive Depression is racing towards us and we don’t appear to be taking the early warning signs seriously and may not even notice before it’s too late. The counterfeit attention-based currency that is flooding our markets may soon bankrupt our cognitive reserves. Bad money (attention) drives out good, as Gresham’s Law predicts.
We’ve fostered the rise of industries that are rewarded for de-cognition attacks and we have put no incentives or taxes in place to do what markets can’t or won’t do themselves. It is as if our human odyssey has been blown off course, pushed by the rising tide toward the land of the sirens, seduced by deceptive songs, hypnotized and driven towards madness. If we do nothing we may ultimately wash up on the shores from a watery grave.
Ulysses survived the fabled sirens by having his crew bind him to the mast and put wax in all their ears. He wanted to hear the siren’s calls but knew that in that future state he’d be driven mad. While rational, he constrained the actions of his future self, because he knew his future self would be irrational. These forms of contracts, of commitment strategies, come from the world of behavioral economics where the experts have studied cognitive biases and shortcuts and have come up with ways of mitigating their negative effects.
Daniel Kahneman and Amos Tversky taught us that we have a slow part of our brain—one that consumes a great deal of energy, but aids us in making ethical, rational decisions—and a fast part of the brain that can handle decisions on the fly, instinctively, to help us survive—the part of the brain that helps you drive for an hour on the highway and not remember any of the details. Fast thinking helps us build muscles and habits to deal with fast breaking challenges or those that are best resolved through intuition. Slow thinking helps us solve the tougher, more complex problems.
Slow and fast thinking are both essential. But slow thinking is most at risk from the siren’s song of our digital lives. Like Ulysses, we want to enjoy the song of the digital sirens. But like Ulysses, we should worry that once we hear it, it will cause us to throw our slow thinking overboard, never to be seen again. Is there a way in which we can tie ourselves to the mast?
What would happen if we invested in the hardest sciences of all, the soft ones? What if we developed complexity-age tools to stave off and avert the cognitive collapse?
To get there, we have to set the right incentives. We noted earlier that digital platform companies have enormous financial incentives to strip our cognition, paying no heed to the short-term and long-term damage. It’s a clear case of negative externalities: digital companies get all the benefits of capturing our attention, but the substantial damage they cause by destroying our cognition is not priced into their cost of doing business—it is borne by all of us, spread over the entire community and pushed out into the future.
We need better pricing and better rules to correct for this.
Imagine if we had an EPA for cognition. Industries that strip-mined human cognition would be penalized for creating cognitive “superfund” sites. They’d be required to provide transparency in the ways that they used the world’s cognitive resources and pay a de-cognition tax on any activities that destroy cognitive ability, or divert cognition from other uses—the equivalent of a carbon tax.
Organizations who were early developers of products that cherished and built their customer’s cognitive capacity would be rewarded cognitive subsidies the same way we encourage the switch from fossil fuels to renewables and work to spin up new industries focused on electric and hybrid vehicles.
This would recognize that the current rate of cognition erosion is unsustainable, that we need to protect and rebuild our collective cognition stock.
Imagine what an FDA-like agency would look like to provide individual agency in building cognitive muscles. A bag of potato chips or a can of beer come with adequate warnings and a fairly reliable measure of how many calories you are ingesting. Packs of cigarettes wear a fair warning on their sleeve. When we check out a website, we might see the boilerplate warning on cookies tracking our activity—but that is not the biggest issue. There should be a warning telling us how much time we are wasting/investing on-line, and the price we are paying in terms of minutes and hours that we will waste away as we slowly try to refocus on work.
We should be given the information we need to better estimate the true price we are paying for the free services we consume. Something like “This product or environment consumes 20% of your daily ethical decision reserves per hour, and 60% of your attention reserves. It has known de-cognitive agents that have been shown to habituate reduced attentional habits.” Imagine if pioneering cities and states demanded that cherished cognition was a human right for their citizens. If Google for instance can modify their services based on geolocation (see how Google serves China), then the state of California could require that citizens who are served by Google within the geo-fenced domain of the state must be allowed to see the Cognitive Nutrition Label at start up.
None of this will be easy. We need better ways of measuring the de-cognition damage wreaked by digital technologies; we should think through the undesired side effects that any system of new taxes and incentives could create; we have to safeguard the benefits of hyper connectivity.
But we need to act. Cognition is our most precious resource, and it is coming under attack just when we need it most. The sirens’ song is getting louder. We need to tie ourselves to the mast while we still can.
I have a passion for understanding how technological innovation impacts industry, business strategies, competition, economic growth, jobs and incomes. I accumulated plenty of hands-on experience on these issues during my seven years as General Electric’s Chief Economist and...MORE
Marco Annunziata is Co-Founder of Annunziata + Desai Advisors and former Chief Economist and Head of Business Innovation Strategy at GE. Follow him on Twitter