

Discover more from Auerstack
My second book Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities will be published by PublicAffairs this Tuesday, March 14. I’m very proud of it and I hope you’ll give it a read.
Some advance praise
Meganets will forever change the way you think about the digital world. David Auerbach has written both a warning and a blueprint for a better future.
—Amy Webb, CEO, the Future Today Institute, and author of The Big Nine and The Genesis Machine
A fascinating, mind-expanding book that is not about the future of technology but about the future of society.
—Alan Murray, CEO, Fortune Media, and author of Tomorrow’s Capitalist
A necessary book, bracing in places, but not without hope.
—Jordan Ellenberg, author of How Not to be Wrong and Shape
Auerbach is the opposite of a conspiracy theorist: he explains how there’s often ultimately no person, institution, or discernable group behind big systems and the events they shape.
—Jonathan Zittrain, professor of law and computer science, Harvard University
Auerbach knows better than anyone that the very act of writing about technology is an assertion of the preeminence of the human. His signature command of conscience and fact make his work that rarest thing: indispensable.
—Joshua Cohen, Pulitzer Prize–winning author of The Netanyahus
In stunning, lucid prose and with unsparing analytical acumen, Meganets reveals just how profoundly our world has been transformed over the past generation of technological innovation.
—Justin E.H. Smith, author of The Internet Is Not What You Think It Is
A Q&A on the main themes of the book
In MEGANETS you argue that no person, company, or government has the ability to control our digital world—it’s now an unbelievably complex autonomous entity authored by hundreds of millions of people connected by algorithms. That seems terrifying. How worried should we be?
We should definitely be concerned about and pay close attention to the effects that these systems are having; they have the potential to wreak far more havoc than we've yet seen. And we should be worried about a resistance to acknowledging the true nature of these systems and trying to apply quick fixes that have already been shown not to work—to hold to the delusion that we can actually stomp out bad things online rather than just sending them scattering elsewhere, for example, or to believing that we can actually stop the flow of what Frank Pasquale calls "runaway data." We have the tools to mitigate the problems, even if we can't fix them. We should be worried that we won't use them.
How do you define a meganet?
A meganet is a persistent, evolving, and opaque data network that controls (or at least heavily influences) how we see the world. It contains both algorithm- and AI-driven servers as well as the millions upon millions of users that are frequently or always active on those servers. The result is an ongoing feedback loop that causes meganets to change rapidly and unpredictably in proportion to the three qualities of meganet content: massive volume, high velocity, and explosive virality.
When did computer engineers and tech companies first start losing control? When did you notice the change in the way technology interacts with massive amounts of human data?
The turning point was when sufficient numbers of users were able to influence algorithms without the direct mediation of programmers. I observed the first manifestations of these problems in the first decade of the 2000s while working as a software engineer first at Microsoft and then at Google. The computer science I'd studied and even the software engineering I'd practiced seemed to be taking a backseat to coping with systems going out of control rather than creating them. As the internet became more dynamic and user-driven, this loss of control accelerated faster than most people anticipated.
Could our online world have developed differently, or does the hyperconnectivity we rely on bring this risk along with it?
My opinion is that while the development of a meganet-driven world could have come about in a variety of ways, it was nonetheless heading toward a single end result of this loss of control and increasingly outsized and incomprehensible networks. One can't necessarily predict how evolution will play out in an ecosystem, but there are certain principles of fitness and natural selection that will inevitably make themselves felt in the end result. I think something similar holds here.
Are there steps we can take to control, or at least tame, meganets?
Control, no, but tame, absolutely. I stress that these systems are simply too big and too fast for us to be able to monitor them with sufficient scrutiny, even with the aid of AI (which, paradoxically, can actually exacerbate the lack of control). But if we abandon the search for targeted, perfect fixes and instead pursue larger-scale, softer policies of arresting the overall speed and development of meganets, we can reduce the amount of negative effects even if we cannot eliminate them. Sometimes it may be as simple as setting a cap or delay on how content propagates on meganets; in other cases it will be more difficult. Some such promising solutions are already being tried out by Facebook and TikTok. I am confident that with the right mindset and the right approach, we will be able to reduce the negative effects of meganets while still benefiting from their positive qualities.
Do you think the misguided belief that someone—Mark Zuckerberg, perhaps—can fix the digital world has impeded our ability to effectively confront today’s informational and political crises? Could understanding the nature of meganets help us do better?
For any problem, it's tempting to think that a solution exists that we could implement if we only had the will. Admitting to the sheer insolubility of a problem is scary. But refusing to let go after we have seen, time and time again, the inability of any authority to fix the problems we're considering (whether disinformation, abuse, or economic chaos) is just perverse. A glance at Facebook's leaked internal communications quickly reveals that it is not a lack of will but rather a lack of ability that leaves the network populated with hate speech and false information. This in no way absolves the company—quite the opposite—but it does force us to reconsider how we might be able to tame these systems more effectively. The first step, no matter what, is to understand them.