204 - Digital Herding

One of my colleagues and good friends recently noted that people have a strong and systematic tendency to "like" their current circumstances, no matter how objectively terrible they may be. This is an extension of the empirically observed concept of how humans gradually return to an emotional baseline even when life-changing events occur in opposite directions, like one person winning the lottery versus another becoming paralyzed.

These baselines drive individual humans to never be fully content, a selection pressure in population dynamics that favors some degree of perpetual adaptation. This means that there is always some pressure to explore, even if that pressure is ignored or internalized and neuroticized.

The tendency to "like" current circumstances functions as a coping mechanism for the present, some baseline degree of acceptance, which can then serve as a temporary anchor for complex dynamics to emerge out of the tension between the currently accepted point and some interest or goal projected on the future. This allows for an accepted position and motivational direction.

However, this becomes an additional problem when AI algorithms embedded in every recommender, search engine, trashbot, and various other sources are designed and optimized with the explicit intention of maximizing the "attention" of users that they consume. The result is large populations not just being herded from one domain of consideration, but across all domains where such systems are deployed at once, giving the herded populations no sense of stable position, and increasing blindness to how they are being herded.

This is also referred to as a "captive audience", though when humans are herded with social engineering to this degree it becomes much closer to the "factory farming" of humans. Just like factory farming discovered with chickens in previous decades, you get the best production of eggs by eliminating the most aggressive/disruptive members of a population. Every source of disruption to the farming process is a factor to be optimized away and removed from the equation.

Such herding of populations into confined mental spaces also cultivates a fertile environment for parasites, and very much unlike factory farming, the companies who herd people into those tiny cages have virtually zero interest in keeping them healthy, mentally or physically. Rather, precisely the opposite is true, as "attention economy" tech companies directly extract revenue from such parasites, making their own revenue-optimizing efforts produce a maximally infested captive user base. For example, merely by mentioning "London Tech Week" once I was bombarded with spam from companies that I promptly and permanently blocked.

Subsequently, such systems corrupt and hijack selection pressures to explore, giving the illusion of progress while circling the drain. For the moment, one of the few saving graces is that the malevolent corporations in question have built their houses on an active volcano, so they'll do a fair reenactment of Pompeii in due time. The people who hedge on those companies losing a fortune when that day comes will no doubt make a killing.