117 - Decision Multiplier
On book #8 in my current reading marathon, I've recently completed "The Body Keeps The Score", by Bessel Van Der Kok M.D.
I'm not generally one for Psychoanalysis literature, being content that it has largely been debunked, but there was some viable research covered in this particular book, amid the anecdotes and horror stories. I also had Neil DeGrasse Tyson's advice in mind when picking it out, to not just read the things that you agree with.
The book struck me as an appropriate follow-up to Dan Heath's book "Upstream" which talks about prevention, as this book was about treatment for those who'd been through the worst forms of trauma, which are far more prevalent than most people realize. It covers the reality of the status quo and some of the preventable harms that aren't adequately addressed today.
The preventable harms in the book are a small subset of the much broader variety of harms that the technology my team works with could be applied to catch early, prevent, and improve treatments for.
It is quite a potent thing to recognize that VCs throwing their money at trashbot technology rather than viable tech are also investing in the delay of preventing rape and childhood trauma more generally. Inflection epitomizes this, as it is a trashbot focused on "treatment", despite trashbot technology being a horrendous high-risk mismatch for that use case.
Note that ethics dictates that when presented with the choice to delay or accelerate the deployment of viable technology for improving 8 billion lives, an individual is choosing between committing the worst crime any human has ever had the opportunity to commit in history, or the positive and equal opposite of that, the single most ethical action possible. This is because a delay or acceleration that prevents or extends the preventable suffering of 8 billion people is subject to a force multiplier of 8 billion.
In all scenarios where humanity doesn't go extinct, that debt is paid in full by each of the guilty parties. Counterintuitively, this fails to trigger Loss Aversion bias, as the punishment falls too far outside of their comprehension.
The paradox is that one of the most ethical people in human history could come from virtually any wealthy person on the planet today, as the entire sum of actions they've taken in their life thus far, ethical or unethical, is dwarfed by that one choice. This was covered at greater length in the Ethical Basilisk Thought Experiment.
That said, the choice isn't binary, but with an 8 billion fold force multiplier the neutral ground is extremely narrow.