◼ Thread · Systemic Harm

The Algorithm That Decides Who Lives

Four corporations. Four software systems. Four decisions to deploy knowing people would die. The algorithm is not the cause. The algorithm is the alibi.

346
Boeing MAX deaths
22.7%
UHC denial rate 2022
6,700
Rohingya killed, month 1

In each case below, a corporation deployed software to automate a decision that previously required human judgment. The software was optimized not for accuracy or safety but for throughput, cost reduction, or liability minimization. Engineers and executives were presented with evidence that the system would harm people. They deployed it anyway. When people died, the company called it a bug, an edge case, or pilot error.

These are not bugs. The algorithm is the company's values, compiled.


UnitedHealth / nH Predict: Denying Care to Medicare Patients Until They Die

naviHealth's nH Predict is a machine-learning system trained on 6 million patient records. UnitedHealthcare deployed it to set hard ceilings on how many days of post-acute care — skilled nursing facilities, inpatient rehabilitation — Medicare Advantage beneficiaries would receive. The algorithm generated a predicted discharge date. The company used it to override physician recommendations.

The results were documented in a U.S. Senate Permanent Subcommittee on Investigations report (October 2024): UnitedHealth's post-acute care denial rate rose from 8.7% in 2019 to 22.7% in 2022 — a near-tripling — coinciding with naviHealth's expanded role. For skilled nursing facilities specifically, the denial rate rose from 1.4% to 12.6% — a ninefold increase — in the first year naviHealth managed SNF pre-authorizations.

The Lokken class action (D. Minn., Nov. 2023) alleges nH Predict carries a 90% error rate — calculated from the percentage of payment denials reversed on internal appeal or by administrative law judges. Plaintiff Gene Lokken, 91, fractured his leg, was denied coverage after 19 days despite paralyzed muscles, was billed $150,000 out of pocket, and died in July 2023. Dale Tetzloff, 74, survived a stroke, was denied after 20 days, faced $70,000 in bills, and died in October 2023. Both were named plaintiffs. Both died before the case went to trial.

STAT News reported that UnitedHealth set an internal goal to keep patient stays within 1% of the algorithm's predicted length — treating the output as the target, not the ceiling.

A 2023 ProPublica investigation documented a recorded 2021 phone call in which a UnitedHealthcare employee, told that a physician review had denied a chronically ill patient's treatment, laughed and said: "I knew that was coming... We're still gonna say no."

In March 2026, a federal magistrate ordered broad discovery, requiring UnitedHealth to produce documents on nH Predict's design, cost-savings projections from the naviHealth acquisition, and government investigation records. No criminal charges. No regulatory penalties specifically tied to nH Predict's deployment.


Boeing 737 MAX / MCAS: Software Substituted for an Airframe Redesign

The Maneuvering Characteristics Augmentation System (MCAS) was software added to the 737 MAX to compensate for aerodynamic instability created by mounting larger, more fuel-efficient engines farther forward on the wing. The new engine placement caused the nose to pitch up under certain conditions. Rather than redesign the airframe — which would require new FAA certification and potentially trigger mandatory simulator training for pilots, a costly requirement that made the MAX less attractive to airline customers — Boeing added software to push the nose down automatically.

The type of sensor MCAS depended on had been flagged in more than 216 incident reports submitted to the FAA before the 737 MAX entered service. Boeing did not flight-test a sensor failure scenario.

Boeing's chief technical pilot Mark Forkner wrote in a 2017 internal message: "I want to stress the importance of holding firm that there will not be any type of simulator training required to transition from NG to MAX. Boeing will not allow that to happen." In other messages, Forkner described "Jedi mind tricking" airline customers into believing no simulator training was required. Boeing received an internal award for keeping pilot training to a non-simulator level.

Boeing omitted MCAS from the crew operating manual entirely. The company first publicly acknowledged MCAS's existence on November 10, 2018 — twelve days after Lion Air Flight 610 crashed. Engineers had expanded MCAS's activation range mid-development from extreme flight conditions to normal g-forces and lower speeds. An internal engineer noted: "Inadvertently, the door was now opened to serious system misbehavior during the busy and stressful moments right after takeoff."

189 people died when Lion Air Flight 610 crashed on October 29, 2018. 157 people died when Ethiopian Airlines Flight 302 crashed on March 10, 2019. 346 total deaths. Both crashes were caused by a malfunctioning AOA sensor triggering MCAS; pilots could not override it.

Boeing agreed to pay $2.5 billion under a 2021 deferred prosecution agreement. In May 2025, the DOJ reached a non-prosecution agreement — effectively dropping criminal charges — requiring an additional $444.5 million. A federal judge dismissed the case in November 2025. Crash victims' families called it "the deadliest corporate crime in U.S. history" receiving no criminal prosecution.


Meta / Facebook: Algorithmic Amplification and the Rohingya Genocide

Facebook's engagement-optimization algorithm — the system that determines what content appears in users' feeds — amplified anti-Rohingya hate speech and incitement to violence in Myanmar beginning as early as 2012. The algorithm did not distinguish between content that drove engagement through information and content that drove engagement through hatred.

Amnesty International's 2022 report analyzed internal Meta documents released by whistleblower Frances Haugen and found: internal Meta research from as early as 2012 indicated the company's algorithms could produce serious real-world harms. A Meta employee document dated August 2019 stated explicitly that "our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform."

In August 2017, Myanmar's military launched a campaign of killing, rape, and burning of Rohingya villages. More than 700,000 Rohingya fled. At least 6,700 Rohingya were killed in the first month alone. The UN Human Rights Council's fact-finding mission (2018) determined that Facebook had been "a useful instrument" for those inciting violence against the Rohingya.

No FTC action. No congressional legislation passed. Meta has not admitted liability. Section 230 of the Communications Decency Act provides broad immunity from liability for third-party content in U.S. courts. Lawsuits seeking $150 billion in compensation for Rohingya refugees remain pending.


Amazon: Productivity Algorithms and the Engineered Injury Rate

Amazon's warehouse management systems track workers' "time off task" and use that data to generate discipline and termination decisions automatically. Workers are subject to algorithmic "rate" quotas — minimum units processed per hour — monitored in real time. The Senate HELP Committee's December 2024 investigation documented the mechanism:

  • Amazon's internal study Project Soteria (2020) found a direct causal relationship between speed requirements and injury rates, and recommended eliminating speed-based discipline. Amazon did not implement the recommendations.
  • A second internal study, Project Elderwand, concluded that exceeding 1,940 repetitive movements per 10-hour shift elevated injury risk. Amazon's actual quotas routinely required workers to exceed this threshold.
  • The Senate report found Amazon manipulated its injury data — using misleading industry comparisons and recordkeeping gaps — to conceal its actual injury rate from regulators.

Amazon warehouses recorded more than 30% more injuries than the industry average in 2023. A 2023 Strategic Organizing Center study found Amazon workers suffered serious injuries at more than double the rate of other warehouse workers. OSHA proposed fines totaling $100,000 — against a company generating hundreds of billions in annual revenue.


The Pattern

In each case: the company possessed internal documentation showing its system caused harm before deployment or shortly after. In each case, the decision to continue operating the system was made by executives who weighed cost and throughput against human welfare and chose profit. In each case, legal and regulatory consequences — when they came at all — were structured to avoid admissions of wrongdoing or precedent-setting criminal liability.

This is not algorithmic bias. This is not an unintended consequence. This is the deliberate weaponization of automation to launder corporate decisions through the appearance of objective computation.

The algorithm did not make these decisions. The algorithm is the alibi.