In a classic thought experiment of modern ethical philosophy, a person must choose between allowing a trolley to run over five people or pulling a lever that would divert it to another track, where it would kill only one. To its critics, this highly contrived dilemma has little practical bearing. Yet this might not always be so, writes Shmuel Reichman: if, as many expect, self-driving cars become a reality, they will be equipped with algorithms for dealing with such situations. Reichman, with this in mind, explores the halakhic ramifications of the famous “trolley problem,” beginning with a similar scenario addressed by the Talmud:
A man comes before [the sage] Rabbah with the following case: the ruler of a city commanded him to kill another person or else sacrifice his own life. Can he do so to save himself? Rabbah answers that this man must give up his own life rather than kill his fellow man, since “who are you to say that your blood is redder? Perhaps the blood of that person is redder than yours.”
Trying to parse Rabbah’s cryptic statement, Reichman notes that some commentators read it to mean that human lives indeed differ in worth, but it is not for other humans to determine which are more worthy. He also cites a different possibility:
God created all people equal, and in His eyes, everyone possesses the same right to life. A person is not judged based on past or future actions; a human being always retains his or her innate, infinite value. Furthermore, even if it was thought that human value was determined based on the amount of future time a person possesses, [in which case, for instance, it would be better to save a young and healthy person than an old or ill one], each moment of time is of infinite value. Therefore, one minute and one year are each valued at the same nonaggregatable infinity.
Hundreds of years later, the great 20th-century halakhist Abraham Yeshayah Karelitz (known popularly as the Ḥazon Ish) concocted his own version of the trolley problem, and could not come to a definitive answer. But, notes Reichman, the problem has very real halakhic ramifications, since it could be forbidden to drive a car programmed to make faulty moral decisions.