by dorkstar » Tue Sep 01, 2020 7:54 pm
The thing about self driving cars is that they follow the speed limits, which usually allows them to slowly pull to a stop when encountering an obstacle in the road. The problems that people come up with tend to sound like: a couch falls off the back of a truck, should the car veer left and cut off a semi or right and cut off a motorcycle? In reality, the self driving car does not swerve either direction. It stops as it can or should to minimize risk.
This brings me to my second point on the trolley problems. Most people find my personal approach to the trolley problems very problematic. As I am neither a trolley conductor nor an executioner, I am very hesitant to touch a trolley lever at all. I can't blame myself for situations in which people die because I did not construct these situations.
These two previous conclusions bring me to a final point of existential contradiction. Humanity seems to be absolutely bent on its own destruction; displacing the natural environment, filling the oceans with trash, burning down forests to create farmland, and changing the climate will all eventually bring this massive population growth to a halt. I think the analogy of the Titanic is perfect because many people, even engineers, thought the ship was literally unsinkable. They didn't have any physical models for an iceberg that might strike the hull with enough momentum to break it. There are even some circles that still speculate the ship was sunk with terrorist intent, despite the evidence.
I think the reason that people are so afraid of sentient computers or aliens like Thanos is because they are aware of the logical answers to humanity's conundrum. The only way to save countless lives in the future is to deny current humans a certain number of resources or space. We really don't have to kill anyone, we just have to eat less meat, create less trash, and have fewer children. However, we are selfish, and few would throw themselves (or maybe their life's savings) on the trolley tracks to save any number of babies. If anything wanted to kill off humanity, aliens or AI, it wouldn't have much of a task, it would just have to wait. The self driving car suicide may seem like a beautiful gesture, but I think it's a ruse.
The thing about self driving cars is that they follow the speed limits, which usually allows them to slowly pull to a stop when encountering an obstacle in the road. The problems that people come up with tend to sound like: a couch falls off the back of a truck, should the car veer left and cut off a semi or right and cut off a motorcycle? In reality, the self driving car does not swerve either direction. It stops as it can or should to minimize risk.
This brings me to my second point on the trolley problems. Most people find my personal approach to the trolley problems very problematic. As I am neither a trolley conductor nor an executioner, I am very hesitant to touch a trolley lever at all. I can't blame myself for situations in which people die because I did not construct these situations.
These two previous conclusions bring me to a final point of existential contradiction. Humanity seems to be absolutely bent on its own destruction; displacing the natural environment, filling the oceans with trash, burning down forests to create farmland, and changing the climate will all eventually bring this massive population growth to a halt. I think the analogy of the Titanic is perfect because many people, even engineers, thought the ship was literally unsinkable. They didn't have any physical models for an iceberg that might strike the hull with enough momentum to break it. There are even some circles that still speculate the ship was sunk with terrorist intent, despite the evidence.
I think the reason that people are so afraid of sentient computers or aliens like Thanos is because they are aware of the logical answers to humanity's conundrum. The only way to save countless lives in the future is to deny current humans a certain number of resources or space. We really don't have to kill anyone, we just have to eat less meat, create less trash, and have fewer children. However, we are selfish, and few would throw themselves (or maybe their life's savings) on the trolley tracks to save any number of babies. If anything wanted to kill off humanity, aliens or AI, it wouldn't have much of a task, it would just have to wait. The self driving car suicide may seem like a beautiful gesture, but I think it's a ruse.