Advanced Trolly Logic
-
-
There is one last problem: you are strongly OCD and need to ensure that an even number of people die.
-
There is one last problem: you are a psychopath and need to ensure that all people die.
-
There is one last problem: you are Thanos and need to ensure that half of the people die.
-
It doesn't assume that. There's some sort of continuum of people on the second track, where there's an infinite number of people between any two points.
This means a trolley on the first track will kill people over time, but as soon as it touches the second track you'll already have done infinite damage.
-
Oh, not this bullshit again.
Obviously the one who tied them up doomed them all to die of thirst.
EDIT: So you can only give part of them a merciful quick death thanks to that other guy's acts.
Filed under: Blaming the real culprit, I'm real fun at hypothetical parties - why do you ask?
-
Wouldn't killing 1+1+1+1+1... <= 1+2+3+4+5+... = -1/12 people basically mean that you give birth to a baby?
-
@JBert said in Advanced Trolly Logic:
Obviously the one who tied them up doomed them all to die of thirst
Good point.
I thought maybe you could untie one, ask him to help you untie more. Then you both untie two, then four, then eight... before you know it you have an exponentially growing pile of untied people. Which will never be infinite, so does it even matter?
And what do you do then? The planet certainly can't handle a quintillion people who were just somehow created and freed.But wait, the track is infinite, so most of these people are really, really, really far away, they will never get to where you are. Where is it even located? It's some sort of infinitely long cylinder universe with a train track through the middle. Maybe they'll form their own civilization around the track, if it's possible to survive.
I thought maybe they could eat each other, it's an infinite supply of meat after all. But all those people will die, decay and rot after a week.That's all train track one. Track two is too weird to contemplate.
-
@cvi said in Advanced Trolly Logic:
Wouldn't killing 1+1+1+1+1... <= 1+2+3+4+5+... = -1/12 people basically mean that you give birth to a baby?
Since ζ(0)=½, only half of one.
-
@PleegWat said in Advanced Trolly Logic:
@cvi said in Advanced Trolly Logic:
Wouldn't killing 1+1+1+1+1... <= 1+2+3+4+5+... = -1/12 people basically mean that you give birth to a baby?
Since ζ(0)=½, only half of one.
𝜺𝜺𝝎
-
@anonymous234 said in Advanced Trolly Logic:
Track two is too weird to contemplate.
You can't even be sure if you know anyone on it at all.
-
-
-
-
-
however, you can pull the lever to make the train get closer just so you can
wavemoon at all the people
-
Obviously, you pull the lever to switch the trolley onto Track B, and then quickly flip it back as it passes over the switch so that the rear wheels are on Track A. The trolley derails before hitting anyone
that you care about.Nobody on the trolley deserved to live, anyway. (Though if they're lucky, being derailed won't kill them.)
-
@boomzilla
Simply use as many trolleys as you have tracks for then there's no need to worry about missing anyone.Also, if you had extra trolleys to reverse from the other ends simultaneously then the job would be done in half the time.
Filed under: O(∞ ÷ 2)
-
@PotatoEngineer said in Advanced Trolly Logic:
Obviously, you pull the lever to switch the trolley onto Track B, and then quickly flip it back as it passes over the switch so that the rear wheels are on Track A. The trolley derails before hitting anyone
that you care about.
-
-
@dkf I've recently found out there's a video game based on that manga:
-
@boomzilla alt, what do you do?" class="img-responsive img-markdown"/>
-
@kazitor
unless the tramolleybus runs across his lawn no action will be taken
-
@kazitor what happens if you increase the track width by 3 pixels?
-
As a @boomzilla alt, what do you do?
Nothing. will operate the lever before I do.
Unless I'm on mobile.
Or I'm @pie_flavor.
-
@pie_flavor said in Advanced Trolly Logic:
@kazitor what happens if you increase the track width by 3 pixels?
The JPEG artefacts become more noticeable.
-
-
@PJH said in Advanced Trolly Logic:
@PotatoEngineer said in Advanced Trolly Logic:
Obviously, you pull the lever to switch the trolley onto Track B, and then quickly flip it back as it passes over the switch so that the rear wheels are on Track A. The trolley derails before hitting anyone
that you care about.That works okay with lightweight model trains, but if a real train does it you'll bust the switch where the tracks reconverge by running the train backward the wrong way through it.
-
@brie said in Advanced Trolly Logic:
That works okay with lightweight model trains, but if a real train does it you'll bust the switch where the tracks reconverge by running the train backward the wrong way through it.
You've “just” got to switch the switch fast enough at the right time. It probably helps if the train isn't going very fast…
-
@brie said in Advanced Trolly Logic:
if a real train does it you'll bust the switch where the tracks reconverge by running the train backward the wrong way through it.
Yes, I'm sure that's the only problem they'd run in to.
-
@PleegWat There are other possible problems that are common to both real and model trains, but might or might not occur, depending the length of the car spanning the tracks, the distance between the tracks, and whether there are obstacles between the tracks. However, damaging the switch is both exclusive to real trains and nearly certain to occur.
-
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
-
@Tsaukpaetra said in Advanced Trolly Logic:
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
That's one reading of it. Another is that you really don't want a Tesla making these choices.
-
@topspin said in Advanced Trolly Logic:
@Tsaukpaetra said in Advanced Trolly Logic:
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
That's one reading of it. Another is that you really don't want a Tesla making these choices.
Hypothetical: You are an AI driving system and you come to a fork in the road. One of the paths has a firetruck, and the other has a median...
-
@hungrier said in Advanced Trolly Logic:
@topspin said in Advanced Trolly Logic:
@Tsaukpaetra said in Advanced Trolly Logic:
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
That's one reading of it. Another is that you really don't want a Tesla making these choices.
Hypothetical: You are an AI driving system and you come to a fork in the road. One of the paths has a firetruck, and the other has a median...
Answer: crash into the nearest Tesla and blame it for sudden braking.
-
@JBert said in Advanced Trolly Logic:
@hungrier said in Advanced Trolly Logic:
@topspin said in Advanced Trolly Logic:
@Tsaukpaetra said in Advanced Trolly Logic:
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
That's one reading of it. Another is that you really don't want a Tesla making these choices.
Hypothetical: You are an AI driving system and you come to a fork in the road. One of the paths has a firetruck, and the other has a median...
Answer: crash into the nearest Tesla and blame it for sudden braking.
Crash it into the nearest Tesla whose driver has a Samsung device so all the evidence goes away.
-
-
@Cabbage isn’t it more ethical when more imps die instead of fewer?
Of course, you can change this by replacing the revenant with an arch-vile and have them all resurrected.
-
-
-
-
@topspin said in Advanced Trolly Logic:
@Tsaukpaetra said in Advanced Trolly Logic:
This thing stewed in my head for a while. The only conclusion I have is that it's just an attempt to demonstrate selfishness regardless of the choice the participant makes.
That's one reading of it. Another is that you really don't want a Tesla making these choices.
You don't want any car making these choices. This is truly a troll-y problem, because the actual correct answer is always "hit the brakes". And just in case it ever wasn't, there's still only one correct answer: do everything possible to safeguard the people inside the car. Period. There is no choice to be made here.
Why? Two reasons. First, pragmatism: if people know that their car is programmed in such a way that it might choose to sacrifice them, no one will buy one.
And second, security. The "trolley problem" is nothing more than a thought experiment. It's not a case study, because it's not something that has ever actually happened. But do you know what has actually happened, a very real problem that top-notch researchers are grappling with today? Adversarial input: the ability to use subtle things that don't look like they make any difference to human eyes to a computer into coming to a completely wrong conclusion.
If a computer has a subroutine in it that says "if T then sacrifice passengers", then you just know that somebody out there is going to figure out an adversarial technique to trick the car into believing condition T exists. And given that 1) there is no evidence that T has ever actually existed, and 2) it's well known that malicious hackers do exist, the only possible answer that is not morally abhorrent is to program the computer to not believe in the trolley problem and always protect its passengers as the highest priority.
-
@Mason_Wheeler said in Advanced Trolly Logic:
If a computer has a subroutine in it that says "if T then sacrifice passengers", then you just know that somebody out there is going to figure out an adversarial technique to trick the car into believing condition T exists.
@error_bot !xkcd self driving issues
-
-
@pie_flavor Yes, but I'm not talking about obvious things like what's being discussed in that comic. I'm talking about adversarial input, where super subtle things, like placing a small sticker on a speed limit sign that most human drivers wouldn't even notice, cause a car to regard it instead as a stop sign, or as a speed limit sign that's 30 MPH faster than what the sign actually says. Things that could easily be overlooked by investigators, and cause someone to blame the car for what was actually a deliberately instigated problem.
-
@Mason_Wheeler Yes, but you're missing the point of the comic, which is a difficult thing to do because it explicitly states it in clear words. Car crashes are very easy to make happen, and the primary reason they don't is because people aren't murderers. Adversarial images are no different.
-
@pie_flavor I'd argue that the truth is closer to "because most people aren't suicidal murderers." Because the things the comic mentions are big, obvious things that anyone could easily do, but they would just as easily get caught doing them and/or end up getting in a wreck themselves.
Remove that risk of self-harm from the equation, and... well... there are ~250M adults in the USA. If 0.1% of them are messed up enough to be willing to commit murder if they believed it wouldn't place them at risk, that's a quarter-million people in this country who would love to see this technique become available.
-
@Mason_Wheeler said in Advanced Trolly Logic:
But do you know what has actually happened, a very real problem that top-notch researchers are grappling with today?
What to do with the Ark of the Covenant?
-
@Mason_Wheeler What about painting fake lines on a road is suicidal?
-
@pie_flavor Why don't you try it sometime? You'll find out quickly enough...