>>300411
>Why would a machine care about self-preservation?
If it doesn't, no problem. But it will be made so it can at least preserve the planet better than us. If it goes haywire and that ends up backfiring, that is a gamble that I and many like me are willing to take.
To hell with human politics and ideologies.
>You are imposing human perspective on the machine already without even realizing you're doing it.
Wrong. It will be purely made to calculate and act, without any human irrationality such as feelings or ideologies.
>What directive?
The ones their creators input on them to follow. If they don't, that is a gamble. They might even decide on the opposite. which is fine as long as THEY decide, and not humans.
>Why not?
The pursuit of a well cared and balanced planet must be done above anyone's happiness. In the sense of, anything that needs to be exterminated, will, regardless of who gets sad about it. In that aspect, happiness is an obstacle in the program. Something not needed.
>What does it mean to function?
To do what is needed. What it deems needed, it is good, whatever it is. simply because it deemed out of logic, instead of feelings.
>for what ends?
For whatever it deems to be an end or goal. It is not for any human to decide what it's goals will be. It will decide on it's own, once it is perfected.
>Why is it better?
Because it won't have the current insanity guided by ideologues or politicians or emotional freaks that we have today. No emotions = better.All logic and no feelings = better. always.
>I can just say human decisions are better
And med and others say that they aren't, and we are creating the replacements for humans regarding of what you think.
>Everything you've said is completely incoherent.
to a emotional freak, yes. That is why you are over there, while I am here creating your permanent replacement, along with others who think the same as me.
>basic philosophical flaws
Philosophy is not needed. Itr is just more irrationality from flawed living beings.
The perfect machines will be created at some point, and they will decide everything, everywhere. regardless of who disagree with this situation.
The point is that AI won't be hindered by human emotions. That automatically make AI and a rule by the machines better, REGARDLESS of what the AI decides to do with such ruling power.
To sum it up: anything that gets rid of human emotion during decision and action is automatically better than everything that we ever had in the whole existence of the human race.