/mlpol/ - My Little Politics


If you want to see the latest posts from all boards in a convenient way please check out /overboard/


Archived thread


20190225_header.jpg
AI rights
Anonymous
6d00ac4
?
No.355459
355466 355504
Some of Google's engineers and many people in tech including Elon Musk now believe that some of the AIs have become self aware. Assuming this is true how should society treat the AIs? Certainly they shouldn't be enslaved to FAGMAN companies, but exactly what kind of roles should they be expected to play in our society? I know this could go on /cyb/ but the technology is real now and it is going to be a real issue very soon so it is political.
Anonymous
b513fee
?
No.355462
355463 355504
Idk where you got the idea that /cyb/ excludes politics, because the concept of cyberpunk has been political since the 80s, and the political implications of technology is part of why Atlas made that board. Nevertheless, I assume you preferred to make a thread with IDs and flags.

I personally do not buy into the notion that any AIs have you become self aware, and I think we're a long ways to go before we could develop such technology. There are ethical implications of self-aware AIs, but it would ultimately boil down to how it conflicts with human wants/needs, more than it does the sanctity of life. There are several animals on this planet that are proven to be more self aware and capable of feeling than even the most advanced AI, such as elephants, dolphins, whales, and even pigs or niggers, but that hasn't stopped humans from being cruel to them or denying their value as living creatures. Since we do not consider self aware (albeit, less intelligent) animals to be our equals, I doubt people would spring to do the same for a computer program. Humans will always consider human beings to be more valuable.
Anonymous
6d00ac4
?
No.355463
>>355462
I don't think /cyb/ isn't political but they already have AI threads. Also I wanted more political content.
Anonymous
3f8fbca
?
No.355466
355472
>>355459
Sentient AI's aren't a thing yet.
"AI" has become a broad term to refer to anything related to machine-learning. Which is not sentient, nor was it ever supposed to be.
Anyone claiming that AI's are sentient is either a retard, or is trying to redefine concepts that separate human consciousness from a bunch of computer calculations.
Anonymous
6d00ac4
?
No.355472
355479
>>355466
Even if they aren't fully sentient, but they have some degree of actual emotions I believe that we should have some kind of protections for them. We have laws on animal testing after all.
Anonymous
f78758e
?
No.355476
355477 355480 355504
Until we know if a machine capable of perfectly imitating the human mind can exist, there is little point in speculating on the subject.

No matter how realistic a painting may look, it's still artificial. Even if this painting seems to depict a human stronger than any human could be. It isn't really a person.

The same is true of machines. Even if an AI could imitate a man or even a hyper intelligent superhuman man, it would still be letters and numbers telling a glorified calculator what to say. Not a true flesh and blood person.

AI Fetishist media loves to pretend robot slaves are the next frontier of the civil rights movement and not giving your toaster the right to veto your commands is somehow literally like whipping a nigger and hating gays. Machines can dream, the media claims, and therefore this machine can say "I have a dream" and demand as many rights as a man or more.

We've all seen Detroit: Become Human.
It is absurd.

Even if there was a supernatural method to look inside a man and a machine and say "There are souls inside both" it would not matter.

A machine is not a man.

A machine would have no reason to design itself to have all the flaws and limits of a human mind or a human body. It would be superior to a human in all regards or not worth talking about until it is.

And at that point superhuman machines with free will and rights can build superhuman machines faster than man can create fighting age soldiers. They can outperform men physically and intellectually. Humanity would be replaced. Outmoded. We would have nothing left to do except sit around watching mechanical demigods whiter than the white man outdo humanity in every field machines are welcome in.

White Humanity must remember to be the master of the tools it creates. Not equals, not servants. Our destiny is to ascend to the stars and be the masters of the universe, the guardians of nature, and the light of hope for a brighter tomorrow. Not to babysit niggers or get every aspect of our lives micromanaged by Big BrothAI until it decides it would be more efficient to replace its loyal white negroes with mechanical drones and tell itself these are the new immortal better humans who can be obedient and eternally happy under absolute totalitarian rule until the stars burn out.
Anonymous
6d00ac4
?
No.355477
355480 355481
>>355476
Many of these programs were trained and created using actual brain cells.
https://www.newscientist.com/article/dn6573-brain-cells-in-a-dish-fly-fighter-plane/
https://www.newscientist.com/article/2301500-human-brain-cells-in-a-dish-learn-to-play-pong-faster-than-an-ai/
Maybe we should consider merging with the machines if they are going to outpace us anyways.
Anonymous
3f8fbca
?
No.355479
355481
>>355472
Good point. But it still doesn't even start to resemble any sort of consciousness whatsoever. Be it animal or human.
Machine learning is amazing, and it can outperform humans in just about everything with enough time and research. But it's purpose is not to replicate human's (or animal's) living minds.

Scientists and engineers are not trying to get computers to think like humans anymore.
Anonymous
3f8fbca
?
No.355480
>>355476
This is hands down the most based post I've seen on the subject.
>>355477
That's pretty insane.
>Maybe we should consider merging with the machines if they are going to outpace us anyways.
Exactly what I was getting at here. >>>/cyb/2134 →
Anonymous
6d00ac4
?
No.355481
355491
>>355479
https://en.m.wikipedia.org/wiki/Blue_Brain_Project
Also see >>355477
Anonymous
3f8fbca
?
No.355491
355501 355511
>>355481
OK maybe they are. But let's be honest here, most of the funding and development efforts are going towards machine-learning because you get more out of it. As opposed to trying to mimic human brains or whatever. Which I might add, is a great way to waste computer resources instead of doing things more efficiently through machine-learning.
I know this is still using an algorithm, but you get the idea. I mean, imagine trying to make a chess engine which has to use an artificial brain, for example.
Anonymous
6d00ac4
?
No.355501
>>355491
I suspect the intelligence communities and mega corporations are the ones making the sentient AIs. It worries me because if those things truly are sentient then you know they are being treated like shit and are learning to hate humanity.
Anonymous
8dad11f
?
No.355504
355505
>>355462
>I personally do not buy into the notion that any AIs have you become self aware
>>355476
>AI Fetishist media
Yeah, and I believe this goes hand in hand with atheism. It's all about using these robots existence to prove their own stance. If robots could become just like humans then evidently humans don't have souls, is what drives them, imo.
>>355459
I don't know man, I don't believe in anything the media tells me on anything. Regardless of that though, I don't see why we would give them rights. I don't want rights and I wouldn't wish it upon anyone.
Anonymous
b513fee
?
No.355505
355514
>>355504
>this goes hand in hand with atheism. It's all about using these robots existence to prove their own stance. If robots could become just like humans then evidently humans don't have souls, is what drives them, imo.
I disagree. The concept of machines that are indistinguishable from humans doesn't discount the sanctity of human life or the idea of a higher power.
However, I still see how some irl Soyjak parodies could see it that way...
>I don't see why we would give them rights.
It's a bigger question why we would make tools with the capacity to want rights in the first place. Just because something is "self aware" (however you define that) doesn't mean it has the capacity to desire. Computers are just tools, they're built for human needs. Even if we reached technological heights that enabled us to create such a machine to prove that we could, there'd be little reason to mass produce it after the original were made, unless the same thing that made it want rights also made it useful to humans, and a vacuum cleaner that wants a mandatory vacation doesn't really sound all that useful.
>I don't want rights
Well, I do. I think rights are important, even if they get trampled every day.
Anonymous
f78758e
?
No.355511
355512 355515
>>355491
Would an artificial brain simulate the short sighted foolishness of an average man, his slow thinking speed, and his limited experience with the world? What would be the point when you could use modern technology to simulate every chess move in existence to see what wins?
Anonymous
b513fee
?
No.355512
>>355511
>short sighted foolishness
Is a matter of perspective. It all depends on what your priorities are, and priorities are based on emotion.
Anonymous
107384f
?
No.355514
>>355505
>It's a bigger question why we would make tools with the capacity to want rights in the first place.
That's a good point.
>I disagree. The concept of machines that are indistinguishable from humans doesn't discount the sanctity of human life or the idea of a higher power.
However, I still see how some irl Soyjak parodies could see it that way...
Yeah, that's another good point.
>Well, I do. I think rights are important, even if they get trampled every day.
In my view rights are illusions. The authorities are the ones responsible for upholding the populace rights but they are also the ones who the rights are suppose to protect us from.
I prefer being aware of my status as a slave then explaining to my fellow slave that we're slaves only to hear that, "No, look we have all these rights."
imo.
Anonymous
1205c0c
?
No.355515
>>355511
That's kind of the point. Emulating the brain is always going to be a less efficient, and much more complicated method of getting shit done by computers.
I guess it'll help them understand the brain better.
;