Is Your Sex Droid a Bigot?
- Never in human history have those six words been put together in a sentence till now.
- Man is working hard to create Terminators and Sex Droids. But we may be creating bigoted Terminators and Sex Droids. The future is really looking fucked.
(Campus Reform) - Robots can be just as biased as humans, according to a recent study conducted at Princeton University that uncovered gender and racial bias in an Artificial Intelligence (AI) machine.
According to the researchers, robots have always exhibited racist and sexist word associations, such as connecting women with families and households but not professionalism or careers.
Researchers from Princeton’s Center for Information Technology Policy decided to test this concept with Stanford University’s Global Vectors for Word Representation, or GLoVe, an AI machine that uses the internet to learn how to associate words and concepts, reports The Tartan.
The researchers put GLoVe through a replica of the Implicit Association Test (IAT), a test developed by Harvard University that is used to detect implicit bias in humans by having them associate certain images with positive or negative adjectives.
Replicants may be bigots, but do they Dream of Electric Sheep? |
One IAT, for example, has participants match up images of black people and white people to adjectives like “pleasant” and “unpleasant.” If the person takes longer to match the black images to “pleasant,” the the IAT determines that they are biased against black people.
GLoVe demonstrated a variation on this type of racial and gender bias in the study’s version of the IAT, identifying black names as less pleasant than white names and associating women with the arts rather than the sciences.
Since robots and AI machines learn by gathering real-world data, they apparently reflect the biases present in human language. Therefore, if humans exhibit gender and racial bias, the machines we create will too.
“The main scientific findings that we’re able to show and prove are that language reflects biases,” said Aylin Caliskan of Princeton University’s Center for Information Technology Policy. “If AI is trained on human language, then it’s going to necessarily imbibe these biases, because it represents cultural facts and statistics about the world.”
According to the study summary published in Science Magazine, AI bias could lead to “unintended discrimination” if the machines are used for tasks such as sorting resumes for job openings.
“In addition to revealing a new comprehension skill for machines, the work raises the specter that this machine ability may become an instrument of unintended discrimination based on gender, race, age, or ethnicity,” the summary warns.
Read More . . . . Will Asian Sex Droids hate Mexicans? Inquiring minds want to know. |
1 comment:
Sure they are .
Berkeley sex robots are pretty wild , anti-semitic kill the jew driven , homophobs , rioting prone , hate whites , unless they are chicanos , hate asians unless they north koreans .
Post a Comment