Tuesday, October 16, 2018

A.I. finds culturally-based "politically" correct identity politics factually incorrect!

From here:

Amazon's artificial intelligence: Too rational for PC culture
Amazon’s high-powered recruiting computers are apparently sexist. So, they have been fired.
In 2014, Amazon programmers began work on developing what was described by one insider as the “holy grail” of artificial intelligence (AI) applications. The plan was for the computer to intelligently sort through the massive number of resumes and select only the top candidates for hire.

But there was a problem.

The computer analyzed personnel data from a period of 10 years and learned patterns for hiring top talent, the kind of talent that built Amazon into the economic and technical juggernaut it is today. By 2015, the company realized the system was not behaving in a gender-neutral manner. The AI had taught itself to penalize resumes containing the word “women’s,” as in “women’s studies” or “women’s business club.” It also downgraded graduates of all-women colleges in the scoring process. Developers edited the system to be neutral regarding those particular terms, but soon realized the machine was capable of learning other patterns that would allow it to identify and discriminate against female candidates. Ultimately, Amazon pulled the plug on the entire project before it was fully utilized.

Stories like these are fascinating for two reasons.

First, it is uncharted territory. There has never been a time in human history where we had to concern ourselves that machines we built were learning things we do not want them to learn and going about their work accordingly.

Second, it is a very stark demonstration of our own cultural taboos.

Western civilization for the past 150 years or so has enforced a strict moral code that we are not to recognize certain patterns.

This is an increasing problem with machine learning.

Artificial intelligence is simply a machine. AI is rational. It does not understand taboos. Computers do not understand sacred cows. Machines cannot comprehend the victimhood caste system in which we live and the web of unspoken social rules to which we adhere. It only knows data. The news stories reporting Google’s image search including photos of black people in searches for gorillas (and the subsequent outrage) is another example of this. Since Google’s engineers could not get their photos app to stop making that cultural mistake, they finally gave up earlier this year and simply banned the term ‘gorilla’ from their AI’s search function.

The machine could not be made to understand cultural taboos or hurt feelings.

Humans do this, too. Every so often an Asian leader will make some comment about poor performance in education or crime rates of blacks and Hispanics in America versus whites and Asians. They do not understand they are not supposed to notice such patterns, unless they blame the pattern on a socially approved excuse. In the same way, African leaders over the past few years have made statements or taken actions regarding homosexuals that caused a stir among the gay community in the Western world. It is the stark cultural distinction that makes these things newsworthy.

These are cultural misunderstandings. Cultures have different histories and taboos that are woven into our interactions, languages and conduct. AI is a an ultimately rational entity without a social culture. It only knows and produces hard reality. It is not worried about being fired or what its neighbors will think or being scorned by society. It only knows its task, and it completes that task without shading results to reach a conclusion acceptable to a sliding scale of various social groups in a culture.

The persistence of artificial intelligence in offending social taboos starkly illustrates how far political correctness has separated Western civilization from rational behavior.

No comments: