The development of face recognition technology has been leaps and bounds recently. There are already some commercial software that can be used to identify the gender of a person in a photo. When the person in the photo is white, 99% of the cases are correct, but if the photo is correct, The middle is black, and the error rate is higher. How much different is the effect of face recognition technology in identifying faces of different races and genders? A new study measures this, and the results show that the darker the skin color, the lower the recognition rate. When identifying black-skinned women, its error rate is almost 35%.
The study by MIT Media Lab researcher Joy Buolamwini shows that some of the biases in the real world have penetrated into the field of artificial intelligence (AI) because face recognition technology is built on AI. Above.
Color is important in machine vision technology
Face recognition algorithms using Microsoft, IBM, and Face++ have higher error rates than identifying white males when identifying black women.
In a group of 385 male photos with whiter skin, the gender error rate was 1%.
In a group of 296 photos of women with whiter skin, the gender error rate was 7%.
In a group of 318 black-skinned male photos, the gender error rate was 12%.
In a group of 271 black-skinned women, the gender error rate was 35%.
In modern AI technology, data is the key. How good is the data used to train the AI, and how good the AI ​​effect will be. If there are more white males than black females in the training data, then its ability to identify black females is worse. Another study showed that in a widely used face recognition dataset, more than 75% of images were male and more than 80% were white. Therefore, this new study raises a question: When the amount of investment and adoption of AI is increasing, how can EA's fairness and accountability be guaranteed?
Today, commercial companies are deploying face recognition software in a variety of ways, including the precise promotion of products based on images on social media. However, some companies are also trying to incorporate face recognition and other AI technologies into some automated decision-making processes, such as recruitment and loan decisions. Researchers at Georgetown University School of Law estimate that the law enforcement network's face recognition network covers data for 117 million American adults (photographs of criminals or suspects photographed by police), while African Americans are most likely to be singled out. Because they account for a particularly high proportion of this database.
Face recognition technology is rarely regulated
Suresh Venkatasubramanian, a professor of computer science at the University of Utah, said: "It is time now. We must seriously study the working methods and problems of the AI ​​system and be accountable to them from a social perspective." There have been some examples of computer vision technology making mistakes, showing the existence. Discrimination. For example, in 2015, Google's image recognition photo app labeled African Americans as "gorillas", and Google later apologized for this. Sorelle Friedler, a computer scientist at Haverford College, says experts have long suspected that face recognition software works differently for different people. "But this is the first study I know to show the existence of this difference," Friedler said.
The 28-year-old Branwini is an African-American computer scientist who has experienced facial recognition bias. When she was studying undergraduate degrees at Georgia Tech, face recognition technology worked well for her white friends, but she could not recognize her face. She felt that this defect would be fixed soon. But a few years later, when she entered the MIT Media Lab, she again encountered this problem - only when she put on a white mask, the software could identify her face. At that time, face recognition software was increasingly coming out of the lab and entering the mainstream society. “This is a very serious question,†she recalls. “It’s time to do something.†So she turned her attention to the fight against prejudice in digital technology. Branwini is now studying at Bo, as a Rhodes Scholar and Fulbright Researcher, she advocates “algorithm accountability†and is committed to making automation decisions more transparent, explanatory, and fair. She has more than 940,000 video talks on "coding bias" at TED. She also created the "Algorithm Justice Alliance", a project designed to raise awareness of this issue.
Experiments with face recognition software from three companies
Branwini will present a recently published paper at a conference this month. She studied the performance of the face recognition systems of Microsoft, IBM, and China Deterrence Technology for this paper, and measured their effects on identifying genders of users of different skin colors. She chose these companies because their face analysis software provides gender judgment and their code has been publicly released for testing.
She found that the recognition of the software of these three companies needs to be improved. Branwini set up a data set for the test, with a total of 1,270 faces, using facial images of parliamentarians from countries with more women's parliamentarians, including three African countries with black skin and three whites. The skin-based Nordic countries. She then graded these facial data from Africa and Northern Europe according to the “six-point marking system†used by dermatologists to classify the skin. Medical classification is more objective and accurate than ethnic classification. She then made a choice between gender and skin color for these face images, and then used the software of the three companies to identify them. The error rate for Microsoft to identify black-skinned women is 21%, while the error rate for IBM and Megvii is close to 35%. The error rates of the three companies in identifying white-skinned men were all below 1%.
Branwini released the results of this research. IBM said in a statement that the company has steadily improved its face analysis software and is committed to "unbiased" and "transparency." IBM said the software upgrade, which will be launched this month, will improve accuracy by nearly 10 times when identifying women with darker skin. Microsoft said it has "taken steps to improve the accuracy of face recognition technology" and is investing resources to study how to "identify, understand and eliminate prejudice."
Branwini said that the contempt for Face ++ software is widely used in China's online payment and online car service, but the company did not respond to requests for comment. Branwini released her data set for others to use. She called her research "the starting point for the solution, which is basically the first step." Branwini also took further action, working with the IEEE (a major computer professional organization) to set up a team to develop accountability and transparency standards for face analysis software. She also regularly meets with other scholars, public policy organizations, and charities that care about AI. Ford Foundation Chairman Darren Walker said the new technology could be an “opportunity platform,†but it would not work if it replicated and magnified past biases and discrimination. Walker said: "The digital world is engaged in a battle of fairness, tolerance and justice."
DN25MM digital magnetic flowmeter has long - term high measurement accuracy. Electromagnetic Flow Meter is actually unaffected by the physical properties of the fluid. The measurement accuracy can reach 1,0.5.
magnetic flow Meters have no mechanical or moving parts and usually do not require maintenance.
em flow meter is suitable for all liquid flow tests with conductivity greater than 5. The change of conductivity does not affect the change of performance.
The best integrated circuit in the world is used to eliminate the influence of excitation switch, and the compensation method of excitation current is used to achieve the optimal stability of the product.
Easy to operate, the flow meter is automatically set in the range of 0 ~ 10m /s, without changing the measuring range.
Electromagnetic Flow Meter,magnetic flow meter,electromagnetic water flow meter,em flow meter
Changshu Herun Import & Export Co.,Ltd , https://www.herunchina.com