What has been not been noted, however, is the way in which these systems likely discriminate against people with disabilities. The problem that people with disabilities face through this kind of AI is, even if they have a strong set of positive qualities for certain jobs, the AI is unlikely to highlight those features and could generate low scores ...
However, people with disabilities will not benefit if their qualities manifest physically in a way the algorithm has not seen in that training data. If their facial attributes or mannerisms are different than the norm, they get no credit, even if their traits would be as beneficial to the job.
Broadly, this issue of coverage (meaning the training data containing enough relevant examples) is a genuine concern when applying AI systems to people with disabilities.
Google researchers demonstrated that some AI considers language about having a disability as inherently negative. As another problematic example, imagine how driverless cars might learn human movements to avoid the path of pedestrians. This is a type of situation in which humans still dramatically outperform AI: choosing not to narrowly interpret a...
About 13% of Americans have a disability of some kind and they already suffer from worse employment outcomes. Their unemployment rate stands at 6.1%, twice that of people without disabilities.
Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.