2015-08-31

Technology Fueling Negative Mood Transition

Daily Mail: It's not just humans, COMPUTERS can be prejudiced too: Software may accidentally sort job applications based on race or gender
You might think that computer software operates without bias because it uses code to reach conclusions.

But scientists have found that programs used to sort job applications and loans could be just as prejudiced as humans.

Machine learning algorithms used in some CV scanning software may make generalisations based on data in an application that accidentally mimic human discrimination based on race or gender, they warn.
If you write an algorithm to sort candidates based on objective criteria, it exhibits racial "bias." The conclusion is not that bias is overstated in society since the machines, which have no bias, are putting out similar results to human decision makers. Instead, the software will be hobbled in order to make it conform with ideology. This is the anti-scientific bias that permeates during periods of negative mood.

Meanwhile, others are using technology to reject integration of all types.
Websites could soon block people from accessing their content due to their race or gender following the development of software that checks the genetic profile of users.

The Genetic Access Control application works by requesting access to information compiled by DNA analysis firms like 23andMe before allowing users can access a website.

This would allow the website to check the user's ethnicity or gender to limit or deny them access, according to the programme's creator.

The software, which has already been blocked by 23andMe, relies upon the user having already had their DNA profile analysed by genetic firms.
23andMe blocks the software, but it can't stop people from supplying the information.

No comments:

Post a Comment