In honour of George Floyd; Sorry but you discriminate as a HUMAN

In honour of George Floyd this post March 22 2019

Niet aleen analytisch, praktisch maar ook sociaal emotionele intelligente robots in de toekomst?

Discrimineren deze robots ons niet bij aankomende sollicitaties zoals HUMANS? zie de video hieronder en lees ook meer over e.e.a.

We beschouwen minderheden en kwetsbaren als minder HUMAN

By Christian Jarrett

It’s a question that’s reverberated through the ages – are we humans, though imperfect, essentially kind, sensible, good-natured creatures? Or deep down are we wired to be bad, blinkered, idle, vain, vengeful and selfish? There are no easy answers and there’s clearly a lot of variation between individuals, but this feature post aims to shine some evidence-based light on the matter. Here in the first part of a two-part feature – and deliberately side-stepping the obviously relevant but controversial and already much-discussed MilgramZimbardo and Asch studies – we digest 10 dispiriting findings that reveal the darker and less impressive aspects of human nature:

1 We view minorities and the vulnerable as less than human
Through history humans have demonstrated a sickening willingness to inflict cruelty on one another. Part of the explanation may be that we have an unfortunate tendency to see certain groups – especially outsiders and vulnerable people perceived as low status – as being less than fully human. One striking example of this “blatant dehumanisation” came from a small brain-scan study that found students exhibited less neural activity associated with thinking about people when they looked at pictures of the homeless or of drug addicts, as compared with higher-status individuals. Many more studies have since demonstrated subtle forms of dehumanisation (in which we attribute fewer mental states to outsiders and minorities) and there have been further demonstrations of blatant dehumanisation – for instance, people who are opposed to Arab immigration or in favour of tougher counter-terrorism policy against Muslim extremists tended to rate Arabs and Muslims as literally less evolved than average. Among other examples, there’s also evidence that young people dehumanise older people; and that men and women alike dehumanise drunk women.

What’s more, the inclination to dehumanise starts early – children as young as five view out-group faces (those belonging to people who live in a different city or who are of a different gender than the child) as less human than in-group faces.

Items 2 tot en met 9 staan in nieuwsbericht Human Error maart 2019


MARCH 13TH 19 KRISTIN HOUSER

FILED UNDER: ARTIFICIAL INTELLIGENCE

You’re Hired

The next time you interview for a job, the recruiter you need to impress might not be human.

Since October, Swedish recruitment agency TNG has been using an artificially intelligent robot head called Tengai to conduct test interviews in place of a human recruiter. Starting in May, the device will begin interviewing candidates for actual jobs with the goal of eliminating the biases human recruiters bring to the hiring process — an encouraging example of an AI eliminating discrimination rather than amplifying it.

Perfect Tengai

Tengai is the work of Furhat Robotics, a conversational AI and social robotics startup. Furhat designed the robot head to be placed on a table where it rests at about eye level with a job candidate. It then asks the person a series of questions, with its voice and face designed to mimic human inflections and expressions.

Unlike a human recruiter — who might develop unconscious biases about a candidate based on anything from their gender and ethnicity to how they answer informal chit chat before the interview — Tengai will ask every question in the same order and the same way.

It then provides a human recruiter with a transcript of the candidate’s answers so that they can make a decision about whether or not to move forward with that person.

Eventually, Furhat hopes to program the robot to make its own decisions on which applicants should proceed to the next round of interviews. It already has an English-language version of the bot in development, with plans to roll that out in early 2020.

hpn bill.jpg
hpn robot.jpg