De-biasing the Employment Process

From Computer Laboratory Group Design Projects
Revision as of 13:26, 23 October 2020 by afb21 (talk | contribs) (Created page with "Client: John Pettigrew, Umbrella Analytics <johnp@umbrellaanalytics.net> The past year has seen a significant rise in awareness of misogyny, racism and other discriminati...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Client: John Pettigrew, Umbrella Analytics <johnp@umbrellaanalytics.net>

The past year has seen a significant rise in awareness of misogyny, racism and other discrimination in society and, in particular, in workplaces. In addition, there is growing awareness of how ‘algorithms’ can reinforce bias rather than remove it. Your task is to produce a system that can help businesses recruit more fairly, by removing biased language from their job adverts that would put off many candidates. Your system should allow users to upload the text for a job ad, to identify problems using natural-language processing and statistics, and to recommend changes to the user so that they can make iterative improvements. Ideally, your system would give each text an overall score as well as word-level feedback.