Personal Ambiguator

From Computer Laboratory Group Design Projects
Revision as of 19:12, 12 November 2021 by afb21 (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Client: Eleanor Drage, Centre for Gender Studies <ed575@cam.ac.uk>

Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine learning classification systems. Your task is to identify potential bias in personal documents such as a CV, health record or university application, and then to use a generative neural network approach to tweak these documents (perhaps by adjusting text or selectively removing information) so that they cannot clearly be classified. The algorithm should be packaged in an interactive tool that allows people to write ambiguated personal documents, and also highlights and educates them about the sources of bias.