Personal Ambiguator

From Computer Laboratory Group Design Projects
Revision as of 16:49, 26 October 2021 by afb21 (talk | contribs) (Created page with "No client confirmed Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine lear...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

No client confirmed

Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine learning classification systems. Your task is to identify potential bias in official documents such as a CV, health record or university application, and then to use a generative approach to tweak these documents (perhaps by adjusting text or selectively removing information) so that they cannot clearly be classified. The algorithm should be packaged in an interactive tool that allows people to write ambiguated personal documents, and also highlights and educates them about the sources of bias.