Personal Ambiguator: Difference between revisions
From Computer Laboratory Group Design Projects
Jump to navigationJump to search
(Created page with "No client confirmed Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine lear...") |
No edit summary |
||
Line 1: | Line 1: | ||
Client: Eleanor Drage <ed575@cam.ac.uk> | |||
Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine learning classification systems. Your task is to identify potential bias in | Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine learning classification systems. Your task is to identify potential bias in personal documents such as a CV, health record or university application, and then to use a generative neural network approach to tweak these documents (perhaps by adjusting text or selectively removing information) so that they cannot clearly be classified. The algorithm should be packaged in an interactive tool that allows people to write ambiguated personal documents, and also highlights and educates them about the sources of bias. |
Revision as of 13:29, 29 October 2021
Client: Eleanor Drage <ed575@cam.ac.uk>
Many areas of life such as housing, health, education and employment are subject to gender, racial and other biases that can be detected with machine learning classification systems. Your task is to identify potential bias in personal documents such as a CV, health record or university application, and then to use a generative neural network approach to tweak these documents (perhaps by adjusting text or selectively removing information) so that they cannot clearly be classified. The algorithm should be packaged in an interactive tool that allows people to write ambiguated personal documents, and also highlights and educates them about the sources of bias.