RACIAL BIAS / FACIAL RECOGNITION – Black student Robin Pocornie repeatedly received 'face not found' and 'room too dark' error messages from Proctorio during exams.
Vrije Universiteit Amsterdam · reported by ResearchImport
RACIAL BIAS / FACIAL RECOGNITION – [name] student [name] repeatedly received 'face not found' and 'room too dark' error messages from Proctorio during exams. She was forced to sit with a lamp shining directly in her face to be detected, while white classmates faced no such requirement.
Course / Exam
Bioinformatics exams (multiple)
Program
Undergraduate
Outcome
Pocornie filed complaint with Netherlands Institute for Human Rights. October 2023 ruling cleared VU Amsterdam of discrimination, citing login data analysis, but said the university 'should have handled the complaint more carefully' and found the university had discriminated in how it handled the complaint. The Institute noted its ruling 'does not rule out that the use of Proctorio or similar AI software may lead to discrimination in other situations.' Scientific studies (including ACM FAccT 2025) later confirmed the OpenCV algorithm Proctorio used showed dramatically higher failure rates for darker-skinned faces.
Source
Research / Academic / Legal
Original
Imported research record