🖨 Print / Save as PDF

Running head: ONLINE PROCTORING AT SCALE, MULTI-SOURCE ANALYSIS

Online Proctoring at Scale: A Multi-Source Analysis of Vendor Failures, Student Harms, and Policy Responses (2018 to 2026)

Michael Steven Moates, Ed.D., LBA, QBA, CEAP

ProctorProblems Research Initiative

Author Note

Correspondence concerning this article should be directed to ProctorProblems, https://proctorproblems.com.

April 29, 2026

Abstract

Online proctoring expanded explosively during the 2020 to 2022 COVID-19 emergency and has since become an embedded feature of higher education and high-stakes occupational licensure in the United States and abroad. This paper aggregates a multi-source corpus of 691 student-reported incidents, 39 active or resolved lawsuits, and 202 policy artifacts to characterize the empirical landscape of remote-proctoring failures and the regulatory response. Sources include published research datasets, university student-government petitions, cohort discussions documented through opt-in messaging audits, government and nonprofit-organization filings, peer-reviewed literature, and journalistic investigations. Findings indicate that connection failures, mid-exam disconnections, racial and disability bias in algorithmic flagging, and undisclosed data-handling practices recur across all major vendors, while regulatory and judicial responses remain fragmented across jurisdictions. The paper offers recommendations for higher-education institutions, occupational licensing boards, civil-rights regulators, and student advocates, and identifies open questions for further research.

Keywords: online proctoring, biometric privacy, algorithmic bias, higher education, occupational licensure, exam administration, student rights, accessibility

Online Proctoring at Scale: A Multi-Source Analysis of Vendor Failures, Student Harms, and Policy Responses (2018 to 2026)

The rapid migration of high-stakes assessment to remote-proctored formats during the 2020 to 2022 COVID-19 emergency placed millions of students under continuous algorithmic and human surveillance during examinations (Inside Higher Ed, 2020; The Markup, 2020; Washington Post, 2020a). What began as a pandemic-era accommodation has since stabilized into a multi-billion-dollar education-technology submarket dominated by ProctorU/Meazure Learning, Proctorio, Honorlock, and Respondus, with adjacent products from ExamSoft/Examplify, ProctorTrack, PSI/Talview, Pearson VUE, and Inspera serving occupational-licensure, higher-education, and language-testing markets (EDUCAUSE, 2021).

The transition has been controversial. Civil-liberties organizations have documented systematic harms including racial bias in facial-recognition flagging, accessibility failures for students with disabilities, invasive collection of biometric and behavioral data, and overreliance on opaque artificial-intelligence (AI) systems whose decisions go unreviewed by faculty (Center for Democracy & Technology [CDT], 2020; Electronic Frontier Foundation [EFF], 2020a, 2021; Electronic Privacy Information Center [EPIC], 2020). Major incidents, including the catastrophic February 2025 California Bar Examination platform failures affecting approximately 5,600 candidates, have led to formal regulatory actions and litigation (American Bar Association Journal, 2025; State Bar of California, 2025).

This paper presents a multi-source empirical analysis of the online-proctoring landscape over an eight-year window (2018 to 2026). It draws on (a) a curated database of 691 student-reported incidents aggregated from public records, news coverage, social media, and first-person submissions; (b) 39 federal, state, appellate, and administrative proceedings naming proctoring vendors as defendants; and (c) 202 statutes, bills, regulations, board actions, agency complaints, university policy decisions, advocacy letters, news articles, op-eds, and vendor statements affecting the sector. The objective is to characterize patterns of failure, identify cross-vendor and cross-jurisdictional structural issues, and recommend institutional, regulatory, and student-advocacy responses.

Background and Literature Review

Pandemic-Era Adoption and Industry Growth

Pre-pandemic remote-proctoring use was largely confined to distance-learning programs and a small set of professional-licensure examinations administered by the Law School Admission Council (LSAC), Educational Testing Service (ETS), and certification testing programs (EDUCAUSE, 2021). Beginning in March 2020, however, U.S. and international institutions rapidly migrated end-of-term examinations to remote formats. By May 2020, Inside Higher Ed reported that online proctoring was “surging” across higher education (McKenzie, 2020), with vendor sales growing several-fold over baseline (Washington Post, 2020a). The Open University, University of Sydney, University of Amsterdam, and others adopted ProctorU or Proctorio at scale during the same period (Honi Soit, 2022; Open University, 2020; Racism and Technology Center, 2022).

Documented Harms in Existing Literature

A substantial peer-reviewed and gray literature has documented harms associated with automated proctoring. Yoder-Himes et al. (2022), publishing in Frontiers in Education, reported statistically significant racial, skin-tone, and sex disparities in commercial automated-proctoring software performance. Coghlan et al. (2021), in Philosophy & Technology, framed the ethical critique through a “Big Brother” lens. Swauger (2020) provided one of the earliest synthetic critiques in Hybrid Pedagogy, arguing that remote proctoring constitutes algorithmic surveillance with disproportionate impact on marginalized students. Mita (2023), in the UCLA Law Journal, addressed how AI proctoring's privacy and bias concerns have been overshadowed by perceptions of algorithmic infallibility.

Civil-liberties analyses have similarly converged on a set of recurring concerns. The Center for Democracy & Technology (CDT, 2020) documented that AI proctoring algorithmically profiles students for suspicious behavior, frequently flagging disabilities including Attention-Deficit/Hyperactivity Disorder (ADHD), Tourette's syndrome, cerebral palsy, autism, dyslexia, and visual impairments as cheating indicators. The American Civil Liberties Union (ACLU, 2023) extended this analysis in its report Digital Dystopia, finding that current proctoring tools fail constitutional and civil-rights standards. The Electronic Privacy Information Center (EPIC, 2020) filed parallel complaints with the District of Columbia Attorney General and the Federal Trade Commission alleging that Respondus, ProctorU, Proctorio, Examity, and Honorlock had collected excessive personal data, relied on opaque AI tools, and made deceptive product statements.

Regulatory and Judicial Responses

Regulatory responses have unfolded across multiple jurisdictions. In the United States, state biometric-information privacy laws, most notably the Illinois Biometric Information Privacy Act (BIPA, 2008), have produced large class-action settlements, including a $6.25 million Respondus settlement covering Illinois test-takers between 2015 and 2023 (Top Class Actions, 2023). California's Student Test Taker Privacy Protection Act (STTPPA, 2023) prohibits proctoring providers from collecting, retaining, or using personal information beyond what is strictly necessary to provide the service. In the Netherlands, the Netherlands Institute for Human Rights (College voor de Rechten van de Mens, 2022) ruled that VU Amsterdam's use of Proctorio created a presumption of indirect racial discrimination. In Ogletree v. Cleveland State University (2022), a federal district court held that mandatory pre-exam “room scans” violate the Fourth Amendment.

Despite these developments, the regulatory landscape remains fragmented. CalMatters (2023) reported that California universities continued to deploy proctoring software after STTPPA went into force, citing San Diego State University, Cal Poly San Luis Obispo, Cal State San Marcos, Chico State University, and the University of California, Berkeley as continuing users. Similarly, vendor responses to a December 2020 inquiry by U.S. Senators Blumenthal, Booker, Warren, Wyden, Van Hollen, and Smith were characterized by EFF (2020b) and Diverse Issues in Higher Education (2021) as dismissive of equity and accessibility concerns.

Method

Data Sources

This paper aggregates four classes of data:

Student-Reported Incidents

The primary corpus consists of 691 unique incident records covering the period 2014 to 2026, sourced from: (a) the ProctorU Student Complaints Database (a research-curated dataset of public Reddit, Trustpilot, Better Business Bureau, social-media, news, academic, and legal records, n ≈ 460); (b) a parallel multi-vendor Proctoring Tools Complaints Database covering Proctorio, Respondus, Honorlock, ExamSoft/Examplify, ProctorTrack, PSI/Talview, Pearson VUE, Inspera, Mercer Mettl, and Kryterion (n ≈ 213); (c) cohort-internal communications from the Herzing University Direct Entry MSN nursing program documented through cohort logs and through opt-in audits of the lead author's GroupMe (n ≈ 32) and iMessage (n ≈ 21) cohort traffic (received messages only, with personally identifying information redacted prior to ingestion); and (d) first-person submissions made directly through the public ProctorProblems intake form. Each record was classified by problem type, vendor, university, source platform, and approximate incident date, with personally identifying information redacted via a regular-expression-based pipeline before public display.

Litigation

39 federal, state, appellate, and administrative proceedings naming proctoring vendors as defendants were documented from CourtListener, the State Bar of California's published litigation, the BIPA settlement notice corpus, the Sauder Schelkopf class-action document repository, and ABA Journal coverage. For each case, the case caption, case number, court, filing date, plaintiffs, defendants, causes of action, allegations, outcome, settlement amount, and appellate history were recorded.

Policy Artifacts

202 policy items were classified into nine categories: (a) statutes and regulations, (b) legislation and hearings, (c) occupational licensure board actions, (d) government agency actions, (e) university policy decisions, (f) nonprofit and advocacy positions, (g) journalism, (h) op-eds and editorials, and (i) vendor statements and advertising. Sources included state and federal legislative trackers, the EFF Deeplinks proctoring tag, EPIC's online-proctoring dossier, news archives from Inside Higher Ed, the Chronicle of Higher Education, EdSurge, The Verge, Vice/Motherboard, MIT Technology Review, NPR, and others, as well as institutional repositories of faculty-senate resolutions and student-government motions.

Public Submissions

The ProctorProblems site at https://proctorproblems.com accepts ongoing first-person submissions through a public web form, with optional account creation, optional uploaded evidence, and a privacy-preserving identifier scheme. All submissions are subject to the same redaction pipeline used for imported records before being made public.

Coding Scheme

Problem types were initially coded using the source datasets' categories (e.g., “Technical/Connection Failure,” “Camera/Microphone Issue,” “False Cheating Flag,” “Privacy/Data Breach”). The original 14-category taxonomy was subsequently disaggregated through a keyword-classification pipeline that decomposed the largest catchment categories into more specific subtypes, yielding 28 problem-type categories at the time of analysis. Reclassification was applied to all records through automated keyword matching against the report description and original-source text.

Privacy and Ethics

All personally identifying information from imported records was processed through a redaction pipeline replacing names, emails, phone numbers, and known unique identifiers with neutral tokens (e.g., [name], [email]) prior to public display. Original unredacted text was retained only for site-administrator review. Cohort messaging audits processed only messages received by the lead author (excluding messages he sent), and applied identical redaction before any public surface. The research initiative is non-institutional and was conducted independent of any university or vendor.

Limitations

The corpus reflects publicly reachable records and audit-accessible cohort communications. It systematically under-represents (a) students who experienced incidents but did not document them publicly; (b) institutions whose incidents have not received press or regulatory attention; (c) jurisdictions outside the English-language record; and (d) vendor-internal data. Findings should therefore be interpreted as a lower bound on incident prevalence, not a population-level estimate. Records imported from public discussion forums may also contain self-selection bias, with dissatisfied users more likely to post than satisfied users.

Results

Volume and Distribution Across Vendors

As of April 29, 2026, the corpus contained 691 unique incident records spanning 148 named institutions and 11 proctoring services. Figure 1 displays the distribution of records across vendors, showing dominance of records concerning ProctorU/Meazure Learning, followed by Proctorio, Respondus, Honorlock, and the smaller-vendor cohort.

Figure 1. Distribution of student-reported incidents by proctoring vendor. Note. N = 691. Counts reflect aggregated records from research datasets, public records, and first-person submissions through April 29, 2026. Source: ProctorProblems Research Initiative.

Problem-Type Taxonomy

Figure 2 displays the distribution of records across the 28-category problem-type taxonomy. The largest category by absolute count is session disconnected mid-exam, followed by setup/connection took hours and privacy/data concern. Categories tied to false-cheating allegations, racial/algorithmic bias, and disability accommodation denials each account for between 4% and 8% of records.

Figure 2. Distribution of student-reported incidents by problem-type category. Note. N = 691. Categories were derived from source-dataset coding and refined through keyword-based reclassification. Source: ProctorProblems Research Initiative.

Institutional Distribution

Table 1 displays the 20 institutions with the largest record counts. Concentration at the top of the distribution reflects, in part, the salience of specific cohort-level documentation efforts (e.g., the Herzing University HESI petition cohort) and high-volume reporting from Western Governors University and Athabasca University.

RankInstitutionReports
1Western Governors University39
2Multiple institutions32
3Herzing University20
4California State Bar19
5Athabasca University15
6LSAT (law admissions)11
7University of Sydney6
8University of British Columbia5
9GRE (graduate admissions)5
10University of Florida5
11University of Utah4
12Louisiana State University4
13New York Bar4
14University of Wisconsin-Madison4
15Arizona State University4
16Cisco certification3
17University of Colorado Boulder3
18Swinburne University3
19University of Technology Sydney3
20University of California, Santa Barbara3

Note. Table 1. Top 20 institutions by aggregated incident count. The “Unknown / Not specified” bucket (n = approximately 280) is excluded. Source: ProctorProblems Research Initiative.

Source Composition

Figure 3 displays the composition of records by source platform. Trustpilot, Better Business Bureau, and SiteJabber consumer-review platforms account for the largest share of imported records, followed by social media and forum discussions, Reddit, academic and legal research, and direct journalism.

Figure 3. Distribution of records by source platform. Note. N = 691. “Direct submission” records are first-person reports made through the ProctorProblems public submission form; all other categories are records imported from external sources. Source: ProctorProblems Research Initiative.

Litigation Landscape

Litigation activity is concentrated in a small number of federal districts and Illinois state court. Of the 39 cases tracked, the Northern District of California and the Northern District of Illinois account for the largest share, followed by the Circuit Court of Cook County, Illinois, the Southern District of Florida, and the Central District of Illinois. ProctorU/Meazure Learning, ExamSoft/Examplify, and Pearson VUE are the most-named defendants; common causes of action include violations of the Illinois Biometric Information Privacy Act, California consumer-protection statutes (CLRA, UCL, FAL), negligence and breach of duty in data-handling, and breach of contract. Detailed case-by-case data are available at the public ProctorProblems litigation page.

Policy Landscape

Across 202 policy items, university policy decisions (n = 33) and journalism (n = 56) constitute the largest categories. Statutes and regulations (n = 25) include state biometric-privacy laws, state student-privacy laws, GDPR, FERPA, the California Student Test Taker Privacy Protection Act, and CNIL guidance on online examinations. Occupational-licensure-board actions (n = 15) reflect a pattern in which high-stakes licensure programs (USMLE, NCSBN, AICPA, NAPLEX) have largely declined to adopt remote proctoring at scale, while lower-stakes credentialing programs (LSAT, GRE, TOEFL, state bar examinations during the pandemic emergency) have adopted it with mixed results.

Discussion

Recurring Patterns Across Vendors

Three patterns recur across the vendor cohort. First, technical reliability remains a persistent failure mode. Mid-exam disconnections, multi-hour setup delays, multi-attempt failures, and platform-wide outages collectively account for approximately one-third of imported records and underlie major regulatory actions including the California State Bar's complaint against Meazure Learning (State Bar of California, 2025). Second, algorithmic bias against Black students, students with darker skin tones, students with disabilities, and students wearing religious head coverings is documented across vendors and across multiple jurisdictions, with the National Institute of Standards and Technology's (2019) facial-recognition vendor evaluation providing a baseline empirical foundation for these concerns. Third, vendor responses to formal inquiry, including the December 2020 U.S. Senate inquiry, have been characterized by external observers as dismissive rather than remediating (EFF, 2020b).

Disability and Accessibility Failures

The CDT (2020) characterization of automated proctoring as fundamentally incompatible with disability accommodations finds substantial support in the present corpus. Records describe pre-approved Americans with Disabilities Act (ADA) accommodations being refused at the point of exam administration; assistive technologies (screen readers, speech-to-text software, dictation tools) being rejected or causing the proctoring software to flag the student as suspicious; and disabilities (motor tics, stims, vocalizations, atypical eye movements) themselves being treated as cheating indicators. Compliance with the WCAG 2.1 AA accessibility standard, claimed by Proctorio (2021) and other vendors, does not reach these structural-conflict issues, which arise from the fundamental design choice to flag deviation from algorithmic norms.

Privacy and Data-Handling Concerns

Vendors deploy combinations of biometric collection (face, voice, gait, keystroke), continuous video and audio capture, screen recording, behavioral profiling, kernel-level browser lockdown, and in some implementations remote-control utilities (e.g., LogMeIn) granting proctors administrator-level access during exam sessions. The Washington State University data breach affecting approximately 440,000 ProctorU users (NBC News, 2020) and the security vulnerability disclosed in Proctorio's browser extension affecting an estimated 2,000+ U.S. colleges (Chronicle of Higher Education, 2021) illustrate the population-scale risk of consolidated biometric collection. Compliance with state biometric-privacy laws, particularly Illinois BIPA, has produced multi-million-dollar settlements but has not generally produced design changes (Top Class Actions, 2023).

The California Bar Examination as Marketplace Failure

The February 2025 California Bar Examination provides a uniquely-documented marketplace failure. Internal Meazure Learning communications obtained through litigation discovery reportedly indicated that the company recognized capacity limitations as early as September 2024 but did not remediate before the examination administration (State Bar of California, 2025). Approximately 5,600 candidates were affected; over 85% of surveyed test-takers reported at least one technology issue, and over 80% reported at least one issue with proctors (American Bar Association Journal, 2025). The California Supreme Court issued emergency remedies allowing affected examinees alternative paths to licensure. The case provides one of the clearest demonstrations to date of cascading harm when a single vendor failure intersects a high-stakes regulatory bottleneck.

Cross-Cohort Patterns: The Herzing Case Study

Within the present corpus, the Herzing University Direct Entry MSN nursing cohort provides a rich case-study record of within-institution patterns. A petition with 600+ signatures (February 2026) and a Change.org campaign with subsequent national-media coverage (April 2026) cite documented technical failures preventing examinees from sitting Health Education Systems, Inc. (HESI) examinations; appeals to the institution's grievance system being routinely denied; and the institution and vendor reciprocally attributing fault. Six recurring failure modes appear in the cohort record: (a) ProctorU sessions auto-cancelled after multiple consecutive failures; (b) inconsistent disclosure to administration; (c) cohort-scale impacts (60+ students reported affected over a four-day window in March 2026); (d) ADA-accommodation refusal; (e) appeals denied at scale; and (f) cross-institutional confirmation that the same vendor had produced similar patterns at peer institutions.

Recommendations

For Higher-Education Institutions

(a) Re-evaluate vendor contracts with reference to documented disability and racial-bias evidence and require contractual representations and warranties on accessibility-audit results. (b) Establish institutional appeals procedures that do not require students to litigate against the vendor as a precondition to academic redress. (c) Publish lists of student-surveillance technologies in active use, including the data they collect and retention durations. (d) Pilot and evaluate proctoring-free assessment models (open-book, project-based, oral defenses, in-person controlled-environment) in courses where the disruption-cost of proctoring software exceeds the integrity benefit.

For Occupational Licensure Boards

(a) Adopt a presumption against remote proctoring for high-stakes licensure absent independently-validated capacity, accessibility, and bias-audit evidence. (b) Require vendor submission of incident-rate, false-flag-rate, and accessibility-complaint data as a condition of contract. (c) Establish public-facing post-administration review boards modeled on the California Supreme Court's response to the February 2025 Bar Examination.

For Regulators and Legislators

(a) Adopt biometric-information privacy laws with private rights of action in jurisdictions that have not yet done so, modeled on Illinois BIPA. (b) Adopt student-test-taker privacy statutes modeled on the California Student Test Taker Privacy Protection Act. (c) Strengthen federal oversight by funding U.S. Department of Education Office for Civil Rights enforcement of disability-accommodation complaints and Federal Trade Commission investigations of unfair-and-deceptive-practices complaints. (d) Require independent third-party bias audits as a condition of vendor procurement at public institutions.

For Students and Student Advocates

(a) Document incidents in real time via screen-capture, time-stamped chat-log preservation, and contemporaneous email reports to faculty advisors. (b) Use BIPA and equivalent statutes' private-right-of-action mechanisms where applicable. (c) Pursue ADA accommodations through institutional disability-services offices in writing prior to scheduled examinations to create a documentary record. (d) Report incidents to public databases including the present site, regulatory complaint portals, and journalists.

Limitations and Future Research

The corpus is limited to publicly reachable records, audit-accessible cohort communications, and first-person submissions; it does not include vendor-internal data or comprehensive licensure-board incident logs. Future research should incorporate (a) controlled experimental evaluations of vendor algorithmic flagging across demographic groups; (b) quantitative analysis of disposition outcomes in BIPA and analog cases; (c) comparative cross-jurisdictional analyses of regulatory enforcement; and (d) longitudinal cohort studies of student outcomes (course completion, mental-health utilization, withdrawal rates) under different proctoring regimes.

Conclusion

Online proctoring at the scale documented here functions as a high-stakes, high-uncertainty technology embedded in academic and licensure infrastructure that has, to date, evolved faster than the regulatory mechanisms designed to constrain it. The empirical record shows recurring patterns of technical failure, algorithmic bias, accessibility failures, and dismissive vendor responses to formal inquiry, patterns that recur across vendors, across institutions, and across jurisdictions. The marketplace failure of the February 2025 California Bar Examination, in particular, suggests that high-stakes licensure programs and the test-takers who depend on them remain structurally exposed to vendor failure absent stronger contractual and regulatory protections. The path forward requires concerted action across institutional, regulatory, and student-advocate domains, supported by transparent and continuously-updated empirical records of the kind this paper describes.

References

American Bar Association Journal. (2025). Following repeat problems with remote bar exam, California releases investigation findings. https://www.abajournal.com/news/article/following-repeat-problems-with-remote-bar-exam-california-releases-investigation-findings

American Civil Liberties Union. (2023). Digital dystopia: The danger in buying what the EdTech surveillance industry is selling. https://www.aclu.org/press-releases/new-aclu-report-shines-light-on-shadowy-edtech-surveillance-industry-and-the-dangerous-consequences-of-surveillance-in-schools

CalMatters. (2023, February 22). E-proctoring still used at California colleges despite ruling. https://calmatters.org/education/higher-education/college-beat/2023/02/remote-proctoring-california-colleges/

Center for Democracy & Technology. (2020). How automated test proctoring software discriminates against disabled students. https://cdt.org/insights/how-automated-test-proctoring-software-discriminates-against-disabled-students/

Chronicle of Higher Education. (2021, July 13). Coverage of Proctorio browser-extension vulnerability. https://www.chronicle.com/

Coghlan, S., Miller, T., & Paterson, J. (2021). Good proctor or “Big Brother”? Ethics of online exam supervision technologies. Philosophy & Technology, 34(4), 1581 to 1606. https://doi.org/10.1007/s13347-021-00476-1

College voor de Rechten van de Mens [Netherlands Institute for Human Rights]. (2022, December 9). Ruling on the use of anti-cheating software at VU Amsterdam. https://racismandtechnology.center/2022/12/24/dutch-institute-for-human-rights-use-of-anti-cheating-software-can-be-algorithmic-discrimination-i-e-racist/

Diverse Issues in Higher Education. (2021). Online proctoring companies respond to senators' equity concerns. https://www.diverseeducation.com/home/article/15108554/online-proctoring-companies-respond-to-senators-equity-concerns

EDUCAUSE. (2021). Proctoring software in higher education: Prevalence and patterns. EDUCAUSE Review. https://er.educause.edu/articles/2021/2/proctoring-software-in-higher-ed-prevalence-and-patterns

Electronic Frontier Foundation. (2020a). Students are pushing back against proctoring surveillance apps. https://www.eff.org/deeplinks/2020/09/students-are-pushing-back-against-proctoring-surveillance-apps

Electronic Frontier Foundation. (2020b). Proctoring companies must be better than mocking students. https://www.eff.org/deeplinks/2020/12/proctoring-companies-must-be-better-mocking-students

Electronic Frontier Foundation. (2021). A long overdue reckoning for online proctoring companies may finally be here. https://www.eff.org/deeplinks/2021/06/long-overdue-reckoning-online-proctoring-companies-may-finally-be-here

Electronic Frontier Foundation. (2022). Federal judge: Invasive online proctoring "room scans" are unconstitutional. https://www.eff.org/deeplinks/2022/08/federal-judge-invasive-online-proctoring-room-scans-are-also-unconstitutional

Electronic Privacy Information Center. (2020). EPIC complaint in re: Online test proctoring companies. https://epic.org/wp-content/uploads/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf

Fight for the Future. (2021). 19 human rights, civil liberties, and youth advocacy organizations demand schools stop using e-proctoring. https://www.fightforthefuture.org/news/2021-07-08-19-human-rights-civil-liberties-and-youth

Honi Soit. (2022, August 15). University of Sydney $2.5M ProctorU spending exposé. Honi Soit. https://honisoit.com/

Inside Higher Ed. (2020, May 11). Online proctoring is surging during COVID-19. https://www.insidehighered.com/news/2020/05/11/online-proctoring-surging-during-covid-19

McKenzie, L. (2021a, February 1). U of Illinois says goodbye to Proctorio. Inside Higher Ed. https://www.insidehighered.com/news/2021/02/01/u-illinois-says-goodbye-proctorio

McKenzie, L. (2021b, April 6). Proctoring tool failed to recognize dark skin, students say. Inside Higher Ed. https://www.insidehighered.com/quicktakes/2021/04/06/proctoring-tool-failed-recognize-dark-skin-students-say

McKenzie, L. (2021c, May 24). ProctorU abandons business based solely on AI. Inside Higher Ed. https://www.insidehighered.com/news/2021/05/24/proctoru-abandons-business-based-solely-ai

The Markup. (2020, October 13). Remote exam software is crashing when the stakes are the highest. The Markup. https://themarkup.org/coronavirus/2020/10/13/remote-exam-software-failures-privacy

Mita, S. (2023). AI proctoring: Academic integrity vs. student rights. UCLA Law Journal. https://uclawjournal.org/wp-content/uploads/10-Mita_final.pdf

National Institute of Standards and Technology. (2019). Face recognition vendor test (FRVT) part 3: Demographic effects (NIST IR 8280). https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf

NBC News. (2020, August 12). Coverage of Washington State University ProctorU data breach. https://www.nbcnews.com/

NPR. (2022, August 25). Scanning student rooms during remote tests is unconstitutional, judge rules. NPR. https://www.npr.org/2022/08/25/1119337956/test-proctoring-room-scans-unconstitutional-cleveland-state-university

Proctorio. (2021). Accessibility. https://proctorio.com/about/accessibility

Racism and Technology Center. (2022). Robin Pocornie's case: Algorithmic discrimination at VU Amsterdam. https://racismandtechnology.center/theme/robins-case/

Senator Richard Blumenthal. (2020). Blumenthal leads call for virtual exam software companies to improve equity, accessibility, and privacy for students amid troubling reports. https://www.blumenthal.senate.gov/newsroom/press/release/blumenthal-leads-call-for-virtual-exam-software-companies-to-improve-equity-accessibility-and-privacy-for-students-amid-troubling-reports

State Bar of California. (2025). Meazure Learning documents prompt State Bar to amend lawsuit against February 2025 bar exam administration vendor. https://www.calbar.ca.gov/news/meazure-learning-documents-prompt-state-bar-amend-lawsuit-against-february-2025-bar-exam-administration-vendor

Swauger, S. (2020). Our bodies encoded: Algorithmic test proctoring in higher education. Hybrid Pedagogy. https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

Top Class Actions. (2023). Respondus online exam BIPA $6.25M class action lawsuit settlement. https://topclassactions.com/lawsuit-settlements/closed-settlements/respondus-online-exam-bipa-6-25m-class-action-lawsuit-settlement/

Washington Post. (2020a, April 1). Closed colleges are using online proctoring services to monitor students during exams. https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus/

Washington Post. (2020b, November 12). Students rebel over remote test monitoring during the pandemic. https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/

Yoder-Himes, D. R., Asif, A., Kinney, K., Brandt, T. J., Cecil, R. E., Himes, P. R., Cashon, C., Hopp, R. M. P., & Ross, E. (2022). Racial, skin tone, and sex disparities in automated proctoring software. Frontiers in Education, 7, 881449. https://doi.org/10.3389/feduc.2022.881449