Proctoring software, unreliability coupled with high stakes consequences
A common configuration for "online proctoring" software is that being "flagged" by the software halts the online exam. This supposedly prevents people blatantly cheating and selling the exam answers in real time.
Even more moderate conditions still suck. If the software flags the student and then the exams invigilators check the recorded video, and then allow the exam to continue, that's a time penalty which may result in marks lower than the student's actual knowledge and skill.
If the proctoring software fails, then the student's examination is invalid. Putting that another way, better pray your operating system is no good at detecting malware-like behaviour.
Proctoring software usually objects to other people in the room. Students can't reliably take an exam from a public library, or from a share house, or from a room with popstar posters.
The software objects to the student's face leaving the camera's frame. Better not drop a pencil, be harassed by your little sister, knock the laptop lid.
The software does eye tracking, the idea being to detect use of notes. Better not sneeze.
Some proctoring software fails to detect people present when their skin colour has insufficient contrast. Got to love that the "systematic racism" here involves two meanings of "system".
For high-stakes exams the invigilators will often want the laptop's camera taken on a tour of the room. Bedrooms are sometimes too complex a space to pass this inspection. Of course students don't know this until just before the exam.
The presence of proctoring software increases the stakes of already high-stakes exams for students. Why a university would choose to do this -- rather than rapidly redesign assessments -- in the midst of the most demanding teaching year since 1939 says a lot about the relationship of the university to its students. It's also a great illustration that education isn't only what is said in the course.