Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Law Firms Using Monitoring Software to Review Contract Review Attorneys’ Work

November 16, 2021

The Washington Post reported on November 11 on the issue of law firms monitoring the work of contract reviewer employees working from home. One black employee noted that the system often failed to recognize her face or identified the Bantu knots in her hair as unauthorized recording devices, forcing her to log back in, sometimes more than 25 times a day.

After complaining to her bosses, they dismissed it as a minor technical issue, though colleagues with lighter skin told her they didn’t have that problem. This is not unusual as facial recognition systems often perform worse for people of color.

She had to re-scan her face from three angles so she could get back to a job where she was often expected to review 70 documents an hour.

Facial recognition systems have become more common in a work-from-home world as employers look for a simple and secure way to monitor a distributed workforce. 

But to contract attorneys, they serve as a dehumanizing reminder that every second of their workday is probed and analyzed. After verifying their identity, the software evaluates their level of attention or distraction and boots them out of their work networks if the system thinks they’re not focused enough.

The software has raised broad questions about how companies treat their remote workforces, especially those, like contract attorneys, whose short-term gigs reduce their ability to lobby for change.  

“There’s always going to be a desire to control more of the workplace, just because you can … and because the cost of all the heavy-handedness comes down on the employee,” said Amy Aykut, a contract attorney in the D.C. area.

The monitoring is a symptom of “these pervasive employer attitudes that take advantage of these technologies to continue these really vicious cycles … that treat employees as commodities,” she said. “The irony in this situation is that it’s attorneys, who traditionally advocate for employee rights or justice when they’re made aware of intrusions like these.”

Keystroke tracking, screenshots, and facial recognition: They may well be here to stay.

Contract attorneys go through thousands of documents entered as potential evidence during a lawsuit, redacting sensitive information and highlighting relevant details lawyers may need while arguing a case. Law firms hire them on an as-needed basis — such as when a complicated lawsuit involves lots of internal records or emails — and terminate them when they are not needed.

Contract attorneys say their short-term contracts ensure they work without benefits, at reduced hourly rates, and with no expectations of job security after the work is complete.  What they do have is law school debts to pay off. They simply need the work.

The Washington Post spoke with 27 contract attorneys across the United States who had been asked to use facial recognition software while working remotely.  Most of them hadn’t expected anything like the facial recognition monitoring they’ve been asked to permit. The software uses a worker’s webcam to record their facial movements and surroundings and will send an alert if the attorney takes photos of confidential documents, stops paying attention to the screen or allows unauthorized people into the room. The attorneys are expected to scan their face when they join the network so their identity can be reverified minute by minute to limit potential fraud.

Not all of them minded. But many said the systems were error-prone and imprecise thanks to general weaknesses in facial recognition systems, which can show wild swings in accuracy based on factors such as a room’s lighting, a person’s skin color or the quality of their webcam.

Lawyers said they had been kicked out of their work if they shifted slightly in their chairs, looked away for a moment or adjusted their glasses or hair. The systems gave them grief for innocent behaviors like holding a coffee mug mistaken for an unauthorized camera or listening to a podcast or the TV.

The constant interruptions have become a major annoyance in a job requiring long-term concentration and attention to detail. Several contract attorneys said they worried that their performance ratings, and potential future employability, could suffer solely based on the color of their skin. Loetitia McMillion, a contract attorney in Brooklyn who is black, said she’d started wearing her hair down or pushing her face near to the screen to keep the software from forcing her offline.

 Some contract attorneys said they felt the burden especially affected people of color, who fill a greater portion of the short-term legal roles. People of color make up about 15 percent of all lawyers in the United States but about 25 percent of the “non-traditional track/staff attorney” jobs, which include contract attorneys, according to recent statistics from the American Bar Association and the National Association for Law Placement.

Contract attorneys are not alone. Delivery workers, call-center representatives and Uber drivers are increasingly assessed by face- or voice-analyzing software, which their employers say can help the companies verify worker identity, performance or productivity.

Verificient Technologies, one of the companies selling such work-monitoring software, also offers a similar “online proctoring” service that colleges use to monitor students during exams. The systems have caused some test-takers to urinate in their seats for fear of being punished or flagged as cheaters if they stepped away.

The company’s “on-demand monitoring” software, RemoteDesk, can track workers’ “idle” and “active” time; record their screens and web-browser history; discern their background noise for unauthorized music or phone calls; and use the webcam to scan a worker’s face or room for company rule-breaking activity, such as eating and drinking or “suspicious expressions, gestures, or behavior.”

Nada Awad, the company’s chief sales officer, said suspicious behaviors include working for too long without a break or looking away from the monitor for extended periods of time. In an online guide on “the ethical complexity of remote workforce monitoring,” the company wrote that its software identifies “various levels of deceit and misconduct based on the guidelines defined by the corporation.”

An example screenshot of the RemoteDesk interface for employers, which the company shared with The Post, logged every online activity a worker had done during the workday, with each classified as “productive” and “unproductive,” as well as an overall “productivity score.” It also showed data on total hours worked and a “webcam feed” that included snapshots of violations, such as when a worker opened a social media website, used their phone or blocked the camera’s view.

Some attorneys said they worry that this is only the beginning for work-from-home surveillance. Call center workers in Colombia told NBC News in August that they had been asked to consent to in-home camera monitoring. Google and Microsoft already offer tools that employers can use to automatically gauge their workers’ productivity. And some companies, including Amazon, have considered monitoring workers’ mouse movements and keyboard strokes to detect impostors.

As the president of a company which provides IT and cybersecurity support to many law firms, I read the story with growing horror. Our previous experience has been that we were asked to install monitoring software only when a law firm suspected there was a “bad actor” – often suspected of purloining data or failing to secure confidential data.

But the Post’s report makes me concerned that the monitoring of all employees, not just law firm employees, may gain traction soon.

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225, Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology