The CFPB Curbs Worker Surveillance – Will the Government Live Up to Its Own Privacy Standards?10/31/2024
The Consumer Financial Protection Bureau (CFPB) is warning businesses that use of “black-box AI” or algorithmic scores about workers must be consistent with the rules of the Fair Credit Reporting Act. This means employers must obtain workers’ consent, provide transparency when data is used for an adverse decision, and make sure that workers have a chance to dispute inaccurate reports. That’s a good move for privacy, as far as it goes. The problem is, it doesn’t go nearly far enough because the federal government doesn’t impose these same standards on itself. First, PPSA agrees with the tightening of employers’ use of digital dossiers and AI monitoring. Whenever someone applies for a job, the prospective employer will usually perform a search about them on a common background-check site. It is not surprising that businesses want to know about applicants’ credit histories, to check on their reliability and conscientiousness, and if they have a possible criminal past. But third-party consumer reports offer much more than those obvious background checks. Some sites, for example, are used to predict the likelihood that you might favor union membership. More invasive still are apps that many employers are requiring new employees to install on personal phones to monitor their conduct and assess their performance. The decision to reassign employees, promote or demote them, or fire them are coming from automated systems, decisions made by machines that often lack context or key information. Federal agencies, from the CFPB to the Federal Trade Commission, have not been shy about calling out privacy violations like these of some businesses for years now. Too bad our government cannot live up to its own high standards. The government freely acknowledges that a dozen agencies – ranging from the FBI to the IRS, Department of Homeland Security, and the Pentagon – routinely buy the most intimate and personal data of Americans scraped from our apps and sold by shadowy data brokers. The data the government collects on us is far more extensive than anything a commercial data aggregator could find. The government can track our web browsing, those we communicate with, what we search for online, and our geolocation histories. This is far more invasive and intrusive than anything private businesses are doing in screening applicants and monitoring employees. Worse, the government observes no obligation to reveal how this data might be used to compile evidence against a criminal defendant in a courtroom, or if agencies are using purchased data to create dossiers on Americans to predict their future behavior. There is no equivalent of the Fair Credit Reporting Act when it comes to the government’s use of our data. But there is the Fourth Amendment Is Not For Sale Act, a bill that would require the government to obtain a probable cause warrant – as required by the Constitution – before inspecting our digital lives. The Fourth Amendment Is Not For Sale Act passed the House this year and awaits action in the U.S. Senate. Passing it in the coming lame-duck session would be one way to remove the hypocrisy of the federal government on the digital surveillance of American workers, consumers, and citizens. Comments are closed.
|
Categories
All
|