It’s like something out of dystopian fiction. A lone car creeps along a darkened street. High-tech scanners and surveillance software create streams of data for every car passed, logging dates, times, coordinates and taking photos. The software pings the driver every time it passes a car whose owner has defaulted on their loan.
The driver is not a thief. He’s not a police officer looking for a criminal. He’s not even a private detective. He’s a repo man. He frequents this neighborhood because the people who live here are lower income, many are people of color and many have been hit hard by economic woes. It’s easy picking.
Nicole McConlogue, an associate professor of law, clinic director at WVU and consumer protection advocate, theorizes that practices like this are exacerbating economic barriers for marginalized people. Over the years, she has amassed enough evidence to make her argument — now published in the Stanford Journal of Civil Rights Civil Liberties in an article titled “Discrimination on Wheels: How Big Data Uses License Plate Surveillance to Put the Brakes on Disadvantaged Drivers.”
Her findings are startling, and repo men are just the beginning. Auto financiers and insurers can use the data this technology can collect to score consumers, sometimes based solely on where they live and the economic disparity of their neighborhoods, effectively limiting access to essential products and services. McConlogue calls this a revival of historic redlining practices.
What could this technology mean for consumer rights and privacy? McConlogue has some answers.
How did you find out about this type of surveillance? Who is using it?
I saw an article in the Washington Post about repo men using automatic license plate readers (ALPR) to skim lots of license plates with little effort. Now, instead of walking around with a flashlight looking at one car at a time, you could use a device almost like a radar gun to read all the plates for you as you drive by.
Something about this technological development unsettled me. Although people at any income level can and do default on loans, it would be too easy for Black, brown and low-income people to be targets — there would be no downside to circling neighborhoods where these folks live every single day. Among police, this tool is widely used to solve crimes and enforce parking restrictions, but law enforcement departments can also use it to track [and share data with organizations such as ICE on] immigrants and Muslims and to scrutinize low-income communities of color [according to reporting done at the “San Diego Union-Tribune,” the “L.A. Times” and the “Electronic Frontier Foundation” among others].
Advocates for privacy and fair policing are pushing back on the law enforcement side. And the police data is not public. But the consumer rights issues are flying under our radar. ALPR data could impact what interest rate you pay for auto financing or your auto insurance rate. Commercial uses of this technology show the same tendency toward discrimination as police and immigration uses of ALPR data. This data is finding its way into all kinds of automated decision-making processes that may impact the financial futures of consumers across the country.
Can you explain redlining and how this might be a revival of that practice?
Banks used to use color-coded maps provided by the federal government [specifically The Home Owners’ Loan Corporation, which operated from 1933 to 1954 (including the economically critical postwar period)] to determine which loan investments were riskiest. Red zones, where Black people lived, were considered “hazardous” and could only get very expensive and unfair loans, if they could get loans at all. The ability to purchase and improve real estate has been a direct pipeline for generational wealth, so this practice set down an enormous roadblock.
Today, ALPR technology has taken its place among other forms of big data, which data brokers use to enhance predictive analytics. First, those collecting the data, scanning the license plates, are often in businesses such as repossession. They are financially incentivized to surveil neighborhoods where they think people do not pay their bills, disproportionally monitoring poor, marginalized communities. This data is added to analytics programs that try to predict consumer behavior and “score” their creditworthiness. Race and class become an inevitable part of this data and very likely influence the results. Once businesses have this data, they can then draw potentially biased conclusions about consumers and adjust prices in a discriminatory manner, effectively putting things like auto financing and insurance out of reach and setting up new roadblocks.
Is anyone fighting back?
One of the biggest issues is that these businesses’ rate-setting formulas are trade secrets. So, you can't even look at them and say, “your algorithm is flawed.” States have been working to take certain pieces out of the equation, like marital status, and groups like Economic Action Maryland have developed a tool where consumers can pull rates from different insurers, change their sex, their marital status, their zip code and see how the rate changes. This kind of reverse engineering, crowdsourcing of information, can reveal problematic patterns to consumers and, hopefully, push action.