We generally think that credit scoring is benign, but something like China’s Social Credit system is at best creepy—and at worst a totalitarian nightmare. But what if credit reporting agencies and other, similar companies are gathering data on you and scoring you in ways that are not as transparent as, and not as regulated as, credit reporting is (at least in the United States)?
The linked op-ed is interesting as a quick survey of all the ways in which familiar companies like TransUnion (one of the three major credit reporting companies) and unfamiliar ones like CoreLogic, Retail Equation, and Sift gather information on us, score it, and sell it to companies we do business with—using the scores without our knowledge to determine (for example) whether or not to accept a return on a purchase or whether or not to rent an apartment to someone. The authors analogize the current state of this form of secretive surveillance scoring to the early days of credit reporting, before (in the United States) the Fair Credit Reporting Act and related legislation made credit reporting more transparent and redress for mistakes or abuse easier to get for those injured. Perhaps we in the West are closer to something like the Social Credit system than we think.
LINK: Data isn’t just being collected from your phone. It’s being used to score you. (by Harvey Rosenfield and Laura Antonini
for The Washington Post [via Houston Chronicle])
Operating in the shadows of the online marketplace, specialized tech companies you’ve likely never heard of are tapping vast troves of our personal data to generate secret “surveillance scores” – digital mug shots of millions of Americans – that supposedly predict our future behavior. …
CoreLogic and TransUnion say that scores they peddle to landlords can predict whether a potential tenant will pay the rent on time, be able to “absorb rent increases,” or break a lease. … Other employers use Cornerstone’s score, which considers where a job prospect lives and which web browser they use to judge how successful they will be at a job. …
Surveillance scoring is the product of two trends. First is the rampant (and mostly unregulated) collection of every intimate detail about our lives, amassed by the nanosecond from smartphones to cars, toasters to toys. … The second trend driving these scores is the arrival of technologies able to instantaneously crunch this data: exponentially more powerful computers and high-speed communications systems such as 5G, which lead to the scoring algorithms that use artificial intelligence to rate all of us in some way.
The result: automated decisions, based on each consumer’s unique score, that are, as a practical matter, irreversible. … It is mostly impossible to know when one has become the casualty of a score, let alone whether a score is inaccurate, outdated or the product of biased or discriminatory code programmed by a faceless software engineer. There is no appeal. …
What do you think?