Companies on the Internet are tracking you with vastly powerful Big Data algorithms to determine what to sell you, for how much and what financial opportunities (loans, accounts, credit cards) to offer you. Today at 10am, I join an FTC workshop on Alternative Scoring Products to debate the transparency and fairness of the system. Privacy and technology experts from industry, academia and the public interest will also participate. They include Professor Joe Turow of the Annenberg School of Communication at Penn, author of The Daily You: How the New Advertising Industry is Defining Your Identity and Your Worth (link is to an interview on NPR's Fresh Air with Terry Gross), and Ashkan Soltani, a privacy and technology consultant (formerly to the FTC) who also helped the Wall Street Journal crack behavioral tracking, scoring and dynamic pricing codes for its award-winning "What They Know" series. You can attend or watch online or watch it as an archive after the event.
The FTC has published a short blog explaining predictive scoring. Along with Jeff Chester of the Center for Digital Democracy, we have submitted a detailed comment (scroll down to see the full pdf) explaining our views. Only a few other comments have been posted so far, but you can file your own comments (due by 21 April) after you watch the workshop.
For many years, financial marketing that was based on your financial profile (your credit report) was conducted under strict "prescreening rules" of the Fair Credit Reporting Act of 1970. Companies could use your financial profile only for credit or insurance marketing (not general target or direct marketing) and if you replied to the offer, they had to make a "firm offer of credit." Importantly, if you choose to reject this use of your credit report, you have the right say no, or opt-out, to protect your privacy.
But in the new Big Data world, the firms claim that their uses of information don't "determine eligibility" and so don't trigger your FCRA rights; so, they assert, they don't have to give you any disclosure of their practices or the right to say no. They claim this even though their financial marketing offers -- essentially prescreening on rocket fuel -- are now being made in real-time (in hundredths of a second) and aren't based only on their compilation of financial data. Their algorithm also may include the powerful new added dimension of your location (from your mobile phone), as well as other data bits, all analyzed together in milli-seconds to decide where to lead you and what to show you and what to offer you. In our view, the vast majority of these instant decisions are still based largely on information that amounts to financial profiling and designed to encourage you to accept financial offers, whether good or bad for you.
For example, some of the non-transparent, deceptive pages you are led to on the Internet when you type "I need a loan," may appear to be lenders, but aren't. These websites are actually "lead generators," that ask you a few questions to determine your value and then auction you off to the highest bidder, often an online payday lender or for-profit school. Lead generators are the target of numerous enforcement inquiries, including by New York.
In a nutshell, if the protections offered by a regulated prescreening system for financial marketing are diluted by a switch to scores generated using largely unregulated Internet algorithms created through the sharing of cookies and all these other tracking bits between and among a vast interconnected network of business-to-business firms that consumers don't know about or do business with, consumers will be harmed.
As the authoritative National Consumer Law Center pointed out in its recent report on "Big Data," the Big Data algorithms, which may not be accurate or have any relation to your creditworthiness, also may implicate your rights under the Equal Credit Opportunity Act, which prohibits discrimination on the basis of gender, race, age or other factors. Further, fellow workshop panelist Pam Dixon of the World Privacy Forum will point out today that some of the algorithms used by some players in the Big Data universe are targeting consumers based on "vulnerability-based marketing."
Here is a summary of the joint U.S. PIRG/Center for Digital Democracy comment:
The growing use of so-called “e-scores” —a form of invisible (to the consumer) online ratings — can help determine our credit worthiness, “lifetime value,” or even the prices we pay. These e-scores can be used to blacklist or engage in discriminatory practices against individuals or even groups of consumers. We are aware that there are numerous online scores being generated for a variety of generally non-controversial uses, including predicting identity theft or fraud. However, we remain concerned that the largest and most important uses of online scoring are to substitute for the highly-regulated pre-screening regime that for years has governed the use of consumer credit reports for marketing purposes. Its proponents claim that the files developed are not on individual consumers, but on clusters of consumers. Its proponents claim online scores are simply a method for establishing audiences for serving ads. Not subject to the Fair Credit Reporting Act FCRA) regulation, they assert, are scores and other products that identify consumers on an aggregate basis (which for them means information narrowed to a small cluster of households at the ZIP+4 level) or consumers not named by name. We disagree with these representations and commend FTC for its inquiry.
CDD and U.S. PIRG also explain our work in a recent Suffolk University Law Review article "Selling Consumers, Not Lists." Probably the best news resource on E-scores is the 2012 New York Times article by Natasha Singer, "Secret E-Scores Chart Consumers’ Buying Power."
I am looking forward to the debate. You can attend or watch online.