Recently, the New York Times, had an article on the new trend to e-scores.
Companies gather data on you, and make a calculation of your value as a customer. As I have blogged before, and as the Times here notes, this can create a new kind of discrimination:
Federal regulators and consumer advocates worry that these scores could eventually put some consumers at a disadvantage, particularly those under financial stress. In effect, they say, the scores could create a new subprime class: people who are bypassed by companies online without even knowing it. Financial institutions, in particular, might avoid people with low scores, reducing those people’s access to home loans, credit cards and insurance.
The Times explains:
Here’s how eScores work:
A client submits a data set containing names of tens of thousands of sales leads it has already bought, along with the names of leads who went on to become customers. EBureau then adds several thousand details — like age, income, occupation, property value, length of residence and retail history — from its databases to each customer profile. From those raw data points, the system extrapolates up to 50,000 additional variables per person. Then it scours all that data for the rare common factors among the existing customer base. The resulting algorithm scores prospective customers based on their resemblance to previous customers.
EScores might range from 0 to 99, with 99 indicating a consumer who is a likely return on investment and 0 indicating an unprofitable one. But in some industries, “knowing the bottom is more important than knowing the top,” Mr. Meyer says. In online education, for instance, scores help schools winnow prospective students who are not worth the investment of expensive course catalogs or attentive follow-up calls — like people who use fake names or adopt the identities of relatives.
“If we can find 25 percent who have zero chance of enrolling, we can say ‘don’t waste your money on them,’ ” he says.
EBureau charges clients 3 to 75 cents a score, depending on the industry and the volume of leads.
Obviously, as time goes by, more and more online data, such as ones online use of social media, blogging, contributions to political parties, etc, will get fed into the algorithm, and no good deed will go unpunished.
Is there any ray of light in this? Obviously such activity can be regulated — indeed the Times article describes the great lengths these companies go to develop algorithms that do not fall afoul of laws regulating credit reporting.
But, more optimistically, lets suppose a society in which we were serious about those under challenge from financial and other crisis. The very algorithms that show who is undesirable as a customer probably show who is in need of help, and who, if government invested in helping them, might untimely have less need for governmental services. Its surely hard to imagine such an approach being popular today, but it is at least one way of thinking. Till that time we urgently need consideration of regulation.
Richard–I agree with you
Sorry I got cut off. I don’t think that any of the disclaimers/waivers users of these sites include enough information on what they are collecting and how they are using it to create tools/analytics that will lead to a more credit seggregated society. To the degree that they are using public resources (licenced air waves equivalents in the online wi fi context through the FTC)–there should be some requirements that those using the equivalent of public airways (old radio analogy) will not use those national resources to bring back segregation and discrimination in new ways/manifestations. At a minimum, I don’t think that consumers are being provided enough information to make their acceptances of policies raise to the level of informed consent. Who expects to have 50,000 other data fields associated to their cyber entity when they buy a book on line, download the next version of angry birds for the kids, etc etc? The expectation that consumers will raise up and demand better will probably don’t pan out. So the market won’t resolve these issues and regulators and advocates need to monitor these type of practice, for a lot of good reasons, including civil rights reasons.