WASHINGTON, DC – The Subcommittee on Digital Commerce and Consumer Protection, chaired by Rep. Bob Latta (R-OH), and the Subcommittee on Communications and Technology, chaired by Rep. Marsha Blackburn (R-TN), held a joint hearing today on algorithms and their impact on consumers.
Chairman Latta kicked things off with a statement on why this hearing is so important for consumers, “Polls shows Americans both feel that technology has had a positive effect on our society, but are also skeptical about how their personal information is used by major technology companies. As policymakers, it is our obligation to ask the tough questions and make sure consumers understand how their information is being used in our digitally driven economy. That is why we will explore today how personal information about consumers is collected online and – importantly – how companies use that information to make decisions about the content consumers see.”
In her opening remarks, Chairman Blackburn spoke on the topic of net neutrality and content providers, “In some very concrete ways, the open internet is more threatened by certain content management practices. These two-year-old FCC rules have not and cannot address these threats, so it is disheartening to see Title 2 regulatory advocates happily conflating the two to divert attention from who is actually blocking content.”
Full Energy and Commerce Committee Chairman Greg Walden (R-OR) discussed the new risks and challenges posed by the incredible growth of the internet economy, saying, “The smartphones we carry with us everywhere, the tablets we log on to, and the smart home devices in our kitchens all represent a transformational shift in how Americans gather information, receive news and content, and connect with friends and family. These services are convenient, efficient, and provide value and tangible benefits to American consumers. The companies behind the services have created jobs, and brought the U.S. into the forefront of technological innovation. In exchange for using certain websites or platforms, consumers are willing to share personal details about themselves – names, locations, interests, and more. The context of the relationship drives that exchange.”
Witnesses listen as members provide opening remarks
Dr. Omri Ben-Shahar, Leo and Eileen Herzel Professor of Law, University of Chicago elaborated on his research regarding online privacy policies in his written statement, “Massive amounts of evidence show that people don’t read the disclosures and don’t use them to make more informed choices. In reality, disclosures are regularly ignored. They are an empty ritual. It is tempting to think that disclosures can be more effective if designed to deliver information to consumers in simpler formats. But simplifications, too, have been tried for decades and failed. My research shows that simplified disclosures about data privacy and security will have no effect on the behavior of consumers or the companies that collect their information.”
Dr. Michael Kearns, Professor and National Center Chair, Department of Computer and Information Science, University of Pennsylvania, answered Chairman Latta’s question about the risks that machine learning and algorithms can pose to consumers, “The use of machine learning allows one to make many inferences that are statistically quite accurate about consumers, that aren’t written down anywhere in the data about that consumer. … This is the kind of thing that’s hard for people to understand, and it’s even hard for the scientists at these companies to understand, this sort of predictive power that they have. When these models are built, they don’t really know a priori, and maybe even afterwards, exactly what properties of consumers, or inferences they’re making about them, that go well beyond the latent data itself.
Dr. Kate Klonick, Resident Fellow, Information Society Project, Yale Law School, commented on the issue of free speech with regard to online companies and consumers, “The free speech implications of the vast power of these platforms to self-regulate are twofold. One, it has a lot of implications for the users’ speech rights in how these private platforms can unilaterally control what goes up and what goes down on their sites. But also, these platforms have arguably free speech rights themselves. Their right to create the community at Facebook or at Twitter, for example, is arguably their own first amendment right.”
Dr. Catherine Tucker, Sloane Distinguished Professor of Management Science and Professor of Marketing, MIT Sloane School of Management discussed the tradeoff between consumers caring about their privacy and utilizing the benefits of many online services, “This is a so-called ‘privacy paradox.’ So many people say they care about privacy, but then act in ways which doesn’t live up to that. We did a little study at MIT where we showed that undergraduates were willing to share really very personal data in exchange for a slice of cheese pizza. What was slightly disconcerting about it was even the people who said that they really cared about privacy, they would usually behave in accordance with those norms, but the moment they saw the cheese pizza was the moment they were willing to share the most personal information. … We do see this inconsistency between the way consumers talk about their privacy and actually act in the online world.”
A background memo, witness testimony, and an archived webcast of the hearing can be found online here.