This post originally appeared on Friday, August 12 in my weekly newsletter.
After reading the Tesla stories this week that prompted Kevin’s story above, I tried to talk to my kid about our upcoming trip to visit some colleges. In preparation for their freshman year, my husband and I read The Price You Pay for College, a book that details the amazing use of data collection by colleges as they seek information to help determine how ready a family is to pay for college.
A summary of the practices can be found here, but in short, colleges take financial information and even pixel tracking email data to gauge how interested a child might be in attending their school, as well as the family’s financial resources, then price the experience accordingly. After sharing this with my child, they tried to “game the system” by waiting several hours to open their emails from the colleges we were planning to visit to try to mislead any price signals.
I heard them describe their strategy, which made me sad. First, I’m not sure it really helps. And second, this is just one source of data about my child available to college admissions officers and their army of counselors. Finally, this is just one small area of their lives where my child is tracked by a set of sophisticated and largely invisible tools as they go about their lives.
These are not isolated cases. I have agreed to share some of my FitBit data with third parties. My location data is available from Google and my mobile phone provider, and tracking pixels follow my emails and web searches. My connected devices regularly send me to my family. When I bake cookies in my connected oven, my husband gets a notification. When I drive my Tesla around town, it can check in the app to see where I am.
I showed my husband and child how I can see everything they ask Alexa in the app, something few people may realize Alexa shares with the app owner. There are two problems here. One is that any connected device can display data about people living in a home with that device or using that device. The other problem is that much of this data also travels to the device manufacturer and from there can be shared with others, including law enforcement, advertisers and data brokers.
I address the first issue by regularly sharing the sensors and their capabilities with anyone who will be using the product and getting their consent. The second question requires a normative solution. And it looks like we can get it.
This week, the Federal Trade Commission said it will begin a public process to develop rules that will reduce “harmful commercial surveillance and lax data security.” On Thursday, the agency said it will issue an advance notice of proposed rulemaking that will solicit public comment on the harms resulting from what the agency calls commercial surveillance and whether new rules are needed to protect people’s privacy and information.
I believe we desperately need new rules. Just watching how limited my child feels trying to navigate social media and apps that want to know more and more about him and put him in a demographic box while he’s still trying to figure out who he is is painfully. As they realize that such surveillance also affects the price of goods for people through discretionary pricing — whether it’s for college or concert tickets — I can see them getting frustrated.
Some of the solutions to this problem will come from private companies. See how Apple disabled tracking pixels in emails. I also expect that we will see more messaging platforms adopt end-to-end encryption (a fact that will stress the government to no end) as they seek to avoid revealing incriminating information about users.
But the government has a huge role to play. We need rules that recognize the scope of the challenges and existing data markets. Data brokers are a huge, invisible force collecting and selling highly personal information to the highest bidder. These buyers can simply be annoying, offering ads that are surprisingly well-targeted, or they can be incredibly harmful, with the potential to deport a citizen or charge someone with a crime.
Regulations to cut off access to such data will have enemies in Washington and in the private market. The FTC wants to hear from consumers about the actual harms caused by commercial surveillance and plans to host a public forum on September 8 for additional public comment. This is your chance to voice your stories and concerns. Ideally, the FTC could enact more rules, but a better solution would also involve Congress and tech firms acknowledging that the business models they follow leave their consumers vulnerable to harm.
Then maybe Kevin won’t have to trade his personal life for convenience when he decides to buy a new car.