Everyone craves downtime. For many people juggling full-time jobs and numerous obligations in their lives, the only way to get enough leisure time is to find life hacks—services like Task Rabbit, Stitch Fix, Asana Rebel, and Blue Apron.
That’s why advertising has the power it does. It’s easy to be swayed by ads that claim you’ll be fitter, smarter, happier, and more relaxed. And personalized advertising works even better because it’s relevant and timely. According to a 2017 Epsilon survey, 80% of respondents say they are “more likely to do business with a company if it offers personalized experiences.”
But that personalization can feel creepy. For example, while people scroll through their Instagram feeds, personalized ads pop up between the photos of cats, kids, sunsets, and food. The recommendations might be right on the nose and could save people a lot of time, but there’s a catch: Consumers are being used like bait.
Companies that depend on the advertising model for profit are giving preferential bias to third parties and making certain recommendations more visible than others because they’re paid ads. And worse, the inundation of advertising disrupts people’s days while harvesting untold amounts of their information.
Without legislation to counterbalance the advertising model, it will consume everyone’s privacy. People who opt for free services by Google, Facebook, Instagram, and others may be selling more of their soul than they realize. It only bubbles up to the public consciousness when data is misused, as with the Cambridge Analytica/Facebook scandal.
Customer First, Company Second
People have a baseline right to certain protections of their data. The silver lining of data breaches and unsanctioned uses of data is that they are creating momentum in Congress to create national data-protection legislation. Since May, Europeans’ right to personal data privacy was significantly strengthened under the General Data Protection Regulation (GDPR). But companies all over the world should tell people what data they’re collecting and why they’re collecting it, and they need to give customers greater control over what they do with that data.
Shortly after GDPR went into effect, California finalized its own legislation, enacting the California Consumer Privacy Act (CCPA). The bill, which establishes significant privacy rules for companies doing business with California consumers and incorporates many principles of GDPR, is set to go into effect in 2020.
But while the CCPA has the right intention, it was drafted hastily, without clarity and nuance. And even if it had been more fleshed out, data-protection legislation needs to go beyond the state level. If every state defines privacy differently, that means there are 50 different laws for companies to comply with. Large organizations with teams of lawyers can handle a complicated regulatory environment like that, but it’s much harder for smaller companies to manage. So that kills innovation because it edges out smaller companies that can’t scale and compete.
Federal legislation would level the playing field because it would make it easier and less expensive for all tech companies to comply. Recently, the BSA software-industry association released the BSA Privacy Framework to tee up key issues that federal legislation must address. The country needs a national law that gives people control, transparency, security, and consistency—not some big, blanket bludgeon that fixes issues with Facebook but unduly harms other companies.
Trust Equals Confidence and Opportunity
While national legislation will make it easier for all companies to compete and innovate, it will also build consumer trust. Enforceable law empowers and protects consumers. And when customers feel safe—that they have control over their data and can push and pull it and do whatever they want with it—they’re more likely to participate in data activities that actually add value to themselves and the world.
It also catalyzes the ecosystem of coming up with new ways of doing things because people are more comfortable and confident to participate. Once customers understand what they gain by opting in or what they lose by opting out for particular services, they can make better decisions that aren’t based on fear or mistrust.
For instance, I trust 23andMe. I get value from its health-and-ancestry services, and it’s transparent about the information it collects and how it’s used. If I want to opt in to 23andMe Research, it would anonymize my DNA and add it to a pool of customer data with the goal of making scientific breakthroughs.
But my DNA and personal info are not being sold or rented to any third parties without my consent. I know they won’t monetize my data for someone else’s benefit. Google could earn back my trust if my data remained my own, and I wasn’t being targeted and bartered off to someone else. I don’t trust it. But I have a relationship with 23andMe that I trust, so I gladly consent for them to use my encrypted data. Time will tell whether or not that trust was misguided, but—for now—it’s there.
Saying Yes to Inspiration and Innovation
If enacting stricter national privacy laws (in the EU, US, and elsewhere in the world) means companies don’t make as much money through the ad model and have to find other ways to make a profit, so be it.
With transparency, companies that aren’t on a mega-enterprise level can more easily highlight the economic benefit to the consumer and provide them with more innovative services. For example, CEO Katrina Lake launched Stitch Fix, a personal style service, with the explicit intent to be customer-centric, focused “on personalization and building a human connection scalably.”
Combining data science with human interaction, Stitch Fix personalizes fashion to a person’s style and body type. The more you work with a particular stylist, the more he or she gets to know you and sends you clothing you’ll like and want to keep.
Legislation that gives consumers more control can create a breeding ground for more customer-centric companies like Stitch Fix. Because you should be able opt in to recommendations but opt out of being profiled for advertising. And even if you don’t build a direct relationship with a human, services with recommendation engines can help you find things you like faster—saving you the pain of blindly searching online. It seems to work for Netflix subscribers; 80 percent of the content people watch on Netflix is based on its recommendation algorithm.
What if you could opt in to receive retail recommendations you knew you could trust, without any preferential bias? If you were served up relevant suggestions rather than paid ads, it wouldn’t feel like an intrusion, would it? I’d rather get recommendations that aren’t influenced by third parties or advertisers.
Meanwhile, the recommendation engine is valuable to the companies selling something if they can convince enough people to buy their products. If they’re creating a good product that people want and not just paying for ads to make sure their recommendations show up first, that’s better for the consumer.
At the end of the day, companies need to move in that direction because national privacy legislation is picking up steam. The goal of legislation should be to incentivize companies to use the data for returning value rather than just extracting value. Then companies will need to find ways to use data that serves rather than only monetizes their customers.
If companies are transparent and give customers control of their data, the opportunities for innovation will be ripe for the picking. And maybe that innovation will help people take a vacation once in a while.