“I love ads.” It wasn’t what I expected to come out of my friend’s mouth. I was having dinner with Magda and the discussion shifted at some point from work to AI to all things digital, and I was surprised by her admission, so I pressed her on it. She explained that she feels like her personal data is well used by the likes of Instagram—she regularly clicks on sponsored posts and feels like she’s able to discover brands that align with her interests—and LinkedIn and Gmail, which give her nudges and propose responses. Her upshot was essentially: take more of my data, abuse my privacy even, as long as you’re helping me discover and learn.
I couldn’t feel more differently. I definitely don’t love ads. I’m very public all day and like to control what I do and see in my private life. It can all feel really inhumane, and I don’t want my life to be watched and shaped by big corporations.
Sometime it feels like that ship has sailed. Personal data is the most valuable thing many companies own. I love how a Scientific American article recently put it: We have become “infinitely knowable,” yet have little power over how our data is used. Some big voices are speaking up about that. Tim Cook got a lot of media attention in October when he called for the US to institute a federal privacy law and railed against the weaponization of data. “This is surveillance,” he said. “And these stockpiles of personal data serve only to enrich the companies that collect them.” He’s asking big questions about where we go from here.
But maybe there’s another one to be asked: How do we create more Magdas? In her view, Cook isn’t entirely right: Yes, her data might enrich other companies, but it also creates valuable discovery and enriches her own life.
In a world where the prevailing sentiment can be that ads are junk or Big Brother-ish, how do digital teams tasked with putting data to use change people’s perceptions? It’s a well-timed question. Companies have for the better part of the last decade invested in the infrastructure to collect our data, and the investment in the analysis and application of that data is now more significantly coming into play. But when the view is laser focused on increasing transactions, an opportunity is lost: to have a human-centred view versus a purely customer-centred one. The difference? It’s about seeing people as people, not just as users of your product or service, which can restrict your opportunities to create value for them–directly or indirectly.
We need to ask ourselves the questions that are not being asked often enough: Data for what purpose and what value? What does it really mean to give people value? How do we understand what the different values are for different people? Is there a way to be generous with data?
When I started asking them, I started noticing companies that were moving in the right direction on this front—and being transparent about what they’re doing. Tesco tracks what its 16 million Clubcard customers put in their shopping baskets, and in October it announced it’s going to use that data to try to influence its customers to make healthier decisions—this after learning that 7 out of 10 customers believe supermarkets could help them do this very thing. Sure, this could increase brand loyalty, or be manipulated to push certain products for financial gain. But Tesco identified a real need: 1 in 4 of its customers say they get confused about which foods are healthy and unhealthy.
In the case of Clorox, they didn’t have the data of their own. But Kinsa did. It’s a tech start-up whose internet-connected thermometers are in half a million US homes. Clorox licensed the resulting data, which didn’t contain personal information but was broken down by zip code—allowing the company to see, in real-time, where fevers were spiking around America and where flu- and flu-like illnesses may be striking. Clorox then curtailed its ad spend in healthy areas and increased it in those afflicted locations, pitching relevant products like disinfecting wipes. The two partnered on this last winter, and Kinsa says Clorox’s ads were interacted with 22% more often as a result.
The efforts feel a little more useful and value-driven while still being profit-minded. They haven’t gotten me feeling like a Magda just yet. But it’s possible I could get there–so long as those in a position to abuse user data do so in the right way.
The next time you’re tasked with designing, building, or iterating a service, ask yourself these three questions before you proceed:
- Have you considered that each data point is a actually a person?
- Have you contemplated how the data could be used to truly benefit that person (from their perspective, not yours)?
- Have you thought about your role in clearly conveying to the user that they’re getting a benefit in exchange for the precious data they’re sharing with you?
This is part of Signals, Winter 2018
NEXT: Changing mindsets by breaking silos, by Heidi Uchiyama
PREVIOUS: Organising around outcomes, by Alice Newton-Rex