The emergence of big data is opening up new, previously unthinkable opportunities every day. Especially when the data concerns people, their behavior, preferences and relationships, it is highly valuable because of what it can be used for.
There are people who don’t take privacy seriously and claim that “I’ve got nothing to hide.” While this may be true, to me this seems irresponsibly ignorant about what person’s digital footprint can reveal about them.
Just three years ago, group of researchers led by Michal Kosinski revealed how data from Facebook Likes could be used to guess several things about a person. With good confidence, properties such as one’s gender, sexual orientation, race, religion, smoking, drug use, alcohol, and many other factors could be inferred from public data. Soon after publication, Facebook made this data private, but the tool devised by University of Cambridge to generate your psychological profile with your permission is still available.
When it comes to big behavioral data such as Facebook, there is a world of opportunities available for their use in marketing. I’ve personally enjoyed the great targeting opportunities Facebook can offer to anyone out-of-the-box. If you can create your own custom audiences (facilitated by their Pixel, for instance), targeting is even more specific and powerful.
In the past months, somewhat questionable use of these powers has become to spotlight after the seemingly surprising – and tantalizing – victory for US Republican presidential candidate Donald Trump. An excellent story on the topic by Krasseger and Krogegus inspired me to voice my opinion about the use of data in influencing people. In the case of Trump, the story goes down to one particular company.
The questionable data deals
Cambridge Analytica is one such company that drives behavioral change through various marketing channels. They excel in skillfully mixing data from all possible channels of behavioral data, not just Facebook, but email, television, credit card companies and so forth, to profile individuals. In US, much of this data is available for sale, thanks to many DaaSs out there. CEO Alexander Nix of Analytica states on their marketing video that during the election campaign:
We hold 5000-6000 data points on each US individual
Given 320 million inhabitants, that is big data. There is another question on the fairness of acquiring this particular data, speculating did Cambridge Analytica in fact have a consent from Facebook users, for instance. Then again, of course data by itself is of no value.
To get value they segment the full audience into smaller psychographic segments relying on 5-factor personality model known as the OCEAN (akin to Big-5). Once they know who you are, they add probabilistic data of your desired activity (such as voting for a presidential candidate or Brexit) and select the most prominent segments for targeting different types of marketing messages in every channel imaginable. Including door to door campaigners.
Video of Nix talking about their work on the other US 2016 presidential election Republican candidate Ted Cruz. Cruz rose from the margins to be the major competitor within the party against Donald Trump. After he lost, Trump invested extremely heavily on Cambridge Analytica’s services.
Marketing people as they are, CA like to attribute Trump’s shock victory to their unique psychological targeting over social media. The aforementioned Motherboard story reveals more questionable details about the actual marketing messages and strategies that were applied over segmentation, but it is evident that they were as effective as they are questionable.
Although most recent and still topical, this is not the only case in which experimentation with unknowing or the questionable use of their data has made the headlines.
Facebook Researchers have faced two such event of *it hitting the fan. In the first case, they were accused of messing with US congress election by activating some people to vote. In the latter, Facebook study of emotional contagion manipulation the transfer of happy or sad mood were questioned. Companies have, I think rightfully, defended their practices even though the problem is hardly settled for good.
Naturally this is definitely not only about Facebook. Many other companies monetize their user data in a DaaS manner even more aggressively. It has been known for some time that one business model for ‘free’ mobile apps is to sneak users’ phonebook and compile huge registry later sold.
This unethical data collection and use is not constrained to personal phones and computers. Most recently american TV manufacturer Vizio was found guilty of spying on 11 million smart TV owners use habits and brokering the data further, possibly coupled with demographic information. Without consent. This resulted in rather modest $2.2 million payment to the Federal Trade Commision in February 2017.
Ethical uses of data in design
Data-driven design is one approach of utilizing data to achieve goals. In my writings (eg.) I like to associate data-driven design with product and service design. Design for behavioral change, as in the case of elections, might also be a subject of data-driven design.
But the nature of goals is important. Many people under the title of designer like to see themselves as working for the best interest of a user or customer. Of course there are conflicts, but I’d like to believe that most designers genuinely empathize with users and others humans affected by their designs (at least when not designing weapons of mass destruction or genocide). Placed on the bipolar political map of the Americans, I’d expect that centre mass of designers would lie towards the left.
When it comes to data-driven design, I’ve repeatedly been asked about the relationship of data-driven design to similar approaches such as growth hacking or conversion optimization. I tend to find the differences and define the unique identity of data-driven design through goals, methods and ethics. My interpretation of the situation is found below:
|Growth hacking||Conversion optimization||Data-driven design|
|Home BU||Marketing||Sales||R & D|
|Goals||More money / customers||More profits / revenue||Any change*|
|Mindset & values||?||Business||Design|
|Question/ Slogan||?||How will we improve a KPI?||How will we do things better?|
In short, I see that designers have a different set of values and intentions. Designers are not going to be happy by just satisfying typical sales or marketing goals. Marketing drives behavioral change in the benefit of the promoted product or service. It can be done without touching the product at all. Same goes for conversion optimization.
Keeping data-driven efforts ethical
The above presentation captured my ideal view. Naturally it is up to every designer to keep up to the high moral standards. This is not easy.
During Interaction’15 conference in San Francisco, several presentations discussed so called ‘black projects’. This term possibly originates from companies such as IDEO which have with the glorious growth of business gotten involved with more projects by non-disclosed contractors for unknown goals. Designers tend to get nervous about this type of work even if the secret goals wouldn’t necessarily be at odds with their own ethical orientation.
So even though design is not a unified profession and does not swear a Hippocratic Oath to maintain professional integrity designers will face similar, and possibly even more serious, moral dilemmas in dealing with data. A data-driven designer’s oath might actually become a good practice if things develop as they currently do.
When it comes to data, there are at least four points of ethical candor to look out for when dealing with human-related data:
In future, I see that there will be need for a digital equivalent of Fairtrade of data brokering to ensure issues of user privacy and consent are properly dealt with when dealing with data. This calls for new type of data transparency reporting in behalf of companies.
Text: Lassi A. Liikkanen, @lassial, Data-Driven Design Specialist, SC5
Featured image: Sami Rouhiainen, SC5