Authors: John Havens
Blabdroid isn’t intended to be manipulative, however. As Hoff pointed out in our interview, the experiment is designed to provide
an emotional outlet for people based on deeper questions than are addressed with modern entertainment:
Instead of watching a reality TV show, we’re interested in what kind of emotional reaction people will have with a little robot. Despite a relatively low level of artificial intelligence, people are having phenomenally emotional experiences. And isn’t that the point? Do robots have to be incredibly smart to make our lives better? No, they just have to be designed right and fit.
11
Hoff’s documentary
The Love Competition
also explores the intersection of emotions and machines. Seven volunteers met with Stanford University neuroscientists who measured their brain patterns in an MRI machine. Volunteers were asked to vividly imagine their experiences with a current or past love, where a winner would be determined based on output of brain activity focused on emotion. The results are described by Angela Watercutter in the
Wired
article “Neuroscientists Measure Brain Activity in
Love Competition
,” where she points out that, based on physiological results (levels of dopamine and serotonin activity), people can show they love someone more deeply than someone else can.
12
Having watched the video myself, what was more powerful than the empirical evidence was the effect the experience had on competitors, who expressed deep emotion after leaving the MRI, many of whom were almost in tears. And in this case, the ELIZA effect of the MRI machine is less overt than with Blabdroid, but still just as poignant: People willingly, or inadvertently, will express emotions with the presence of robots or technology that would have stayed hidden without them. Fueled by a sense of freedom to express sentiment that may be construed as inappropriate or questionable by humans, people open up to machines. Even though they know they’re doing it.
In a final insight about the nature of people’s responses to artifacts engineered by humans in our interview, Reben brought up a powerful point about the nature of some of our oldest companions:
A lot of people have fears about artificial intelligence and social robotics. They think, if I get a robotic animal as a pet, won’t that be bad? I’ll be replacing social connections with technology. Newsflash—we’ve had this precedent for eons. It’s called a dog. Dogs have been technologically bred for generations through genetic selection to be our companions. Carbon or silicon, sometimes we need to vent our emotions on something that’s nonjudgmental.
13
Reflections
Mirrors aren’t always fun. In light of how we’re looking at ourselves, we may smile and love what we see. Or we may view ourselves through a lens of criticism, noting every blemish. Quantified self and the Internet of Things provide multiple ways to reflect on our humanity. They also let others peek from behind our shoulders and see us in ways we didn’t recognize before.
Being accountable in the Connected World with its multifaceted mirrors doesn’t need to be scary, just informed. The tools involved, like ELIZA, can provide catharsis versus criticism on your journey to optimization. But as the rate of technology is increasing exponentially, you can’t afford to linger at the glass without embracing your digital identity. Privacy isn’t dead, but requires being proactive—be accountable so the identity you broadcast is the one you mean to project.
[PART]
2
Be a Provider
BROADCASTING VALUE IN THE PERSONAL DATA ECONOMY
I WANT YOU TO GET UP RIGHT NOW AND GO TO THE WINDOW. OPEN IT, AND STICK YOUR HEAD OUT, AND YELL, “I’M AS MAD AS HELL, AND I’M NOT GOING TO TAKE THIS ANYMORE!”
—Howard Beale, in the film
Network
8
BIG DATA
Courts have recognized celebrities’ claims to a property interest in their name and fame to seek compensation whenever such an image is used for a commercial purpose. Why not extend such a property interest to the personal data of ordinary individuals? For, with the advent of digital technologies, hasn’t personal data of us all become an asset that is worth real money?
1
CORIEN PRINS
T
HIS ISN’T A BOOK
about getting angry. But it is a book about Hacking H(app)iness, which involves reevaluating ideas about the ways you measure what you value in your life. And like Howard Beale from the movie
Network
, I think your life has value. And in the Connected World, that value is
fiscal
as well as inherent.
Much of the debate around privacy with new technologies doesn’t stem from ethics, but economics. When I say you have a right to privacy no matter what your preference, I’m also saying you have a right to your money. Your currency. The stuff you put in a bank.
Nonetheless, in relation to privacy issues, it’s common to hear phrases like “But kids these days don’t care about privacy—they’re used to sharing their pictures on Facebook and grew up using social media.” First off, the scope of these statements is simply
absurd. Not all “kids” feel the same about privacy, plus most teens have a better awareness about setting their privacy controls than many adults. Secondly, once you become aware that data regarding people’s identity is being sold, stop making the conversation about privacy. Make it about economics and see the reaction.
Old Conversation
CONCERNED ADULT:
Don’t you care you’re giving private data away to brokers?
SAMPLE YOUTH:
Not if they give me a coupon or whatever.
New Conversation
CONCERNED ADULT:
Don’t you care you’re giving $1,200 per year away to data brokers in exchange for a few coupons?
SAMPLE YOUTH:
Why don’t I get any of that money?
Alexis C. Madrigal elaborated on this point in the
Atlantic
:
In a survey by Carnegie Mellon’s Lorrie Cranor and Stanford’s Aleecia McDonald, only 11 percent of Americans would be willing to pay a dollar per month to withhold their data from their favorite news site. However, 69 percent of Americans were not willing to accept a dollar discount on their Internet bills in exchange for allowing their data to be tracked. That is to say: if people think data is already flowing to a website, few would pay to hold it back . . . The companies making the data-tracking tools have serious incentive to erode the idea of privacy not just because they can make (more) money, but because privacy erosion leads to more privacy erosion.
2
One of the primary reasons we’ve all become complacent about privacy is not just because of our preferences toward technology;
it’s because the people who stand to lose money if we own our data don’t want us to cut into their profits. While the value of a person’s data depends on things like their age, where they live, and how much time they spend online, keep this point clear in your mind: Other people make more money off of your data than you do.
While the Internet advertising model shifts to adopt consumer awareness of the personal data economy, we also need to be accountable for our actions regarding payment of content providers. The 11 percent of Americans willing to pay one dollar to withhold their data may be opting to pay that dollar to the news site as an exchange of value. Content providers need to pay bills like anyone else, so many offer visitors the chance to pay in exchange for an advertising-free environment. We’ve been trained, however, to know we can find similar free content on dozens of sites, so typically don’t remain loyal where content feels commoditized. By and large, this means content providers need advertising dollars to derive revenue from any eyeballs that visit their site, however fleetingly.
What this means for consumers is we want the best of both worlds: We don’t want to be tracked or have our data be sold to brokers. But we’re also not willing to pay for content, so we unwittingly keep a broken advertising model afloat that erodes consumer privacy while profiting a diminishing number of Internet services that don’t want things to change.
But that’s the way things are, you say. Who cares?
You do. You just don’t realize how technologies like augmented reality and facial recognition mean people can tag your image and sell it like they do right now. But the visual economy doesn’t have terms and conditions for you to sign. Whether or not you care about privacy, if the broken Internet model goes virtual, people make money off your image and identity without your even knowing. So the next time you’re thinking of leaving a content provider’s site because they asked you to contribute money so they don’t have to
be reliant on advertising dollars, remember: In the virtual world you’re not just the product.
You’re the content.
The Personal Data Economy
If your personal data is the same as money, it deeply affects the economics of your life. When you broadcast your data, whether it’s personal via quantified self or public via the Internet of Things, start picturing yourself walking around with dollar bills hanging out of your pockets. Then picture someone taking those dollars from your pockets while saying, “Can you wear looser pants tomorrow to make it easier for me to fleece you?”
Seriously, picture this image and tell me you’re still complacent. As a parent, picture someone doing that to your kids while they’re also exposing their image and private information for anyone to see or stalk. Are you angry yet? Now picture a future where this practice will accelerate, where your currency gets traded without your involvement. Now join me in opening a window to let everyone hear you as you scream to the world, “I’m mad as hell, and I’m not going to take this anymore!”
People controlling an economy control power. The term “Big Data” could just as well be called “big money.” It doesn’t make a difference if your data may not be worth as much as someone else’s. It’s worth
something
. When broadcast, any digital information relating to your image, words, or actions becomes part of the personal data economy. Broadcasting in this context can be viewed in an economic context, where personal data is an issue of property versus privacy. As Corien Prins notes in an article in
SCRIPTed
:
In looking at privacy as a problem of social cost, commentators have argued that the prospects for effective personal data protection may be enhanced by recognizing a property right of such data. They feel that the present conception of
privacy is an ineffectual paradigm and that, if we want strong privacy protection, we must replace it with the more powerful instrument of a property right.
3
Data is property. Intellectual property, or IP, is such a big deal for companies because it implies ownership. Your likeness, actions, and history belong to you—at least until you give them away.
You’ve got an intimate and personal stake in the Big Data revolution: Your Little Data is part of it.
The Basics of Big Data
Among data scientists and tech geeks, “Big Data” is largely seen as a marketing term. It’s too general a concept to be tied to any one industry, referring to the concept of massive data sets from multiple sources. The term evolved as chips and hardware have become smaller and cheaper in the past few years, allowing for an explosion of data that currently has been too granular to collect. To give a sense of how vast the world of Big Data has become, the International Data Corporation’s digital universe study from December 2012 says that the digital world will reach forty zigabytes by 2020, an amount that is equal to fifty-seven times the amount of all the grains of sand on all the beaches on earth.
4
You begin to see why using the phrase “Big Data” is akin to saying “the Internet.” It’s too vast and general to have meaning in and of itself. As Rufus Pollock, founder and codirector of the Open Knowledge Foundation, pointed out in his article for the
Guardian
, the size of data isn’t the priority. What’s important is having the data and the best context to derive insights from it.
5
In terms of characterizing Big Data, many experts refer to the “three Vs” of Big Data—volume, variety, and velocity: