How I Learned to Stop Worrying and Love Big Data

How I Learned to Stop Worrying and Love Big Data

by beck haberstroh

A popular meme argues that “if we want the rewards of being loved we have to submit to the mortifying ordeal of being known.” These words were written by Tim Krieder in a New York Times essay where he reflected on the experience of being accidentally cc’d on an email from one friend to some other friends about his goats. Not everyone likes Tim’s goats! 

Krieder goes on to imagine that “the single most devastating cyberattack a diabolical and anarchic mind could design” would be simply to “make every email and text ever sent universally public.” If we could all see, as he did, what our friends and colleagues really say about us, “the fabric of society would instantly evaporate.” Civilization depends on our ability to gossip and whisper. Privacy is essential for social order. And yet, as he processes his discomfort, Krieder comes to the conclusion that building trust with others means acknowledging that they observe you in your strengths and flaws. We must reveal our imperfections if we want to form authentic, loving relationships. 

Today, in the United States, we are experiencing the flipside of Krieder’s stated fear: our digital life is being shared, not publicly, but with a complex web of private actors. Krieder’s essay was published in 2013, the same year that Edward Snowden leaked classified information from the National Security Administration (NSA) that revealed extensive government surveillance of American civilians. Recently, we are coming to terms with an audience that includes not just our government, but also private companies. In the TikTok hearings this month, members of Congress questioned the app’s chief executive about its Chinese owner, ByteDance, as well as its handling of U.S. user data. As Dell Cameron writes in Wired, these hearings did not reveal so much about TikTok as they did about “how desperately the United States needs national data-privacy protection.” These issues are not at all exclusive to TikTok, but extend to many, mostly American-owned companies. “Surveillance capitalism,” a term coined by Shoshana Zuboff in 2014, is one way to describe this commodification of our personal data. 

We are increasingly confronted with the various forces to which we must ‘submit’ in order to gain access to goods and services. Phones, apps, and websites might ask permission to collect your data. Stores might inform you that they are recording your image and purchases. Social media companies, websites, phone and computer producers, internet providers, and indeed insurance agencies, healthcare corporations, government offices, and others make the argument that we can only be properly cared for if we give them all of our information. Even our mental health data, as a 2023 study by Joanne Kim at Duke University revealed, is not safe from being sold by data brokers. By submitting to the mortifying ordeal of being known to a company, we are told, we will reap benefits. Without a doubt, many of us do.

The “rewards of being loved” are, of course, not distributed equitably. Imbalances of power are amplified by big data. The public relations ideal of this transaction promises ads for shockingly perfect dream products, as well as hilarious videos, insightful articles, and lucrative networking. Instead, the mass collection and exchange of our information has produced a reality that affects the cost and accessibility of insurance, housing, and healthcare, and empowers law enforcement and the carceral state. Algorithmic bias reaches deep into the folds of each of our daily lives, some examples of which are detailed in Safiya Noble’s 2018 book, Algorithms of Oppression and Cathy O’Neil’s 2016 book Weapons of Math Destruction

This ordeal is “mortifying,” in part, because we know that we are being watched and because it can feel like there is nothing, really, to do about it. The demands of our work, living conditions, or social relationships might make it difficult or impossible to negotiate the terms of our participation. Of course, many people can and must evade watchful eyes and there will always be those who, for reasons that include lack of resources, threat of violence, or personal politics and values, live their lives offline to varying degrees. The margins offer many alternative models—but most of us have chosen or are forced, consciously or not, to submit.

Sometimes I think about my actions across different devices and platforms as a kind of performance. Surveillance capitalism relies upon this performance being predictable. By behaving with consistency, I ensure that the models that are built about me are accurate and profitable. The routines of my daily life are translated into scripts. 

I’d like to rewrite some of the show.

Within performance art, a “score” is a short set of instructions for a performer to follow, taking its name from the idea of a musical score. Scores are a way to explore everyday objects and situations. My collaborator Katie Giritlian and I often use scores as a part of our process, and our first ‘collaboration’ was hosting a night with friends exploring these kinds of rapid, impromptu exercises. 

Scores can interrupt what is otherwise routine. While these interventions may be absurd, they may also give us insight into how our activities are transformed into data. Scores give us a way of understanding how a piece of a structure operates, or how we operate within it. Or, at the least, they can be fun. 


These scores target the ways that our actions in digital spaces are used to build models of our behavior and are sold to data brokers. They are written to work across a wide variety of apps and devices—phones, laptops, search engines, social media platforms, messaging apps, and web browsing. The efficacy of these scores will likely be affected by what kinds of privacy restrictions you have set up on different devices, websites, and applications. Before you begin working with these scores, you may want to research and review, to the extent that you can, what data is being collected about you and how different applications have categorized you already.

Some of these scores can be ‘performed’ within moments, some require a deeper investment, and some act more as a poetic provocation. As you enact these micro performances, observe what happens to your feed, the kinds of ads that you get and content that you get served, and the ways that your data categorizations on different apps change. Remember that what you can see is only a small part of what is collected and sold about you.

A limitation of the approach offered in this syllabus is that it doesn’t take into account the different policies and protocols of each of these digital environments. Different companies reveal different elements of their data-collection methods and give users, each of us, different degrees of choice and insight. Some companies and governments work together and some do not. And all of this is constantly changing and most of it is a series of black boxes. Part of the problem is that the granular specifics of this ‘system’ are impossible to neatly summarize, even for an expert (which I am not). 


Make a Collective Data Body

  • Decide on an account, or several, that you and people who you trust will share in common. Together, establish community agreements about how you will use the application or device.
  • Consider whether it is important that you share with others that this personal account is being used by multiple people (for instance, social networking sites) and if so, how you will communicate to others that it is a shared account.
  • Make exquisite corpse drawings of your collective data body and use this as your profile image.
  • Most personal accounts are designed for use by a single person. What happens if we use personal accounts as collectives? How could this scramble the effectiveness of the data that is harvested from this account? How does it impact your personal enjoyment or use of the account?
  • Examples of collective data bodies include my parents’ joint email and Google accounts, certain media streaming accounts on my apartment’s shared TV, Kyle McDonald’s 2013 Going Public performance, and a 2020-21 collaboration that I did with Max Fowler called “Score for a Queerfake (1: How to Make a Collective Data Body)” where we used a shared Facebook account in order to intervene in the platform’s data collection mechanism and profile of me.
A screenshot of the Facebook settings section called "Your Topics," which includes a list of topics Facebook things the user likes and wants to see.

Live your best life

  • Many digital devices, apps, and services promise to help us become better, more optimized and efficient versions of ourselves, while at the same time building platforms that facilitate doom scrolling, hate-clicking, and working 24/7.
  • Within the constraints of your current life, are there other ways that you wished you used technology? Is it possible to use digital spaces to truly fulfill or nourish you?
  • For one hour, use your devices and accounts as if you are the dream version of yourself. Try to only search for, look at, read, watch, or surf through things that genuinely serve your ideal you.
Screenshot of a search engine bar with the words "samuel r delaney reading to" typed in, and the suggestions " you" and " baby" appearing below.

Live your alter ego’s life

  • Pick an alter ego. This ‘alter ego’ could be modeled off of an amplified slice of yourself, or a public figure that interests you.
  • For one hour, use your devices and accounts as if you are this alter ego. 
  • What is it like to try to perform as someone else for a data broker?
A screenshot of a Google feed of "Related questions," all related to living on the moon.

DIY Home Decoration

  • Turn the advertising space on websites and apps into decorating space.
  • What are things that you like to look at, but will never buy? Search for these things until they appear in all the ad spaces on websites that you visit.
Screenshot of the NYTimes homepage with a photo of rocks in the ad space at the top.

Find the sky

  • Make your social media or search feed entirely blue.
A fuzzy screenshot of a grid of blue images.

Visit an echo chamber

  • *use this score with caution*
  • Choose an app or website such as YouTube, TikTok, or Instagram. Spend one hour searching this space as if you hold political views that are different from your own.
  • Make a sound.
Screenshot of a graphic that says "Can you spot the bait?" and has a warning about misinformation, with a cartoon of an alarmed person staring at their computer below.

Write a one-person play

  • Draft the opening line for a new play. Type this line into your search engine.
  • Scroll through the first page of results, without clicking on any links. Find a word, phrase, or sentence that stands out. This will be the second line of the play. Type this line into your search engine.  
  • Scroll through the first page of results, without clicking any links. Find a word, phrase or sentence that stands out. This will be the third line of the play. Type this line into your search engine.
  • Continue until you’ve finished your play. As you ‘write’, consider whether your one-person play will require any descriptions of scenes, actions, or tone. 
  • Look at your “search history” to view your completed work. Read it out loud.
Screenshot of an example of a completed search history play, written in the way beck describes.

Play Clue

  • Sit down with someone that you are in regular communication with. Choose a platform or service that you both use (for instance, maybe you are on the same social media, or you read the same news website).
  • In the game Clue, players navigate a board in order to solve a murder mystery by identifying the person who committed the crime, the location in which it took place, and the weapon used.
  • Scroll through your feeds until you find advertisements or content that relate to the same things. Find a common person, object, and place in order to solve the mystery and win (perhaps what you are each shown is not identical, but references the same person, place, or item).
A photo of a score card from the game Clue, against a blue background.

Question Diversity Day

  • In an early, (in)famous episode of the popular American television show The Office, the boss character leads his staff through a horrific diversity training exercise. He asks his workers to treat each other like the identities that are written on index cards affixed to their foreheads. 
  • How do your devices, social media apps, advertisers, or others treat you like the workers in The Office’s Diversity Day? Are there goods, services, or content that you’ve been shown that relates to stereotypes about identities that you hold? Conversely, are there goods, services, or content that you’ve been shown that affirm aspects of your identity in ways that feel positive?
  • How do you communicate about your identities to the platforms, websites, and devices that you use? How do these companies communicate to you about your identities? Who do they think you are? Are they right?
  • Decide on an aspect of your identity that you want to share with the machine. Declare it.
  • Decide on an aspect of your identity that you want to hide from the machine. Disguise it.
Screenshot of Google's 'visual matches' image search function, with an image of a person's hair.

Try slapstick

  • Find a lightweight prop, ideally one that makes a fun sound. Hit the keyboard of your device repeatedly with your prop. Be sure to routinely press enter.
Screenshot of Google search results from a nonsense input.

Improvise a dance

  • What parts of your body, movements, and actions do you typically use to navigate your digital devices? Navigate your browser, apps, or device with a body part, or from a physical position, that you do not usually engage. How would Charlie Chaplin use a smartphone? 
  • Perform at least three of these movements without the presence of the device.
  • Many of the devices that we use create and reinforce ideas about how our bodies should look and what our bodies should be capable of. Liat Berdugo explored these questions in a 2015-16 performance titled Unpatentable Multitouch Aerobics.
Diagrams of a laptop and a music player with various parts labeled with numbers.

Travel time

  • A 2012 New York Times Magazine article by Charles Duhigg explained the way that companies work with statisticians to analyze their customers’ shopping habits and anticipate profitable life events. This is called “predictive analytics.” The article went viral because it relayed an incident where a teen girl’s father found out that she was pregnant through targeted mail advertisements. Mailers—how quaint!
  • Choose a moment from your life that you enjoy remembering, or pick a future moment that you savor anticipating. What if you were graduating high school and moving to a new city, or planning a big party for your retirement from your dream job? Use your apps and devices as if that moment is now. 
  • Better yet, for one week, travel backwards and forwards between these past and future moments. If you want to up the ante even more, consider what those life events would be like in an entirely different time period.
Screenshot of Google's shopping results related to a retirement party.

Connect-the-dots on a field trip

  • Turn on the ‘location services’ in your devices (phones, laptops, tablets, smart watches, etc). Take them on a field trip to an unfamiliar location, preferably using a mode of transportation that is unusual for you. 
  • Create a connect-the-dots drawing using your journey with these devices. 
  • If your housemates enthusiastically consent, you can even make it a family vacation by bringing along their devices as well.
A Google Maps settings screenshot that says "See trips you've taken"

Create your very own temporary utopia

  • For one hour, use your devices and apps as if you exist in the utopia of your wildest dreams. Consider, in this fantasy world, what you might search for, purchase, or communicate to friends and family. Consider whether any of these devices, apps, or platforms exist, and if they do, what their purpose is. 
  • In the coming hours and days, is there anything that you notice change in terms of the kinds of material shown to you?
  • You can also adapt this score to explore the worst dystopia you may fear.
A YouTube screenshot of videos about growing green beans.

Write your own! 

  • Consider an aspect of your personal data that you would like to explore.
  • Write a short score. Include action words. Give it a title.
  • Email it to me, or, if you have an account, add it to this shared for all to enjoy.


  • How did it feel to (mis)use these digital spaces? 
  • What, if anything, did you observe as a response to your scored performance? Did you notice any changes in the kind of ads you were served, content you were shown, or other functioning of these spaces?
  • What did you learn about your existing habits within these different apps, platforms, and devices? What did it tell you about what kind of ‘user’ you are? In what ways is yourself as a ‘user’ an extension of yourself, and in what ways is it not?
  • What did you learn about your values and beliefs? 
  • Did any emotions arise through completing any of these scores? Curiosity, vulnerability, fear, anger, joy, shame?
  • What questions do you have now?

A highly abbreviated, partial list of artists working with scores and issues of data privacy who inspire me and this syllabus especially:


To Julia and Gillian of the Syllabus Project, as well as Max, Katie, and my dad for thoughtfully reading and offering feedback on earlier drafts of this text.

beck’s syllabus references Katie Giritlian and Lai Yi Ohlsen, both of whom are Syllabus contributors.