In March of this year, Mark Zuckerberg announced that “[o]ver the last 15 years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square. But people increasingly also want to connect privately in the digital equivalent of the living room.” While this is of course a strategic move, to associate his dying platform with the “privacy” of a living room after the Cambridge Analytica scandal, Zuckerberg also touches on a shift which has occurred since Facebook’s public launch in 2006. Facebook, and other social media platforms, preach global connectivity, while most user experience today boils down to the mundane chattering of everyday, “wat should we make 4 dinner?”, life.
Over the last decade, digital platforms have seeped further into the domestic sphere and closer to the body, through devices we can no longer imagine our lives without. The Amazon Echo, the Fitbit bracelet, the Apple watch, the iPad, and every shade of smartphone, keeps us constantly connected—not just to each other, but to profit-driven platforms who have aims of their own. What liberation was found in the disembodied “user” of the 1990s when personal computers first entered the domestic sphere, is now, through closer proximity, an entry point for a more deeply integrated daily consumption of advertisements and goods (targeted at YOU).
If we compare just the visual language of contemporary Facebook with Myspace circa 2005, there is an obvious shift in control, from user to platform. Maybe you remember Myspace in the early days—there were a few formatting constraints, but the rest was up to the user’s whim. You could have a background of glitter GIF animations with an overlay of multicolored horses or text dripping 8-bit blood over an array of spider webs, or SpongeBob on top of SpongeBob with the theme song on autoplay. This was one of the first online platforms I remember interacting with, with easily implemented code snippets giving at least the illusion of hacking the façade. Something could be made there that wasn’t there before, and it was up to you to decide what version of yourself to put forth.
Beyond the surface of Facebook today, content provided by users is formatted and censored, messages are read, photos are scanned for the faces they contain, your interests are cataloged, your reactions monitored. Aside from what is taken by the platform, what the platform “provides” is another consideration. We are given a range of six emotions to express our emotional reaction to a post, one “current city” we live in, one profile image to represent ourselves, a “workplace” which asks for the name of a “company” to explain how we earn a living. This simplification of the complexity of a life into quantifiable data points is harmful, especially if it is also the way we start to see ourselves. In the early 90s, psychologist Mihalyi Csikszentmihalyi wrote that “an object with a specific form and function inevitably suggests the next incarnation of that object.” I would take this a step further to argue that the marketing category of an object inevitably drives the next incarnation of that object. A chair no longer considers the best bodily posture of sitting, or the mood of the sitter, or any actions performed while sitting, as much as it reiterates the visual style of chairs, which have been successfully marketed and sold previously. Digital platforms follow the same path, further abstracting our needs to create and market consumable products, each platform acting to perpetuate the same system over and over again. As this abstraction continues, we distance ourselves more and more from the natural processes of our bodies, our immediate environment, our social interactions, and our psychology, to focus on the image of how our lives “should” look on a surface level. With digital platforms, this surface is built, made public, and viewed through a device which we have altered our lives to accommodate, not the other way around.
Many, myself included, feel that they can’t just leave or “opt out” of such platforms, as they hold too much social weight, contain too much valuable information, and provide a sense of proximity to those who live far away. The idea of a “digital detox” is of course an option, although this is ultimately a trend to get people to buy vacations in remote locations where they sit for a week and “experience nature” before returning to the same habits they had before. What then, can be done to break these constructs and shift power back to the user? This guide offers a few tips on where to begin:
Digital platforms are per definition businesses, regardless of the way they operate. The “success” of a platform relies on the number of users who perform the tasks pre-anticipated by the business that runs it. Those tasks might not reveal their value directly to users, and we are often quick to ignore them (the truth can be scary). WhatsApp, like Facebook, generates capital through selling user data, Yelp generates capital by getting businesses to pay a subscription to boost their ratings, Pinterest generates capital by discreetly integrating targeted ads into your feed. Often times, platforms have multiple streams of income—YouTube’s main source of income is through advertising, while it also sells a “premium” subscription for users to view content without ads. A win-win formula. Additionally, digital platforms which might have nothing in common in terms of function, capitalize in very similar ways. Hulu and Tinder, for example, are both based on a subscription model, while with one you watch Game of Thrones and with the other you look for your next bad date. Through selling advertising space, selling user’s data, selling subscriptions, or by taking a cut of monetary transactions, digital platforms have a variety of ways of monetizing their services. Informing yourself on how a platform earns its money is a great place to start in understanding the real intentions behind the design of that platform, and how you’re expected to use it. With this in mind, you can more directly question your participation in that platform, make demands for changes in policy, and find ways to subvert its functionality if you wish to use it more than it uses you.
I often think FOMO, rather than actual need, is what drives most usage of digital platforms. Certain platforms have become synonymous with their intended usage, while they don’t necessarily do it better than pre-internet methods (ie. Amazon vs. shopping locally, Tinder vs. meeting IRL, Uber vs. Taxis). The scale of the audience is one thing that digital platforms have going for them, but while reaching a large number of people has its obvious benefits, we don’t have to play out the scenario a platform wants us to play. Instead of engaging with them by default, like checking Facebook everyday aimlessly, we could try a more targeted approach (we all know Facebook does). We could have an intention to do something, and not just stick to the conventional methods of getting it done.
Tinder is normally used to rapidly see who you don’t want to date and to find one-night-stands and the occasional relationship. It is probably not normally used to sell a piano, find an apartment, create an alter ego, learn about pandas, practice a foreign language or reach voters to support your political campaign, although none of those things would be out of the question. The platform allows you to upload 9 photos and provides a free text area of 500 characters and a messaging function. That is all the functionality you need to potentially do any of those things. To top it all off, since none of those things are expected on Tinder they might actually work better than on platforms made specifically for that. By performing these small fissures in the predictability of your actions online, you mess with the data, remain conscious of the trail you’re leaving, and make it more difficult for platforms to influence your decisions.
“Blackboxing” is a term coined by Bruno Latour, meaning “the way scientific and technical work is made invisible by its own success”. Much of our world today is blackboxed: our sewage systems, the selling of our data, our clothing production, and harvesting of our coffee are all processes we generally know nothing about but interact with directly or indirectly everyday without question. The invisibility of these systems keep those who profit from them in control. But now we have this great tool called the internet where we can look those things up to better understand them, and then consequently receive a stream of ads for composting toilets, data protection software, organic cotton t-shirts, and fair trade coffee.
YouTube makes its money mainly through advertising. Since YouTube is owned by Google, it tracks its users’ activities on a number of websites, as well as Google Chrome, to build a user profile which it sells to advertisers to match specific ads with specific users. These ads are for products and services that relate to the videos you have been watching and things you have searched for recently. You make it much easier for them to advertise products to you if you watch unboxing videos, but there is one genre of video that might come close to subverting this model—the DIY tutorial video. “How-to” is one of the top four categories of videos on YouTube, with a search increase of 70% per year, ranging from how to fix your glasses to how to build your own house. These of course serve as advertisements in themselves for the materials and tools you need to make whatever it is yourself, however they also potentially push for a reconsideration of the value of something by revealing the process through which it is made (ie. you realize it’s very simple to fix your immersion blender that stopped working once you break open the solid plastic casing designed to prevent you from doing so; or after seeing how long it takes to knit a balaclava, you don’t mind spending some extra money on a handmade one). The more we know about the way things are made, how they can be fixed, how they can be hacked, the more we can live outside of a capitalist model, which privileges profit above all else.
So now you know how they make their money, you have an idea of how to subvert their functionality, and you are ready to “unblackbox” your shit, but let’s return to the ideal of most of these platforms—connecting people. What happens online should enrich and extend your “real” life, and not vice versa. The consequence of living on a planet of 7.5 billion, with more than half of them online, and 3 billion active on social media, is that we often don’t stray from our comfort zones to really learn of the lives of others or apply what we’ve learned to our own lives.
I currently host an Airbnb Experience, which is a workshop to “Make Your Own Bergh*** Clubbing Outfit”. In this workshop, guests are provided with material, patterns, and instructions to make an item of clothing to be worn clubbing. Although simple to sign up for, the class is “authentically” strenuous, requiring guests’ full attention and hard labor for roughly 5 hours. Although Airbnb takes a cut of my profits (about 15%), the platform enables me to meet a demographic of people who don’t have a lot of experience with making things with their hands, and who probably normally buy new clothing made by children in Bangladesh. Because of this, I’m willing to overlook the 15% received by Airbnb, if I can convert two people per class into anti-consumerist makers, or at least encourage them to consider their clothing purchases through the physical memory of what it feels like to sew a garment.
A Word for the Weary
None of these techniques aim to totally take down a digital platform, they might, at the very least, help us consider what we really want from them and how to keep them from “using” us. We live within so many systems that we take for granted and prefer to not learn about because complexity can be overwhelming, but leaving stones unturned doesn’t make them go away. Despite what the tech utopians say, we can’t “fix” all of our problems with technology, but we can learn about our problems and experiment with ways of engaging with our problems through technology. The advantage that we have as humans is that we have more intricate relationships, nuanced emotions, unpredictable habits, and irrational desires which ultimately are unquantifiable and therefore unsellable. All we need to do is embrace the complexity of being human and remind ourselves that we existed without these platforms less than 20 years ago, and we will continue to exist without them after their demise.
About Anna: Anna Reutinger is a sculptor originally from California, living and working in Berlin. Reutinger’s work subverts the position of the digital-native-global-citizen by soothing digitally induced anxieties and providing coping mechanisms to apocalyptic inevitabilities through a return to craft and empathy. She received her M.F.A. in Dirty Art from the Sandberg Instituut in Amsterdam, and her B.A. in Design Media Arts and Digital Humanities from UCLA (University of California, Los Angeles). Her work has been exhibited at the Saint Etienne Biennale, FR; Jan van Eyck Academie, W139, Ram Foundation, De Fabriek, NL; Macao, Milan, IT; The Hammer Museum, The Getty Center, The New Wight Gallery, Chin’s Push and Control Room, Los Angeles, US.
✨This is the fifth in a series of seven commissioned essays for 2019. With these original essays, our aim is to publish work that engages with digital visual culture, both in its niche manifestations and within the technological, political, and mainstream reality of the internet. Want to be the first to know when our next essay is published, stay up to date on our activities, or just find the latest ‘must sees’ on the internet? You can have all that and more when you sign up for our newsletter!