Spread: Power Dynamics across Digital Technology
By Yanely Lopez
Illustration by Taylor Brunnschweiler
Classical economists might suggest that the market is a place where a wide variety of actors come together on a balanced playing field. Conversely, the modern digital economy presents an economic reality that is anything but even; extremely important algorithmic systems and databases are controlled by a singular platform, often by just one individual (O’Reilly et al., 2024). As the dangers of data capitalism, in which our personal information is collected and sold for profit (Marciano et al., 2020), continue to grow, this begs the question of how we can spread power more evenly across digital technologies? Does the answer perhaps lie in technology itself?
If power is the ability to control and influence the behaviour of others, power online is made up of the management of digital platforms, the capacity to shape how users connect and the ability to determine what users see and for how long. This lucrative collection of data and subsequent curation of algorithmic systems is a common practice for most digital platforms, with TikTok serving as a clear example.
The popular social media platform collects extensive personal data from its users, including IP addresses, scrolling behaviours, app preferences, and complete purchase histories (Jawed, 2024). Access to and management of this sensitive information is concentrated within a small, centralized group of people. Despite TikTok´s operational network spanning across various countries and organisational tiers (He, 2024), a large part of decision-making processes remains in the hands of its ByteDance team in China. This includes production development and key strategies, but more importantly, access to behavioural data and user information requires approval from the team in China (Rodriguez, 2021).
This raises significant concerns, as the sensitive data of users from around the world, is not only concentrated within a select group of people, but also operating within a country that has an undemocratic system. In 2021, the Chinese Government was afforded a “golden share” of TikTok, which is a type of veto over certain firm activities, including the content they provide to the public (He, 2023). This again points to the nature of power online being concentrated within a certain company, country and in this case, political party.
This unequal distribution of power presents both explicit and subtle dangers to society. More alarmingly, it means the erosion of democratic systems that stems from the lack of transparency and limited civilian involvement in the management of social platforms. Most people report having very little understanding of how companies are using their data, with 71% rating it a big concern (McClain et al., 2023).
In modern democracies, it is vital for citizens to be well-informed not only about positive developments but also to ensure that wrongdoers are held responsible. For example, Facebook faced various charges and fines following its Analytica Scandal in 2018 (Cadwalladr & Graham-Harrison, 2018). However, none of the executives directly involved in the improper handling of user’s’ data and its subsequent usage in various political campaigns, were held personally responsible.
Assigning responsibility is no easy feat, though, as it is far easier to hold individuals accountable than systems (Aveling et al., 2016). The protective shield around Big Tech and their deliberately opaque systems allow for injustices to occur that put everyday people in danger. Social media platforms were intended to facilitate human connection. But the more complex allocating responsibility is, the easier putting monetary gains above human experience becomes.
Cambridge Analytica used access to the content users were liking and their connections in order to create targeted campaign ads (Hern, 2018) for their various political clients. Normalising the exploitations of emotions such as fear and anger, contradicts the original purpose of social media platforms and constitutes a significant intrusion on individuals’ autonomy.
Given the imminent dangers that accompany the concentrated distribution of power of digital technology, it is imperative to find a solution. With problems such as privacy issues but also misinformation and environmental issues accompanying technological innovation, it is logical to develop a certain wariness of its development. In fact, there is a rising trend of being “offline” or choosing to opt out of participating in social media completely (Kaspersky, 2019). However, whether one believes in technological inevitability or not, finding a way to use innovation to drive humans forward has always been important. There have been several attempts to democratise, or make digital platforms more widely accessible, to restore agency to users.
Many of these solutions include co-voting systems or a user council to give users direct oversight and input alongside actual developers (Gersbach, 2020). This does not, however, change the underlying nature of technology itself, which also needs to have a solid democratic foundation.
Instead, measures such as the development of open-source technology, a “data-as-commons” approach, and the implementation of self-sovereign identity systems (Fischli & Muldoon, 2024) must become practice. Each of these would promote transparency in how data is stored, let citizens decide what use to make of it and most importantly, allow individuals greater control over what information they share. Ultimately, the challenge lies not in rejecting digital technologies, but in shaping them so that their advancement support public welfare, rather than the narrow interests of a select few.
References:
Aveling, E., Parker, M., & Dixon‐Woods, M. (2016). What is the role of individual accountability in patient safety? A multi‐site ethnographic study. Sociology of Health & Illness, 38(2), 216–232. https://doi.org/10.1111/1467-9566.12370
Cadwalladr, C., & Graham-Harrison, E. (2018, March 18). Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach. The Guardian. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
Fischli, R., & Muldoon, J. (2024). Empowering Digital Democracy. Perspectives on Politics (Print), 22(3), 1–17. https://doi.org/10.1017/s1537592724000409
Gersbach, H. (2020). Democratizing Tech Giants! A roadmap. Economics of Governance, 21(4), 351–361. https://doi.org/10.1007/s10101-020-00244-5
He, L. (2023, January 27). China still wants to control Big Tech. It’s just pulling different strings | CNN Business. CNN. https://edition.cnn.com/2023/01/27/tech/china-golden-shares-tech-regulatory-control-intl-hnk/index.html
He, L. (2024, March 18). Analysis: Wait, is TikTok really Chinese? | CNN Business. CNN. https://edition.cnn.com/2024/03/18/tech/tiktok-bytedance-china-ownership-intl-hnk
Hern, A. (2018, May 7). Cambridge Analytica: how did it turn clicks into votes? The Guardian. https://www.theguardian.com/news/2018/may/06/cambridge-analytica-how-turn-clicks-into-votes-christopher-wylie
Jawed, A. (2024, August 12). What Data Does TikTok Collect and Why It Matters – Hollyland. Hollyland. https://www.hollyland.com/blog/tips/what-data-does-tiktok-collect
Kaspersky. (2019, June 18). 38% would give up social media to guarantee lifetime data privacy, Kaspersky’s study says. /. https://www.kaspersky.com/about/press-releases/give-up-social-media-to-guarantee-lifetime-data-privacy
Marciano, A., Nicita, A., & Ramello, G. B. (2020). Big data and big techs: understanding the value of information in platform capitalism. European Journal of Law and Economics, 50(3), 345–358. https://doi.org/10.1007/s10657-020-09675-1
McClain, C., Faverio, M., Anderson, M., & Park, E. (2023, October 18). How Americans View Data Privacy. Pew Research Center; Pew Research Center. https://www.pewresearch.org/internet/2023/10/18/how-americans-view-data-privacy/
O’Reilly, T., Strauss, I., & Mazzucato, M. (2024). Algorithmic attention rents: A theory of digital platform market power. Data & Policy, 6. https://doi.org/10.1017/dap.2024.1
Rodriguez, S. (2021, June 25). TikTok insiders say social media company is tightly controlled by Chinese parent ByteDance. CNBC. https://www.cnbc.com/2021/06/25/tiktok-insiders-say-chinese-parent-bytedance-in-control.html
Lucia Deon is a third-year Communication Science student at the University of Amsterdam, taking a minor in European Politics and Global Change. After a year of work experience, she plans pursue a Master in the vein of persuasive and political communication. Outside of her studies, she enjoys music, literature and exploring the city.
Madelene Nitzsche is a Cultural and Social Anthropology student at the University of Amsterdam with a background in communication studies, interior design, graphic design, and photography. Her experience growing up in Western and non-Western environments has influenced her interest in culture, art, urbanisation, and psychology. Opposing viewpoints, emotion, and contrast are a focus in her artistic work and it is also what drives her interest in the ethnographic approach. She is currently experimenting with multimedia artwork and poetry and perception of reality. Her vision is to integrate art and culture with ecology and community to improve societal relations and the relation to the self. You can follow her process @_.__.amber.___.