Relevance Filter Application

First off, the link to my tool to score the relevance of documents/data: https://relevance-filter.digitaldrivenworld.com/

As a non-technical researcher, I had experienced some major difficulties when analysing social media data. Most of my research has been examining a particular social issue and most of the scraped data were irrelevant to my research questions. For example, I worked for Social@risk to investigate Vietnamese textile workers’ grievances and collective actions by analysing a large corpus of posts and comments on Facebook. The most time-consuming task was probably finding the relevant data because we had to scrape all the posts of many Facebook groups and pages, and most of the posts and comments were irrelevant to our research problems (too many ads). I had the same difficulties when analysing Singaporean citizens’ sentiment and opinions about the country’s Smart Nation initiative. We scraped millions of posts and comments from the politicians’ and organizations’ Facebook pages, and had to filter out everything that’s irrelevant to the Smart Nation initiative. It was always like looking for a needle in a haystack, and we were always afraid that we would accidentally remove important data when filtering out the irrelevant.

That’s why I developed an application and proposed a method to score textual data’s relevance. The application is not perfect yet, but I tried it out with some of my data corpora and got relatively positive results. At least it saves time compared to searching for keywords in excel. But I can guarantee that it is more useful than that. I attached a document to explain my approach, the rationale behind it, and what researchers should take into account when using the tool. I hope I will find time to modify and improve the tool with my professor’s suggestions.

I’m very grateful for my professor, Dr. Bernhard Rieder, who gave tutorials on Python (and much beyond that) and practical advice for my very first programming project. I’m not a fast learner in programming but his patience and dedication helped me to go through. My fellow classmates in the tutorials also gave lots of suggestions for my project. And last but not least, I wouldn’t have completed this project without my partner’s help. You see, I just designed the tool and coded the back-end. He helped me build the database, set up server, develop the front-end, and fix my bugs :). I just can’t be grateful enough for all the support I had.

I hope the tool will be useful to someone. And let me know if you have any suggestions/ difficulties; it is highly appreciated.

When would you write an abstract?

When I searched “write abstract before or after paper” on Google, the first result was “Write your paper first, save abstract for last”, and the second result was “Writing an abstract months before the paper”. I used to write abstract after finishing the paper; however, after writing a few abstracts for conference and paper proposals, I realize that abstract should not be the first step nor the last of writing a paper.

Writing an abstract in the early stage of a research and submit it for conference/paper proposal would pre-define the directions (including methodology, hypothesis, theories) the research would take, which could become problematic later. I think an abstract should be written after we collected and analyzed a significant part of the data. When I already have some key findings and confirm that the methodology works and the data would answer my research questions, I can write the abstract and commit to writing the paper.

Why not save it for last? As I said, I used to write abstract after everything else was done. And I know it’s a common approach. However, it was a wrong approach for me because I often found myself get lost and confused due to the large amount of data and theories that potentially help answering my questions. Therefore, some parts of my papers were not very well-connected and consistent. On the other hand, writing an abstract right after reading and analysis were done helped me to define the focus of my research and main arguments. This is just something I learned as a novice researcher. It could be wrong or could be improved, so I welcome all suggestions.

Rethinking Informational Privacy in The Datafication Age (Part 2)

In the first part, I have introduced the influential concepts of informational privacy and its dimensions. This post discusses how personal information is processed and used by algorithmic personalization systems and how such systems should operate to ensure personal privacy – permitting and protecting person’s autonomous life.

No raw data – no raw personal information

Professor of Information Sciences Philip F. Agre (1994) identified the grammars of actions that are central to data collection, which he termed as “the capture model of privacy”. Capture is not a new state of surveillance but a process of grammatizing human activities which involves identifying fundamental units of an activity, restructuring these units in a sensible way in computational languages, and then imposing these algorithmic structures of human activities on users (p. 744-747). This model, Agre wrote, “has manifested itself principally in the practices of information technologies” (p. 744). Due to the “messy” nature of information, decisions have to be made on how data are aggregated and categorized. As illustrated in the Facebook’s data model below, classification is the foundation for systems to make sense of data inputs. Accordingly, as the famous book title edited by Lisa Gitelman (2013) “’Raw Data’ is an Oxymoron” suggests, data are never ‘raw’ but always already ‘cooked’ (p. 2). In other words, personal information collected and processed by the algorithmic systems is not neutral or objective information but constructed by the engineers of the algorithmic systems in particular historical, social, cultural and economic circumstances. However, the logics of these systems are opaque, if not concealed from the public eye in the name of protecting intellectual property. We live in what legal scholar Frank Pasquale (2015, p. 3) has called a “black box society,” in which algorithms determine who we are and the contours of our world without us knowing exactly how. What is fundamental here is that the algorithmic systems of data collection and analysis are not purely modeling reality and human behavior as they are but are embedded with certain ideology, perception of the world and behaviors even before they collect data about the world. These analyses of algorithmic systems support the argument that there is no raw personal information per se given the logics of data collection and analysis.

Figure 2. Facebook’s data model, reversed engineered (Rieder, slide 37)

But why do the logics of data collection and analysis matter to the protection of individual privacy? Rössler (2005) clarified how self-determination is violated when one’s expectations of being watched are not matched with the practices of surveillance:

The protection of informational privacy matters so much to people because it is an intrinsic part of their self-understanding as autonomous individuals to have control over their self-representation. By means of the information they give other people about themselves or that they know other people have always had about them, individual simultaneously regulate the range of very diverse social relations within which they live. Without this form of self-determined control over their personal information, nor would self-determined, context-dependent, authentic behavior towards others be possible, nor would it be possible to find an answer authentically to the question of how one wants to live. (p. 116)

Considering an example in which a user grants permission to a media platform to record all of her personal information (metadata and data) with the awareness that the data are then used for behavioral targeting advertising, does it mean that the user has complete control of her self-realization and self-determination? Is there any other expectation that should be taken into account other than the expectation of her data being collected for behavioral targeting advertising, for example how her data are processed and quantified to lead to such advertisements? By knowing the socio-algorithmic process of data collection and analysis, data subjects could make more informed decisions regarding what they are confronted with. Without this knowledge, in the age of datafication and personalization systems, their control over the knowledge others have of them is incomplete. They have little idea how the socio-algorithmic process captures and calculates flecks of their identities; therefore, they do not understand how certain information is presented to them. It could have bitter consequences; for example, in a credit scoring system, the subject does not have full understanding on how his personal information is perceived and how his “worthiness” and “reliability” are calculated. What “worthiness” and “reliability” mean are up to the algorithms’ authors. As the data subject has little knowledge to form well-founded assumptions and expectations over how the systems perceive him, the norms that constitute his informational privacy are not fulfilled. The very moment the subject becomes aware of the structure of the surveillance systems and the decomposition and compositions of their bits of data may result in a change or shift in perspective because now the person is fully aware of the assumptions, conditions, and consequences of his actions. Therefore, protecting one’s control over his personal information is insufficient to provide respect for her as an autonomous subject, as Rössler (2005) argued:

the social and legal norms here must not just be a matter of form but also effective enough for me to be able to assume that my rights to informational privacy are in principle guaranteed, for autonomy can be threatened by the very fact that I no longer feel able to count on the self-evident assumption that these expectations are justified… People want to have control of their own self-presentation; they use the information others have about them to regulate their relationships and thus the role they play in various social spaces. (p. 118)

In summary, following the argument that privacy has the function of permitting and protecting an autonomous life, that is a person is autonomous if she can ask herself the question what sort of person she wants to be, how she wants to live, etc., informational privacy should take into account the categorizing, contextualizing, transforming, and meaning-making of personal information (or “the capture model” in Agre’s term). To secure a fair governance of personal information in the datafication age, informational privacy should encompass both persons’ control over the access to their personal information and the socio-algorithmic process of their personal information. Only when people are informed of how their self-representations and life choices are influenced by the logics of data collection and analysis could they lead an autonomous life.

References:

Agre, Philip E. “Surveillance and Capture: Two Models of Privacy”. The Information Society 10.2 (1994): 101-127.

Amazon.Jobs, “Personalization, 2018, https://www.amazon.jobs/en/teams/personalization-and-recommendations. Accessed 1 Nov 2018.

Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.”Theory, Culture & Society 28.6 (Nov. 2011): 164-81.

Cheney-Lippold, John. We are data: Algorithms and the making of our digital selves. NYU Press, 2017. Print.

Cole, David, “‘We Kill People Based on Metadata,’” NYR Daily (blog), New York Review of Books, May 10 2014, www.nybooks.com.

Daniel, J. Solove. “A Taxonomy of Privacy”, University of Pennsylvania Law Review. 154 (2006): 477-560

Daniel, J. Solove. Understanding privacy. Harvard University Press, 2008. Print.

Fried, Charles. “Privacy [A Moral Annalysis].” Yale Law Journal 77 (1968): 21.

Gitelman, Lisa, ed. Raw data is an oxymoron. MIT Press, 2013. Print.

Mayer-Schöenberger, Viktor and Kenneth Cukier. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt, 2013.

Nissenbaum, Helen. Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press, 2009.

Pasquale, Frank. The black box society: The secret algorithms that control money and information. Harvard University Press, 2015. Print.

Rieder, Bernhard. “Week 2: Affordances.” New Media Theories. University of Amsterdam. Amsterdam, 14 Sep. 2017.  

Rössler, Beate. The value of privacy. Cambridge: Polity Press, 2005. Print.

Schneier, Bruce. Data and Goliath: The hidden battles to collect your data and control your world. WW Norton & Company, 2015. Print.

Van Dijck, José. “Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology.” Surveillance & Society 12.2 (2014): 197-208.

Varian, Hal R. “Beyond big data.” Business Economics 49.1 (2014): 27-31.

Westin, Alan F. Privacy and Freedom. New York: Atheneum, 1967. Print.

Westin, Alan F. “Science, privacy, and freedom: Issues and proposals for the 1970’s. Part I – The current impact of surveillance on privacy.” Columbia Law Review 66.6 (1966): 1003-1050.

Rethinking Informational Privacy in the Datafication Age (Part 1)

The notions of personal information and privacy have often been addressed as intertwined, as tightly connected concepts where one constrains the other. That is, current theories and discussions of informational privacy are closely related to a specific notion of personal information, with privacy being conceptualized as one’s ability to control the access to one’s personal information. In this paper, I seek to understand what constitutes informational privacy in the age of datafication and personalization technologies. The paper is organized in three sections: the first section is a brief summary of prevailing concepts of informational privacy that construct the main theoretical framework for this paper, which is followed by the second section dedicated to metadata as an important dimension of personal information; the third section explains why there is no such thing as “raw personal information” in the algorithmic systems and why a normative privacy concept should take this into account.

Before I delve into the literature review, I want to emphasize that when I talk about privacy, I am referring to individual privacy and that this paper aims to develop a normative concept of information privacy within the context of contemporary logics of data collection and analysis. Moreover, as Solove (2008) argued, theories are meant to “be tested, doubted, criticized, amended, supported, and reinterpreted” (p. ix), this paper is not an attempt to introduce a new theory of informational privacy but to build on and develop upon existing theories in light of an evolving social and technological environment.

Theoretical Framework

Although there are various aspects of privacy that can be found in the literature, for the time being I shall limit myself to explore the dimension of informational privacy. According to Rössler (2005), the dimension of informational privacy has been considered as the central dimension of privacy by numerous theorists (p. 111). While there are different approaches to define privacy, in this essay I am taking up such approaches as those adopted by Charles Fried, Alan Westin, and Beate Rössler.

American jurist and lawyer Charles Fried (1968) defined the right to privacy as one’s control over knowledge about oneself, not only in terms of quantity of information but also modulations in the quality of the knowledge (p. 475). In similar terms, American law professor Alan Westin (1967) developed what is now considered a philosophical groundwork for current debates about privacy law: “privacy as the claim of individuals, groups or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” (p. 7). In arguing why individuals need privacy, he wrote “privacy is neither a self-sufficient state nor an end in itself… It is basically an instrument for achieving individual goals of self-realization” (1966, p. 1029). This view is further extended by ethics professor Beate Rössler (2005), who proposed that “something counts as private if one can oneself control the access to this ‘something’” and that “the protection of privacy means protection against unwanted access by other people” (p. 8). Informational privacy is obtained if one is able to control the access of others to information about her and able to make well-founded expectations and assumptions concerning what others know about her and how they acquire their knowledge. Accordingly, informational privacy is violated when knowledge about the person is infringed upon, and when her expectations and assumptions “prove to be false, are disappointed or come to nothing” (p. 111-113). The fundamental point here is that:

violations of informational privacy always… result in violations of the conditions of autonomy…. Only on the basis of the (fragile) stability of her fabric of expectations, knowledge, assumptions and selective self-disclosure is it possible for a person to exercise control over her self-presentation and thus, in a broader sense, to enjoy the possibility of a self-determined life (p. 112, 140).

To sum up this branch of thought, the dimension of informational privacy serves to secure the individual’s control over access to knowledge about herself and her expectations regarding what other people or institutions know about her for it is essential to her self-realization and autonomous life. However, in the age of datafication and personalization technologies, this normative concept seems to be incomplete to guarantee one’s self-autonomy and self-realization because it does not take into account the possibility of interference in one’s action even when access to one’s information is deliberately granted. Furthermore, there are dimensions of personal information that Internet users are not fully aware of, such as metadata, given the contemporary logics of data aggregation. Therefore, although this definition of privacy in terms of control of access is highly valuable, there are crucial elements that are left out of account. In accordance with this argument, technology professor Helen Nissenbaum (2009) proposed that privacy concerns should not be limited solely to concern about control over personal information. To address this gap, she suggested to regulate how institutions use data, not how data is collected as there has been considerable evidence that the “transparency-and-choice” framework, which is built on the notion of “privacy as control of access”, cannot solve the core problems of flagrant data collection, dissemination, aggregation, analysis, and profiling (p. 34). It is important to note that, as legal scholar Daniel Solove (2006) argued, while technology is involved in various privacy problems, it is not the main and only cause of privacy problems, which “primarily occurred through activities of people, businesses, and the government” (p. 560). This observation is an important reminder for privacy theorists to go beyond a techno- or socio-deterministic perspective. Therefore, I propose to ask what personal information (data) is collected, how neutral data is, and what happens to the data after it is collected. By exploring these questions, we shall see how to re-conceptualize informational privacy in such a way that can address the challenges posed by algorithmic profiling and data-driven logic.

The dimensions of personal information 

Since there is a great ambiguity in the way the term “personal information” is used (Nissenbaum 2009, p. 4), it is important to define the dimensions of personal information used in this paper before we explore what personal information is collected. Personal information is information that is “relating to an identified or identifiable natural person (‘data subject’)” (definition by the European Union Directive, cited in Nissenbaum 2009, p. 4). This includes sensitive or intimate information, information about a person, and metadata generated through a user’s activities on the Net. Metadata are what constitutes a dimension of personal information that is generally less taken into consideration in privacy discourses. Since metadata do not contain the actual content of communication, the threat to privacy is less acknowledged and sometimes said to be negligible. Metadata in exchange for communication and entertainment services has become the norm; most users click consent to give away “something as private” without fully understanding what it is (Van Dijck 2004, p. 200).

While users may not be accustomed to thinking about metadata because they are not immediately visible, they have direct impact on users’ self-representation and identity, which in turn regulates their social relations and autonomy. According to security expert Bruce Schneier (2015), “Data is content, and metadata is context. Metadata can be much more revealing than data, especially when collected in the aggregate.” (p. 75). David Cole’s research on the NSA system proved that precisely the metadata are what articulate our relationship to the state as a surveilled subject (2014). Some examples of metadata are the data about whom you talk to, when you talk to them, how long you talk, which sites you visit, how long you stay, and which buttons you click on the sites, etc. As Mayer-Schöenberger and Cukier (2013) observed, in the age of datafication, “everything under the sun – including ones we never used to think of as information at all, such as a person’s location” is transformed into a data format to make it quantified (p. 15). Although metadata do not include the actual content of the communication you have, “metadata alone can provide an extremely detailed picture of a person’s most intimate associations and interests” (Cole 2014, para 2). In short, metadata are crucial elements in categorizing and profiling data subjects; they consequently define who deserves “a higher level of scrutiny” (Cheney-Lippold 2017, p. 92).

Media theorist Matteo Pasquinelli argued that since metadata now serves as the measures of value of social relations and what our algorithmic associations mean, “the current mode of Deleuze’s “societies of control” might be termed the “societies of metadata” (cited in Cheney-Lippold 2017, p. 92). As former NSA general counsel Stewart Baker said, “Metadata absolutely tells you everything about somebody’s life. If you have enough metadata you don’t really need content.” (cited in Cole 2014, para 2). Furthermore, the contemporary use of metadata is towards predicting human behavior. For example, in the case of the NSA, it is to determine the probability of someone committing a terrorist act (Cheney-Lippold 2017, p. 85). A frequently cited example of commercial platforms’ predictive analytics is the Amazon’s recommendation engine, which the company argues is “to create a personalized shopping experience” (Amazon.Jobs 2018). Google Now is materializing the vision of its creator: “It should know what you want and tell it to you before you ask the question” (Page, cited in Varian 2014, p. 28). In other words, from the viewpoint of surveillance and marketing, predictive analytics correlating (meta)data patterns to individual’s behavior yields powerful information about who we are and what we do, which is in turn repurposed for the institutions’ objectives, such as the manipulation of desire and demand or policing groups of people or individuals and others. The examples above show that the privacy risk of metadata lies at the practices of data filtering and algorithmic manipulation for commercial and other reasons while claiming they are neutral facilitators of users’ web experience. In fact, metadata have become a kind of invisible asset, processed mostly separate from its original context and outside people’s awareness. Moreover, there has been evidence time and again that companies and other institutions which own communication platforms monetize metadata by repackaging and selling them to advertisers or data companies (Van Dijck 2004, p. 200). Therefore, a normative concept of informational privacy must address all dimensions of personal information, including metadata, in order to protect the most private realm of individuals, the context of their communication.  The next question to be addressed is how the data collection and analysis processes influence one’s self-realization and self-determined life. To answer this question, firstly it is necessary to understand how the algorithmic systems of data collection and analysis work on a base level.

References: 

Agre, Philip E. “Surveillance and Capture: Two Models of Privacy”. The Information Society 10.2 (1994): 101-127.

Amazon.Jobs, “Personalization, 2018, https://www.amazon.jobs/en/teams/personalization-and-recommendations. Accessed 1 Nov 2018.

Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.”Theory, Culture & Society 28.6 (Nov. 2011): 164-81.

Cheney-Lippold, John. We are data: Algorithms and the making of our digital selves. NYU Press, 2017. Print.

Cole, David, “‘We Kill People Based on Metadata,’” NYR Daily (blog), New York Review of Books, May 10 2014, www.nybooks.com.

Daniel, J. Solove. “A Taxonomy of Privacy”, University of Pennsylvania Law Review. 154 (2006): 477-560

Daniel, J. Solove. Understanding privacy. Harvard University Press, 2008. Print.

Fried, Charles. “Privacy [A Moral Annalysis].” Yale Law Journal 77 (1968): 21.

Mayer-Schöenberger, Viktor and Kenneth Cukier. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt, 2013.

Nissenbaum, Helen. Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press, 2009.

Rössler, Beate. The value of privacy. Cambridge: Polity Press, 2005. Print.

Schneier, Bruce. Data and Goliath: The hidden battles to collect your data and control your world. WW Norton & Company, 2015. Print.

Van Dijck, José. “Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology.” Surveillance & Society 12.2 (2014): 197-208.

Varian, Hal R. “Beyond big data.” Business Economics 49.1 (2014): 27-31.

Westin, Alan F. Privacy and Freedom. New York: Atheneum, 1967. Print.

Westin, Alan F. “Science, privacy, and freedom: Issues and proposals for the 1970’s. Part I – The current impact of surveillance on privacy.” Columbia Law Review 66.6 (1966): 1003-1050.

Media activism and the public sphere (Part 2)

In Part 1, I have described the main concepts central to the public sphere and digital networks. Part 2 analyzes how the civic tech activist groups’ practices and imaginaries can offer useful insights into constructing the public sphere. In the conclusion, I will suggest the agendas for further study of the public sphere from the media and data activism perspective.


Reflections on civic tech activists’ enacting the public sphere

Civic tech activists in this paper refer to those that aim to use technology to empower citizens, accelerate social change, or advance advocacy. While there are various genres in civic tech including hacktivism, open source movement, open data movement, citizens’ media which have different agendas, goals, means and methods, an analysis of their common ideas and concepts would help to understand how these communities are shaping the configuration of the public and public sphere.

According to Habermas, the key members of the public sphere are civil society groups who act as a form of sensor, playing an important role in initiating, organizing and steering critical debate on matters of public interest.

“Civil society is composed of those more or less spontaneously emergent associations, organizations and movements that attuned to how social problems resonate in the private life spheres, distill and transmit such reactions in amplified form to the public sphere. The core of civil society comprises a network of associations that institutionalizes problem-solving discourses on questions of general interests inside the framework of organized public spheres (Habermas Between Facts and Norms 367).” 

However, the civil society, without formal authority, can only influence the political system “through the filters of the institutionalized procedures of democratic opinion- and will-formation and enters through parliamentary debates into legitimate law-making” (371). This argument may not always hold true for two reasons. First, a number of grassroots civil society groups have rejected policy processes and sought to initiate critical public discourse and challenge the practice of the state not through established or legitimate means (such as elections) but through their own medium or arguably illegal tactics. For example, hackers and radical tech activists deconstruct existing technology (e.g. the Chaos Computer Club reversed engineering governmental surveillance software like the Federal Trojan in Germany)[1], or build alternative communication infrastructure to facilitate the development of unrestricted public sphere (e.g. Tor network servers, Indymedia)[2]. Hackers’ and radical tech activists’ practices thus effectively challenge the notion of public sphere as legitimate social institutions as opposed to the gatekeeping function of publishing in mainstream press. In this case, legitimacy of public opinion is not earned through dialogue with institutions. In fact, to the hackers and civic tech activists, legitimacy is less important than contestation which is exercised through protest and disruptive action. Therefore, I argue that Dean’s neodemocracies model provides a better foundation to understand the goal and means of the radical tech activists’ engagement in democratic politics.

Second, tech activists have proved to produce efficient public debate by the bottom-up approach instead of entering through institutionalized public sphere. For instance, projects like “The Setup method”[3] scraping the birth data of all Dutch citizens in order to set up a fake birthday gift service provide a good starting point for debate on data and privacy issues. Using the bottom-up approach, grassroots tech collectives bridge the realms of public and private and stimulate discussion on subjects of general interests while touching on the private life spheres. To civic tech activists, especially the activists in the open data movements, the bottom-up approach means providing access to data and software to all citizens so that they can make their own interpretation of raw data (data as collected) and thus generate their own knowledge (Baack, Datafication and empowerment 3). At the heart of bottom-up approach is the vision of citizen empowerment practiced through breaking the monopoly of public and privatized institutions in the control, collection, and distribution of knowledge (5). The bottom-up approach is, however, mainly operated at a local level because it requires close collaboration and engagement between individuals, and a focus on the immediate issue to produce efficient public debate.

From the open data movement perspective, data meaning-making as a collective practice include recognizing the roles of actors whose may have been neglected in the constitution of the public sphere, especially non-human actors. They perceive the infrastructure of public sphere as relations of people, machines, software, protocols, processes, and that it is important to uncover these relations to the public. Examples could be found at the projects by Share Lab[4], which have conducted investigations into the global data surveillance, unravel the invisible infrastructure of networks and make it comprehensible to ordinary citizens. The outcomes of their projects opened up questions about the transparency of media infrastructure, privacy issues, immaterial labor, data discrimination, free access and exchange of knowledge, information and technology. But more importantly, they emphasize the roles of non-human actors, algorithm and infrastructure, in our daily public and private communication. This finding echoes Peters’s and Couldry’s argument on the erosion of the public and public sphere in the increasingly fragmented, scattered environments and personalized communication networks. A significant contribution of the civic tech activists is highlighting the importance of transparency of the media infrastructures in enabling open, inclusive, and critical public debate.

It is important to note that while the civic tech activists are passionate about social and political issues that they promote, their practices tend to focus more on providing the infrastructure for formulating public debate rather than participating in the debate itself for a longer-term. In most cases, they engage in the public discussion to raise awareness of the issues, initiate the debate or assist the advocacy of organized institutions and other activist groups. It could be explained that they see their expertise as more useful in helping the mobilization of the public sphere than organizing and maintain it. Interview with the group members of the project mySociety also confirms this understanding: “What we do is present the facts…. It is then up to other people to do with that what they will, which might well be using to promote a cause (cited in Baack, Civic Tech n.p).

Conclusion

It can be concluded that while most of Habermas’s understanding of the public sphere as a democratic ideal is still valuable, consensus and legal, rational procedures are not necessarily the only goal and medium for constructing the public sphere. In some cases, it is organized through contestation and conflict. Acknowledging the inevitable antagonism of political life is important to avoid the traps of the “counterfeit” public opinion in the forms of referendum or polls and the refeudalization public sphere in the forms of publicity/public relations. Learning from the practices of civic tech activists, public sphere model should also include the bottom-up approach and the transparency of non-human actors.

Since the public is a space emerging as an effect of power relations, there are different publics in different contexts. This essay focuses on conceptualizing the public sphere produced by civic tech activists in Western countries. Further research could look into how the activists in other civil societies accelerate public debate. Other questions to examine could be: How do power relations in datafied publics affect the constitution of the public sphere? How data discriminiation and digital divide affect the practices of civic tech activists in constructing the public sphere? What are the implications of assembling public around issues through the use of big data?

References

Baack, Stefan. “Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism.” Big Data & Society 2.2 (2015): 2053951715594634.

Baack, Stefan. “Civic Tech at mySociety” How the Imagined Affordances of Data Shape Data Activism.” Krisis Journal for contemporary philosophy 1 (2018). Access at <http://krisis.eu/civic-tech-at-mysociety-how-the-imagined-affordances-of-data-shape-data-activism/>

Couldry, Nick. The myth of ‘us’’: digital networks, political change and the production of collectivity.’ Information, Communication & Society. 18.6 (2015): 608–626.

Dean, Jodi. Why the Net is not a Public Sphere. Constellations. An International Journal of Critical and Democratic Theory. 10.1 (2003): 95–112. https://doi.org/10.1111/1467-8675.00315

Habermas, Jürgen .”The Public Sphere.” Media studies : A Reader. Ed. P. Marris. (1996)

Habermas, Jürgen. “Between facts and norms (W. Rehg, Trans.).” Cambridge: PolityPress (1996).

Peters, John Durham. “God and Google” in The Marvelous Cloud: Toward a Philosophy of Elemental Media. Chicago: The University of Chicago Press, 2015, 315-369.

[1] See Kubitschko, Sebastian. “Chaos Computer Club: The Communicative Construction of Media Technologies and Infrastructures as a Political Category.” Communicative Figurations. Palgrave Macmillan, Cham, 2018. 81-100.

[2] See Milan, Stefania. (2016). Three decades of contention. The roots of contemporary activism. In Social Movements and Their Technologies. Wiring Social Change (pp. 19–48). Basingstoke, UK: Palgrave MacMillan.

[3] Read more about the project at https://www.tijmenschep.com/the-national-birthday-calendar/

[4] Read more about the projects at https://labs.rs/en/

Media activism and the public sphere (Part 1)

The public sphere conceptualized by Jürgen Habermas concerns the relation between mass media and association in public space. It has important implications for our democratic developments and political transformations. The great virtue of the public sphere idea is that it calls our attention to critical reflection on the role of the media in social communication. As the media landscape and infrastructure become more and more complex, academic scholars have increasingly invested in revising and offering critique to the public sphere idea. However, there are few scholarships looking at the practices and imaginaries of the media and technology activist groups in civil society, whom I will refer to as civic tech activists, to conceptualize how they enact the public sphere and enable rational public discourse. Since they are important actors in the constitution of the public sphere, we need to understand what they are doing to produce and shape public discussion. Therefore, this paper aims to answer this question and provide constructive evaluation of Habermas’s public sphere idea from the perspective of civil society groups. While there are many different segments of the civil society, this paper focuses on the media and technology activist groups as they are essentially at the front line of acting on and changing the media. In the following, I shall describe the main concepts central to the public sphere and digital networks. Next, I will analyze how the civic tech activist groups’ practices and imaginaries can offer useful insights into constructing the public sphere. In the conclusion, I will suggest the agendas for further study of the public sphere from the media and data activism perspective.

Conceptual frameworks

In this section, some of the central concepts and theories mobilized for this study are explored, including Jürgen Habermas’s public sphere and its critique, John Durham Peters’s theory of elemental media, and Nick Couldry’s critique of digital networks theory.

Habermas’ theory of the public sphere has powerfully influenced academic discourses in various disciplines from political science to media and communication studies. It is widely accepted as the standard work for discussion and reflection on citizens’ participation in political discussion and the role of media in politics and everyday life. Through his analysis of the history of the public sphere, Habermas points out its weaknesses and contradictions, and from there formulates a normative claim of how a public sphere should be. First, it should be, in principle, accessible to all citizens. Second, Habermas defines the public sphere as:

“A domain of our social life in which such a thing as public opinion can be formed. Access to the public sphere is opened in principle to all citizens… Citizens act as a public when they deal with matters of general interest without being subject to coercion; thus with the guarantee that they may assemble and unite freely, and express and publicize their opinions freely (The Public Sphere 55).” 

Despite being legitimized through public in elections, the state is not a part but a counterpart of the public sphere. Ideally, within the public sphere, private persons come together as if equals for open and rational debate and discussion in order to form public opinion, and consensus is achieved only through the force of rationality (55-58). In the process in which private opinions were transformed into public opinion, the media plays an important role in mediating and mobilizing the public sphere (58). Public opinion, not mere opinion or personal opinion made public, “is formed only if a public that engages in rational discussion exists” (56). It could be summarized that Habermas’ model of public sphere can only be achieved in the context of a liberal political culture, when citizens are free and active agents in an already rationalized society, and when the media, as a mandatory of enlightened public, facilitates public discussions without the influence of their own commercial or political interests.

Habermas’ concept of public sphere has received both acclamation and criticism, mostly because his normative claims were based on the analysis of bourgeois public sphere, which relied on small-scale print media and face-to-face discussion, thus making it difficult to apply to modern societies in which citizens’ relations to the media are far more complex. Craig Calhoun suggests that the notion of the homogenous bourgeois public sphere is damaging to practices of democracy as it excluded women, ethnic and racial minorities, and it was built on the backs of the working class (cited in Dean 96). Taking it further, Dean makes the point that the norms of the public sphere have been adopted by a “communicative capitalism” and turned into an ideology of publicity in service of those with resources and power (102). One of her main concerns is the notion of the public sphere has been based on the idea that power is external, hidden, secret (110). Therefore, Dean puts forward the theory of neo-democracies, which reject the fantasy of a public and work from the antagonism of political life (108). Neo-democracies are configured through contestation and conflicts, instead of legal and rational procedures and consensus as in the public sphere model (109).

Table 1. Public sphere and neodemocracies models. Adapted from Dean, p. 108.

The development of the Internet and the emergence of personal blogging and social network platforms have given hope that the norms of Habermasian public sphere can be fulfilled and a deliberative democracy can be actualized. However, as shown in Peters’s analysis of the ethereal digital media, the practices and structures of the new platforms are the extension of early media practices. Whereas Habermas is more concerned about the social and political structure of the media in mobilizing the public sphere, Couldry and Peters point at its technical aspects as strongly influencing human interactions and knowledge-making. As much as Google’s search algorithms and storage infrastructure are phenomenal achievements, they essentially configure human’s knowledge and memories. Building his analysis on how Google’s systems work, Peters argues that algorithm of networks is constantly updated and secret, and the networks are managing the relations people have with themselves, others and the natural world. An important argument Peters made is that anything that is not recorded or indexed may be lost forever in the realm of Google and eventually the social world. This reflection is important as our identities and social life are increasingly recorded and datafied; however, there are still so many aspects, activities, knowledge, and people that are undocumented (329). It becomes clear that networked technologies are shaping our knowledge which becomes a part of self-constitution of public.

In similar fashion, Couldry argues that digital networks fail to facilitate long-term public discussion and political transformations; instead, their infrastructures and protocols accelerate short-term commitments, which result in less stability in collective debate and action. From the perspective of Couldry, public is a myth of the digital networks, which encourage us to believe that our participation and interaction on public debate on social media platforms have important contribution to social and political change (608). In fact, the digital networks result in users retreating to their private spheres. As Couldry explains:

“Social media target individuals, drawing them into regular interactions with other individuals that they choose, and then monetizing that potential attention and the related consumer data that such interactions generate… A new myth is emerging about the types of collectivity that we form when we use social networking platforms: a myth of natural collectivity whose paradigmatic form lies in how we gather on platforms such as Facebook (619-20).” 

Deconstructing this myth is, therefore, crucial in understanding how digital networks threaten to disintegrate the structural transformation of the public sphere: public space becomes increasingly privatized. That is to say that our participation and engagement in public discussion is partly decided by the algorithm which was designed with commercial and private interests, and that individuals shrink away from the public debate with those of contrastive opinions.

Part 2:

References: 

Baack, Stefan. “Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism.” Big Data & Society 2.2 (2015): 2053951715594634.

Baack, Stefan. “Civic Tech at mySociety” How the Imagined Affordances of Data Shape Data Activism.” Krisis Journal for contemporary philosophy 1 (2018). Access at <http://krisis.eu/civic-tech-at-mysociety-how-the-imagined-affordances-of-data-shape-data-activism/>

Couldry, Nick. The myth of ‘us’’: digital networks, political change and the production of collectivity.’ Information, Communication & Society. 18.6 (2015): 608–626.

Dean, Jodi. Why the Net is not a Public Sphere. Constellations. An International Journal of Critical and Democratic Theory. 10.1 (2003): 95–112. https://doi.org/10.1111/1467-8675.00315

Habermas, Jürgen .”The Public Sphere.” Media studies : A Reader. Ed. P. Marris. (1996)

Habermas, Jürgen. “Between facts and norms (W. Rehg, Trans.).” Cambridge: PolityPress (1996).

Peters, John Durham. “God and Google” in The Marvelous Cloud: Toward a Philosophy of Elemental Media. Chicago: The University of Chicago Press, 2015, 315-369.

Data justice conference 2018

Facebook world is crumbled under the investigation of Cambridge Analytica. But the case is only one example of how data has been overused and misused all over the world. I’ve signed up for the Data Justice Conference at Cardiff University. It is a 2-days conference from 21-22 May 2018 to examine the relationship between datafication and social justice. It would also be my first time going to England and Wales so I’m very excited. I will update about the trip and what I learn from the conference here.

The perks of software development outsourcing jobs

In recent years Vietnam has emerged as a dynamic destination for software development outsourcing in Asia due to the increasing number of young Vietnamese software engineers and lower labour costs compared to other IT outsourcing countries, i.e. India, China and the Philippines (Gallaugher and Stoller 3; Phan 7; Shillabeer 159-160). According to a research report from AT Kearney, Vietnam ranks eighth among the software outsourcing destinations. It is estimated that as of 2016 over 1,000 companies are operating in the software sector and employing around 70,000 people. (qtd. in Austrade, n. pag). The nation’s outsourcing industry reached US$500 million in 2014 (VINASA qtd. in Sturgeon and Zylberberg, 146).

It has been known that workers in outsourcing companies have to work overtime to meet their client’s deadlines and suffer from the time zone differences with their clients (Upadhya 10). However, there is little research about the working conditions, expectations and motivations of these workers, who are making a significant contribution to the country’s economic growth. It is therefore important to perform a qualitative research about the software outsourcing industry from the labour sociology perspective. This research is primarily based on the work of Andrew Ross on New Economy’s workplace and no-collar work mentality of new media labour. Supporting theory frameworks are Enda Brophy’s study of post-Fordist relationship to labour, and family culture in the workplaces by Modern Ram and Ruth Holliday. Nevertheless, the nature of the ecomony investigated in Ross’s study is different from the developing countries’ economies. Therefore, this research constitutes of an analysis on the outsourcing companies’ online content and interviewing current employees in a digital art outsourcing company. Combining with insights of Vietnamese culture and education, the research examines, refines, and extends the nature of outsourcing jobs to answer the question: What do outsourcing workers in new media industry see as the attributes of a ‘good job’?

Theoretical framework 

The main features of what is seemingly a ‘good job’ were identified by Andrew Ross through his ethnographic research concentrating on new media companies in New York’s Silicon Valley as following: personal freedom, high autonomy, flexibility, and opportunities for self-actualisation (7-17). The digital and online companies with these features are acknowledged as New Economy’s workplaces in comparison with the brick-and-mortar firms as Old Economy (9).

While there appeared to be a hidden cost of overwork and lack of job security, the employees feel attach to the “irresistible work environment, one that they feared they may never enjoy again in their personal lives” (15). In the New Economy’s workplaces, a no-collar work mentality which rejects authoritarian and labels has emerged. The creativity and individuality of this nonconformist attitude serves to capture the ideal of work which the New Economy’s companies try to sell to both customers and their workers (32-50). Online and digital companies offer employees the opportunities to contribute to something bigger than themselves, and therefore deliver human self-recognition which has changed the labour culture in new media industry (Turner 89-91). The technology scene’s idealism status is earned through creating something that can change the world (Marwick 82); hence computer programmers and software engineers are motivated to pursue “meaningful” projects even if they have to work seventy-hour-plus hours a week and sacrifice their personal lives (Ross 19).

Flexible management is one of the main features of the post-Fordist workplaces, especially in new media industry. Workers in the high-tech industry reported a high level of independence, with little monitoring of their progress (Brophy 621, 629). However, the flexible system of production gives rise to precarity which implies a range of different and less stable relations to labour: short-term contracts, part-time jobs, self-employment, no clear separation between work and free time, lack of union protection, no steady work rhythms, etc. (621).

However, these studies have been concerned primarily with the experience of workers in the West, especially in the United States, and there has been relatively few academic works on the Vietnamese outsourcing software industry from the point of view of labour. This research will document and conceptualise the workplace conditions, aspirations, and most importantly the attitude of young employees towards their jobs. The research will therefore fill in the gap of academic knowledge about the New Economy’s workplace culture and politics in developing countries.

Methodology 

In order to understand the nature of the software development outsourcing workplace, this research will draw on an analysis of online content posted by the two most well-known IT outsourcing companies in Vietnam: Glass Egg Digital and TMA Solutions. Glass Egg Digital is one of the leading offshore software development companies specialising in multimedia projects located in Ho Chi Minh City, and has over 300 employees. Their production team includes producers, graphic artists and programmers. In contrast, TMA Solutions with 2,000 engineers in staff provides a wide range of IT outsourcing services. Both companies are active on Facebook as it is the most popular social media in Vietnam with 46 million Facebook users and therefore an effective medium to attract new talents (We Are Social 179). Thus, I will analyse content on the companies’ Facebook pages and the employees’ comments on these content.[2] These resources will provide information about the companies’ activities, working conditions, and employees’ attitudes towards their workplaces and their jobs.

This paper will also draw initial findings on my observation while working as a project manager in Vietnamese outsourcing companies, and informal interviews with two employees in Glass Egg Digital. The interviews will give a more detailed understanding of how workers view and feel about their outsourcing jobs, workplaces, and career paths. It is important to note that this research is a starting point of a larger ethnographic research in which given time and resources researchers immerse themselves in the outsourcing workplaces, observe and document the employees’ behaviour, and interview them while they are on the job.

Vietnam’s software development outsourcing industry 

Since ‘đổi mới’, Vietnam’s economic and political reforms launched in 1986, a country that was once embargoed by the US has adopted market-friendly policies to attract foreign investment in its economy. It has grown from one of the poorest countries in the world to a lower-middle income nation (The World Bank, n. pag). The key turning point was when Vietnam joined WTO in 2007, which has increased the country’s participation in the global economy. Furthermore, Vietnamese government has invested heavily in ICT infrastructures to boost the export of IT services, which are considered to be the key sectors contributing to the country’s economic development (Sturgeon and Zylberberg 158). With the government’s pledge to invest approximately US$415 million in the ICT sector by 2020, the industry is expected to continue growing (BMI qtd. in Austrade, n. pag)[1].

Vietnam’s outsourcing companies have attracted customers who previously outsourced to China, India, and Eastern Europe due to their cost advantage. More specifically, Vietnamese labour is far less expensive than China’s or India’s (Ho Chi Minh City Computer Association qtd. in Sturgeon and Zylberberg, 147). However, it is estimated that the human resources demand in software development sectors will far outweigh supply over the next 3 years. Therefore local businesses are losing their young employees whom they spent time and effort to train to international companies who have higher offers (Shillabeer 160).

A ‘flexible’ assembly line

Activities involved in the software production process can be divided into conception and execution tasks. Most of the work in outsourcing companies requires execution skills including coding, testing and maintenance and only a few projects require preliminary analysis and designing solutions.

After analyzing the client’s brief, a project manager or senior software engineer will divide the work into small pieces and assign many workers to work simultaneously. For example, in Glass Egg Digital, “Each artist is responsible for a component of an art product.” (Nhan, Vu. Interview, 21 December 2017). Nhan is managing a long-term project that has been assigned to 50 digital artists. Precisely this ‘assembly line’ that makes the outsourcing companies time and cost competitive. As with a traditional assembly line, workers have limited opportunities to think outside the box. Their tasks are strictly described and monitored, which leaves them almost no space for exploring creative solutions.

According to the interviewees, just a few senior artists at Glass Egg are highly individualistic and able to turn preliminary concepts into concrete models. These senior artists are often put to work to compete with other companies when bidding for projects. After acquiring the projects, work will be assigned to the artists in the ‘assembly line’.  Therefore, the digital artisans and the industrialisation of bohemia that Ross found in the new media industry are not entirely applicable to the outsourcing industry. Creativity and individuality, which are the main attributes of the nonconformist mentality, are not celebrated on a large scale in the outsourcing workplaces, because they need to put the client’s requirements above all. The competitiveness of outsourcing companies, in fact, mostly relies on their abilities to deliver the products exactly as described in client’s brief.

However, as remarked by interviewees in Brophy’s study, software development work is still significantly different from a manufacturing assembly line because of the “globalized workflow of the software industry” (629). Interestingly, although working in two different continents with different business cultures, both interviewees in Brophy’s research and this study emphasised the culture of tinkering, autonomy, self-teaching, and self-motivation. The project-based production that was found in Silicon Valley informational and design firms (Turner 91) is remarkably pronounced in software outsourcing firms. Artists in Glass Egg Digital are encouraged to manage their own schedule as long as they can deliver their work on time. This flexibility in time management is a crucial difference from manufacturing assembly line production and the Old Economy firms, and it is also a compelling feature that both interviewees find irresistible.

Nevertheless, this flexible management is only possible because of the use of various tracking and reporting tools. The role of project managers, who are also called producers or accounts in some companies, are important in outsourcing firms as they are the ones that communicate with clients to understand the requirements. They connect all resources, make project plans, and drive the projects to completion. In order to meet client’s deadline and requirement, project managers have to monitor technical and art colleagues’ workflows and progress through tracking and efficiency tools. In some companies, employees have to report their daily activities and progress on timesheets because the outsourcing companies bill clients on the basis of man-days. Ultimately, the outsourcing companies keep tight control over employees’ working progress to maintain their profit margins even though they present themselves as flexible and open workplaces.

On the other hand, the flexible culture allows companies to demand ‘flexibility’ from their employees. As Nhan described, “Our plan depends on clients’ plan. Sometimes they change their priorities and we also have to change our plan. For urgent requirements, we have to work over time to meet their new request.” In outsourcing companies, both employees and managers “have to prepare to work overtime any time at request” (Ha, Uong. Interview, 17 December 2017).

However, the testimonies of employees of Glass Egg Digital and TMA Solutions generally suggest that they find their jobs rewarding and enjoyable. The following section will address the perks of working in Vietnamese software outsourcing companies, and from there I will propose a definition of a ‘good job’ in the new media outsourcing industry.  

The perks of outsourcing jobs

1. Training and abroad working experience

The technology disciplines, especially software development, appear to be among the most attractive career choices to Vietnamese students (Le n. pag). Vietnamese young workers are generally characterised by being resourceful, willing to take risks, but lacking initiative and English communication skills (Shillabeer 159). While making an effort to invest in training human resources for the technology sectors, Vietnam’s universities are still “obsolete”, even in comparison with other Southeast Asian countries (Mai and Yang 171). Education in general focuses extensively on theory and memorising, which results in graduates, even those from prestigious technical universities, not qualified enough to meet the standards of the modern international business environment (Sturgeon and Zylberberg 134). Therefore, opportunities to be trained by working on complex international projects are highly valued. Precisely the training opportunity is considered to be one of the main attributes of a good job for which newly graduates or inexperienced new media workers are willing to work for a low salary and long hours.

For example, Glass Egg Digital’s training programmes for art students every 6 months attract a large number of talented artists and programmers even though only a few of them are offered a full-time position at the company at the end of the programme. However, according to a survey conducted by Adecco (n. pag), after accumulating experience and improving their technical skills, programmers and digital artists can earn from US$500 to US$2,000, which are comparatively high because the average monthly salary of Vietnamese university graduates in 2017 is about US$340 (Ministry of Labour, War Invalids and Social Affairs qtd. in Hoang, n. pag). Programmers and artists with fewer than 5 years of experience often choose the workplaces where they would benefit from managers’ mentorship and training programmes. This is also reflected in TMA Solutions employees’ testimonies of why they joined the company in the first place.

English training is also one of the key ‘selling-point’ of new media companies in Vietnam. Compared to Indian and Eastern Europe outsourcing firms, the Vietnamese workforce is at the disadvantage in terms of English communication skills because the language barrier is proven to affect the success of outsourcing projects (Shillabeer 159). Workers with excellent English communication skills can communicate directly with clients and thus increase their efficiency, negotiating power with their employers, and promotion opportunities.

While software-outsourcing workers generally do not get involved in the conception phase of a project, it does not mean that their job is not challenging at all. In fact, most TMA Solutions employees refer to the competition with Indian companies as a challenge that they find thrilling to pursue. The senior software engineers cited their experience of working on-site at client’s main offices in the West when bidding for projects as the most valuable experience they have had in their working life. Abroad business trips do not only provide the workers an opportunity to observe the Western working style and improve their skills. It is also perceived as a perk that only privileged employees can obtain since travelling to the West is a lengthy and very expensive procedure.

Training and abroad working experience are considered to be important factors when choosing companies to work for because they increase the software engineers’ bargaining power in the job market. Similar to workers in Silicon Valley (Ross 17), many software engineers and artists in outsourcing companies actively engage in building their own ‘personal brands’. The culture of individualism is motivated by an economic manner of thinking which is central to neoliberalism:

The worker has become human capital… They become individuals for whom every action, from taking courses on a new computer software application to having their teeth whitened, can be considered an investment in human capital (Read 5, 7).

A good job in the software outsourcing industry, therefore, must provide the means for workers to further their careers or to pursue a higher salary: technical and English training, and challenging work. Similar to the permatemps in Brophy’s study, junior workers in the outsourcing high-tech industry are the most vulnerable to labour precarity. For instance, junior artists in Glass Egg Digital are offered to work as contractors for 6 months without dependable benefits before signing a 12-month contract. Therefore, junior workers have to sacrifice most of their waking hours to work as a trade-off for future financial gains and job security. Conversely, due to the shortage of high-quality software engineers and artists, senior employees in the new media industry can work the system to their own advantage by using the ‘exit weapon’ at the slightest dissatisfaction or for a higher offer. The relatively high rate of employee turnover is perceived to be a problem for the outsourcing companies. Thus, they have engaged in various strategies and management efforts to foster employees’ loyalty. i.e. cultivating family culture.

2. Family culture and kinship

The existence of family culture in the firms which employ no family members has been acknowledged by scholars. Family culture in this case is a management ideology that is nurtured at work, without a predominance of blood ties, to encourage trust and supports between employees and managers and thus ease the problems of delegation (Ram and Holliday 642). Therefore the family culture can be seen as a ‘rational’ system of management. Testimonies of Glass Egg Digital employees on the company Facebook pages frequently refer to the friendly and ‘informal’ environment in which colleagues support each other and have fun together at work. According to Ha, the company’s culture and morale coordinator, the family culture took roots through overcoming hardships together as a small company and continues to be maintained by various strategies, for example numerous parties, heavily-invested team building activities, and company outings, etc. Most importantly, the open working space without cubicles encourage workers to interact and assist each other. Managers themselves exhibit an approachable, helpful and personal management style.

The family culture is, in fact, not new to Vietnamese company culture as many Vietnamese companies are run by family members. Vietnam is also one of the countries that were significantly influenced by Confucianism, “an ethical system, a political ideology, and a scholarly tradition developed from the teachings of the ancient Chinese philosopher Confucius”, which have persisted to the present day (Walker and Truong 2015). At the heart of Confucian discourse lies the notion that one’s identity is defined by their relationships with others: family, friends, and the organisations they are a member of. Therefore, one’s responsibility is to build a harmonious family, organisation, or society, and contribute to the common goals of the above-mentioned communities. As observed, while the new media companies operating in Vietnam have adopted the flexible and open management style, Confucius’ ideas still run deep in the business culture and employees’ expectations. Sometimes employees find themselves torn apart between the desire to obtain their financial goals and the fulfillment they can only get in a company they feel they belong to. Therefore, some workers choose to maintain a full-time job at the company they consider to be their ‘second big family’ and work on side projects in their free time.

It is important to note that new media outsourcing companies have cultivated family culture and kinship with innovative strategies. As Ross perceived in Silicon Valley companies, “Employees’ social life draw heavily on their immediate colleagues. There are no longer boundaries between work and leisure. No one who held a New Economy job was immune to this biohazard” (19). The software outsourcing companies in which the majority of employees are at young ages actively erode the work-life boundaries. For instance, TMA Solutions has hosted an internal dating ‘game-show’ to help their employees find their significant other within the company. It is not clear how much influence these strategies have on employees’ attitude and loyalty over their company, but the employees’ comments indicate that they find these ‘bonding’ activities very unique and intriguing. Such unconventional human resource management can be considered to be an influencing attribute of New Economy ‘good jobs’.

Conclusion 

Unlike no-collar workers in Internet companies in Silicon Valley, employees in software outsourcing companies do not pursue a stimulating and irresistible work that can change the world but instead pursue financial security and a sense of belonging. However, the research also found that the software outsourcing workplaces in Vietnam are remarkably influenced by the company culture in Silicon Valley while are still maintaining their own Asian values. Outsourcing companies thrive to offer a modern, flexible, open, and relatively unorthodox workplace culture, which has become common in the high-tech industry. Employees in outsourcing firms, therefore, face the same dilemmas of the flexible management system as those in Silicon Valley. It can be concluded that while the nature of the work in outsourcing firms and new media startups are relatively different, the hidden costs in the global informational economy, i.e. precarity and the loss of work-life balance, are the same. An important finding of this research is that the new media workforce in developing countries like Vietnam is also influenced by neoliberalism as they start to see themselves as a ‘human capital’, consciously increase their values, and use them as bargaining chips with their employers.

A further research can extend the scale of the interviews to software engineers to have a more comprehensive understanding of the outsourcing workforce. Given time and resources researchers can carry out a proper ethnography to make observations in some software outsourcing companies in Vietnam that will allow them to critically look at the nature of these companies and the outsourcing work from an outsider’s perspective. Another potential object of further in-depth research is family culture in Vietnamese new media companies as there are not many scholarships on this system of management.

Notes

[1] Since Vietnamese government does not collect systematic data on the IT services, nor does it publish all collected data on the Internet, the statistics presented in this paper may not entirely correct. During the research, I encountered different figures for number of firms, revenue, and employment. I chose to use statistics from the most reliable resources, such as published books and government reports. Nevertheless, the figures available from disparate sources still suggest the significant development of the software development outsourcing industry in Vietnam.

[2] Glass Egg Digital’s Facebook page: https://www.facebook.com/glassegg.gameoutsourcing/

TMA Solutions’ Facebook page: https://www.facebook.com/tmasolutions/

References

 Adecco. “[Infographics] Salary Guides for IT Position”. Adecco. 2017. 19 December 2017. <https://www.adecco.com.vn/en/knowledge-center/detail/salary-guides-for-it-position>

Austrade. 2017. The Australian Trade and Investment Commission. 20 December 2017. <https://www.austrade.gov.au/australian/export/export-markets/countries/vietnam/industries/ICT>

Brophy, Enda. “System Error: Labour Precarity and Collective Organizing at Microsoft.” Canadian Journal of Communication, vol. 31, no. 3, Oct. 2006, pp. 619-638.

Gallaugher, John, and Greg Stoller. “Software outsourcing in Vietnam: A case study of a locally operating pioneer.” The Electronic Journal of Information Systems in Developing Countries 17 (2004).

Le, Van. “7 ngành được thí sinh lựa chọn nhiều nhất kỳ thi tuyển sinh đại học 2017.” Vietnamnet, 2017. Ministry of Information and Communication. 20 December 2017. <http://vietnamnet.vn/vn/giao-duc/tuyen-sinh/7-nganh-duoc-thi-sinh-lua-chon-nhieu-nhat-ky-tuyen-sinh-dai-hoc-2017-381027.html>

Mai, Phu Hop, and Jun Wu Yang. “The current situation of Vietnam education.” Social Sciences, vol. 2, no. 6, 2013, pp. 168-178.

Marwick, Alice E. Status Update: Celebrity, Publicity, and Branding in the Social Media Age. Yale University Press, 2013, pp. 73-111.

Ministry of Information and Communications. National Commission on Application of IT: Vietnam Information and Communication Technology White Book 2014. Hanoi: Information and Communications Publishing House, 2014.

Hoang, Manh. “Lao động trình độ ĐH: Thu nhập trung bình đạt 7,49 triệu đồng/tháng.” Dân trí, 2017. TW Hội Khuyến học Việt Nam. 20 December 2017. <http://dantri.com.vn/viec-lam/lao-dong-trinh-do-dh-thu-nhap-trung-binh-dat-749-trieu-dong-thang-20170915144255701.htm>

Phan, Chau. “Vietnam as an IT Outsourcing Destination.” Capstone Project for Business, Policy and Strategy Course. Available from Washington Research Library Consortium: http://aladinrc.wrlc.org/handle/1961/4778 (2008).

Ram, Monder, and Ruth Holliday. “Relative merits: Family culture and kinship in small firms.” Sociology, vol. 27, no. 4, 1993, pp. 629-648.

Read, Jason. “A Genealogy of Homo-Economicus: Neoliberalism and the Production of Subjectivity.” Foucault Studies, Feb. 2009, pp. 25-36.

Ross, Andrew. No-collar: The humane workplace and its hidden costs. Temple University Press, 2004.

Shillabeer, Anna. “The Impact of I.T. Development Outsourcing on Worker Dynamics in Vietnam.” Proceedings of the International Conference on Managing the Asian Century, Singapore, 11-13 July 2013. Ed. Purnendu Mandal. Singapore: Springer Science+Business Media, 2013. pp. 153-162.

Sturgeon, Timothy and Zylberberg, Ezequiel. “Vietnam’s Evolving Role in ICT Global Value Chains.” Vietnam as a Crossroads: Engaging in the Next Generations of Global Value Chains. Eds. Claire Hollweg, Tanya Smith, and Daria Taglioni. Washington: World Bank Group, 2017. pp. 133-158.

Turner, Fred. “Burning Man at Google: A Cultural Infrastructure for New Media 
Production.” New Media & Society, vol. 11, no. 1–2, 2009, pp. 73-94.

Upadhya, Carol. “Controlling offshore knowledge workers: Power and agency in India’s software outsourcing industry.” New Technology, Work and Employment, vol. 24, no. 1, 2009, pp. 2-18.

Walker, Allan, and Truong, Thang Dinh. “Confucian Values and Vietnamese School Leadership.” Encyclopedia of Educational Philosophy and Theory, 2016), pp. 1-6.

 We Are Social. Digital in 2017: Southeast Asia. 2017. Facebook Usage Analysis, Slideshare, <https://www.slideshare.net/wearesocialsg/digital-in-2017-southeast-asia>. Slide 179

The World Bank. “The World Bank in Vietnam – Overview”. 2017. The World Bank. 19 December 2017. <http://www.worldbank.org/en/country/vietnam/overview>

How to research new media production

This is a short reflection on research methodology for new media production culture. There are two main groups of the methodology presented here:

  1. Ethnography and interviewing
  2. Discourse/textual analysis.

The methods and objects of study here are illustrated by academic research papers that have been performed in areas such as politics, media, cultural studies, organisation and management studies.

Ethnography and Interviewing

The ethnographic method is suitable for studying a particular group of people who have the same interest, identity, and culture. Researchers who employ the ethnographic method immerse themselves in the environment or culture with the subjects of their research for an extended period of time. While “living” among the subjects, researchers observe and document their patterns of communication, behaviour and the social structure of the group (Evans 9). Researchers also employ interviewing to assist with their observation and documentation. In other words, by interviewing, researchers gain a better understanding and interpretation of their observations from their subjects’ point of view.

Andrew Ross employs ethnography and interviewing to understand the attitudes and perspective of employees towards their workplaces. He spent eighteen months deep inside two prominent New Economy companies, Razorfish and 360hiphop, in New York’s Silicon Valley. He also interviewed employees in the technology industry in other cities. By immersing himself in their daily working life, he is able to “learn from employees while they were on the job” (3). For example, through observation and interviewing, he realizes how first-generation websters viewed their own companies as an example of how capitalist society could be reformed, and identifies the costs (i.e. overwork) of what are seemingly the good and humane jobs (18-20).

Interviewing is specifically important to understand movement and conflict as Brophy states in his research of labor conflict in digital capitalism (634). Brophy’s aim of the research is “to prioritize the fears, hopes, and desires of high-tech workers themselves over the strategies of established labour”, thus interviewing is the most appropriate approach to capture their emotions, conflicts, and pressure (634).

Similarly, drawing on ethnographic research in an Internet healthcare start-up company in Boston, media labs in Berlin, and startups in Bangalore, Christopher Kelty explores the cultural significance of Free Software movement (n.p). With the aim to conceptualise geeks’ social imaginaries, Kelty carefully selected a few interviewees based on their background and experience, and then conducted extensive interviews with them on a wide range of topics (72-91). This approach is slightly different from Ross’s study whose interviews focused on specific issues because Kelty’s object of study is broader and more philosophical. Kelty, who is an anthropologist himself, successfully shows how geeks make use of the Protestant Reformation as a metaphor for their position by looking closely at the words, tools, and stories that geeks use (72-91).

Another topic that has been explored by interviewing and ethnography is the cultural infrastructure for new media production, researched by Fred Turner (73). By participating in the Burning Man festival, Turner has a close look at how the festival in Black Rock City became a social model for project-based production in new media industry which is “driven by the pursuit of self-realization, project engineering, and communication” (91). The research consists of an archival research and interviews with long-time participants including computer programmers and software engineers who worked for Silicon Valley information and design firms. Hence, it not only answers the question at the start of the research: “How does Burning Man appeal so much to them?” but also reveals that the spirit of the festival lingers throughout the year and plays a central role in motivating both the common-based peer production and commercial product development (89-91).

While most of the ethnographic research above is conducted in the workplace contexts, Marwick’s highly anecdotal research on the social order in new media industry includes participation and observations of social media platforms and social gatherings of the San Francisco’s tech community from 2006 to 2010 (73-111). To better understand the hierarchy implied behind the interactions in the tech community, Marwick also conducted qualitative interviews with some of the tech scene’s entrepreneurs, “micro-celebrities”, tech journalists, and bloggers (73-111). These interviews then help her to see the implications of the hierarchy, such as the technology scene’s idealism status is earned through creating something that can change the world (82).

Furthermore, a study on the identities and culture of entrepreneurial labor by Neff, Wissinger, and Zukin shows that research in new media industry is not necessarily limited to observing and interacting with workers in this industry but can be expanded to other cultural industries; for it could provide a cross-disciplinary perspective on the object of study (308). The participant-observation in New York City and the interview data from 100 workers in the fashion media and the technology fields allow the researchers to identify the forces that led to entrepreneurial labor and to establish a model of “good work hierarchy in new media” (308-330).

Ethnography and interviewing complement each other in most studies, for interviewing is to understand the perspective, experience, motivation, feelings, and expectations of the subjects of study (i.e. employees, geeks, entrepreneurs) while ethnography allows the researcher to look critically at the new media industry (i.e. workplace, jobs, and free software community) from an outsider’s perspective.

Discourse/textual analysis 

Discourse/textual analysis is an approach which involves deconstructing written work to examine its relationship with the ideologies and social norms of a particular society, culture or community. In other words, researchers who perform textual analysis on a text would try to make sense of the ways people in particular cultures at times see themselves and the world they live in (McKee 1).

An important aspect of new media production studies is contemporary business culture, which is analysed by Eve Chiapello and Luc Boltanski, since it is fundamental to the organisation and economic activities of new media companies (n.p). Chiapello and Boltanski trace and observe the changes of organisations over the past 30 years by analysing published works from the field of management studies, which are argued to influence the thinking of employees and play an important role in corporate organisation reforms (164-165). The researchers choose texts written in French from the 1960s and the 1990s that were endorsed by professional management reviews (164). This is argued to be “an easier option than studying changes in practices that can be disparate and dispersed, of varying magnitude, and which can affect the organisation of the companies” (162). While this research’s object is similar to Brophy’s: post-Fordist work structures, this work’s goal is to provide a theoretical framework for better understanding of the transformations that capitalism has experienced. Both studies showed how organisations can be changed by employee initiative. However, Chiapello and Boltanski’s study aims to provide a model of change based on pragmatic analysis.

Another structural research is Alan Liu’s study on the mix structure of “politics for the really cool”. Since this cyber-politics exists almost everywhere in our activities on the Internet (the digital everyday), Liu chooses to use a data set of various resources (i.e. Wired magazines, Internet technology news sites, news and commentary of the leading activist organisations, critical commentary, etc.) to identify the elements of the free cyber society and the underlying structure of cyberlibertarianism (239-241). Although Liu’s book was published in 2004, he uses many resources from the 80s and 90s (i.e Processed World magazine “Bad Attitude” edition) to make historical critique of the digital everyday (278-282). Liu explained in an interview with Geert Lovink that “the now and the far past are necessary to each other” and ”can be brought into meaningful engagement” when positioned in the generational changes that “made us what we are today” (n.p).

Besides published works, magazines, online commentary, etc., researchers can also analyse the technical artifacts written and coded by the subjects of study (i.e. computer programmers, hackers, etc.). Coleman studies the ethical and aesthetic values of free and open source software developers (also called hackers) by closely reading the developer collective (technical documentation, coding, help messages, warning messages, communication on developers’ IRC channels, etc.) (91-158). Coleman’s object of study is similar to Kelty’s: the free software movement. Yet, by looking at the technical artifacts, Coleman is able to explore hackers’ clever code, aesthetics and effects, and how hackers organise themselves via the process of labor.

Conclusion 

In general, ethnography is most appropriate for research questions that aim to obtain insights into people’s views, feelings, motivations and actions, as well as the nature of the settings (i.e. workplace, social gatherings). It is usually a combination of detailed observations and interviews. Depending on the scope of the study subject, researchers may select a few interviewees and develop multiple extensive interviews with them, or design a broad range of questions focusing on specific issues for a number of interviewees.

In contrast, textual analysis is suitable for research that aims to understand the underlying systems, structures, and ideologies. Outcome of research using textual analysis is usually a framework or model that can be used to solve contemporary problems.

 

References

Boltanski, Luc, and Eve Chiapello. “The New Spirit of Capitalism.” International Journal of Politics, Culture, and Society, vol. 18, no. 3/4, 2005, pp. 161-88.

Brophy, Enda. “System Error: Labour Precarity and Collective Organizing at Microsoft.” Canadian Journal of Communication, vol. 31, no. 3, Oct. 2006.

Coleman, E. Gabriella. Coding Freedom: The Ethics and Aesthetics of Hacking. Princeton University Press, 2012, pp. 91-158.

Evans, Sarah. Qualitative Research Methods Bibliography (2017). Web.

Kelty, Christopher. Two Bits: The Cultural Significance of Free Software. Duke University Press Books, 2008, pp. 64-94.

Liu, Alan. The Laws of Cool: Knowledge Work and the Culture of Information. 
 University of Chicago Press, 2004, pp. 239-282.

Lovink, Geert. “Interview with Alan Liu.” Institute of Network Cultures. http://networkcultures.org/geert/interview-with-alan-liu/

Marwick, Alice E. Status Update: Celebrity, Publicity, and Branding in the Social Media Age. Yale University Press, 2013, pp. 73-111.

McKee, Alan. Textual Analysis: A Beginner’s Guide. 1st ed., SAGE, 2003.

Neff, Gina, et al. “Entrepreneurial Labor among Cultural Producers: ‘Cool’ Jobs in ‘Hot’ Industries.” Social Semiotics, vol. 15, no. 3, Dec. 2005, pp. 307–34. Taylor and Francis+NEIJM, doi:10.1080/10350330500310111.

Ross, Andrew. No Collar: The Humane Workplace And Its Hidden Costs. Temple 
 University Press, 2004, pp. 1-20.

Turner, Fred. “Burning Man at Google: A Cultural Infrastructure for New Media 
Production.” New Media & Society, vol. 11, no. 1–2, 2009, pp. 73-94.

Rethinking freedom in the algorithmic age

How was your life this year? Do you want to reflect on it? Do you want to look back at what you have achieved or lost this year? Some of us might want to do that but I believe many have moments they do not want to relive. While they have the right to do so, they might not be able to run away from them thanks to the new age of algorithms. In 2014, Facebook incorporated the app “Year in Review” into its social network. The app was designed to help users relive their favorite memories by featuring a selection of highlights of their year (mostly photos) pulled from their profiles in a video. And for Eric Meyer, despite seeing the videos created by others popping up on his Facebook News Feed, he avoided making one of his own (2). Until one afternoon, the app inadvertently showed him what his year looked like with one feature photo (3).  There, on the top of his Newsfeed, among the celebrating animation, was the face of his daughter who died that year (4).

The algorithms of the app were designed to push a picture in user’s timeline to urge them using the app. The pictures were probably algorithmically selected based on their interaction scores and the codes were clearly not designed with all scenarios in mind. However, my essay is not to denounce algorithms or programmers, but to contemplate on users freedom on world wide web. The story of Meyer is painful, yet it showcases how important it is to ask and answer, “What is the nature of freedom in the algorithmic age?” This essay takes the conceptual and historical work of the philosophers Michel Foucault, Gilles Deleuze and Isaiah Berlin to theorise personal freedom based on the analysis of some digital platforms. Foucault and Deleuze’s readings of disciplinary power and society of control are anticipatory and reflect our realities today; however, it is also important to take an approach from the perspective of freedom. In the essay, I will first establish the foundation of the analysis by defining algorithms and reviewing concepts of freedom and neoliberalism, and then analyse the important subjects of freedom before concluding what freedom is when algorithms are at work.

Algorithms

Algorithms are systems or processes consisting of sets of rules and grammars of action that describe how to perform a task. While algorithms do not have a consciousness themselves, they are essentially not unbiased or neutral (Cheney-Lippold 166). Ultimately, it is inevitable to design without bias. In fact, the “architects” (programmers, product owners, project managers, etc.) either deliberately or unintentionally embed their own ideology, their own perception of the world, or their ways of doing into the system, which are usually opaque to users unless it is an open source system. Moreover, algorithms, especially self-learning machines, need initial data to train themselves. The historical data that was fed to the algorithms certainly embeds historical practices and patterns. Algorithms simply pick up on these patterns and propagate them (Agre 745). Moving further from there, the operation of personalisation algorithms is a constant feedback loop that continuously aggregates user data, i.e. users’ consumption behavior, to classify users into dynamic categories and then modulates their online experiences (Cheney-Lippold 168).

I agree with Foucault that a conception of power would not simply be negative or juridical and algorithms could be analysed as a technology of power which is productive because it operates by producing knowledge and desire (The Meshes of Power 154). An important note here is that in Foucault’s concept of power, knowledge is inseparable from power (Discipline and Punish: The Birth of the Prison 27). Algorithms indirectly produce knowledge of us by capturing, analysing our activities and categorising us. Based on this knowledge, algorithms show us the advertisements that we may find relevant to us and therefore creates the desire for the products shown in these advertisements. It also produces new forms of desire, sometimes at a high level, such as the desire to expose ourselves and to see others exposed their life and their thought or the desire to create an online persona as in the case of social networks. Nevertheless, if we analyse the power of algorithms critically, we would be able to understand its relationship with human freedom and then contemplate the dimensions of freedom. Before conceptualizing freedom in the algorithmic context, it would be useful to review and reflect on the two concepts of liberty identified by Isaiah Berlin, a liberal philosopher, in 1958.

The two concepts of liberty/freedom

Negative freedom, advocated by classical liberals, John Locke, John Stuart Mill, Benjamin Constant, is a freedom from outside interference (Berlin 371). The concept of negative freedom can be summarized as: “I am no one’s slaves. Freedom is my ability to operate within a certain sphere where no one else is interfering with me.” The classical liberal thinkers believed that there should be a minimum range of personal freedom and a clear division between the public and private sphere (371). “Freedom in this sense is not, at any rate logically, connected with democracy or self-government” (373). Therefore, governing in the sense of negative freedom does not attempt to model and impose an overarching sense of common good, but to make sure that one’s autonomy is free from constraints.

Positive freedom, advocated by socialists or even liberals, could be summarized as: “I am my own master. Freedom is my ability to achieve my goals regardless of interference” (373). The concept of positive freedom entails being a rational, active, and responsible being. But human beings could be passionate, irrational, and ignorant. Therefore, democratic optimists such as Fichte, T.H. Green believe that democratic governing should involve creating social conditions which would provide each individual the means to exercise their free will. It became legitimate to them to impose constraint or obligation on individuals so that higher forms of freedom can flourish (381). Berlin rejects this argument which seem to defend authority and has been used by those needed justifications for imposing their doctrines on society (382). For Berlin, while imposing positive freedom, one might actually behave as the enemies of personal freedom.

Then we have neoliberalism which adopts the strict definition of negative freedom and maintains that individuals are free to pursue their self-interest and participate in a competitive market (Read 5). It should be noted that neoliberal refuses the positive freedom tradition and advocates reducing the role of government to a minimum. Neoliberal governmentality operates through a choice of architecture to shape the environments and rules of the game to intensify individual freedom and responsible choice (Thaler and Sunstein 6).

Self-determination and Interference

What the two concepts of freedom have in common is the self-determination. For both concepts, one must, ultimately, be able to exercise their free will. Thus, it is important to analyse the self-determination of individuals in the algorithmic context. To understand how our self-determination takes place in the digital space, we can examine the case of Google’s search algorithm, which is also referred to as PageRank. Google explains their search algorithms as follow: when a search query is entered into Google Search, the algorithms look for clues (i.e. user’s search history) to understand what user means by it rather than just the literal content of the search query. Then they match the user’s query with information on web pages and use a formula to decide how relevant a content is to what the user is looking for. They also examine different aspects of the web pages to decide the orders of search results. Within this system, our knowledge is, to some extent, dictated by various aspects that the algorithms take into account. Furthermore, the algorithms always evolve to better understand what you mean, sometimes better than you know yourself. For example, if you search for “how to go to Schiphol Airport”, you see a map with directions, not just links to other sites. By optimizing the search results to best match users’ search intentions, deciding what is more relevant to the query, the algorithm becomes an affecter of knowledge, and therefore influence users’ self-determination.

Optimizing for success is at the core of algorithms, which is most evident in the recommendation engines that have become excessively popular. Netflix, Amazon, Youtube gently suggest content or products they think we will like. While one can argue that users have the ability to choose what they want to watch on Youtube and Netflix or choose what they want to buy on Amazon, it is clear that the space to explore other options than which are presented to us is very limited. In a letter to shareholders in April 2015, Jeff Bezos, CEO of Amazon, declared that they generate “a steady stream of automated machine-learned nudges (more than 70 million in a typical week)” and “these nudges translates to billions in increased sales to sellers” (Mac 17). It is hard to imagine that these 70 million nudges did not somehow manipulate consumers’ self-determination. This manipulation is a subtle form of interference to personal freedom.

Control and Freedom 

In the final section, I want to discuss the relationship between control and freedom. From a Foucault perspective, freedom and control are not in opposing relationship with each other. Rather, they are mutually constitutive. In other words, they cannot exist without one another. “Power is exercised only over free subjects” (Foucault, The Subject and Power 221). People are controlled through freedom, not in conflict with freedom. This is certainly true in the algorithmic age which is observed to has a strong influence of neoliberalism. For example, Google Ad Auction lets the market set the price or the dating website OKCupid finds your match by analysing your answers on the site’s survey.

A GPS technology improves our ability to find the way in an unknown location and provides a means to exercise our free will in a strange environment. On the other hand, it also takes away our freedom in some opaque ways. For example, our movement becomes the data that we give away in exchange for the direction. Or we could also say that we depend on the direction given by the GPS technology, and our autonomy was taken away in the process. This example perfectly illustrates the nuances of control and freedom. We are left with as much freedom as possible because any attempts to control the subjects (i.e. social media users) would undermine the system’s ability to study their subjects. However, algorithms have optimized the environment not only to ensure that users benefit the most from their services but also to accommodate the goal of the platform owners.

Another example is that Google’s search result is not the same for everyone depending on various factors including their location. When someone in Germany searches for “4th of June 1989”, they are shown the historical event “Tiananmen Square protests of 1989”. On the contrary, using the same search query on the same platform in China gives us different results, i.e. “famous birthdays” was on the top of the results.

Once individual autonomy is increased, algorithmic control could also be leveraged. Such practices are established on the ground of as least coercion as possible so that one can exercise their freedom according to current social norms. While not forcefully imposing the current norms of freedom, civility, and industriousness on individuals, the platform owners can act as the public arbiters of social values and knowledge by the use of algorithms.

Conclusion 

The exercise of power and freedom in the West has taken new forms. Deleuze anticipated and already explained the work of control in his essay “Postscripts of control” which is the foundation of this essay. How about freedom? In the algorithmic context, we are exercising a form of negative freedom, though not in its strict definition, as we are free to act and to choose within the options filtered for us based on our previous actions. Power is embedded in the algorithms with economic and/or political incentives. This is not to say that we are suffocated under the algorithmic mechanism of control. Rather, algorithms empower a kind of freedom which I would call “filtered freedom”. We are free from constraints, have a sense of full self-determination, and yet remain subjects to surveillance and potential manipulation.

Ideally, this essay could have provided a more comprehensive review of the debate on freedom by including the work of Nikolas Rose who challenged Foucault’s view of freedom. Further research on freedom in the algorithmic age could look into that and explore the ethical values of this freedom. Moreover, in any society, resistance to control always exists; therefore it would be important to research how the algorithmic forms of resistance affect our understanding of freedom.

 

References

Agre, Philip E. “Surveillance and Capture: Two Models of Privacy”. The Information Society 10.2 (1994): 101-127.

Berlin, Isaiah. “Two concepts of liberty.” The Idea of Freedom. Oxford: Oxford University Press, 1979. 175-93.

Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture & Society 28.6 (Nov. 2011): 164-81.

Deleuze, Gilles. “Postscript on the Societies of Control.” October 59 (Winter 1992): 3–7.

Foucault, Michel. “The Meshes of Power.” Space, Knowledge and Power. (2007): 153-162.

Foucault, Michel. “The Subject and Power.” Critical Inquiry 8.4 (1982): 777-95.

Google. “How Google Search Works | Search Algorithms.” Google Search, https://www.google.com/search/howsearchworks/algorithms/. Accessed 24 Oct. 2017.

Mac, Ryan. “Jeff Bezos’ Letter To Shareholders: ‘Don’t Just Swipe Right, Get Married (A Lot).’” Forbes (2015), https://www.forbes.com/sites/ryanmac/2015/04/24/jeff-bezos-letter-to-shareholders-dont-just-swipe-right-get-married-a-lot/. Accessed 25 Oct. 2017.

Meyer, Eric. “Inadvertent Algorithmic Cruelty.” Thoughts From Eric, 24 Dec. 2014, http://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-algorithmic-cruelty/.

Read, Jason. “A Genealogy of Homo-Economicus: Neoliberalism and the Production of Subjectivity.” Foucault Studies, (Feb. 2009): 25-36.

Thaler, Richard H and Cass R Sunstein. Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, London: Yale University Press, 2008.