Understanding Disinformation

Author: Mohan Dutta
Published:

Singapore’s proposed Protection from Online Falsehoods and Manipulation Act[1] (POFMA) is one of the most extensive approaches set out to combat “fake news”, or disinformation, since the phenomenon gained global public attention following the 2016 US presidential elections. In addition to criminalising certain acts of disinformation, POFMA would grant broad discretionary powers to the executive branch of government to curtail online communications and regulate the platforms that enable them. One key premise that this legislation is based on is the ostensibly large effect of digital disinformation.

…the challenge of disinformation is salient in monoculture environments where power is consolidated in the hands of state, military and private interests. Addressing this challenge, therefore, requires greater democracy, diversity of voices, and participation, not less

The online dissemination of disinformation is indeed a challenge, given the proliferation of online hate, threats to human health and wellbeing, and various forms of cross-border influence in electoral processes. Understanding the key theories in the literature will help us grasp the current body of evidence on the effects of digital information. Situating this evidence in conversation with the broader question of power and democracy, I argue that the challenge of disinformation is salient in monoculture environments where power is consolidated in the hands of state, military and private interests. Addressing this challenge, therefore, requires greater democracy, diversity of voices, and participation, not less.

What effect does the media actually have?

Whenever a widely disseminated new medium is introduced in society, the lay idea that it has large effects tends to emerge. For example, in Myanmar, a false claim that two Muslim men had raped a Buddhist woman spread widely within 24 hours. This was followed by public disorder, riots, and violence.[2] This incident has been cited as an example of the effects of disinformation spread on social media.[3]

Yet this characterisation does not acknowledge several important material factors, such as the broader context of Islamophobic hatred sowed by the state-police-military apparatus, including through active cultivation on digital platforms, and the simultaneous repression of freedom of expression. The attribution of causality to the spread of information via social media thus misrepresents the picture. Similarly, the claim that the outcome of the most recent US presidential elections was caused by disinformation disseminated on social media fails to take into account the broader context of growing inequalities in the US and associated feelings of disenfranchisement and mistrust. Media reports of Russiagate circulated a straightforward narrative of large social media effects.[4]

…the evidence suggests complex effects that are mediated and moderated by variables ranging from individual dispositions, to communication characteristics, to broader contexts

Yet such claims of direct effects are not usually borne out. Instead, the evidence suggests complex effects that are mediated and moderated by variables ranging from individual dispositions, to communication characteristics, to broader contexts.

The following section discusses the theories of the effects that the media (including social media) has.

1. Magic bullet theory: the media directly shapes what the public thinks

Scholars began to study the effects of media amid large-scale societal transformation in the 1920s and 1930s, involving rapid urbanisation and the growth of the mass media.[5] The prevailing assumption in this era was that the media were all-seeing and all-influential, directly shaping what people thought, how they felt and how they behaved. The media were portrayed as hypodermic needles that could inject beliefs directly into the public.[6] If message content was placed into a medium, it was assumed to have persuasive effect immediately on reaching the audience. Early researchers argued that “the mass production of communications” created a “mass audience”: a “conglomerate of millions who could now attend to the same message”.[7] The mass audience, it was argued, was increasingly alienated, volatile and rootless in an urbanised and industrialised society, and therefore intrinsically susceptible to manipulation. This idea that audiences could be “brainwashed” by mass mediated messages gained further legitimacy during World War I, with the preoccupation with the role of mass media in propaganda.[8]

2. Limited effects theory: the effects of media are limited

Yet the earliest classic studies failed to demonstrate these large effects. Instead, they demonstrated that the effects of media on public opinion, how the public perceive an issue or a political candidate, were minimal and at best, limited. Summarising these findings of studies conducted between the 1930s and 1960s, the media scholar Paul Lazarsfeld proposed the limited media effects theory.[9] This theory notes that the media do not directly influence the public. Instead, it proposes a range of mediators that shape the effects of mass media. Based on his review of the major media studies, Joseph Klapper published the book The Effects of Mass Communication in 1960, concluding that media reinforce or sustain existing thoughts and ideas.[10]

What this research suggests is that there does not exist a single, perfect strategy for correcting misinformation

The limited effects theory now forms the mainstream paradigm of media effects research, with ongoing research examining the mediating and moderating variables that might explain how media influence audiences. For instance, the cumulative body of scholarship on effects examines message characteristics, source characteristics, channel characteristics, and the interactions among them. Other scholarship examines audience characteristics, including demographics and disposition on media effects. Consider, for instance, the scholarship on the effectiveness of interventions correcting misinformation[11]: what this research suggests is that there does not exist a single, perfect strategy for correcting misinformation. The success of a corrective intervention depends on factors such as the characteristics of the audience, the nature of the original misinformation, and the medium of the correction.

3. Selective processing theories: people consume media which reinforces their existing beliefs

Building on the findings summarised by Klapper, some have put forward selective processing theories of media, arguing that active audience members choose to consume media content that reinforces their existing beliefs, attitudes and behaviours. Lazarsfeld, Berelson and Gaudet conducted voter choice studies, noting that partisan voters in US presidential elections would encounter messages aligned with their partisan beliefs more often than they would encounter incongruent messages.[12] Selective exposure theory observes that individuals are driven by their existing dispositions to expose themselves to media content that are aligned[13], such as subscribing to newspapers with the same political leaning as their own. Selective retention theory similarly notes that individuals tend to remember messages that are consistent with their existing beliefs and values.

4. Uses and gratifications theory: people use media to fulfil their needs

The uses and gratifications theory argues that individuals use various media to fulfil their needs.[14] The driving questions for this theory are “Why do people use media?” and “What uses do they put them to?” The media repertoire of an individual is shaped by the needs that they seek to fulfil through media consumption; for social media uses, these include the needs for connection, socialisation, entertainment, and information.[15]

5. Agenda setting theory: the media don’t tell us what to think, but what to think about

According to agenda setting theory, the media don’t tell us what to think, but what to think about.[16] The public agenda (the issues that the public think about) reflects the media agenda (the issues that appear as important in the media). Existing studies document the correlation between media (including digital media) and public agendas, supporting this theory. Studies also document the correspondence between mainstream media agenda and digital media agenda.[17]

6. Framing theory: how an issue is framed affects how people perceive it

Framing theory attends to how an event or issue is presented to the audience (called “the frame”), and the effects of the frame on individuals.[18] Frames offer structure to the message, and therefore constitute its meaning. The news media, for instance, frame the information they convey in specific ways by organising the message. In doing so, framing theory suggests that news media influence the audience. Existing scholarship documents the role of individual differences, channel, and other variables that constitute the effects of media frames on audience perceptions.

7. Cultivation theory: a cumulative effect over time

Cultivation theory suggests a cumulative effect, observing that exposure to media over time cultivates the audience’s perception of reality. Initially, this research focused on the effects of television, due to the ubiquity of television in American life and culture. As a common environment of symbols, television bound people together, and socialised them into identities, rules, and roles.  Cultivation scholars suggest that we look at how stories in the digital environment cultivate perceptions of the world among their audiences.[19] Media therefore cultivate ideologies through sustained uses over a long period of time.

Additional theories specific to digital communication

Although the theories presented above were initially conceptualised and empirically tested in the context of traditional media, almost all have also been tested in the context of digital media. In addition, digital communication theories specifically have grappled with digital communication technologies.

8. Channel complementarity theory: the internet complements, not replaces, traditional media

 At the turn of the millennium, as internet penetration was rapidly growing, I started observing that the internet did not displace the other traditional sources of news and information for the American public. Based on this empirical observation across a number of different studies, I proposed the channel complementarity theory,[20] arguing that the internet, rather than displacing uses of traditional information sources, complements them. Individuals interested in procuring information in a particular content area—such as politics or health—expose themselves to a multitude of media outlets to optimise the information they obtain. The individual who seeks out specific information content on the internet is also more likely to read newspapers, watch television, or listen to radio in the same content area to fulfil their information needs.

For instance, in a study conducted in Singapore, our research team observed channel complementarity in health information seeking patterns. In a study of exposure to political information in a campaign cycle in India, our team likewise observed complementarity between exposure to political information on the internet and on traditional channels. In addition, our study observed complementarity across functions, with individual sharing of political information through social media being related with exposure to political information across traditional media and internet, as well as with sharing of political information through traditional channels.

9. Hybrid Media System:You can’t separate “traditional” and “new” media

The hybrid media system proposed by Andrew Chadwick suggests that the dichotomous classification of media systems as “traditional” and “new” don’t tap into the complexity, interpenetration and dynamic nature of the current media system.[21] Through close readings of several case studies, he demonstrates how various forms of interactions across a wide array of communication channels reinforce one another. Hybrid media system captures the flows between different communication channels, depicting the mixing of the logics of the various communication systems. Contemporary political campaigns, for instance, use mixes of online and offline elements in campaign communication, with these elements drawing on each other in dynamic ways.

The empirically demonstrated complexity of media effects is at odds with the direct, large effects claims made in POFMA. The literature points to the role of deliberative democracies as an antidote to the manipulative strategies of disinformation originating from economic and political power

Summary: What effect does the media actually have?

In summary, what these existing media theories propose is that the effects of media are complex and multifaceted, with a wide variety of moderating and mediating variables, and vary from small to moderate and large sizes. Contextual features—especially democratic and media climates—matter in understanding the effects of media on audiences.[22] The empirically demonstrated complexity of media effects is at odds with the direct, large effects claims made in POFMA. The literature points to the role of deliberative democracies as an antidote to the manipulative strategies of disinformation originating from economic and political power.

Media effects of digital disinformation (“fake news”)

The media effects literature on “fake news” suggest limited to moderate effects, with a number of questions that are largely unanswered.[23] The research can be broadly classified into two categories: (a) sharing and engagement with disinformation; and (b) the effects of disinformation on beliefs, attitudes, and behaviors of the public.

Sharing and virality

Studies note the key role of digital infrastructures in political campaigns, with many political campaigns, for instance, turning to Twitter.[24] A growing body of scholarship suggests that disinformation is more likely than accurate information to go viral.[25] Moreover, these studies also suggest that bots are key players in automating information and in making disinformation viral. In examining what makes a message go viral, scholars point to the role of emotions. The expression of sentiments through emojis influences what is shared and how rapidly, meaning emojis influence sharing.

Noting that digital advertising does not have large effects, [Neuman] instead suggests that most individuals are not persuadable pawns, and are equipped to ignore advertising and propaganda

Digital campaign effects

As suggested in the limited effects literature, the effects of campaigns are partisan. Targeted campaigns are partisan, and they reinforce the beliefs of the audience they reach. Online political advertisements are unable to convert the attitudes of voters.[26] In the US, partisanship shapes how people vote, and micro-targeting campaigns don’t shape voting outcomes.[27] Notes Neuman, “media message, intended to be persuasive or otherwise, is not likely to stimulate a singular response, but rather a distribution of responses across a population of those who have encountered the message”.[28] Noting that digital advertising does not have large effects, he instead suggests that most individuals are not persuadable pawns, and are equipped to ignore advertising and propaganda.

Effects of disinformation

In their synthesis on studies of the effects of “fake news”, Egelhofer and Lecheler note the absence of studies that empirically document effects. The limited studies that do exist suggest limited effects. In the US, visitors to “fake news” websites constitute a small audience compared to the total population, as well as compared to the audience of mainstream news websites.[29]

Examining online audience data from late October through late November of 2016 in the US, Allcott and Gentzkow concluded that a small fraction of Americans consumed fake news.[30] Moreover, visitors to established news sites spend more time with them compared to visitors to fake news sites.[31] In an online survey of a national sample of 2,525 Americans, combined with their web use behaviours, between 7 October and 14 November 2016, approximately one in four Americans visited a fake news website.[32] With Facebook forming the primary source of fake news traffic, the pattern of selective exposure was largely concentrated among participants with the most conservative information diets. Similar observations are made by Nelson and Taneja, who observe that the primary audience of fake news sources is a small group of heavy Internet users who are not loyal to the sources.[33]

Inter-media agenda setting

The hybrid media model discussed earlier documents the interlinked relationships among diverse media. The agenda of mainstream media establishes the agenda of social media and vice versa, as newsrooms dedicate teams to monitoring digital spaces for stories. Disinformation, when picked up and disseminated through traditional media channels, performs an agenda-setting function.[34]

Context of digital disinformation

Digital information is embedded in relationships and networks of power. For instance, consider Cherian George’s analysis of case studies of “hate spin”—the “manufactured vilification or indignation, used as a political strategy that exploits group identities to mobilise supporters and coerce opponents”.[35] He finds that the manufacturing and dissemination of hate is a mainstream political strategy used by ruling governments. Power is central to the analysis offered by George, attending to the powerful forces that gain politically through the production and circulation of hate.

…the manufacturing and dissemination of hate is a mainstream political strategy used by ruling governments

In contexts where “fake news” or disinformation seems to have had a strong effect, including contributing to violence, some combination of the following factors are observed:

  1. democratic structures are weak;
  2. democratic institutions have been depleted;
  3. instruments of the state sponsor and circulate disinformation;
  4. mainstream political parties/actors support the disinformation campaign; and
  5. audiences perceive a climate of crisis.[36]

Disinformation is weaponised as a political tool that works alongside other tools such as the censorship of opposing views, the erasure of truthful accounts and the direct use of violence. For instance, in India, Islamophobic disinformation is planted by networks/organisations attached to the ruling political party and feeding on majoritarian politics. The disinformation is quickly shared through social media such as Twitter, WhatsApp and Facebook, and works together with offline hate-driven organising (Islamophobic outfits attached to the ruling Hindutva party). Incidents of hate-driven violence therefore have been actively sponsored by the ruling political party that has simultaneously attacked democratic avenues of alternative expression. Given the difficulty in designing experiments around these external factors, however, there is ultimately limited quantitative and causal support for the conclusion that disinformation can be tied to large effects in the form of violence.

Similarly, propaganda had a key role in organising state/military-led violence and control in the mass genocide of individuals alleged to be attached to the Indonesian Communist Party in the 1960s.[37] The state and military were involved in manufacturing disinformation that blamed the Indonesian Communist Party (PKI) for the 30 September coup. This disinformation sought to imply a PKI conspiracy, vilifying the PKI and narrating a story of PKI brutality, in order to cultivate a Communist “Other”. Manufactured accounts of sexualised torture seeded fear and planted large-scale hysteria about a putative Communist threat, amid a crisis created by the military and the overt censorship of media. Here, disinformation was part of a political campaign of repression, censorship, manipulation and organised violence. The power of this campaign and active state-military collaboration to silence voices continues to erase and downplay current conversations on the genocide. That this disinformation campaign succeeded, and continues to succeed, despite originating in an era without any form of digital media illustrates the importance of the key factors noted above, and conversely, the roles of power, violence, censorship, democratic climate, and opportunity for voice in constituting the success of disinformation campaign.

…the power to disseminate information is increasingly consolidated in the hands of a narrow and networked power elite. The perception created on digital platforms—of many voices—should be juxtaposed against the backdrop of limited opportunities for voices to be heard in ways that matter

It is worth noting the proliferation of communication monocultures across the globe, where the power to disseminate information is increasingly consolidated in the hands of a narrow and networked power elite.[38] The perception created on digital platforms—of many voices—should be juxtaposed against the backdrop of limited opportunities for voices to be heard in ways that matter.[39] The growing sense of disenfranchisement among people across the globe is tied to neoliberal policies that have on one hand entrenched political and economic power, and on the other hand, have systematically removed opportunities for deliberative democratic participation.[40] The challenge for democracy now is the erasure of public spaces for meaningful participation. The systematic study of the effects of disinformation needs to be situated within this broader context, and in balance with the literature that documents the democratising role of digital platforms.

Conclusion

The media reports and public discourse on the effects of disinformation are based upon an assumption of large effects that is not currently borne out by existing research.[41] The media depictions of data-driven campaigns do not correspond with the reality of political campaigning on the ground.[42] Similarly, the rhetoric of large effects, which is often used by data-driven political marketing companies in securing large client bases, does not match the scientific evidence on the effectiveness of data-driven political marketing.

Certainly, the accelerated circulation of information on digital platforms presents important possibilities for democratising communication while at the same time posing important questions about the role of disinformation. This presents a challenge for both media researchers and policy makers. However, at this point, political communication scientists note the absence of evidence for making strong claims about the large effects of online disinformation and thus the putative need for rapid regulatory responses.[43] Instead, systematic studies of effects of disinformation are needed.[44]

The claims of “large effects” in the submissions to the Select Committee hearings and the incorporation of these effects claims into the Bill ought to be closely interrogated in the backdrop of the current science of media effects, the role of narrative power, and the nature of deliberative democracy in Singapore

Given the limits to democracy and freedom of expression in Singapore, particular caution should be exercised in reading the regulatory reach of POFMA. The claims of “large effects” in the submissions to the Select Committee hearings and the incorporation of these effects claims into the Bill ought to be closely interrogated in the backdrop of the current science of media effects, the role of narrative power, and the nature of deliberative democracy in Singapore. Particularly salient is the role of diverse voices in contesting the “truth claims” made by those in power. POFMA, by placing the decision-making power in the hands of ministers, consolidates this power rather than enabling spaces for democratic articulations as the basis of deliberation.

What we do know from the literature is the role of power. The consolidation of power and the absence of democratic opportunities and institutions for participation reproduces an environment where disinformation acts oppressively, actively erasing the opportunities for democratic voices to emerge in open communication processes for seeking truth. Often, powerful private and state actors are the sources of disinformation, actively building a communication monoculture through instruments and tools controlled by the state.

When the overarching environment has been constrained by limited choices and limited possibilities for democratic participation, disinformation planted by hegemonic political-economic actors can quickly spread and contribute to strong effects. In this backdrop, what is the role of voice, especially the marginalised voice, in seeking truth?[45] As we work on building the evidence base on the effects of digital disinformation, both media theory and practice need to explore the role of what I call “voice democracy”—democratising the communicative spaces for voices of citizens (particularly marginalised citizens) to be heard[46]—in countering disinformation and strengthening democracy.

References

[1]https://www.parliament.gov.sg/docs/default-source/default-document-library/protection-from-online-falsehoods-and-manipulation-bill10-2019.pdf

[2]United Nations Human Rights Council. (2019). Report of the international Fact Finding Mission on Myanmar.

[3]Channel News Asia. (April 13, 2019). Proposed law on falsehoods has ‘clear oversight mechanism’ to prevent abuse by Government, says Shanmugam

Read more at https://www.channelnewsasia.com/news/singapore/proposed-law-on-falsehoods-has-clear-oversight-mechanism-to-11438132

[4]Mayer, Jane (October 1, 2018). “How Russia Helped Swing the Election for Trump”. The New Yorker. Retrieved May 2, 2019.

[5]De Fleur, M. L. (1998). Where have all the milestones gone? The decline of significant research on the process and effects of mass communication. Mass Communication & Society, 1, 85–98; De Fleur, M. L., & Ball-Rokeach, S. (1988). Theories of mass communication (5th ed.). New York: Longman.

[6]Curran, J., Gurevitch, M., & Woollacott, J. (1982). The study of the media: Theoretical approaches. In M. Gurevitch, T. Bennett, J. Curran, & J. Woollacott (Eds.), Culture, society and the media (pp. 11-29). New York: Methuen and Co.

[7]Neuman, W. R., & Guggenheim, L. (2011). The evolution of media effects theory: A six-stage model of cumulative researchCommunication Theory21(2), 169-196.

[8]Lasswell, H. D. (1935). The person: Subject and object of propaganda. The Annals of the American Academy of Political and Social Science, 179(1), 187-193; Lasswell, H. D., Casey, R. D., & Smith, B. L. (Eds.). (1969). Propaganda and promotional activities: an annotated bibliography. Chicago: University of Chicago Press.

[9]Lazarsfeld, P. F., B. Berelson, and H. Gaudet. 1948. The people’s choice: How the voter makes up his mind in a presidential election. New York: Columbia University Press; Katz, E. (2001). Lazarsfeld’s map of media effects. International Journal of Public Opinion Research13(3), 270-279.

[10]Klapper, J. T. (1960). The effects of mass communication. New York, NY: Free Press.

[11]Southwell, B. G., & Thorson, E. A. (2015). The Prevalence, Consequence, and Remedy of Misinformation in Mass Media Systems. Journal of Communication, 65(4), 589-595.

[12]Lazarsfeld, P. F., B. Berelson, and H. Gaudet. 1948. The people’s choice: How the voter makes up his mind in a presidential election. New York: Columbia University Press.

[13]Stroud, N. J. (2017). Selective exposure theories. In The Oxford handbook of political communication.

[14]Katz, Elihu, Jay G. Blumler, and Michael Gurevitch. “Uses and gratifications research.” The public opinion quarterly 37.4 (1973): 509-523.

[15]Phua, J., Jin, S. V., & Kim, J. J. (2017). Uses and gratifications of social networking sites for bridging and bonding social capital: A comparison of Facebook, Twitter, Instagram, and SnapchatComputers in Human Behavior72, 115-122.

[16]McCombs, M. E., Shaw, D. L., & Weaver, D. H. (2014). New directions in agenda-setting theory and research. Mass communication and society, 17(6), 781-802.

[17]Weimann, G., & Brosius, H. B. (2015). A new agenda for agenda-setting research in the digital era. In Political Communication in the Online World (pp. 26-44). Routledge.

[18]Scheufele, D. A., & Tewksbury, D. (2006). Framing, agenda setting, and priming: The evolution of three media effects models. Journal of communication,57(1), 9-20.

[19]Morgan, M., Shanahan, J., & Signorielli, N. (2015). Yesterday’s new cultivation, tomorrow. Mass Communication and Society, 18(5), 674-699.

[20]Dutta, M. J. (2004). Complementarity in consumption of news types across traditional and new media. Journal of Broadcasting & Electronic Media, 48(1), 41-60.

[21]Chadwick, A. (2017). The hybrid media system: Politics and power. Oxford University Press.

[22]Habermas, J. (2006). Political communication in media society: Does democracy still enjoy an epistemic dimension? The impact of normative theory on empirical research. Communication theory, 16(4), 411-426.

[23]Egelhofer, J. L., & Lecheler, S. (2019). Fake news as a two-dimensional phenomenon: a framework and research agenda. Annals of the International Communication Association, 1-20.

[24]Jungherr, A. (2016). Twitter use in election campaigns: A systematic literature review. Journal of information technology & politics, 13(1), 72-91.

[25]Shao, C., Ciampaglia, G. L., Varol, O., Yang, K. C., Flammini, A., & Menczer, F. (2018). The spread of low-credibility content by social bots. Nature communications, 9(1), 4787.

[26]Broockman, D. E., & Green, D. P. (2014). Do online advertisements increase political candidates’ name recognition or favorability? Evidence from randomized field experiments. Political Behavior, 36(2), 263-289.

[27]Kreiss, D. 2017. Micro-targeting, the quantified persuasion. Internet Policy Review 6 (4). doi: 10.14763/2017.4.774.

[28]Neuman, W. R. 2016. The digital difference: Media technology and the theory of communication effects. Cambridge, MA: Harvard University Press.

[29]Nelson, J. L., & Taneja, H. (2018). The small, disloyal fake news audience: The role of audience availability in fake news consumption. new media & society, 20(10), 3720-3737.

[30]Allcott H and Gentzkow M (2017) Social Media and Fake News in the 2016 Election. Stanford, CA: Stanford University. Available at: http://web.stanford.edu/~gentzkow/research/fakenews.pdf

[31]Nelson, J. L., & Taneja, H. (2018). The small, disloyal fake news audience: The role of audience availability in fake news consumption. new media & society, 20(10), 3720-3737.

[32]Guess, A., Nyhan, B., & Reifler, J. (2018). Selective exposure to disinformation: Evidence from the consumption of fake news during the 2016 US presidential campaign. European Research Council, 9.

[33]Nelson, J. L., & Taneja, H. (2018). The small, disloyal fake news audience: The role of audience availability in fake news consumption. new media & society, 20(10), 3720-3737.

[34]Vargo, C. J., Guo, L., & Amazeen, M. A. (2018). The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media & Society, 20(5), 2028-2049

[35]George, C. (2016). Hate spin: The twin political strategies of religious incitement and offense-taking. Communication Theory, 27(2), 156-175; George, C. (2016). Hate spin: The manufacture of religious offense and its threat to democracy. MIT Press.

[36]Udupa, S. (2018). Enterprise Hindutva and social media in urban India. Contemporary South Asia, 26(4), 453-467.

[37]Roosa, J. (2006). Pretext for Mass Murder: the September 30th Movement and Suharto’s coup d’état in Indonesia. Univ of Wisconsin Press.

[38]McChesney, R. W. (2013). Digital disconnect: How capitalism is turning the Internet against democracy. The New Press.

[39]Couldry, N. (2010). Why voice matters: Culture and politics after neoliberalism. Sage publications.

[40]Calhoun, C. (2015). Democratizing inequalities: Dilemmas of the new public participation. NYU Press; Dutta, M. J. (2016). Neoliberal health organizing: Communication, meaning, and politics. Routledge.

[41]Cadwalladr, C. (2017). The great British Brexit robbery: how our democracy was hijacked. The Guardian, 7; Cadwalladr, C., & Graham-Harrison, E. (2018). Cambridge Analytica: links to Moscow oil firm and St Petersburg university. The Guardian, 17.

[42]Baldwin-Philippi, J. (2017). The myths of data-driven campaigning. Political Communication, 34(4), 627-633.

[43]Bodó, B., Helberger, N., & de Vreese, C. H. (2017). Political micro-targeting: a Manchurian candidate or just a dark horse?. Internet Policy Review, 6(4).

[44]Zuiderveen Borgesius, F., Möller, J., Kruikemeier, S., Ó Fathaigh, R., Irion, K., Dobber, T., … & de Vreese, C. H. (2018). Online political microtargeting: Promises and threats for democracy. Utrecht Law Review, 14(1), 82-96.

[45]Dutta, M. J. (2014). A culture-centered approach to listening: Voices of social change. International Journal of Listening, 28(2), 67-81.

[46]Dutta, M. J. (2011). Communicating social change: Structure, culture, and agency. Routledge.

 

Mohan Dutta

Mohan J. Dutta is Dean's Chair Professor of Communication and Director of the Center for Culture-Centered Approach to Research and Evaluation (CARE) at Massey University. Previously, he was Provost Chair Professor and Head of Communications and New Media at the National University of Singapore.

If you enjoyed this article...

Join the movement and spread the love

If you enjoyed this article and would like to join our movement to create space for research, conversation, and action in Southeast Asia, please subscribe to New Naratif—it’s just US$52/year (US$1/week)!