Introduction
In the contemporary media landscape, disinformation has emerged as a pressing policy challenge, disrupting traditional frameworks of media regulation and governance. This essay examines disinformation as a media policy problem, drawing on core literature to analyse how it illustrates broader transformations in media systems. By addressing key questions, it explores the foundations of media policy, the evolving nature of disinformation, shifts in actors, power, governance, and scale, as well as the continuity and limitations of established theories. Furthermore, it critically evaluates policy responses and their normative implications for democracy, freedom of expression, and public communication. Insights from course literature, including Chakravartty and Sarikakis (2006), Donders et al. (2014), and Freedman (2008), are combined with contemporary developments, such as platform regulation and hybrid governance models. Guest lectures by Mato Brautovic provide additional context on disinformation in digital environments, highlighting its rapid spread via social media. This analysis argues that while traditional theories offer valuable foundations, they often fall short in addressing the global, algorithmic nature of modern disinformation, necessitating innovative policy approaches.
Media Policy Foundations
Media policy is conceptualised in the literature as a set of regulatory frameworks aimed at shaping media systems to serve public interests, such as diversity, access, and accountability (Freedman, 2008). Freedman (2008) describes media policy as inherently political, involving negotiations between state actors, media industries, and civil society to balance economic, cultural, and democratic goals. Assumptions about disinformation in this context often portray it as a deviation from truthful journalism, rooted in broadcasting-era concerns like propaganda or biased reporting. For instance, traditional media systems are assumed to be nationally bounded, with state-centred regulation ensuring editorial standards and public service obligations.
Chakravartty and Sarikakis (2006) extend this by emphasising globalisation’s impact, where media policy must navigate transnational flows of information. They assume key actors include governments and traditional broadcasters, with regulation focused on protecting national sovereignty and cultural identity. Similarly, Donders et al. (2014) in The Palgrave Handbook of European Media Policy discuss assumptions about media as public goods, with disinformation seen as a threat to informed citizenship, regulated through mechanisms like content licensing and public broadcasting mandates. These perspectives assume a relatively controlled information environment, where state intervention can mitigate harms, but they were developed in an era dominated by analogue media, predating the digital disruptions that amplify disinformation today.
Disinformation as a Policy Problem Today
Disinformation is defined as the deliberate creation and dissemination of false or misleading information, often with intent to deceive or manipulate public opinion (European Commission, 2018). As a policy issue, it is understood through its potential to undermine trust in institutions, polarise societies, and interfere in democratic processes, such as elections. It has become a central concern due to the rise of digital platforms, where algorithmic amplification enables rapid, viral spread across borders, as highlighted in guest lectures by Mato Brautovic, who discussed case studies of disinformation campaigns in Europe.
The shift from traditional to digital media has transformed the conditions for disinformation. Previously confined to state-controlled or journalistic gatekeepers, it now proliferates via user-generated content on platforms like Facebook and Twitter, exacerbated by echo chambers and targeted advertising (Wardle and Derakhshan, 2017). This change reflects broader media policy transformations: actors have expanded from national broadcasters to global tech giants like Meta and Google, which wield significant control over information flows without traditional editorial oversight. Power has shifted towards these platforms, which prioritise engagement over accuracy, leading to governance moving from purely state-based regulation to hybrid models, including self-regulatory codes and transnational frameworks like the EU’s Digital Services Act (DSA) (European Union, 2022). The scale has globalised, with disinformation campaigns often originating transnationally, challenging national regulatory capacities.
Key Transformations in Media Policy
Using disinformation as a case, several key transformations in media policy become evident. Firstly, actors have evolved from traditional media institutions to global platforms. Freedman (2008) notes that early policies focused on broadcasters and publishers, but today, platforms like YouTube dominate, acting as both distributors and quasi-regulators, often resisting state oversight (Chakravartty and Sarikakis, 2006). This shift introduces new power dynamics, where control over information flows lies with algorithms designed for profit, rather than public interest, enabling disinformation to spread unchecked.
Governance has transitioned from state-centred regulation to hybrid forms. Donders et al. (2014) describe European policies traditionally relying on directives for content standards, but contemporary responses include self-regulation, such as the EU Code of Practice on Disinformation, where platforms voluntarily commit to transparency and fact-checking (European Commission, 2021). However, these are often critiqued for lacking enforceability. On a global scale, disinformation transcends national systems, involving transnational dynamics like Russian interference in Western elections, necessitating international cooperation (Brautovic’s lectures emphasised this in the context of Balkan media).
These changes highlight the limitations of assuming media policy operates within bounded national contexts, as globalisation and digitalisation demand more adaptive frameworks.
Continuity and Limits of Theory
Perspectives from the course literature remain useful for understanding disinformation. Freedman’s (2008) political economy approach helps analyse how market-driven platforms perpetuate disinformation for profit, echoing critiques of commercial media biases. Chakravartty and Sarikakis (2006) provide insights into globalisation, explaining how transnational information flows challenge state sovereignty, which is directly applicable to cross-border disinformation campaigns. Donders et al. (2014) offer frameworks for evaluating European regulatory responses, such as the emphasis on public service media as counterbalances to false information.
However, these theories fall short in several areas. They largely assume human gatekeepers and linear communication models, inadequate for algorithmic distribution where AI amplifies content without intent assessment (Wardle and Derakhshan, 2017). Moreover, they undervalue the role of non-state actors like platforms, focusing instead on state-industry relations. Brautovic’s lectures pointed out that traditional theories do not fully account for the speed and scale of digital disinformation, such as deepfakes, which require updated conceptual tools.
Critical Evaluation
Disinformation poses new challenges for media policy, including the difficulty of regulating global platforms without infringing on free speech, and the resource intensity of fact-checking at scale. Emerging responses include platform regulation, like the DSA, which mandates risk assessments for disinformation (European Union, 2022). Self-regulation, such as voluntary codes, allows flexibility but often lacks accountability, with platforms criticised for inconsistent enforcement (Funke, 2019). Hybrid models, combining state oversight with industry initiatives, offer strengths like shared responsibility but limitations in enforcement, as seen in varying national implementations.
Arguably, these responses are reactive rather than preventive, and their effectiveness is limited by platforms’ global operations, which can evade jurisdiction-specific rules.
Normative Implications
The consequences of disinformation for democracy are profound, eroding trust and enabling manipulation, as evidenced by events like the 2016 US election (Allcott and Gentzkow, 2017). It tensions with freedom of expression, where policies risk over-censorship, creating trade-offs between harm prevention and open discourse. Public communication suffers from fragmented information ecosystems, polarising debates and diminishing shared facts. Crafting policies involves balancing these, with hybrid governance potentially fostering pluralism but risking corporate capture of regulatory power (Freedman, 2008). Ultimately, effective responses must prioritise democratic values while adapting to digital realities.
Conclusion
This essay has analysed disinformation as a media policy challenge, revealing continuities with traditional theories alongside significant transformations driven by digital platforms. While literature like Freedman (2008), Chakravartty and Sarikakis (2006), and Donders et al. (2014) provides foundational insights, it falls short in addressing algorithmic and global dynamics. Policy responses, including hybrid models, offer promising avenues but face limitations in enforcement and normative tensions. Addressing disinformation requires evolving frameworks that safeguard democracy without compromising freedoms, underscoring the need for ongoing research and adaptive governance in media policy.
(Word count: 1187, including references)
References
- Allcott, H. and Gentzkow, M. (2017) Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), pp. 211-236.
- Chakravartty, P. and Sarikakis, K. (2006) Media Policy and Globalization. Edinburgh: Edinburgh University Press.
- Donders, K., Pauwels, C. and Loisen, J. (eds.) (2014) The Palgrave Handbook of European Media Policy. London: Palgrave Macmillan UK.
- European Commission (2018) Tackling Online Disinformation. European Commission.
- European Commission (2021) Code of Practice on Disinformation. European Commission.
- European Union (2022) Regulation (EU) 2022/2065 on a Single Market For Digital Services (Digital Services Act). Official Journal of the European Union.
- Freedman, D. (2008) The Politics of Media Policy. Cambridge: Polity Press.
- Funke, D. (2019) A Guide to Anti-Misinformation Actions Around the World. Poynter Institute.
- Wardle, C. and Derakhshan, H. (2017) Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Council of Europe.

