Emotional Ecosystem of Propaganda Expansion
(EEP) Model
Introduction
The Emotional Ecosystem of Propaganda (EEP) is a theoretical framework that redefines propaganda as a dynamic network of emotional influence rather than a mere vehicle of fixed ideology. The original EEP model highlighted how modern propagandists primarily exploit a triad of emotions – fear, anger, and pride – to capture attention, mobilize groups, and cement loyalties. In this view, propaganda in the 21st century has “shifted decisively” from overt ideological indoctrination to the engineering of emotional states as the chief means of shaping public perception. While this provided a powerful lens on contemporary misinformation (e.g. outrage cycles, nationalist fervor, fear of enemies), it became clear that a fuller emotional and cognitive account was needed. In particular, the initial model under-emphasized positive or prosocial emotions, the role of pseudo-rational and ideological content, and the crucial distinction between illegitimate manipulation and constructive persuasion.
This revised EEP model builds on the original by deeply integrating three new dimensions: (1) a broader emotional palette that includes hope, love, compassion, inspiration, and other positive emotions alongside fear, anger, and pride; (2) the interplay of rationality and ideology with emotion, explaining how selective “facts,” expert endorsements, and ideological narratives reinforce emotional appeals; and (3) an ethical and contextual evaluation framework to distinguish propaganda that is manipulative and harmful from emotionally resonant advocacy that can be morally justified. The result is a comprehensive model reflecting systemic integrity, interdisciplinary rigor, and explanatory depth suitable for advanced scholarship and teaching. Each section below elaborates on one of these dimensions with precise definitions and real-world examples, illustrating how the EEP operates as an emotional–cognitive ecosystem of influence.
Expanding the Emotional Palette of Propaganda
The Emotional Triad vs. a Full Spectrum: The original EEP core identified fear, anger, and pride as the dominant emotions weaponized by propagandists . Fear is used to seize attention and induce a sense of threat; anger is stoked to direct aggression towards scapegoats and galvanize action; pride (often in nation, religion, or group identity) is invoked to foster loyalty and a sense of superiority . Indeed, extensive research confirms that negative high-arousal emotions like fear and outrage make messages more viral and memorable, aiding propagandists in bypassing critical thinking . However, propaganda “traffics mostly in emotions, and not just negative ones” . Analysts note that propagandists will use any emotion that effectively shortcuts reasoning – including hope, love, empathy, and inspiration – “so long as it shortcuts critical thinking” . A revised emotional palette therefore recognizes positive emotions as equally instrumental in propaganda’s arsenal, either to manipulate or to motivate.
Positive Emotions in Manipulative Propaganda: While fear and hatred are the blunt instruments of coercive propaganda, positive emotions can be co-opted to mask manipulative intent or broaden appeal. For example, hope can be exploited by authoritarian movements to present a utopian vision or a savior figure. A notorious Nazi poster from the early 1930s showed destitute Germans with the caption “Unsere letzte Hoffnung: Hitler” (“Our Last Hope: Hitler”) – explicitly framing Adolf Hitler as the beacon of hope for a nation in despair. By offering false hope of salvation and renewal, Nazi propagandists tapped a positive emotion to seduce the public, complementing the fear and anger they also fomented. Likewise, propagandists frequently invoke love and empathy in distorted ways. Totalitarian regimes may foster a cult of love or trust toward the leader (e.g. portraying the dictator as a loving father-protector of the people) to engender loyal devotion. Such paternalistic propaganda manipulates compassion and trust – “we care for you like a loving parent” – to discourage dissent . Extreme nationalist or sectarian propaganda often cultivates an intense love of the in-group (fatherland, ethnic group, religious community) as the flipside of hatred toward an out-group. This emotional duality was evident in fascist and ultranationalist campaigns that paired love of country with fear of traitors or outsiders . Even inspiration and heroic pride can be harnessed for dark purposes: terrorist organizations and extremist militias frequently produce slick videos glorifying martyrs and warriors, aiming to inspire recruits with a sense of heroic purpose and pride in joining a grand cause, while glossing over the violent or nihilistic reality. In sum, positive emotions are not inherently “good” in propaganda contexts – they too can be weaponized as seductive forces that disarm skepticism. As one analysis emphasizes, “our emotional system can be manipulated to destructive ends” when propagandists play on virtues like courage, hope, or love to advance a harmful agenda.
Positive Emotions in Constructive Persuasion: Conversely, the expanded EEP model also accounts for prosocial and constructive uses of emotional appeal – cases that blur the line between propaganda and public service messaging. History offers examples where stirring hope, inspiration, and unity served ethical or widely beneficial ends. During World War II, Allied governments produced propaganda not only to vilify the enemy but also to uplift and rally their own populations. For instance, American campaigns like the famous “Rosie the Riveter” poster (“We Can Do It!”) inspired pride and confidence among women entering the workforce, using uplifting imagery rather than fear. The U.S. “Knit Your Bit” posters urged women at home to knit socks for soldiers, appealing to patriotic love and compassion for the troops and offering citizens a hopeful sense of contributing to victory . In the Soviet Union’s fight for survival against Nazi invasion, propaganda posters often depicted strong, hopeful figures gazing toward a brighter future – implicitly giving Soviet citizens hope that the war would end in victory if they persevered. These optimistic emotional appeals helped sustain morale in dire times. In more recent memory, political and social movements have explicitly leveraged positive emotions alongside anger at injustice. The 2008 Obama presidential campaign in the US famously centered on the slogan “HOPE,” with posters (echoing revolutionary art) that inspired millions with a sense of optimistic unity and change. Social movements like civil rights or environmental campaigns often balance outrage (at racism or climate destruction) with hope for a better future and love/solidarity among participants, thereby motivating constructive action rather than despair. Even public health and humanitarian communications use emotional narratives: compassion is elicited by campaigns showing suffering children or disaster victims, aiming to spur charitable relief efforts; inspirational stories of survivors or community heroes are told to encourage resilience and collective efficacy. The key distinction is that in these cases, emotions are invoked in service of truthful causes and prosocial outcomes – for example, promoting health, unity, or resistance to oppression – rather than to deceive or divide. The revised model therefore treats propaganda as an emotional spectrum: from fear and hatred to hope and love, any affective state can be strategically amplified. What differs is the strategic purpose and psychological function of each emotional cluster. Fear and anger tend to narrow focus, create urgency, and identify enemies; hope and inspiration broaden outlook, encourage patience and effort toward long-term goals (or alternately, complacency if it is a false hope); pride and love bolster group identity – which can be inclusive (solidarity) or exclusive (chauvinism) depending on context. By mapping this broader emotional palette, the EEP model can analyze how different feelings are orchestrated in propaganda campaigns to reinforce each other. For instance, an extremist narrative might simultaneously stoke anger at an out-group, fear of an existential threat, and pride in one’s own community, while also dangling hope for a coming triumph – a potent cocktail that addresses multiple psychological motivators. Modern propagandists are effectively “emotion engineers” who mix negative and positive affects to achieve the desired social control or mobilization. Recognizing the full palette of emotions – including those that feel pleasant or virtuous – is essential, since as propagandists have long known, inspirational or aspirational tones can be as impactful as fear in bypassing critical scrutiny.
Rationality, “Pseudo-Rational” Content, and Ideological Narratives in Symbiosis with Emotion
A critical expansion of the EEP model is understanding how cognitive-seeming elements – rational appeals, data, expert authority, and ideological framing – operate in tandem with raw emotion in propaganda. It is a simplification to view propaganda as purely emotional or irrational; in practice, successful propaganda often mimics rational discourse and taps into belief systems to fortify its emotional impact . The revised model examines this interplay, revealing a symbiotic relationship where emotional narratives and pseudo-rational content reinforce one another.
Pseudo-Rational Appeals and Selective “Evidence”: Modern propaganda frequently clothes itself in the trappings of logic, science, or empiricism – especially when targeting educated or skeptical audiences. Propagandists use selective evidence, misleading statistics, and fraudulent experts to create a veneer of rationality around emotionally charged claims. A classic historical example comes from tobacco advertising in the mid-20th century: as health concerns about smoking grew, cigarette companies responded with campaigns touting doctors and scientific “studies” to assuage fear. The R.J. Reynolds company infamously ran ads claiming “More Doctors Smoke Camels than any other cigarette.” This claim was backed by the appearance of a survey – in reality a marketing ploy where doctors were given free Camels and then asked their preferred brand. By enlisting medical authority, the ad gave a rational tone to what was essentially propaganda encouraging an unhealthy habit. Similarly, climate change denial campaigns have presented graphs, petitions signed by “scientists,” and cherry-picked climate data to argue seemingly rational points (e.g. questioning CO₂’s role or highlighting cold weather events), all to reinforce an emotional/political stance that climate action is unnecessary or harmful. Such cherry-picking of facts and expert endorsements exploit our cognitive heuristics: people tend to trust figures in lab coats or data that confirm their pre-existing beliefs. By mimicking the form of rational discourse, propagandists can lower the audience’s guard – the arguments feel factual, giving emotional narratives a firmer cognitive grounding. It’s important to note that these rational elements are typically pseudo-rational: they sound logical but are built on distorted or false premises. As Aldous Huxley observed in 1958, “rational propaganda in favor of action that is consonant with the enlightened self-interest of those addressed… appeals to reason by means of logical arguments based upon the best available evidence fully and honestly set forth.” In contrast, propaganda that appeals to base passions “offers false, garbled or incomplete evidence, avoids logical argument and seeks to influence its victims by… repetition of catchwords [and] furious denunciation of… scapegoats.” . Modern propaganda often follows this formula: it mixes bits of truth with falsehoods, avoids genuine debate, and relies on mantras and slogans (“Make X Great Again”, “Enemies of the People”) to cement emotional resonance while giving an illusion of reason or common sense.
Cognitive-Emotional Mechanisms: The effectiveness of blending rationality with emotion lies in well-documented psychological mechanisms. Propaganda leverages cognitive biases to make emotional messages stick. One such mechanism is the illusory truth effect – simply repeating a claim, even a false one, increases the likelihood people will accept it as true . High-volume propaganda exploits this by flooding the information space with the same themes across many channels (TV, social media, bots, etc.), so that even rational-minded individuals become familiar with its talking points. The Kremlin’s “Firehose of Falsehood” model is a case in point: Russian disinformation outputs “a high volume of conflicting messages, many of which are outright false or only partially true,” with no commitment to consistency or objective reality, using sheer repetition and contradiction to “muddy the waters.” . This strategy shows that confusion can be as useful as persuasion – by overwhelming audiences with cognitive overload, it induces cynicism and resignation (people give up on discerning truth) . In such an environment, emotional reactions (like tribal anger or anxiety) guide people’s responses more than facts. Another mechanism is confirmation bias: propaganda supplies “rational” arguments that align with the audience’s existing emotional commitments or ideologies, which people eagerly accept while dismissing contrary evidence. For example, a person already fearful of vaccines will readily latch onto a pseudo-scientific article by an “expert” highlighting vaccine side effects, reinforcing their fear with a feeling of rational justification. Authority bias is also exploited: endorsements by figures perceived as authorities (doctors, generals, professors, celebrities) imbue emotional narratives with credibility. The interplay is cyclical: an emotional claim (e.g. “Group X is dangerous”) is introduced; pseudo-evidence or expert testimony then “proves” it (giving the audience intellectual permission to believe it); this combination intensifies the emotional response (fear/anger toward Group X), which further entrenches belief in the claim. In effect, the emotional narrative and the fake rational scaffolding co-evolve, creating a self-reinforcing conviction. As an illustrative case, consider conspiracy theories. Many conspiracies present an elaborate, quasi-rational explanation for events – they have timelines, technical details, and citations (often misinterpreted or fabricated) that appeal to those seeking a logical story in chaotic events. Yet, they “heavily leverage emotion – fear of a hidden threat, anger at elites, pride in being ‘awake’ to the truth.” . The rational façade (say, a complex theory about a government plot) hooks people intellectually, but the real driving force that causes someone to share it and cling to it is the emotional payoff: the thrill of secret knowledge, the validation of seeing one’s anxieties confirmed, the tribal pride of feeling smarter than the “sheep.” The revised EEP model thus stresses that propaganda seldom purely relies on emotion or reason alone – it concocts a blend, using bits of reason to “weaponize” rationality in service of emotional narratives. This synthesis exploits the full range of human psychology, engaging both our gut reactions and our reasoning faculties (albeit in a distorted way).
Ideological Narratives and Identity Cues: Although the EEP theory posits a “post-ideological” era of ad-hoc emotional coalitions , ideology in a broader sense is still very much alive in propaganda, typically meshing with emotion in implicit ways . By ideology here we mean a belief system, worldview, or narrative about how society should be organized. Propagandists often embed ideological cues and dog-whistles in emotionally charged content to evoke a sense of tribal identity. For example, a piece of populist propaganda may arouse anger about economic hardships and direct it at immigrant groups (emotional scapegoating), while simultaneously invoking ideological slogans of nationalism or economic protectionism. The ideology (“our nation should come first” or “globalism is harming us”) gives a justificatory context for the anger, framing it as righteous and logical. Similarly, religious extremist propaganda (whether jihadist or anti-abortion or others) will harness emotional stories – outrage at purported moral decay, compassion for the innocent victims (real or imagined), fear of divine punishment or societal collapse – bundled with explicit ideological tenets (quoting scripture, citing moral philosophy) to reinforce the faithful’s identity. By interweaving identity markers (phrases, symbols, values important to the target group), propaganda taps into group psychology: people are more receptive to messages that affirm their in-group and vilify an out-group. This is why propaganda is replete with tribal signifiers – flags, traditional motifs, language like “patriots” versus “traitors,” “true believers” versus “infidels,” etc. These cues automatically trigger emotional loyalty in the in-group and hostility to outsiders, often bypassing analytic thought. Notably, even when grand ideologies (like communism, fascism, liberal democracy) are less sharply defined, propagandists lean on simplified ideological narratives. Many contemporary propaganda campaigns revolve around nebulous but powerful identity ideologies: “freedom vs. tyranny,” “the true people vs. the corrupt elites,” “order vs. chaos.” These are not detailed doctrines, but binary narratives that carry moral and emotional weight. They serve as cognitive frames that make emotional appeals more resonant. For instance, COVID-19 disinformation in some communities framed public health measures as an assault on personal freedom (an ideological stance), thereby heightening anger and resistance among those who value libertarian ideals . In summary, the rational/ideological dimension of propaganda provides a scaffolding for the emotional ecosystem: pseudo-rational content lends plausibility and shields the mind from outright rejecting the message, while ideological narratives lend meaning and identity to the emotions provoked. The revised EEP model recognizes this synergy, analyzing propaganda not as emotion versus reason, but as a sophisticated cocktail of emotion, cognition, and cultural narrative. This more nuanced view aligns with classic propaganda theory (e.g. Ellul, Lasswell, Chomsky) which always saw propaganda as working on both the “thinking and feeling” parts of people, even as new technology and psychological insights amplify the emotional emphasis today.
Ethical and Contextual Evaluation of Propaganda
Expanding the EEP model to include an ethical dimension allows us to formally distinguish between illegitimate manipulative propaganda and potentially legitimate persuasive communication, even when both employ emotionally resonant techniques. Propaganda has a notoriously negative connotation – often associated with deception, manipulation, and authoritarian control – but history and practice show a spectrum of ethicality. Some emotionally charged campaigns serve beneficial or at least non-nefarious ends, such as promoting public health or encouraging civic behavior . Others blatantly aim to mislead, exploit, or oppress. To navigate this, we introduce criteria for evaluating propaganda practices:
Intent: The purpose behind the message is a key ethical litmus test. Propaganda deployed to deceive the public, incite prejudice or violence, or entrench an unaccountable power is ethically illegitimate in intent. By contrast, communication intended to inform or rally people for the common good, even if emotionally evocative, can be seen as more legitimate. For instance, during a pandemic a government might use strong emotional appeals (fear of disease, hope for safety) to encourage vaccination – the intent here is to save lives, aligning with the public interest. In contrast, an authoritarian regime spreading emotional disinformation about a fake “terrorist threat” to justify cracking down on opponents has malign intent (manufacturing fear to cement power). Intent is often the difference between education and manipulation. As media ethicists note, even democratic actors use emotional appeals, but the difference is often one of degree and intent – whether the goal is informed consent or emotional coercion.
Truthfulness and Accuracy: Truth-alignment of propaganda content is a fundamental ethical yardstick. Factual, honest messages (even if selectively presented) are far less objectionable than outright lies and fabrications. “White” propaganda, in scholarly terms, is information that is truthful and openly sourced, whereas “black” propaganda consists of lies and false sources . An emotional message grounded in reality (e.g. highlighting true stories of injustice to spur reform) retains ethical integrity that falsehood-ridden propaganda (e.g. concocting hoaxes or scapegoating innocent groups) utterly lacks. Huxley’s distinction echoes here: rational persuasion uses “the best available evidence fully and honestly set forth,” whereas propagandistic manipulation uses “false, garbled or incomplete evidence” and distortion . Importantly, omission and exaggeration fall on this spectrum too. Ethical persuasion strives for transparency about what is known and unknown, while unethical propaganda will suppress inconvenient facts or wildly exaggerate claims to stoke emotion. For example, a public service poster might show the real dangers of drunk driving (emotionally jarring but truthful), whereas a propagandist may spread a rumor that a certain minority group causes a social problem without evidence, purely to vilify them (emotional but false). The more a campaign strays from truth into deception, the more it crosses into unethical territory, regardless of how noble the stated cause.
Transparency of Source and Method: Ethical evaluation also considers transparency – is the source and persuasive intent made clear to the audience? Propaganda is unethical when it masquerades as something else (e.g. fake grassroots (“astroturf”) campaigns, partisan messages disguised as news, trolls posing as ordinary citizens). In contrast, what we might call legitimate advocacy or public messaging is usually upfront about who is speaking and why. For instance, government health campaigns or NGO advertisements typically disclose their sponsorship; you know you are seeing a persuasive message. In covert propaganda (especially common online), the audience is manipulated without even realizing who is behind the content or that it is propaganda – this breach of transparency is a red flag. Propaganda scholarship classifies “gray” and “black” propaganda as those with hidden or false sources, versus “white” propaganda which comes from an open, identified source. From an ethical standpoint, covert manipulation undermines autonomy far more than overt persuasion. Additionally, transparency of method implies not using undue coercion. Emotional appeals can be strong, but are they supplemented by opportunities for rational consideration (ethical) or are they paired with suppression of opposing views and disinformation (unethical)? A campaign that uses emotion while welcoming dialogue (e.g. a charity appeal that tugs heartstrings but publishes accountability reports) stays on firmer ethical ground than one that uses emotion to drown out all other perspectives.
Context and Outcome: The broader context and consequences of propaganda determine its ethical valence as well. One must ask: Does the propaganda strengthen or undermine democratic discourse? Does it respect or erode the audience’s agency? What are the societal outcomes if it succeeds? For example, propaganda urging an occupied population to non-violently resist a tyrannical regime (such as clandestine Resistance pamphlets in WWII Europe) might be judged ethical because its outcome – liberty – is a positive one, and it counters a greater evil. Propaganda used to promote public health, like vivid campaigns to get people to stop smoking or practice COVID hygiene, can save lives and thus produce broadly beneficial outcomes. On the other hand, propaganda that scapegoats a minority might lead to discrimination or even mass violence (as Radio Rwanda’s hate propaganda did in inciting genocide). The ethical framework must consider these outcomes: harm vs. benefit, division vs. unity, repression vs. empowerment. Cultural context also matters; what one society views as a legitimate emotional appeal (say, strong religious sentiment in messaging) another might view as manipulative. Ultimately, the ethical “line” is crossed when propaganda’s methods (deceptive, covert, coercive) and aims (self-serving power or malice) override its value to society. As one commentator put it, propaganda is ethical if it “empowers people to make informed decisions or promotes positive societal change,” but unethical if it “exploits fears, prejudices, or vulnerabilities.”
Using these criteria, we can formulate a typology of propagandistic practice. On one end of the spectrum might be “constructive propaganda” or ethical persuasion: emotionally engaging messages that are transparent, truth-based, well-intended, and aligned with public benefit. Examples include wartime morale posters that, while one-sided, did not rely on lies and served to defend a population, or public awareness campaigns (against drunk driving, for disaster preparedness) that use fear or hope responsibly to encourage prudent action. We might also include activist and protest messaging that appeals to emotions of justice, solidarity, and outrage in order to challenge oppression – for instance, the anti-apartheid movement’s posters and songs that stirred both anger at injustice and hope for equality. These could be seen as legitimate uses of emotional propaganda for collective good. On the other end is “malignant propaganda”: the classic unethical kind, characterized by falsehood (or gross distortion), hidden agendas, and intent to mislead or harm. Totalitarian propaganda from Nazi Germany is a paradigmatic case – it spread heinous lies about Jews and others, incited hatred and violence, and was orchestrated by a regime bent on aggressive war and genocide. Another example is extremist recruitment propaganda today (whether from ISIS or white supremacists), which plays on fear and hatred with deceptive claims to lure individuals into destructive acts. In between, there are gray areas. Many propaganda campaigns start with a seemingly positive goal but slide into unethical tactics over time . For instance, World War I recruitment propaganda in Britain and America began as straightforward patriotic appeals (to duty, honor, protecting the homeland), which one could argue had legitimate rationale during a national emergency. Yet, as the war dragged on, these campaigns resorted to extreme demonization of the enemy – fabricating atrocity stories and dehumanizing Germans as “Huns” – thus crossing into deception and inflaming hatred . The initial intent (recruit soldiers to defend the nation) was arguably justifiable; the later methods (xenophobic lies) were not, illustrating how context and degree matter. The Soviet Union’s propaganda in the 1930s offers another dual-edged case: early propaganda touted industrialization and modernization – optimistic themes to rally people for economic development – but soon it became a tool to cover up brutal realities (famines, purges) and to enforce an ideology, clearly veering into immoral territory when truth was discarded and personality cult took over . These examples reinforce that intentions, methods, and outcomes must be evaluated together. Propaganda can “start with ethical intentions but become unethical” if its messaging shifts toward manipulation or falsehood that causes harm.
By incorporating an ethical typology, the revised EEP model not only describes how propaganda works, but also provides a framework for normative assessment. This is crucial for applications in media literacy and policy: one can ask whether a given campaign’s use of emotional tactics is warranted (e.g. is it truth-based and serving a just cause?) or abusive (relying on deceit and fear for cynical ends). Criteria like intent, truthfulness, transparency, and outcome act as evaluative guideposts. They remind us that emotional appeals per se are not “evil” – what matters is how they are used. A savvy propagandist might even use positive symbols and half-truths to sell negative ends, whereas an ethical communicator might use stark frightening facts to serve positive ends. The EEP model’s ethical dimension captures these nuances, acknowledging that propaganda exists on a moral spectrum from enlightened persuasion to exploitative manipulation .
Conclusion
The fully revised Emotional Ecosystem of Propaganda (EEP) model offers a robust and comprehensive understanding of propaganda in contemporary society by integrating a broader emotional range, the rational–ideological interplay, and an ethical evaluation into the framework. This enriched model recognizes that today’s propaganda functions as a systemic ecosystem – an evolving network of emotional triggers and reinforcements, interlaced with selective facts and identities, thriving in our algorithm-driven media environment. We now see propaganda not just as a top-down dissemination of lies or fear, but as a complex emotional orchestration: it can provoke fear and rage, but also co-opt hope and empathy; it mimics reason and taps ideology to anchor those emotions; and its impact can be judged along a spectrum from liberating to devastating. By accounting for positive emotional appeals, we better explain propaganda’s ability to inspire and co-opt movements (for good or ill). By analyzing pseudo-rational content, we expose how propaganda feels persuasive by abusing the appearance of logic and expertise, reinforcing cognitive biases. By delineating ethical criteria, we confront the critical question: when does emotional persuasion become immoral manipulation? – thus bridging descriptive analysis with normative concerns. The revised EEP model has systemic integrity in that it weaves together psychological, technological, and cultural factors into one explanatory tapestry. It has interdisciplinary rigor, drawing on communication theory, social psychology, political science, and ethics. And it provides explanatory depth for phenomena ranging from social media misinformation storms to historical propaganda campaigns.
In an era often described as “post-truth” and saturated with emotion-driven narratives, such a model is timely. It illuminates how propaganda can rapidly spread and evolve in the digital age’s ecosystems – amplified by algorithms, modulated by community engagement, sustained by feedback loops – as originally posited by EEP . Yet the revisions remind us that beneath the new technological wrappers, the human element remains central: hearts and minds are the terrain of struggle. Understanding the Emotional Ecosystem of Propaganda in this richer light arms us to recognize when our feelings of fear, anger, pride or hope are being orchestrated, to discern when “facts” are serving feelings, and to evaluate the legitimacy of those who seek to move us. Ultimately, the goal of articulating this model is not only analytical – it is also to empower an informed citizenry. By mapping the ecosystem, we can better resist malign propaganda and support constructive communication. In the spirit of what the EEP framework calls “emotional sovereignty” , the revised model encourages individuals and societies to reclaim control over their emotional landscapes. That means fostering awareness of emotional triggers, demanding truth and transparency, and cultivating critical thinking even when faced with messages that stir the soul. In doing so, we uphold the ideal that persuasion in public discourse should engage both our reason and emotion without betraying our trust – an ideal that this comprehensive model helps to both explain and aspire toward.
Sources: The analysis builds on theoretical insights from propaganda studies and real-world examples. Key references include Aldous Huxley’s distinction between rational and non-rational propaganda , modern research on disinformation tactics like the “Firehose of Falsehood” , historical cases of emotional propaganda in wartime , and evaluations of propaganda’s ethical dimensions , among others as cited throughout. These illustrate and support the expanded EEP model as outlined above.
Brian Maxwell
The Integrity Dispatch