Economists Are Finally Catching Up – But Will Politicians Listen?

For years, many of us outside the ivory tower have watched economists confidently explain the world using tidy models that don’t quite match reality. Now, it seems even the experts are starting to wake up. Nobel laureate Angus Deaton, a man who has spent over five decades shaping economic thought, recently admitted that he’s rethinking much of what he once believed. In his essay, Rethinking My Economics, he acknowledges something the rest of us have known for a long time; economics, as it has been practiced, has ignored some fundamental truths about power, fairness, and the actual lives of working people.

One of his biggest realizations is that power—not just free markets or technological change—determines wages, prices, and opportunities. The old economic story said that workers got paid what they were worth, and if wages were low, it was because of “supply and demand.” Deaton now recognizes that corporate power has a much bigger role than economists have admitted. Employers dictate pay, not some invisible hand. This is what workers and unions have been saying for generations.

Speaking of unions, Deaton now regrets his past views on them. Like many economists, he once saw unions as a drag on efficiency. Now he sees them as a necessary counterbalance to corporate power. He even links their decline to some of today’s biggest problems—like stagnant wages and the rise of populism. Those of us who watched good union jobs disappear over the decades could have told him that.

Deaton also revisits the supposed wonders of free trade and globalization. He used to believe they were unquestionably good for everyone, lifting millions out of poverty worldwide, and now he wonders if the benefits of global trade have been overstated, especially for North American workers. It turns out that shipping jobs overseas and gutting local industries does have consequences. Again, not news to the factory workers and small-town business owners who saw their livelihoods disappear.

Even on immigration, Deaton has had a rethink. While he still sees its benefits, he admits he hadn’t fully considered its effects on low-wage workers. Many working-class folks—especially in industries like construction and manufacturing—have long argued that an influx of labor can drive down wages. For decades, economists dismissed these concerns as uninformed or even xenophobic. Now, Deaton is realizing that, actually, those workers had a point.

One of the biggest flaws in modern economics, Deaton argues, is its obsession with efficiency. The field has spent too much time focusing on what is “optimal” in theoretical terms while ignoring what is fair. Efficiency is great if you’re a CEO looking at profit margins, but for ordinary people trying to build stable lives, fairness matters just as much—if not more.

Perhaps most importantly, Deaton now believes that economics needs to learn from other disciplines. Historians, sociologists, and philosophers have long been tackling questions about inequality, power, and justice that economists are only now beginning to take seriously. Maybe if more economists had paid attention to those fields earlier, we wouldn’t be in such a mess now.

Which brings us to Mark Carney. Once the golden boy of central banking, Carney is now stepping into the political arena with the Canadian Federal Liberals, promising policies that sound progressive, but still carry the scent of Bay Street. The big question is: will his economic approach reflect the real-world reckoning that Deaton and others are finally having, or will it be more of the same old technocratic tinkering? Carney has talked a lot about inclusive growth and climate action, but will he acknowledge—like Deaton now does—that power imbalances, corporate dominance, and the decline of unions are at the heart of inequality? Will he push policies that actually shift power back to workers, or just dress up neoliberal economics with a few social programs? If Carney truly embraces Deaton’s new thinking, we might see a real departure from the old economic playbook, but if he sticks to the well-worn path of market-friendly “solutions,” it’ll just be another round of the same policies that got us here in the first place.

It’s refreshing to see someone like Deaton openly question his own past beliefs. It’s a rare thing for a leading economist to admit they’ve been wrong, but for those of us who have lived through the consequences of these flawed economic theories, starting with the years of Reagan and Thatcher, the real question is: Why did it take them so long to figure this out? And now that they have—will the politicians actually do anything about it?

The Delusions of Authoritarians: Why it never ends well for Fascist Leaders

Fascist and authoritarian leaders rarely see themselves as doomed figures in history. On the contrary, they often believe they are exceptional – capable of bending the course of history to their will. Whether through the cult of personality, the rewriting of historical narratives, or sheer force, they assume they can control how they will be remembered. This delusion has led many to catastrophic ends, yet new generations of authoritarians seem undeterred, convinced that they will be the ones to succeed where others failed. Trump and his allies fit squarely into this pattern, refusing to believe that history might judge them harshly or that their actions could lead to their own downfall.

Mussolini provides one of the most vivid examples of this phenomenon. He envisioned himself as a modern-day Caesar, reviving the grandeur of the Roman Empire through Fascism. His brutal repression of dissent, his alliance with Hitler, and his reckless military ambitions ultimately led to disaster. When the tide of World War II turned, Mussolini found himself abandoned, hunted, and finally executed by his own people; his corpse hung upside down in Milan as a stark rejection of his once-grandiose vision. And yet, to the very end, he believed he was the victim of betrayal rather than the architect of his own demise.

Hitler, too, was utterly convinced of his historical greatness. He meticulously curated his own image, producing propaganda that cast him as Germany’s savior. Even as the Third Reich collapsed around him, he ranted in his bunker about how the German people had failed him rather than the other way around. His ultimate act, suicide rather than surrender, was an attempt to control his narrative, ensuring he would never be paraded as a prisoner. But history did not grant him the legacy he sought. Instead of being remembered as a visionary, he became the ultimate symbol of genocidal tyranny.

The pattern continued into the later 20th century. Nicolae Ceaușescu, the Romanian dictator, had convinced himself that his people adored him. He built extravagant palaces while his citizens starved, crushed opposition, and developed a personality cult that portrayed him as a paternal figure of national strength. When the moment of reckoning arrived in 1989, he seemed genuinely shocked that the crowd in Bucharest turned on him. Within days, he and his wife were tried and executed by firing squad, their supposed invincibility revealed as an illusion.

Even those who manage to hold onto power longer do not always escape history’s judgment. Augusto Pinochet ruled Chile through terror for nearly two decades, believing that his iron grip would secure him a revered place in history. But his crimes – torture, executions, forced disappearances eventually caught up with him. Though he escaped trial for most of his life, his reputation was destroyed. His legacy became one of shame rather than strength.

Trump, like these figures, operates in a world where loyalty and spectacle take precedence over reality. He dismisses mainstream historians as biased, preferring the adulation of his base over any broader judgment. He likely assumes that as long as he can retain power, whether through elections, legal battles, or intimidation, he can dictate how history views him. But history has a way of rendering its own verdict. Those who believe they can shape their own myth while trampling on democratic institutions, rule of law, and public trust often find themselves remembered not as saviors, but as cautionary tales.

Abandoned Sovereignty: How Canada Gave Up on Its Own Defence Industry

I began writing this piece over a year ago, and now it seems time to publish. I have seen first hand, during my time working for the UK feds, the way most members of NATO, not just Canada, have purchased U.S. military equipment, often under political pressure, and to the detriment of their own defence industries.  NATO interoperability standards should mean that any compatible equipment should be a viable option, considered through open competitive bidding, yet the geopolitical reality is something completely different. 

Canada has long faced intense pressure—political, economic, and social—to purchase U.S. military equipment for its armed forces, a reality that has shaped its defense procurement decisions for decades. This pressure is deeply rooted in history, from Cold War-era alliances to modern-day trade dependencies, and it has left Canada with little choice, but to align its military acquisitions with American interests. The consequences of this alignment go beyond procurement choices; they have also played a role in the erosion of Canada’s own defense research and development capabilities.

The political pressure to buy American is most evident in Canada’s commitment to joint defense initiatives, particularly NORAD and NATO. From the early days of the Cold War, Canada’s defense policies have been deeply entwined with those of the United States. The integration of North American air defense under NORAD meant that Canada’s fighter aircraft, radar systems, and missile defense strategies had to be compatible with those of the U.S. When Canada scrapped its own Avro Arrow fighter program in 1959, ostensibly for cost reasons, it conveniently cleared the way for the adoption of American aircraft like the CF-101 Voodoo, locking the Royal Canadian Air Force (RCAF) into a reliance on U.S. technology that continues to this day.

This trend persisted throughout the latter half of the 20th century. Canada’s navy, which once built world-class destroyers and anti-submarine vessels, saw its shipbuilding industry decline, and by the 1990s, the country was purchasing used British submarines while remaining dependent on American-built weapons and sensors. Similarly, Canada’s decision to buy the CF-18 Hornet fighter in the 1980s followed a pattern of choosing U.S. aircraft over European or domestic alternatives. While the CF-18 has served well, it locked Canada into the U.S. military supply chain for parts, upgrades, and replacements. Now, with the planned acquisition of F-35 stealth fighters, that dependence is only deepening.

Economically, Canada’s military procurement is heavily influenced by its integration with the U.S. defense industrial base. The Defense Production Sharing Agreement (DPSA), signed in 1956, allowed Canadian defense firms to bid on U.S. military contracts, but it also cemented Canada’s role as a supplier of components rather than a leader in weapons development. This effectively sidelined Canadian military research and engineering projects, making it far more difficult to revive independent initiatives. When the Arrow was canceled, it wasn’t just a single aircraft project that was lost—it was an entire aerospace industry that could have positioned Canada as a technological leader rather than a perpetual customer of American defense contractors.

The economic argument for buying American is always framed in terms of cost-effectiveness and interoperability, but the reality is that it often comes with trade-offs. The purchase of American equipment frequently involves hidden costs—maintenance contracts, dependency on U.S. technology, and restrictions on modifications. The recent push to buy American-made submarines, replacing the troubled British-built Victoria-class boats, is another example of how Canada’s choices are limited by its reliance on U.S. and NATO systems. In many cases, American weapons systems are the only viable option simply because Canada has not maintained the capability to produce its own alternatives.

Public sentiment in Canada is often skeptical of major military purchases, and this can create social and political tensions. Many Canadians are uncomfortable with high military spending, particularly when it benefits American defense giants like Lockheed Martin or Boeing. This unease has been reinforced by past procurement scandals, such as the costly and controversial EH-101 helicopter cancellation in the 1990s, which resulted in years of delays in replacing Canada’s aging Sea Kings. Yet, despite public resistance, successive Canadian governments—Liberal and Conservative alike—have found it almost impossible to escape the gravitational pull of American defense procurement.

Interoperability with U.S. forces is the most frequently cited justification for this dependence, and in some cases, it is a valid one. Canadian troops often train and deploy alongside U.S. forces, making shared equipment a practical necessity. However, this argument is often overstated to justify buying American even when other options exist. The recent decision to acquire P-8 Poseidon maritime patrol aircraft from Boeing, rather than exploring alternatives like the Airbus C295 or continuing to develop Canadian-built options, reflects this bias. The same was true with the decision to buy Sikorsky CH-148 Cyclone helicopters, a troubled program that has suffered significant delays and technical issues.

Over time, Canada’s ability to independently design and produce advanced military hardware has been systematically dismantled. The cancellation of the Arrow was just the first in a series of decisions that saw Canadian innovation sacrificed in favor of American procurement. The loss of the CF-105 program, the shelving of independent drone development efforts, and the abandonment of domestic tank production have left Canada as a nation that buys rather than builds. While there are still areas of strength—such as armored vehicle production through General Dynamics Land Systems Canada—the overall trajectory has been one of increasing dependence on the U.S.

The reality is that Canada’s defense procurement strategy is shaped as much by geopolitics as by practical military needs. The U.S. is both Canada’s closest ally, and its largest trading partner, and any significant deviation from American military procurement norms risks diplomatic and economic fallout. The fear of upsetting Washington is a powerful deterrent against seeking alternatives, whether from European manufacturers or through domestic production.

In the end, Canada’s military procurement is not just a matter of choosing the best equipment—it is a strategic and political decision that reflects the country’s place in the global order. Until Canada makes a concerted effort to rebuild its defense research and production capabilities, it will remain at the mercy of U.S. military priorities. Whether that is an acceptable trade-off is a question that Canadian policymakers—and the public—must continue to grapple with.

Update
Since writing the core of this piece, there has been some signs that Canada is trying to rekindle its own defence industry with its ship building program for the new River class destroyers, the conversation about purchasing European designed and built submarines, and early discussions regarding reducing the F-35 purchase program, in favour of the Swedish Saab Gripen. The Swedish proposal, which promised that aircraft assembly would take place in Canada, and that there would be a transfer of intellectual property, which would allow the aircraft to be maintained in this country, was very different from the U.S. F-35 program, where major maintenance, overhaul and software upgrades would happen in the States. The second Trump administration might just be the catalyst that Canada needs to seek alternative solutions rather than the business as usual approach we have seen over the last 75 years.  

Thatcher’s Flawed Philosophy: How Community Really Does Defines Us

Margaret Thatcher’s infamous declaration that “there is no such thing as society” has sparked decades of debate and remains a contentious cornerstone of her political philosophy. Her emphasis on self-interest over community solidarity, however, neglects a fundamental truth: humans are inherently social beings, and society is not an abstract ideal but a lived reality. To dismiss the concept of society is to deny the interconnectedness that defines human existence. 

From the earliest days of our evolution, humans have depended on cooperation and collective effort for survival. Group solidarity enabled us to hunt, share resources, build shelters, and ultimately thrive. Language, culture, and complex societal structures emerged from this cooperation, underscoring that our progress has always been rooted in community. Thatcher’s rejection of society as a meaningful entity ignores this profound evolutionary history.

Modern science further reinforces the critical role of social connections. Studies in sociology, psychology, and anthropology repeatedly demonstrate that strong social ties contribute to better mental and physical health, greater happiness, and longer life expectancy. Conversely, social isolation and loneliness have devastating consequences, leading to increased rates of mental illness, substance abuse, and even early mortality. Community is not just a philosophical idea; it is an essential foundation for individual and collective well-being.

History provides countless examples of the power of community to create positive change. Civil rights movements, environmental activism, labor struggles—these are not the outcomes of individuals acting in isolation but of people coming together in solidarity to challenge injustice and fight for shared goals. Such movements illustrate that progress is often born from collective action rather than solitary self-interest.

Even Thatcher’s own notion of self-interest fails to account for the human capacity for empathy, reciprocity, and altruism. While individuals may act in their own interests, they do so within a framework of interconnected relationships. Acts of kindness and generosity are not rare deviations from human nature but deeply ingrained aspects of it. Recognizing the well-being of others as intertwined with our own is not only logical but vital to the fabric of any functioning society.

Thatcher’s dismissal of society as a nonentity represents a reductionist and ultimately flawed view of human nature. Far from being atomized individuals, we are part of a larger web of connections that sustains us. Acknowledging the reality and importance of community is essential if we are to build resilient societies that prioritize the common good and provide a sense of belonging for everyone. Society does exist—and it is the very foundation upon which we stand.

The Failing Republic: Why the U.S. is Losing Its Separation of Powers

The United States was designed as a carefully balanced system, drawing from Polybius’ theory of anakyklosis, the ancient idea that governments cycle through different forms of rule as they degenerate. The Founders sought to prevent this cycle from repeating in America by creating a mixed government – a system that combined elements of monarchy (the presidency), aristocracy (the Senate and judiciary), and democracy (the House of Representatives and popular elections). This balance was supposed to be maintained through separation of powers and checks and balances, preventing any single branch from becoming dominant. However, over time, this system has eroded, leading to political dysfunction, growing authoritarian tendencies, and an increasing sense that American democracy is failing to sustain itself.

One of the most obvious signs of this breakdown is the expansion of executive power. The U.S. presidency, originally designed to be a limited office constrained by Congress, has grown into an institution that wields enormous influence over both domestic and foreign policy. Congress’ constitutional power to declare war has been effectively ignored for decades, with presidents engaging in military actions without formal approval. Executive orders, once meant for administrative matters, now serve as a way for presidents to bypass legislative gridlock and unilaterally shape national policy. Emergency powers, originally intended for genuine crises, have been used to consolidate authority, further tipping the balance away from Congress and toward the executive. What was once a system of monarchy constrained by law is increasingly resembling the early stages of tyranny, where power becomes concentrated in the hands of a single leader.

Meanwhile, the institutions meant to act as a wise, stabilizing force, the Senate and the judiciary, have themselves become distorted. The Senate, originally designed to serve as a check on populist excess, has become a bastion of partisan gridlock, where legislative action is often blocked not through debate and compromise but through procedural loopholes like the filibuster. The Supreme Court, meant to provide legal stability, has evolved into a de facto policymaking body, issuing rulings that shape national laws based on the ideological leanings of its justices rather than broad democratic consensus. The fact that justices serve lifetime appointments ensures that political biases from decades past continue shaping the present, often overriding the will of the electorate. Rather than serving as an aristocratic check on instability, the judiciary and Senate have increasingly acted as oligarchic strongholds, where entrenched power resists democratic accountability.

At the same time, the democratic elements of the system have begun to decay into their own worst tendencies. Gerrymandering has allowed political parties to carve up districts in ways that virtually guarantee electoral outcomes, stripping voters of meaningful representation. Populist rhetoric has taken over political campaigns, where leaders appeal not to reasoned debate but to emotional manipulation and fear-mongering. The rise of social media-driven outrage politics has further fueled division, turning every issue into an existential battle where compromise is seen as betrayal. The January 6th attack on the Capitol was not just an isolated event but a symptom of a deeper problem, the slide of democracy into oligarchy, or mob rule, where decisions are no longer made through structured governance but through force, intimidation, and the manipulation of public anger.

This erosion of balance has led to a state of chronic political paralysis. Congress, once the heart of American governance, now struggles to pass meaningful legislation, forcing presidents to govern through executive action. Public trust in institutions is collapsing, with many Americans believing that elections, courts, and government bodies are rigged against them. And looming over it all is the increasing potential for authoritarianism, as political leaders, on both the left and right, flirt with the idea that democratic norms can be bent, ignored, or rewritten to serve their interests. This is precisely the pattern that anakyklosis predicts: when democracy becomes too unstable, people turn to strong leaders who promise to restore order, often at the cost of their freedoms.

If the United States is to avoid falling deeper into this cycle, it must take deliberate action to restore the balance of power. Congress must reclaim its authority over war, legislation, and oversight. The judiciary, particularly the Supreme Court, may need reforms such as term limits to prevent long-term ideological entrenchment. Electoral integrity must be strengthened, ensuring fair representation through independent redistricting commissions and protections against voter suppression. And perhaps most importantly, the American public must become more politically literate, resisting the pull of demagoguery and demanding a return to governance based on reason, debate, and compromise.

Without these changes, the U.S. risks following the path of so many republics before it, where democracy fades, power consolidates, and the cycle of anakyklosis completes its turn once again.

Does National Service Strengthen Democracy?

Over the decades, my views on national service have shifted in ways I never anticipated. In the 1970s, I opposed it as a right-wing strategy to control young people. By the 1990s, after working in military settings that fostered aggressive elitism, I argued that civilians should remain separate from the patriarchal uniformed culture. Then, in the 2010s, I found myself engaged in change management projects within uniformed teams plagued by misogyny and racism. Now, after six decades of reflection, I find myself reconsidering my stance yet again.

National service has long been debated as a tool for unity, civic responsibility, and military readiness. But its potential to erode military elitism and foster a stronger connection between soldiers and society is often overlooked. Professional militaries, especially in nations where service is voluntary, tend to cultivate exclusivity—a culture where soldiers see themselves as distinct, even superior, to the civilians they serve. This divide reinforces the notion of the military as a separate class, rather than an integrated part of society. National service disrupts this dynamic by compelling a broader cross-section of the population to serve, reshaping military identity from an elite institution to a shared civic duty.

In voluntary systems, the military often attracts those who seek discipline, structure, or prestige—creating an insular culture with its own rigid hierarchy. Civilians, in turn, either glorify or distance themselves from this world, reinforcing the idea that service is for a dedicated few rather than a collective obligation. By contrast, when participation is mandatory across social classes and career paths, the military becomes more representative of society. The uniform is no longer a symbol of an exclusive warrior class, but a temporary role worn by people from all walks of life.

This integration fosters deeper civilian-military interaction. In countries like Switzerland and Israel, where service is universal, military experience is common rather than exceptional. Nearly everyone has served or knows someone who has, preventing the formation of a professional military caste detached from the society it protects. In contrast, nations with fully voluntary forces risk developing a military with its own insular traditions and perspectives, further widening the civilian-military gap.

Scandinavian countries offer compelling examples of how national service can shape military culture. Norway introduced gender-neutral conscription in 2015, significantly increasing female participation and reinforcing the country’s commitment to equality. Sweden, after briefly abolishing conscription, reinstated a selective system in 2017 to address recruitment shortages. While both countries prioritize inclusivity, Norway enforces universal service more strictly, while Sweden selects only those necessary for military needs. These models highlight how national service can be adapted to different societal priorities while still promoting integration.

This shift from exclusivity to civic duty is essential for preventing an isolated, professionalized force with an “us vs them” mentality. In a national service system, military service is just one form of contribution, alongside disaster relief, infrastructure projects, and community assistance. This broader framework erodes the idea that military life is inherently superior, reinforcing the principle that national service—whether military or civilian—is about collective responsibility, not personal status.

The benefits of this integration extend beyond military culture. Veterans who return to civilian life find themselves in a society where their experience is widely shared, reducing post-service isolation and preventing the hero-worship that can distort public perceptions of the military. When nearly everyone has served in some capacity, soldiers are seen not as a privileged class, but as fellow citizens fulfilling a duty like everyone else.

Perhaps most importantly, national service strengthens democracy itself. By grounding military power in the citizenry, it prevents the rise of a professional warrior class detached from national values. It ensures that defense, like governance, remains a shared responsibility rather than the domain of a select few. In this way, national service transforms military duty from an elite pursuit into a universal expectation—one that keeps soldiers connected to, rather than separate from, the society they serve.

16 Year Olds Should Be Allowed to Vote in Canada

I firmly believe in the right of 16 and 17 year old Canadians to vote. They are more than ready to shoulder this responsibility, and society already entrusts them with far greater challenges. Here’s why I support enfranchising them.

The Responsibilities They Already Bear
At 16, young Canadians can obtain a driver’s license, manage the responsibilities of operating a vehicle, and comply with traffic laws. Many also join the workforce, contributing taxes that fund services without having a say in how those funds are spent. This taxation without representation runs counter to the principles of fairness in a democratic society.

Some 16 year olds live independently, taking full responsibility for their finances, households, and futures. These young people already make life-altering decisions, proving their ability to assess and manage complex situations.

They also have the legal right to make important healthcare decisions without parental consent in most provinces. From mental health treatments to reproductive choices, they show the capacity to evaluate critical issues. Moreover, the age of consent in Canada is 16, and in some cases, they can even join the military, committing themselves to a life of service and sacrifice. If we trust them with these decisions, why not trust them with a vote?

Their Political Awareness
Critics say 16 year olds lack the maturity to vote, but that argument doesn’t hold water. Today’s youth are incredibly engaged with issues like climate change, education, and social justice. They organize protests, sign petitions, and participate in grassroots movements. They are not just passive observers; they are active participants in shaping their world.

Civics education in Canadian schools equips them with the knowledge to understand governance and the electoral process. Giving them the vote would deepen their connection to democracy, encouraging lifelong participation.

Looking at Other Democracies
Canada wouldn’t be breaking new ground here. Countries like Austria, Brazil, and Scotland already allow 16 year olds to vote, and studies show these younger voters are as thoughtful and engaged as older ones. Early enfranchisement fosters a lifelong habit of voting, strengthening democratic systems for everyone.

A Voice for the Future

The decisions made today—on climate policy, education, and job creation—will define the futures of these young Canadians. Denying them a voice in these matters is short-sighted. They are the generation that will live with the long-term consequences of today’s elections.

It’s time we acknowledge the responsibilities and contributions of 16 year olds and empower them with the right to vote. They have proven their maturity and commitment to society. Including them in the democratic process would make Canada’s democracy stronger, more inclusive, and better prepared for the future.

Is USA a Fascist State Struggling with Democracy? 

Is America flirting with fascism, or are such claims the product of alarmist hyperbole? It’s a question that divides dinner tables, social media feeds, and even academic circles. Some argue that the United States is a democracy fighting for its soul; others see it as a country standing perilously close to authoritarian rule. But to call America fascist – or even on the road to it – requires a careful unpacking of what fascism truly entails, and how it might resonate within the American political landscape.

Let’s be clear: fascism isn’t a vague insult for policies we don’t like. It’s an authoritarian ideology with specific hallmarks. Think Mussolini’s Italy, Hitler’s Germany – regimes steeped in violent nationalism, the suppression of dissent, and a drive to create a monolithic cultural identity. Robert Paxton, one of the leading scholars on the subject, described fascism as thriving on crises, exalting the group over the individual, and depending on a strong leader to restore a supposedly decaying nation. So, how does America stack up against these criteria? Let’s dig deeper.

Nationalism and Authoritarian Rhetoric
Nationalism is the drumbeat of every fascist regime, and it’s undeniable that America has had its moments of chest-thumping pride. But the “America First” rhetoric of recent years has pushed nationalism to a different level, stirring debate about its compatibility with democratic ideals. Take the Trump administration, where slogans like “Make America Great Again” dovetailed with a barrage of attacks on immigrants, minorities, and even the democratic process itself. Muslim travel bans, family separation policies at the southern border, and the vilification of immigrants as existential threats bear a troubling resemblance to the exclusionary policies of fascist regimes.

And then there’s the attack on the press—“the enemy of the people,” as Trump called it. Fascism thrives on controlling narratives, suppressing inconvenient truths, and manufacturing enemies to unite the populace. These tactics were echoed in efforts to discredit media outlets, undermine trust in elections, and dismiss dissenting voices. While America still enjoys a free press and opposition parties, these tactics are red flags in any democracy.

Civil Liberties Under Pressure
A free society requires robust protections for civil liberties, yet the U.S. has shown cracks in its foundation. Think about the use of force against peaceful protesters during the George Floyd demonstrations, or the revelations of mass surveillance by whistleblower Edward Snowden. Then there are laws in certain states aimed at curbing protests – an unsettling echo of fascist regimes that treated dissent as treason.

Still, America hasn’t crossed the line into wholesale repression. Dissent exists, opposition thrives, and courtrooms regularly challenge abuses of power. These are democratic lifelines, but they must be safeguarded vigilantly.

Corporate Power and Economic Control
Fascism often entails a symbiotic relationship between the state and corporations, where economic power is wielded for nationalist purposes. In America, the government doesn’t control corporations outright, but the influence of corporate money in politics is undeniable. Lobbying, dark money in elections, and the revolving door between big business and government raise questions about whether democracy is being eroded by oligarchic forces.

Economic inequality is another point of tension. Policies favoring the wealthy over the working class may not fit the fascist mold exactly, but they exacerbate social divisions, fueling the kind of crises that fascism preys upon.

Racial and Cultural Tensions
A defining feature of fascism is the enforcement of a singular racial or cultural identity, often to the detriment of minorities. The U.S. has a long history of systemic racism, from slavery and segregation to redlining and mass incarceration. Contemporary issues – like police brutality and racial inequality – continue to expose deep wounds in the fabric of American democracy.

White nationalist groups, emboldened in recent years, represent another disturbing trend. The normalization of their rhetoric in certain political spaces harks back to fascist tendencies to scapegoat minorities for societal woes. Yet, these groups remain fringe elements rather than central powers, and their rise has been met with strong opposition from civil society.

America’s Democratic Struggle
Despite these troubling signs, it would be a mistake to paint America as fully fascist. The U.S. retains institutions that fascist regimes dismantle: a separation of powers, an independent judiciary, and regular elections. Social movements – from Black Lives Matter to grassroots environmental campaigns – demonstrate that the democratic spirit is alive and well.

America’s story is not one of fascism triumphant, but of democracy under pressure. Its history is riddled with contradictions, from its founding on ideals of liberty while maintaining slavery, to its championing of free speech while tolerating systemic inequality. Yet, those contradictions are precisely why it remains a battleground for change.

So, Is America Fascist?
Not yet – and perhaps not even close. But the warning signs are there. The flirtation with authoritarianism, the normalization of exclusionary rhetoric, and the entrenchment of corporate influence all demand vigilance. America isn’t Mussolini’s Italy or Hitler’s Germany, but it is a nation grappling with the forces that could pull it in that direction. The question isn’t just “Is America fascist?” – it’s “What are we doing to ensure it never becomes so?”

Americans must keep democracy’s flame alive by holding power to account, protecting civil liberties, and fighting for the inclusive ideals the country was built on. After all, democracy isn’t just a system – it’s a struggle. And that struggle is theirs to win.

Dies Natalis Solis Invicti

December 25th, the Dies Natalis Solis Invicti or “Birthday of the Unconquered Sun,” has long been associated with cosmic renewal and light’s triumph over darkness, aligning with the winter solstice. Its significance may stretch back to ancient Persia and found deep resonance in Roman religion, particularly within the mystery cult of Mithras.

Mithraism flourished in the shadowy corners of Roman society, appealing to soldiers, merchants, and officials. Mithras, a god of light and justice, was central to a complex mythology that emphasized cosmic order and renewal. His worship featured the tauroctony, a scene depicting Mithras slaying a sacred bull in a cave, from whose blood life and fertility emerged. This act symbolized victory over chaos and the cycles of life and death, themes reinforced by Mithras’ divine “rock birth” (Petra Genetrix), which emphasized his eternal and unshakable essence.

Mithras’ bond with Sol Invictus, the Roman solar deity, was central to his worship. Together, they were shown feasting after the bull’s defeat, celebrating cosmic renewal and the return of light. This connection tied Mithras to the December 25th celebration, when the days began to grow longer, signifying hope and rebirth for his followers. Worship took place in cave-like Mithraea, where initiates advanced through seven secretive ranks, fostering bonds of loyalty and discipline, particularly among Roman soldiers.

Christianity rose alongside Mithraism, drawing parallels with Mithras in themes of salvation, sacrifice, and divine light. Jesus Christ, like Mithras, came to symbolize victory over darkness, spiritual rebirth, and eternal life. By aligning Christ’s nativity with December 25th, Christianity absorbed and redefined the pagan imagery of the “unconquered sun,” positioning Jesus as the true “light of the world.” This synthesis appealed to Mithraic followers and others drawn to sun worship, securing Christianity’s dominance within the empire.

Mithras endures as a figure of cosmic mystery, his story largely conveyed through art and ritual. From his rock birth to his symbolic feast, he remains a mediator between worlds, forever linked to humanity’s quest for order and light in the face of darkness.

Asimov’s Warning Is Just As Valid Today 

Isaac Asimov’s assertion about the “cult of ignorance” in the United States, where the false equivalence of ignorance and knowledge undermines democracy, is disturbingly evident in many elected U.S. leaders. This trend, marked by anti-intellectualism and the rejection of expertise, is not only a historical thread, but also a contemporary issue with serious consequences. When political leaders prioritize personal beliefs or populist rhetoric over evidence-based decision-making, the nation’s progress is stymied.

One glaring example is the response to the COVID-19 pandemic, during which several federal leaders publicly rejected scientific consensus and medical expertise. President Donald Trump, for instance, consistently downplayed the severity of the virus, promoted unproven treatments like hydroxychloroquine, and suggested bizarre remedies such as injecting disinfectant. His administration’s frequent clashes with public health experts, including Dr. Anthony Fauci, showcased a dangerous preference for misinformation over evidence-based policy. This rejection of expertise delayed critical responses, contributing to the unnecessary loss of lives and eroding public trust in institutions.

Climate change denial is another prominent example of Asimov’s warning in action. Despite decades of scientific research and warnings about the catastrophic effects of global warming, U.S. federal leaders like Senator James Inhofe have openly dismissed the issue. Inhofe’s infamous act of bringing a snowball to the Senate floor in 2015 to mock climate change science epitomized the rejection of intellectual rigor in favor of simplistic and misleading arguments. Under President Trump, the United States withdrew from the Paris Climate Accord in 2017, a decision that disregarded global consensus and expert recommendations. This move not only hampered international climate action, but also showcased a willingness to prioritize political posturing over long-term environmental sustainability.

Education policy also reflects this strain of anti-intellectualism. Federal and state leaders have fueled culture wars over curricula, targeting topics like evolution, climate science, and systemic racism. Florida Governor Ron DeSantis, for example, has led efforts to restrict discussions of race and gender in schools, framing them as “woke indoctrination.” His administration’s actions, including banning Advanced Placement African American Studies, reflect a fear of critical thinking and a broader trend of politicizing education. Such measures not only undermine intellectual growth, but also perpetuate ignorance by denying students access to nuanced perspectives.

Another aspect of this “cult of ignorance” is the weaponization of populist rhetoric. Politicians like Marjorie Taylor Greene and Lauren Boebert frequently champion “common sense” over expertise, dismissing intellectual rigor as elitist. Greene’s baseless claims about space lasers causing wildfires or her rejection of vaccine science exemplify how some leaders amplify misinformation to appeal to their base. This rhetoric undermines trust in institutions, promotes conspiratorial thinking, and fosters a climate where ignorance is celebrated over informed debate.

The Trump administration’s broader approach to governance further illustrates Asimov’s critique. From rejecting intelligence assessments on foreign interference in elections to downplaying the impact of climate policies, the administration often sidelined expertise in favor of politically convenient narratives. This pattern was not limited to one administration. Leaders across political spectrums have, at times, embraced anti-intellectualism, whether through denial of scientific consensus, opposition to educational reform, or a reluctance to address systemic issues.

Asimov’s warning resonates because it touches on the core principle that democracy requires an informed citizenry and leaders willing to engage with complex realities. Yet, when leaders dismiss expertise and elevate ignorance to a virtue, they erode the foundations of democratic governance. The COVID-19 pandemic, climate change denial, and educational censorship demonstrate how the conflation of ignorance with knowledge can have dire consequences for public health, global stability, and intellectual progress.

Reversing this trend demands a renewed commitment to intellectual integrity and informed leadership. Politicians must prioritize evidence-based policymaking, foster trust in expertise, and resist the allure of populist rhetoric that sacrifices long-term progress for short-term gains. Only by respecting knowledge and promoting critical thinking can the United States counteract the “cult of ignorance” Asimov so aptly described and ensure a democratic future guided by reason and understanding.