Thursday, 19 March 2026

Intellectual Integrity in a World Without Void Thinking

 

(AI Generated Image)
Intellectual Integrity in a World Without Void Thinking

A recent plagiarism flap, an activist accusing a public figure who happens to be a politician, academic, and social advocate, captures a familiar anxiety about originality. The politician threatened defamation and a week later, the activist apologized, conceding that the politician had published the idea earlier. Yet the activist maintained he hadn’t read that prior work and that his view arose from his own independent thinking. That claim, whether true or not, spotlights a deeper puzzle, if thought is built from observation and experience, what exactly do we mean by “independent” thinking? Perhaps what we often witness is not theft, but convergence, two minds attending to the same patterns in the world and assembling similar conclusions from shared materials.

In academia, the chorus against plagiarism swells, and calls for “originality” and “independent thought” grow ever more insistent. But the word “independent” can be a romantic overreach. Imagination needs raw material, no mind thinks out of a void. We observe, remember, compare, and extrapolate from the known to press into the unknown. On this empiricist picture, cognition is not spontaneous generation. It is construction, intelligent, disciplined, sometimes dazzling construction, from what experience supplies.

Still, this framing can underrate the mind’s capacity for abstraction, pattern recognition, analogy, and synthesis. Even if imagination depends on existing materials, it can reorder them into forms that feel startlingly new. The mind’s originality often lies less in the bricks and more in the architecture. Dependence on input is undeniable,  the question is whether dependence precludes novelty. It need not. Novelty may arise from the structure and depth of reorganization rather than from detachment from experience.

This suggests a refined empiricism where originality is not creation from nothing but transformation of something. The mind is not a creator ex nihilo, it is a reconfigurer. In that light, “independent thought” is never independent of input but can be independent in method, how it selects, filters, and reinterprets the available content.

Opponents press a nativist rationalist case, where the mind isn’t just a processor of experience, but it comes equipped with innate structures that make certain kinds of thinking possible. Descartes famously claimed some ideas (mathematical truths, the infinite) are not derived from the senses. Kant argued that the mind contributes a priori forms, space, time, causality, structuring experience from the outset. Chomsky proposed an inborn language faculty whose complexity outstrips what pure induction from stimulus could supply. On this account, two points challenge the empiricist’s comfort, the mind isn’t a blank slate, and thought is at least partly generative, producing concepts not strictly traceable to specific sensory inputs.

You can translate this into evolutionary terms where innate structures as inherited cognitive architectures shaped by selection. That move makes the nativist view scientifically plausible without smuggling in fully formed ideas. But it doesn’t secure the conclusion that experience is secondary. Early humans may have possessed capacities for abstraction and language, yet capacity is not expression. These potentials need triggers, social scaffolding, and cumulative culture. A child might be wired for mathematics, but without exposure and pedagogy, algebra won’t materialize in isolation. Experience does not merely decorate an interior but it activates and calibrates it.

At this point, the disagreement narrows. The key issues are whether stimulation builds or merely triggers, and what “independent” should mean. The empiricist leans toward construction, where it states that without stimulation, nothing meaningful forms. The nativist counters that stimulation is necessary but primarily tunes and switches on pre existing systems. The definitional snag is equally crucial. If “independent” means independence from external content, it is a myth. If it means independence in the rules and standards of cognition, the internal constraints that shape how we think, then independence survives at the structural level. Edge cases cut both ways, infants display early object expectations, suggesting pre-structuring, while humans can imagine higher dimensions and fictional worlds, seemingly beyond direct experience. The empiricist replies, even these feats are extrapolations from prior inputs, executed by a mind adept at recombination.

A hybrid view emerges as not only attractive but hard to avoid, the mind may be innately structured, yet actual thinking requires experiential activation. All real thought depends on prior input, even if that input does not fully construct it from scratch. That isn’t naïve empiricism but it’s a measured synthesis, innate potential married to experiential development.

Classical voices deepen the picture. Socrates, via Plato, treats learning as recollection, stimulus functions as a midwife, drawing out latent knowledge rather than depositing content from outside. This challenges the empiricist at the root, perhaps the mind contains seeds that dialogue merely awakens. Thiruvalluvar, by contrast, exalts cultivation which includes listening, exposure, and moral discipline transform raw experience into wisdom. He aligns with the empiricist emphasis on input but insists that without reflection and virtue, exposure remains inert. Put together, these positions triangulate a compelling map, internal latency needs external engagement, external engagement needs disciplined processing.

What does this mean for originality and the plagiarism panic? First, convergence is real. Two thinkers can witness the same social currents and, independently, produce strikingly similar analyses. Second, independence should be reconceived, not independence from sources, but independence in the quality of transformation, how rigorously, ethically, and creatively one reworks the given. Third, responsibility shifts to the learner and the writer. Even if structures are innate and stimulus indispensable, wisdom is not passively received. It is painstakingly constructed through attention, judgment, and character.

From a Vedantic angle, this can be seen as a duality unfolding, inherent capacity meets experiential reality, and through conscious effort which includes study, reflection, practice, then it becomes insight. The mind may not be a blank slate, and it certainly isn’t a sealed vault, but it is a living architecture that must be animated, tested, and refined. So is “independent thinking” a myth? Only if we define it as thinking without lineage or input. If instead we define it as the disciplined power to transform what we encounter, to make it truer, clearer, and more generative, then independence is not only real, it is precisely what responsible thought demands.

Cheers.

ravivarmmankkanniappan2003190320263.0567° N, 101.5851° E

© All Right Reserved

Tuesday, 10 March 2026

THE APE THAT DREAMED THAT IT WAS A GOD

AI Generated Image

 

The Ape That Dreamed It Was a God

For most of our history, humans appear to have been reasonably competent creatures, in small doses. Place a dozen of us in a village, a hunting band, or a tribal encampment, and society functions with disarming simplicity. Everyone knows who grows the tapioca, who fixes the roof, and, crucially, who stole the goat. Leadership tends to fall to the person who can keep the fire burning, ward off predators, and remind the rest of the tribe not to eat the bright red berries. But magnify this arrangement to millions of strangers, add bureaucratic labyrinths, televised debates, and an occasional flag waving ceremony, and suddenly the system produces something extraordinary, a natural habitat in which theatrical, hyper ambitious, and occasionally shameless individuals rise, almost effortlessly, to positions of authority. A rare evolutionary niche indeed, the apex predator of the political savannah.

Meanwhile, the sensible people, the ones who actually fix the roof, slowly edge toward the back of the room, confused. The meeting that was supposed to be about replacing a broken ladder has somehow morphed into a three hour argument about who deserves to hold it. Progress, we call it. Though one suspects the ladder would get repaired sooner if left to a tribe of moderately organized squirrels. Civilization, governance, social institutions, all grand words for what occasionally looks like an elaborate filtration system for selecting individuals who should not be trusted with the office coffee machine, let alone the machinery of a country.

And perhaps this is the real flaw in our planetary project, humans were never designed to run the world. We were built for ambling across grasslands, sharing berries, and checking over our shoulders to ensure we hadn’t been designated as someone else’s lunch. A modest, sensible role in the food chain. But give us symbolic thought, agriculture, philosophy, science, industrialization, psychology, and digital networks, and we take it as a sign that we should appoint ourselves CEOs of the ecosystem. Yuval Noah Harari (Historian and Philosopher, 1976- to date)  might call this the inevitable side effect of our dangerous superpower that is the ability to conjure shared fictions at scale. Once a few of us agreed that lines on a map are sacred, that paper is money, and that slogans are a form of truth, we could coordinate in vast numbers but sadly, we could also mislead one another with professional efficiency. A species that once negotiated over berries now negotiates over narratives, currencies whose value rises with repetition. The same cognitive fireworks that let us imagine a better future also let us invent better justifications, more decorative delusions, and myths elaborate enough to require ministries.

Humanity sometimes looks like a species that accidentally promoted itself. We evolved for a modest job, wandering grasslands, sharing berries, and nervously checking whether we were about to become lunch, yet the moment we discovered stories and symbols, we interpreted that as a mandate to run the entire planet. In global geopolitics, this plays out like a prehistoric foraging tribe that somehow acquired a corporate org chart where borders are sacred office partitions, currencies are colourful reward points, and ideologies are motivational posters everyone pretends to understand. Leaders gather in diplomatic boardrooms to negotiate narratives the way our ancestors once negotiated berry bushes, only now the berries are trade routes, alliances, sanctions, and “national interests.” The strange trick is that most of it works because we all agree to treat the memos as real and repeat a slogan often enough and it graduates into policy. Our great cognitive superpower, the ability to believe the same story at scale, lets billions coordinate, innovate, and occasionally build rockets, but it also means the world is effectively managed by a former band of foragers who discovered PowerPoint and decided that what the ecosystem really needed was quarterly strategy meetings.

Long before any of this, Thiruvalluvar (an ancient Indian Philosopher, circa 4th CE) had the courtesy to warn us in couplets that a ruler without virtue is a calamity, that greed corrodes judgment, and that governance without justice is merely a louder form of theft. He might have phrased it more elegantly, but the gist is familiar, better the leader who rescues the drowning than the one who asks whether the drowning have filled in the correct form. In a village, this is common sense but, in a nation, it becomes a manifesto nobody reads. If small societies rely on character because everyone can see it, large societies rely on spectacle because character no longer fits on a billboard. We keep mistaking applause for approval and volume for validity, and thus the ladder remains tragically unfixed.

Francis Bacon (an English Philosopher, 1561-1626), who never met a cognitive bias he didn’t try to categorize, would likely diagnose our misadventures as an infestation of idols. The Idols of the Tribe, our species wide habits of overgeneralizing and seeing patterns where none exist. The Idols of the Cave, our private preconceptions and pet theories, which we defend with the ferocity usually reserved for family heirlooms. The Idols of the Marketplace, the way language turns confusion into policy by giving vague ideas sturdy names. And the Idols of the Theater, our fondness for grand systems that are more elegant than accurate. Put these together and you get modern governance, a theatre crowded with idols and not nearly enough exits, where the debate about the ladder proceeds flawlessly in the passive voice, mistakes were made, responsibilities were misunderstood, repairs were delayed, but the press conference went very well.

Then comes Sigmund Freud (a Neurologist and founder of Psychoanalysis, 1856-1939), whispering that the true ruler of the polis might be the unconscious, the vast, inconvenient ocean beneath our carefully ironed intentions. We advertise to desires we don’t admit, vote for stories we can’t resist, and then rationalize our choices as if logic had been invited from the start. The superego drafts the manifesto, the id writes the campaign jingle and the ego edits the minutes afterward to make it all sound deliberate. Industrial society discovered that the psyche is a lever, and so we built entire industries to pull it. If Bacon taught us to watch our errors, Freud taught us to watch the watcher, to suspect that the person holding the ladder might be doing so to impress their father, terrify their rival, or seduce the electorate, anything, really, except fix the roof.

Meanwhile, the spiritual economy upgraded itself to a doctrine of ownership. Not just land or cows, but attention, identity, opinion, and afterlife options. We collect followers the way ancestors collected firewood, then pray that the algorithm, our new household god, will smile upon our sacrifices. Death once retired us from the world, now it threatens to interrupt our brand strategy. If Thiruvalluvar counselled restraint and justice, our age prefers a more actionable virtue, scale. We confuse “bigger” with “better,” “louder” with “truer,” and “trending” with “true.” It turns out you can capture the world’s attention without once capturing the problem at hand, which is why the ladder’s defect has more publicity than solutions.

A contemporary thinker like David Graeber (an Anthropologist and a Political Activist, 1961-2020) would add that bureaucracy expands not to solve problems but to define them into eternity. Paperwork is our civilization’s poetry, a sprawling epic in which the hero is a form and the dragon is a missing signature. Whole categories of “bullshit jobs” arise to service the narrative machinery that services the other narrative machinery, until the only thing being produced at scale is justification. We used to hunt deer but now we hunt compliance. We used to share meat but now we share meeting invites. If Harari mapped how fictions make us many, Graeber mapped how paperwork makes us busy, too busy, often, to notice that the roof is leaking onto the file labelled “Roof Integrity.”

The Scientific Revolution promised us a method, fewer idols, more evidence, fewer feelings, more facts. We honoured that promise by building instruments of astonishing precision and then using them to measure our preferences. We split atoms and then our attention. Rationality became a toolkit for building better machines and better excuses, the same empiricism that could heal a city could also optimize a distraction. We tell ourselves that the data will save us, but data, like the gods, have priests, and priests, like the rest of us, have incentives. Thus, empiricism often arrives to the policy table on time, only to discover that the seating chart is already fixed.

Industrialization dragged us into cities and into ourselves. The modern psyche, half spectacle, half surveillance, oscillates between craving visibility and fearing exposure. Freud’s descendants help us label the oscillation, advertisers help us monetize it and the rest of us post about it. Digital networks turned our cognitive village into a global amphitheatre where everyone speaks and nobody listens long enough to pass the ladder. We call this “networked cognition,” a charming euphemism for outsourcing memory to machines and delegating judgment to trends. We have reached the point where the town crier is automated, and the town itself is an app asking us to rate our experience of the fire while the house burns.

Perhaps the truth is embarrassingly simple. Maybe humanity was never meant to design social constructs spanning continents. Maybe our wiring was optimized for cooperative foraging, not parliamentary theatrics. Maybe the cognitive revolution was less an upgrade and more a cosmic glitch, a misfired mutation that gave primates the ability to invent bureaucracy. Thiruvalluvar would urge us to rediscover virtue and restraint. Bacon would plead for method over myth. Freud would ask us to interrogate our motives before we broadcast them. Harari would remind us that our superpower is a shared story, and that stories can hand us both tools and chains. Graeber would advise us to notice when the structure we built to help us has become the reason help cannot arrive.

A clear 21st-century example is the global response to the COVID 19 pandemic. A microscopic virus spread through a species capable of sequencing its genome within weeks and designing vaccines in under a year, an astonishing triumph of science. Yet the crisis quickly became a theatre of competing narratives. Governments argued over borders, political parties turned masks and vaccines into identity badges, and social media flooded the public square with conspiracies and counter stories. In some places, the logistics of saving lives were slowed by bureaucratic procedures, ideological battles, and mistrust of institutions. Scientists pleaded for evidence based method, echoing the spirit of Francis Bacon, while psychologists pointed to fear, denial, and tribal thinking that Sigmund Freud might have recognized. Meanwhile, the crisis revealed how global coordination depends on shared beliefs, much as Yuval Noah Harari argues. Humanity possessed the tools to solve the problem, but our stories about power, identity, and authority often made the solution harder to reach.

Perhaps the quiet absurdity of our age is that competence whispers while confidence campaigns. The thoughtful hesitate, the theatrical govern. Our institutions resemble a ladder with missing rungs, still ceremonially displayed, endlessly discussed, but rarely repaired by the few who actually know how to climb. So, we polish speeches, redesign platforms, and issue declarations of progress while the roof continues its patient leaking. Civilization becomes a ritual of announcing solutions rather than practicing them. Meanwhile, the squirrels, unburdened by ideology, bureaucracy, or televised debates, solve the practical problem of winter with an efficiency our committees might envy.

The tragedy is not that humanity dreamed boldly, but that the ape who dreamed it was God occasionally forgot it was still an ape. Power magnifies the illusion, language decorates it. And so, we continue negotiating over narratives while the scaffolding of reality creaks beneath us. As Friedrich Nietzsche warned, “He who fights with monsters should look to it that he himself does not become a monster.” Until humility climbs the ladder before ambition does, progress may remain what it too often is, a press conference about repairs rather than the quiet work of fixing the roof.

Cheers.

ravivarmmankkanniappan1908100320263.0567° N, 101.5851° E

© All Right Reserved

 


Saturday, 7 March 2026

THE ITCH OF WAR: FROM KURUKSHETRA TO HORMUZ

 

(AI Generated Image)

War rarely begins with grand strategy or noble declarations. More often, it begins with something far smaller and far more human. Imagine an itch, an irritation that refuses to go away. One person feels it first, perhaps pride wounded, ego bruised, grievance unresolved. Instead of calming the irritation through restraint, reflection, or compromise, he provokes another. Soon the second person begins scratching as well. What started as a private discomfort becomes shared agitation. Retaliation follows retaliation, and the scratching becomes a spectacle. Others join in, either to defend honour, settle scores, or simply because conflict has a way of pulling spectators onto the stage. Before long, the original irritation is forgotten, yet the pain has spread everywhere. That, in essence, is how wars often grow, not merely from necessity, but from unchecked impulses and the human tendency to export one’s own unrest.

A striking illustration of this dynamic appears in the ancient Indian epic Mahabharata and the catastrophic Kurukshetra War. The conflict did not begin with armies marching across plains, but it began with humiliation, envy, and pride. The rivalry between the Pandavas and the Kauravas escalated through insults, manipulation, and the infamous dice game in which power, honour, and dignity were gambled away. The public humiliation of Draupadi transformed a palace dispute into a moral crisis that demanded redress. What might have remained a family quarrel hardened into an existential struggle involving kingdoms across the subcontinent. By the time diplomacy failed, the original grievances had become secondary. Pride, vengeance, and the perceived need to restore honour had already set the stage for a war that would devastate an entire generation.

History shows that this pattern repeats itself with uncomfortable regularity. Conflict is rarely spontaneous, but it usually emerges within larger cycles of power, insecurity, and shifting influence. When dominant powers sense their authority weakening or their economic foundations wobbling, strategic anxiety tends to rise. Military posturing becomes more visible, statements grow sharper, and warships suddenly begin what might politely be described as “presence missions.” Aircraft carriers do not wander oceans by accident. They are floating signals. When global power feels uncertain, the world often witnesses a season of muscle flexing disguised as diplomacy.

This dynamic is not new. In the nineteenth century, Britain and Russia engaged in a prolonged geopolitical rivalry in Central Asia that later became known as “the Great Game.” The term was first used by Captain Arthur Conolly of the British East India Company’s Bengal Light Cavalry in the 1840s to describe the strategic contest unfolding across Afghanistan, Persia, and the Central Asian Khanates. Later, Rudyard Kipling’s novel Kim gave the phrase its romantic and mysterious aura, portraying a shadowy world of spies, agents, and imperial manoeuvrings. Behind the literary drama, however, the Great Game was simply two empires attempting to secure influence, buffer zones, and strategic advantage without triggering a full scale war between themselves.

What is unfolding today in the Middle East resembles a far more dangerous version of that rivalry. Observers increasingly describe the current crisis as a “New Great Game,” but the comparison is only partially accurate. The nineteenth-century contest revolved largely around territory and imperial boundaries. The modern one revolves around regime survival, strategic deterrence, economic choke points, and global alliances that stretch far beyond the region itself.

The present escalation began dramatically at the end of February 2026, when the United States and Israel launched coordinated high intensity strikes against Iranian political, military, and nuclear infrastructure. The operations, reported as large scale precision campaigns, targeted command centres, missile facilities, and key figures within Iran’s leadership. Reports from multiple outlets indicated that the attacks killed Iran’s Supreme Leader, Ayatollah Ali Khamenei, along with several senior military commanders and government officials. Iranian authorities later confirmed the deaths and declared a national mourning period.

This moment represented a decisive break from the shadow war that had defined US/Iran tensions for decades. Until then, confrontation largely occurred through proxies, cyber operations, covert sabotage, and limited missile exchanges. Directly targeting the leadership of the Iranian state crossed a threshold that previous administrations had avoided. The strategic logic behind the strike appeared to be the classic doctrine of overwhelming force, cripple the command structure quickly and create internal political shock large enough to weaken the regime itself. Officials in Washington framed the operation partly in those terms, suggesting that the Iranian population should seize the moment to reclaim political control from its ruling system.

But wars rarely unfold according to the tidy logic of strategic planners. Iran responded with immediate retaliation, launching waves of drones and ballistic missiles at American installations and allied states across the Gulf. The scale of the response was notable not only for its intensity but for its geographic reach. Missiles and drones targeted locations in Qatar, Bahrain, the United Arab Emirates, Kuwait, and Saudi Arabia, while some strikes extended toward Cyprus, Turkey, and Azerbaijan. Explosions were reported near major infrastructure hubs, including ports, energy terminals, and military bases. In geopolitical terms, Iran was sending a blunt message that is, if its regime was threatened, the entire regional system would feel the shock. Interestingly, Iran’s approach relies heavily on cheap, low cost drones, frequently referred to as Shahed-136 and Shahed-131 kamikaze drones. Estimates place production costs between $20,000 and $50,000 per unit, a fraction of the cost of more sophisticated US and Israeli systems like the Patriot missile or Israel’s David’s Sling and Arrow-3 interceptors, which can range from $1 million to over $3 million per launch. By leveraging affordability and sheer numbers, Iran can project strategic disruption without the enormous financial burden of high end missile exchanges, turning cost asymmetry into a tactical advantage.

The escalation deepened when Iran announced the closure of the Strait of Hormuz, one of the most critical maritime chokepoints in the global economy. Roughly a fifth of the world’s oil and a substantial portion of liquefied natural gas normally transit through that narrow passage between the Persian Gulf and the Gulf of Oman. Declaring the strait closed, and threatening vessels attempting passage, instantly disrupted global energy flows. Hundreds of tankers were stranded or forced to reroute. Insurance firms began withdrawing coverage for shipping in the region. The economic ripple effects spread quickly through global markets.

At the centre of the crisis now stands a clear strategic confrontation between the United States and Israel on one side and Iran on the other. Washington and Tel Aviv appear to be pursuing objectives that go well beyond slowing Iran’s nuclear ambitions. The pattern of strikes suggests a broader effort to degrade Iran’s missile capabilities, dismantle the network of allied militias often described as the “Axis of Resistance,” and limit Iran’s ability to project influence across Lebanon, Syria, Iraq, and Yemen. Israeli leadership has emphasized that the war is intended to be decisive rather than permanent, though history offers little reassurance that conflicts launched with such confidence remain contained.

Meanwhile, other major powers are behaving with notable caution. Russia and China both condemned the strikes and called for emergency discussions at the United Nations, yet neither has shown serious interest in entering the conflict militarily. Their restraint is not altruism, it is calculation. Russia remains heavily engaged in its own war in Ukraine and has little appetite for a second direct confrontation with the United States. China, while deeply dependent on Middle Eastern energy supplies, prioritizes stability above ideological alignment. India, for its part, is walking a delicate line. New Delhi relies heavily on Gulf energy imports and maintains strategic partnerships with both Washington and Tehran, making overt support for either side risky. As a result, India has largely called for de-escalation and dialogue, emphasizing diplomacy while quietly managing its energy security and regional influence. An open war involving great powers would threaten precisely the economic and strategic stability that Beijing and New Delhi alike rely upon.

That does not mean Iran stands entirely alone. Diplomatic backing, intelligence sharing, technological assistance, and strategic coordination are all possible forms of indirect support. Iranian officials have hinted at receiving “political and other assistance” from both Moscow and Beijing, though the ambiguity appears intentional. In geopolitics, uncertainty itself can function as a strategic tool.

Perhaps the most uncomfortable position belongs to the Gulf monarchies. Countries such as Saudi Arabia, the United Arab Emirates, and Qatar host major American military bases while simultaneously depending on regional calm to sustain their economic growth. That dual reality places them directly in the crossfire. Iranian missile and drone attacks have already struck installations in several of these states, including Qatar’s Al Udeid air base and key port infrastructure in the UAE and Bahrain. At the same time, disruptions to shipping through the Strait of Hormuz threaten the very energy exports that underpin their economies. In effect, they are both partners and potential victims in the same strategic arrangement.

There is a historical echo here that stretches far back into antiquity. When Xerxes I of Persia, the self-styled “King of Kings”, pushed the Achaemenid Empire westward, his campaign was driven not only by expansion but by the need to reinforce imperial authority after internal revolts in provinces like Egypt and Babylon. His massive expedition against the Greek world required extraordinary engineering feats such as the bridge across the Hellespont and culminated in decisive setbacks at battles like Battle of Salamis. Empires often expand outward when internal pressures rise, projecting strength to consolidate authority at home. The comparison is imperfect, yet the pattern is familiar, which is where the hegemonic power attempts to maintain dominance across a vast strategic theatre while managing political strain, logistical limits, and unpredictable resistance.

The crisis differs from the 19th century Great Game in several important ways. First, the struggle today is less about territorial control than about control over infrastructure and systems, where it focuses on energy routes, financial networks, cyber capabilities, missile defences, and global supply chains. A single disruption in the Strait of Hormuz can shake energy markets from Tokyo to London in a matter of hours.

Second, the battlefield now includes powerful non state actors. Groups such as Hezbollah and various regional militias extend the reach of state power while maintaining enough ambiguity to complicate retaliation. Their involvement blurs the line between conventional war and proxy conflict, increasing the risk of escalation across multiple fronts simultaneously.

Third, and perhaps most significant, the consequences are global rather than regional. Energy prices surge, stock markets wobble, and shipping routes stretch thousands of miles longer as vessels avoid conflict zones. Insurance companies withdraw coverage, logistics networks slow down, and the spectre of recession begins to hover over distant economies that have no direct involvement in the fighting.

All of this makes the modern “Great Game” less like a chess match and more like a complex web of dominoes. One move rarely affects only one square on the board.

The role of media further complicates how people understand these events. If you really want to experience modern warfare without leaving your sofa, forget streaming platforms and try channel surfing instead. Start with CNN, where the graphics move fast, the music is urgent, and the strikes often sound like decisive acts of strategic necessity. Then switch to Al Jazeera and watch the same event transform into an entirely different film, suddenly the language shifts, the victims have names, and the missiles look less like strategy and more like devastation. For an added intellectual workout, flip to BBC, which usually delivers the most carefully balanced version, a calm voice explaining that everything is “deeply concerning,” followed by a panel discussion that politely circles the issue without quite landing anywhere. By the time you finish rotating through the channels, you will have watched three different movies about the same war, complete with heroes, villains, tragedy, and suspense. It is better than any thriller, geopolitics, emotion, moral ambiguity, and contradictory plotlines, all broadcast live.

And yet, for most people far from the front lines, daily life continues. Bills must still be paid, children still go to school, friends still gather, and football debates remain as heated as ever. The spectacle of geopolitics often unfolds on screens rather than streets. For most of the world’s population, war exists as background noise, distant, dramatic, and strangely abstract. The evening news may flash images of missiles and burning infrastructure, but outside the window the bus still arrives, shops still open, and someone somewhere is passionately arguing about the weekend match. Life has a stubborn rhythm that refuses to pause simply because history is making noise elsewhere.

Yet that distance is fragile. Every global conflict begins somewhere specific, one border, one grievance, one decision taken in a quiet room by a handful of leaders. But the consequences rarely remain contained. Energy prices rise, alliances shift, economies tremble, and narratives harden. The real danger lies not only in the violence itself but in the human tendencies that ignite it, pride that refuses compromise, leaders who gamble with escalation, and societies that mistake retaliation for justice. When those forces converge, the irritation spreads outward like ripples in water. The scratching multiplies.

And eventually, everyone feels the itch.

Still, human history has never been defined solely by conflict. It is also defined by resilience, courage, and the stubborn refusal to surrender to fear. The spirit of that resilience is captured beautifully in the words of Subramania Bharati (aka Mahakavi, meaning the ‘Great Poet’, a 20th-century a revolutionary, patriot, and social reformist), whose famous poem Achamillai Achamillai reminds us that uncertainty and danger have never been enough to stop the human will to live freely,

அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே
இச்சகத்துள் ஒருவனுக்கு இந்நிலையே அமையுமோ?
அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே.

உச்சிமீது வானிடிந்து வீழுகின்ற போதினும்
அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே.

அஞ்சுவது யாதொன்றும் இல்லை,
அஞ்சி நிற்கும் காலமும் இல்லை.”

(an excerpt from the Subramania Bharathi’s poem “The Fear”)

“There is no fear, there is no fear, there is no such thing as fear.
Can such a state truly exist for a human in this world?
Yet I say again, there is no fear, there is no fear.

Even if the sky itself were to shatter
And come crashing down upon my head,
Still there is no fear, there is no fear.

There is nothing in this world to be afraid of,
Nor is this a time to stand trembling in fear.”

Bharati was not naive about the dangers of the world, he understood them deeply. But his message was clear, humanity must not surrender its courage, its dignity, or its hope. Wars may erupt, powers may compete, and the machinery of geopolitics may grind loudly in the background. Yet ordinary life continues precisely because people refuse to let fear define the boundaries of their existence.

So, while empires manoeuvre and alliances shift, people will still gather around dinner tables, argue about football, plan their futures, and teach their children to dream of a better world. History may move in storms, but humanity moves in hope.

Cheers.

ravivarmmankkanniappan2053070320263.0567° N, 101.5851° E

© All Right Reserved.