Sunday, 12 April 2026

Echoes of Aroma and Revolution in Old Taiping

 

Changchun Villa

I attended a friend’s son’s wedding in Taiping last Saturday, a joyous occasion filled with laughter, warmth, and celebration. As the festivities drew to a close, our friend Selva mentioned a heritage gem located nearby, the Antong Coffee Mill. Intrigued by the promise of history and tradition, we set off without hesitation, eager for an unexpected adventure. What awaited us was not merely a visit, but a journey through time.

The moment I stepped into the coffee complex, it felt as though I had crossed the threshold into another era. The rich, intoxicating aroma of roasted beans hung in the air, welcoming me like an old friend. Standing proudly at the entrance was the famed Changchun Villa, a silent witness to history and once home to one of the remarkable figures connected to the founding of the Republic of China. Its presence lent an air of reverence and mystery, inviting us to uncover stories long preserved within its walls.

Oven

Founded in 1933, Antong Coffee Mill is officially recognized as the oldest coffee mill in Malaysia still in operation. Nestled in Taiping, Perak, the factory is a living museum that has faithfully preserved its traditional wood-fired roasting methods for more than ninety years. Established by Mr. Tiah Ee Mooi and now managed by the third generation of his family, Antong stands as a testament to dedication, resilience, and heritage. The compound itself holds layers of history, Tiah rented Changchun Villa in 1933 before purchasing it two years later, and the coffee mill was originally built from the villa’s stables.

The villa once served as the residence of Chen Cuifen, often remembered as the “Forgotten Revolutionary Female” and the devoted partner of Dr. Sun Yat-sen, the Father of Modern China. It is believed that this tranquil residence became a strategic planning ground for anti-Qing revolutionary activities in the early 1900s. Born in Hong Kong in 1873, Chen Cuifen played a crucial yet understated role in the 1911 Xinhai Revolution. For more than two decades, she supported Sun Yat-sen through exile and hardship, managing logistics, transporting weapons, and tending to wounded soldiers during their time in Japan and Malaya. Despite her unwavering dedication, her contributions were often overshadowed in official histories.

Chen Cuifen
(photo courtesy of Wikipedia)

After the establishment of the Republic of China in 1912, Chen Cuifen settled in Malaya, where she lived independently. She adopted a daughter named Sun Rong and engaged in business ventures, including establishing a rubber plantation. For a period, she resided in Taiping, at a villa now adjacent to Antong Coffee Mill, where it is said that Sun Yat-sen himself once stayed. Standing there, surrounded by echoes of history, it was impossible not to feel a profound sense of awe.

As I continued my exploration, I had hoped to witness Antong’s famed traditional production process firsthand. Unfortunately, we arrived too late in the day, as the roasting can only be observed in the morning. Though I missed the spectacle, the lingering aroma of coffee and the preserved machinery allowed us to vividly imagine the time-honoured craft.

Sand Roast

In the early hours, beans are roasted in wood-fired ovens fuelled by recycled timber and mangrove logs, imparting a distinctive smoky fragrance that defines Antong’s signature brew. The celebrated double-roasting technique then transforms the beans into a bubbling mixture blended with sugar and margarine, creating a rich, caramelized essence. Once cooled, the hardened mass is manually smashed into fragments before being ground into fine powder. While modern methods are now employed for efficiency, the preserved mill stands as a living exhibit, offering a captivating glimpse into the meticulous craftsmanship of the past.

Old Mill Machines

The experience was nothing short of enchanting. Visitors are free to observe the roasting process, explore the artifacts housed within Changchun Villa, and savour complimentary coffee samples in the air-conditioned showroom. Antong’s signature Kopi O remains a timeless favorite, while contemporary offerings such as Durian White Coffee, espresso ice cream, and specialty golden coffee showcase its evolution through the decades.

Entrance to The Old Mill

It was truly a journey that captured the passage of time. The old coffee mill stands as a proud testament to the enduring legacy of Nanyang-style coffee. Though the historic machinery now rests as a silent exhibit, the entire complex is permeated with an irresistible coffee aroma that evokes nostalgia and wonder. The Changchun Villa, now transformed into a museum adjoining the café, offers a stirring glimpse into the past. Knowing that Dr. Sun Yat-sen and Chen Cuifen once lived and planned there made the experience exhilarating, sending goosebumps down our spines.

A Statue of Sun Yat sen at the Entrance of Changchun Villa

If you ever find yourself in Taiping, do not miss the opportunity to visit Antong Coffee Mill. Pause for a cup of its aromatic brew, wander through its storied halls, and immerse yourself in the rich tapestry of history. It is more than a destination, it is an adventure through time, where every sip tells a story and every step echoes with the legacy of those who shaped the future.

Cheers

ravivarmmankkanniappan@2057120420263.0567° N, 101.5851° E

(C)All Rights Reserved

Tuesday, 7 April 2026

The Managed Myth of Work Life Balance in Late Capitalism

 

(AI Generated)

From the perspective of critical theory, the modern discourse of work life balance is less a humanitarian breakthrough than an adaptive response to the internal contradictions of capitalism. What appears as a progressive concern for employee well-being is, in fact, deeply embedded in the same system that produces the very conditions it seeks to alleviate. The language of balance does not resolve the tension between human needs and economic imperatives, in actual fact it manages it.

Since the Industrial Revolution, work has been progressively abstracted, measured, and optimized. This transformation reorganized not only production but also human identity. As Karl Marx observed, the worker becomes alienated, reduced to a function within a system that values output over experience. In contemporary terms, this reduction is encoded in the evolution of language, from labour to “human resources,” and now to “human capital.” Each term reflects a deeper internalization of market logic, where human capacities are treated as assets to be maximized.

The Frankfurt School offers a sharper lens through which to interpret this shift. Thinkers like Theodor Adorno and Max Horkheimer argued that advanced capitalism sustains itself not merely through economic structures, but through cultural and psychological integration. Dissent is not eliminated but it is absorbed. In this sense, work life balance functions as what might be called a “managed contradiction”, a concept that acknowledges distress while neutralizing its disruptive potential.

Nowhere is this more visible than in contemporary corporate practices, particularly within the technology sector. Companies such as Google and Meta have pioneered expansive employee wellness ecosystems, with inclusion of on-site gyms, mindfulness programs, flexible work arrangements, and even nap pods. These initiatives are often celebrated as evidence of a more humane workplace. Yet, they also blur the boundary between work and life in ways that intensify engagement. When the workplace provides not only income but also social life, leisure, and identity, disengagement becomes psychologically and socially costly. The result is not less work, but a more totalizing form of it.

Similarly, platform based companies like Uber and Grab exemplify the neoliberal reconfiguration of labour. Here, the rhetoric shifts from employment to “flexibility” and “independence.” Workers are framed as autonomous entrepreneurs, free to choose when and how they work. However, this autonomy is constrained by algorithmic management systems that dictate pricing, visibility, and access to opportunities. The risks traditionally borne by employers, for example income stability, health benefits, long term security, are transferred onto individuals, who must now continuously adapt to fluctuating conditions.

This transformation aligns closely with Michel Foucault’s concept of governmentality. In neoliberal societies, power operates less through direct control and more through the shaping of subjectivity. Individuals come to see themselves as projects to be managed, constantly optimizing their skills, time, and well-being. Work life balance, within this framework, becomes a personal obligation rather than a collective right. Failure to achieve it is internalized as a personal deficiency rather than recognized as a structural outcome.

Even the rise of corporate wellness and mental health initiatives reflects this logic. Programs promoting mindfulness, resilience, and emotional intelligence are framed as tools for personal empowerment. Yet, they often function to recalibrate individuals to endure high pressure environments without questioning the conditions that produce stress. The focus shifts from changing the system to adapting the self.

The paradox, then, is stark. Work life balance is simultaneously necessary and unattainable. It is necessary because human beings cannot sustain indefinite productivity without psychological and physiological consequences. Yet it remains elusive within a system that continuously expands its demands and redefines its limits. The concept persists not because it resolves this contradiction, but because it renders it tolerable, giving individuals a language to cope without fundamentally altering the structure that produces the strain.

A contemporary illustration of this tension can be seen in the rise and subsequent normalization of “quiet quitting,” a term that gained global traction through platforms like TikTok. Workers, particularly younger professionals, began advocating for doing only what their roles formally required, no unpaid overtime, no emotional overextension, no constant availability. At first glance, this appeared to be a reclaiming of boundaries, a grassroots correction to the excesses of modern work culture. Yet organizations quickly absorbed and reframed the phenomenon. Corporate discourse shifted toward “employee engagement,” “wellness initiatives,” and flexible work policies, not as structural concessions but as strategic responses to maintain productivity and retention.

Even in companies such as Amazon, where reports have highlighted intense performance metrics and high pressure environments, the response has not been a reduction in systemic demands but the introduction of coping mechanisms, such as mental health resources, resilience training, and carefully calibrated flexibility. These measures acknowledge the human cost, yet they stop short of redistributing or reducing the underlying pressures. Instead, they enable workers to endure them more sustainably.

Thus, the paradox deepens. Work life balance becomes both a necessity for survival and a tool that stabilizes the very system that undermines it. It does not dismantle the contradiction between human limits and economic expansion but it manages it. In doing so, it transforms a structural tension into a personal responsibility, ensuring that the system can continue to evolve without ever having to truly resolve the imbalance at its core.

In this sense, work life balance operates as a stabilizing myth of late capitalism. It offers the promise of reconciliation between human flourishing and economic rationality, while deferring any substantive restructuring of their relationship. The individual is encouraged to believe that balance is achievable through better choices, better habits, better self management, obscuring the structural conditions that make such balance elusive.

What emerges is a subtle but profound shift in responsibility. Where institutions once bore some obligation for the welfare of workers, that burden is increasingly displaced onto individuals. This is framed as empowerment, freedom, flexibility, autonomy, but experienced as obligation, which requires the individual the need to constantly negotiate, optimize, and justify one’s own existence within the system.

Work life balance, then, does not mark the humanization of work but it marks the normalization of its contradictions. What appears as a concession to human need is, in many ways, an adaptation that allows the system to endure without addressing its core imbalance. The language of balance reframes strain as something to be managed individually rather than structurally resolved, placing the burden back on the worker to negotiate the limits of their own exhaustion.

This tension is not new. The classical Tamil text Thirukkural captures a timeless awareness of excess and restraint. Consider the couplet below by Sage Thiruvalluvar,

“The life of one who does not live within limits may seem to exist, but it will perish without truly being.” - Kural 476

Here, Thiruvalluvar speaks not only to personal moderation but to the sustainability of any system that ignores natural limits. When applied to modern work culture, the insight becomes strikingly relevant. A structure that continually stretches human capacity under the guise of flexibility risks hollowing out the very lives it depends on.

Work life balance, in this light, becomes less a solution and more a coping mechanism, an acknowledgment that the system demands more than it can justly sustain, while subtly urging individuals to self regulate rather than question the demand itself.

Cheers.

ravivarmmankkanniappan@1810070420263.04384, 101.58062

©All Right Reserved

Thursday, 19 March 2026

Intellectual Integrity in a World Without Void Thinking

 

(AI Generated Image)
Intellectual Integrity in a World Without Void Thinking

A recent plagiarism flap, an activist accusing a public figure who happens to be a politician, academic, and social advocate, captures a familiar anxiety about originality. The politician threatened defamation and a week later, the activist apologized, conceding that the politician had published the idea earlier. Yet the activist maintained he hadn’t read that prior work and that his view arose from his own independent thinking. That claim, whether true or not, spotlights a deeper puzzle, if thought is built from observation and experience, what exactly do we mean by “independent” thinking? Perhaps what we often witness is not theft, but convergence, two minds attending to the same patterns in the world and assembling similar conclusions from shared materials.

In academia, the chorus against plagiarism swells, and calls for “originality” and “independent thought” grow ever more insistent. But the word “independent” can be a romantic overreach. Imagination needs raw material, no mind thinks out of a void. We observe, remember, compare, and extrapolate from the known to press into the unknown. On this empiricist picture, cognition is not spontaneous generation. It is construction, intelligent, disciplined, sometimes dazzling construction, from what experience supplies.

Still, this framing can underrate the mind’s capacity for abstraction, pattern recognition, analogy, and synthesis. Even if imagination depends on existing materials, it can reorder them into forms that feel startlingly new. The mind’s originality often lies less in the bricks and more in the architecture. Dependence on input is undeniable,  the question is whether dependence precludes novelty. It need not. Novelty may arise from the structure and depth of reorganization rather than from detachment from experience.

This suggests a refined empiricism where originality is not creation from nothing but transformation of something. The mind is not a creator ex nihilo, it is a reconfigurer. In that light, “independent thought” is never independent of input but can be independent in method, how it selects, filters, and reinterprets the available content.

Opponents press a nativist rationalist case, where the mind isn’t just a processor of experience, but it comes equipped with innate structures that make certain kinds of thinking possible. Descartes famously claimed some ideas (mathematical truths, the infinite) are not derived from the senses. Kant argued that the mind contributes a priori forms, space, time, causality, structuring experience from the outset. Chomsky proposed an inborn language faculty whose complexity outstrips what pure induction from stimulus could supply. On this account, two points challenge the empiricist’s comfort, the mind isn’t a blank slate, and thought is at least partly generative, producing concepts not strictly traceable to specific sensory inputs.

You can translate this into evolutionary terms where innate structures as inherited cognitive architectures shaped by selection. That move makes the nativist view scientifically plausible without smuggling in fully formed ideas. But it doesn’t secure the conclusion that experience is secondary. Early humans may have possessed capacities for abstraction and language, yet capacity is not expression. These potentials need triggers, social scaffolding, and cumulative culture. A child might be wired for mathematics, but without exposure and pedagogy, algebra won’t materialize in isolation. Experience does not merely decorate an interior but it activates and calibrates it.

At this point, the disagreement narrows. The key issues are whether stimulation builds or merely triggers, and what “independent” should mean. The empiricist leans toward construction, where it states that without stimulation, nothing meaningful forms. The nativist counters that stimulation is necessary but primarily tunes and switches on pre existing systems. The definitional snag is equally crucial. If “independent” means independence from external content, it is a myth. If it means independence in the rules and standards of cognition, the internal constraints that shape how we think, then independence survives at the structural level. Edge cases cut both ways, infants display early object expectations, suggesting pre-structuring, while humans can imagine higher dimensions and fictional worlds, seemingly beyond direct experience. The empiricist replies, even these feats are extrapolations from prior inputs, executed by a mind adept at recombination.

A hybrid view emerges as not only attractive but hard to avoid, the mind may be innately structured, yet actual thinking requires experiential activation. All real thought depends on prior input, even if that input does not fully construct it from scratch. That isn’t naïve empiricism but it’s a measured synthesis, innate potential married to experiential development.

Classical voices deepen the picture. Socrates, via Plato, treats learning as recollection, stimulus functions as a midwife, drawing out latent knowledge rather than depositing content from outside. This challenges the empiricist at the root, perhaps the mind contains seeds that dialogue merely awakens. Thiruvalluvar, by contrast, exalts cultivation which includes listening, exposure, and moral discipline transform raw experience into wisdom. He aligns with the empiricist emphasis on input but insists that without reflection and virtue, exposure remains inert. Put together, these positions triangulate a compelling map, internal latency needs external engagement, external engagement needs disciplined processing.

What does this mean for originality and the plagiarism panic? First, convergence is real. Two thinkers can witness the same social currents and, independently, produce strikingly similar analyses. Second, independence should be reconceived, not independence from sources, but independence in the quality of transformation, how rigorously, ethically, and creatively one reworks the given. Third, responsibility shifts to the learner and the writer. Even if structures are innate and stimulus indispensable, wisdom is not passively received. It is painstakingly constructed through attention, judgment, and character.

From a Vedantic angle, this can be seen as a duality unfolding, inherent capacity meets experiential reality, and through conscious effort which includes study, reflection, practice, then it becomes insight. The mind may not be a blank slate, and it certainly isn’t a sealed vault, but it is a living architecture that must be animated, tested, and refined. So is “independent thinking” a myth? Only if we define it as thinking without lineage or input. If instead we define it as the disciplined power to transform what we encounter, to make it truer, clearer, and more generative, then independence is not only real, it is precisely what responsible thought demands.

Cheers.

ravivarmmankkanniappan2003190320263.0567° N, 101.5851° E

© All Right Reserved

Tuesday, 10 March 2026

THE APE THAT DREAMED THAT IT WAS A GOD

AI Generated Image

 

The Ape That Dreamed It Was a God

For most of our history, humans appear to have been reasonably competent creatures, in small doses. Place a dozen of us in a village, a hunting band, or a tribal encampment, and society functions with disarming simplicity. Everyone knows who grows the tapioca, who fixes the roof, and, crucially, who stole the goat. Leadership tends to fall to the person who can keep the fire burning, ward off predators, and remind the rest of the tribe not to eat the bright red berries. But magnify this arrangement to millions of strangers, add bureaucratic labyrinths, televised debates, and an occasional flag waving ceremony, and suddenly the system produces something extraordinary, a natural habitat in which theatrical, hyper ambitious, and occasionally shameless individuals rise, almost effortlessly, to positions of authority. A rare evolutionary niche indeed, the apex predator of the political savannah.

Meanwhile, the sensible people, the ones who actually fix the roof, slowly edge toward the back of the room, confused. The meeting that was supposed to be about replacing a broken ladder has somehow morphed into a three hour argument about who deserves to hold it. Progress, we call it. Though one suspects the ladder would get repaired sooner if left to a tribe of moderately organized squirrels. Civilization, governance, social institutions, all grand words for what occasionally looks like an elaborate filtration system for selecting individuals who should not be trusted with the office coffee machine, let alone the machinery of a country.

And perhaps this is the real flaw in our planetary project, humans were never designed to run the world. We were built for ambling across grasslands, sharing berries, and checking over our shoulders to ensure we hadn’t been designated as someone else’s lunch. A modest, sensible role in the food chain. But give us symbolic thought, agriculture, philosophy, science, industrialization, psychology, and digital networks, and we take it as a sign that we should appoint ourselves CEOs of the ecosystem. Yuval Noah Harari (Historian and Philosopher, 1976- to date)  might call this the inevitable side effect of our dangerous superpower that is the ability to conjure shared fictions at scale. Once a few of us agreed that lines on a map are sacred, that paper is money, and that slogans are a form of truth, we could coordinate in vast numbers but sadly, we could also mislead one another with professional efficiency. A species that once negotiated over berries now negotiates over narratives, currencies whose value rises with repetition. The same cognitive fireworks that let us imagine a better future also let us invent better justifications, more decorative delusions, and myths elaborate enough to require ministries.

Humanity sometimes looks like a species that accidentally promoted itself. We evolved for a modest job, wandering grasslands, sharing berries, and nervously checking whether we were about to become lunch, yet the moment we discovered stories and symbols, we interpreted that as a mandate to run the entire planet. In global geopolitics, this plays out like a prehistoric foraging tribe that somehow acquired a corporate org chart where borders are sacred office partitions, currencies are colourful reward points, and ideologies are motivational posters everyone pretends to understand. Leaders gather in diplomatic boardrooms to negotiate narratives the way our ancestors once negotiated berry bushes, only now the berries are trade routes, alliances, sanctions, and “national interests.” The strange trick is that most of it works because we all agree to treat the memos as real and repeat a slogan often enough and it graduates into policy. Our great cognitive superpower, the ability to believe the same story at scale, lets billions coordinate, innovate, and occasionally build rockets, but it also means the world is effectively managed by a former band of foragers who discovered PowerPoint and decided that what the ecosystem really needed was quarterly strategy meetings.

Long before any of this, Thiruvalluvar (an ancient Indian Philosopher, circa 4th CE) had the courtesy to warn us in couplets that a ruler without virtue is a calamity, that greed corrodes judgment, and that governance without justice is merely a louder form of theft. He might have phrased it more elegantly, but the gist is familiar, better the leader who rescues the drowning than the one who asks whether the drowning have filled in the correct form. In a village, this is common sense but, in a nation, it becomes a manifesto nobody reads. If small societies rely on character because everyone can see it, large societies rely on spectacle because character no longer fits on a billboard. We keep mistaking applause for approval and volume for validity, and thus the ladder remains tragically unfixed.

Francis Bacon (an English Philosopher, 1561-1626), who never met a cognitive bias he didn’t try to categorize, would likely diagnose our misadventures as an infestation of idols. The Idols of the Tribe, our species wide habits of overgeneralizing and seeing patterns where none exist. The Idols of the Cave, our private preconceptions and pet theories, which we defend with the ferocity usually reserved for family heirlooms. The Idols of the Marketplace, the way language turns confusion into policy by giving vague ideas sturdy names. And the Idols of the Theater, our fondness for grand systems that are more elegant than accurate. Put these together and you get modern governance, a theatre crowded with idols and not nearly enough exits, where the debate about the ladder proceeds flawlessly in the passive voice, mistakes were made, responsibilities were misunderstood, repairs were delayed, but the press conference went very well.

Then comes Sigmund Freud (a Neurologist and founder of Psychoanalysis, 1856-1939), whispering that the true ruler of the polis might be the unconscious, the vast, inconvenient ocean beneath our carefully ironed intentions. We advertise to desires we don’t admit, vote for stories we can’t resist, and then rationalize our choices as if logic had been invited from the start. The superego drafts the manifesto, the id writes the campaign jingle and the ego edits the minutes afterward to make it all sound deliberate. Industrial society discovered that the psyche is a lever, and so we built entire industries to pull it. If Bacon taught us to watch our errors, Freud taught us to watch the watcher, to suspect that the person holding the ladder might be doing so to impress their father, terrify their rival, or seduce the electorate, anything, really, except fix the roof.

Meanwhile, the spiritual economy upgraded itself to a doctrine of ownership. Not just land or cows, but attention, identity, opinion, and afterlife options. We collect followers the way ancestors collected firewood, then pray that the algorithm, our new household god, will smile upon our sacrifices. Death once retired us from the world, now it threatens to interrupt our brand strategy. If Thiruvalluvar counselled restraint and justice, our age prefers a more actionable virtue, scale. We confuse “bigger” with “better,” “louder” with “truer,” and “trending” with “true.” It turns out you can capture the world’s attention without once capturing the problem at hand, which is why the ladder’s defect has more publicity than solutions.

A contemporary thinker like David Graeber (an Anthropologist and a Political Activist, 1961-2020) would add that bureaucracy expands not to solve problems but to define them into eternity. Paperwork is our civilization’s poetry, a sprawling epic in which the hero is a form and the dragon is a missing signature. Whole categories of “bullshit jobs” arise to service the narrative machinery that services the other narrative machinery, until the only thing being produced at scale is justification. We used to hunt deer but now we hunt compliance. We used to share meat but now we share meeting invites. If Harari mapped how fictions make us many, Graeber mapped how paperwork makes us busy, too busy, often, to notice that the roof is leaking onto the file labelled “Roof Integrity.”

The Scientific Revolution promised us a method, fewer idols, more evidence, fewer feelings, more facts. We honoured that promise by building instruments of astonishing precision and then using them to measure our preferences. We split atoms and then our attention. Rationality became a toolkit for building better machines and better excuses, the same empiricism that could heal a city could also optimize a distraction. We tell ourselves that the data will save us, but data, like the gods, have priests, and priests, like the rest of us, have incentives. Thus, empiricism often arrives to the policy table on time, only to discover that the seating chart is already fixed.

Industrialization dragged us into cities and into ourselves. The modern psyche, half spectacle, half surveillance, oscillates between craving visibility and fearing exposure. Freud’s descendants help us label the oscillation, advertisers help us monetize it and the rest of us post about it. Digital networks turned our cognitive village into a global amphitheatre where everyone speaks and nobody listens long enough to pass the ladder. We call this “networked cognition,” a charming euphemism for outsourcing memory to machines and delegating judgment to trends. We have reached the point where the town crier is automated, and the town itself is an app asking us to rate our experience of the fire while the house burns.

Perhaps the truth is embarrassingly simple. Maybe humanity was never meant to design social constructs spanning continents. Maybe our wiring was optimized for cooperative foraging, not parliamentary theatrics. Maybe the cognitive revolution was less an upgrade and more a cosmic glitch, a misfired mutation that gave primates the ability to invent bureaucracy. Thiruvalluvar would urge us to rediscover virtue and restraint. Bacon would plead for method over myth. Freud would ask us to interrogate our motives before we broadcast them. Harari would remind us that our superpower is a shared story, and that stories can hand us both tools and chains. Graeber would advise us to notice when the structure we built to help us has become the reason help cannot arrive.

A clear 21st-century example is the global response to the COVID 19 pandemic. A microscopic virus spread through a species capable of sequencing its genome within weeks and designing vaccines in under a year, an astonishing triumph of science. Yet the crisis quickly became a theatre of competing narratives. Governments argued over borders, political parties turned masks and vaccines into identity badges, and social media flooded the public square with conspiracies and counter stories. In some places, the logistics of saving lives were slowed by bureaucratic procedures, ideological battles, and mistrust of institutions. Scientists pleaded for evidence based method, echoing the spirit of Francis Bacon, while psychologists pointed to fear, denial, and tribal thinking that Sigmund Freud might have recognized. Meanwhile, the crisis revealed how global coordination depends on shared beliefs, much as Yuval Noah Harari argues. Humanity possessed the tools to solve the problem, but our stories about power, identity, and authority often made the solution harder to reach.

Perhaps the quiet absurdity of our age is that competence whispers while confidence campaigns. The thoughtful hesitate, the theatrical govern. Our institutions resemble a ladder with missing rungs, still ceremonially displayed, endlessly discussed, but rarely repaired by the few who actually know how to climb. So, we polish speeches, redesign platforms, and issue declarations of progress while the roof continues its patient leaking. Civilization becomes a ritual of announcing solutions rather than practicing them. Meanwhile, the squirrels, unburdened by ideology, bureaucracy, or televised debates, solve the practical problem of winter with an efficiency our committees might envy.

The tragedy is not that humanity dreamed boldly, but that the ape who dreamed it was God occasionally forgot it was still an ape. Power magnifies the illusion, language decorates it. And so, we continue negotiating over narratives while the scaffolding of reality creaks beneath us. As Friedrich Nietzsche warned, “He who fights with monsters should look to it that he himself does not become a monster.” Until humility climbs the ladder before ambition does, progress may remain what it too often is, a press conference about repairs rather than the quiet work of fixing the roof.

Cheers.

ravivarmmankkanniappan1908100320263.0567° N, 101.5851° E

© All Right Reserved

 


Saturday, 7 March 2026

THE ITCH OF WAR: FROM KURUKSHETRA TO HORMUZ

 

(AI Generated Image)

War rarely begins with grand strategy or noble declarations. More often, it begins with something far smaller and far more human. Imagine an itch, an irritation that refuses to go away. One person feels it first, perhaps pride wounded, ego bruised, grievance unresolved. Instead of calming the irritation through restraint, reflection, or compromise, he provokes another. Soon the second person begins scratching as well. What started as a private discomfort becomes shared agitation. Retaliation follows retaliation, and the scratching becomes a spectacle. Others join in, either to defend honour, settle scores, or simply because conflict has a way of pulling spectators onto the stage. Before long, the original irritation is forgotten, yet the pain has spread everywhere. That, in essence, is how wars often grow, not merely from necessity, but from unchecked impulses and the human tendency to export one’s own unrest.

A striking illustration of this dynamic appears in the ancient Indian epic Mahabharata and the catastrophic Kurukshetra War. The conflict did not begin with armies marching across plains, but it began with humiliation, envy, and pride. The rivalry between the Pandavas and the Kauravas escalated through insults, manipulation, and the infamous dice game in which power, honour, and dignity were gambled away. The public humiliation of Draupadi transformed a palace dispute into a moral crisis that demanded redress. What might have remained a family quarrel hardened into an existential struggle involving kingdoms across the subcontinent. By the time diplomacy failed, the original grievances had become secondary. Pride, vengeance, and the perceived need to restore honour had already set the stage for a war that would devastate an entire generation.

History shows that this pattern repeats itself with uncomfortable regularity. Conflict is rarely spontaneous, but it usually emerges within larger cycles of power, insecurity, and shifting influence. When dominant powers sense their authority weakening or their economic foundations wobbling, strategic anxiety tends to rise. Military posturing becomes more visible, statements grow sharper, and warships suddenly begin what might politely be described as “presence missions.” Aircraft carriers do not wander oceans by accident. They are floating signals. When global power feels uncertain, the world often witnesses a season of muscle flexing disguised as diplomacy.

This dynamic is not new. In the nineteenth century, Britain and Russia engaged in a prolonged geopolitical rivalry in Central Asia that later became known as “the Great Game.” The term was first used by Captain Arthur Conolly of the British East India Company’s Bengal Light Cavalry in the 1840s to describe the strategic contest unfolding across Afghanistan, Persia, and the Central Asian Khanates. Later, Rudyard Kipling’s novel Kim gave the phrase its romantic and mysterious aura, portraying a shadowy world of spies, agents, and imperial manoeuvrings. Behind the literary drama, however, the Great Game was simply two empires attempting to secure influence, buffer zones, and strategic advantage without triggering a full scale war between themselves.

What is unfolding today in the Middle East resembles a far more dangerous version of that rivalry. Observers increasingly describe the current crisis as a “New Great Game,” but the comparison is only partially accurate. The nineteenth-century contest revolved largely around territory and imperial boundaries. The modern one revolves around regime survival, strategic deterrence, economic choke points, and global alliances that stretch far beyond the region itself.

The present escalation began dramatically at the end of February 2026, when the United States and Israel launched coordinated high intensity strikes against Iranian political, military, and nuclear infrastructure. The operations, reported as large scale precision campaigns, targeted command centres, missile facilities, and key figures within Iran’s leadership. Reports from multiple outlets indicated that the attacks killed Iran’s Supreme Leader, Ayatollah Ali Khamenei, along with several senior military commanders and government officials. Iranian authorities later confirmed the deaths and declared a national mourning period.

This moment represented a decisive break from the shadow war that had defined US/Iran tensions for decades. Until then, confrontation largely occurred through proxies, cyber operations, covert sabotage, and limited missile exchanges. Directly targeting the leadership of the Iranian state crossed a threshold that previous administrations had avoided. The strategic logic behind the strike appeared to be the classic doctrine of overwhelming force, cripple the command structure quickly and create internal political shock large enough to weaken the regime itself. Officials in Washington framed the operation partly in those terms, suggesting that the Iranian population should seize the moment to reclaim political control from its ruling system.

But wars rarely unfold according to the tidy logic of strategic planners. Iran responded with immediate retaliation, launching waves of drones and ballistic missiles at American installations and allied states across the Gulf. The scale of the response was notable not only for its intensity but for its geographic reach. Missiles and drones targeted locations in Qatar, Bahrain, the United Arab Emirates, Kuwait, and Saudi Arabia, while some strikes extended toward Cyprus, Turkey, and Azerbaijan. Explosions were reported near major infrastructure hubs, including ports, energy terminals, and military bases. In geopolitical terms, Iran was sending a blunt message that is, if its regime was threatened, the entire regional system would feel the shock. Interestingly, Iran’s approach relies heavily on cheap, low cost drones, frequently referred to as Shahed-136 and Shahed-131 kamikaze drones. Estimates place production costs between $20,000 and $50,000 per unit, a fraction of the cost of more sophisticated US and Israeli systems like the Patriot missile or Israel’s David’s Sling and Arrow-3 interceptors, which can range from $1 million to over $3 million per launch. By leveraging affordability and sheer numbers, Iran can project strategic disruption without the enormous financial burden of high end missile exchanges, turning cost asymmetry into a tactical advantage.

The escalation deepened when Iran announced the closure of the Strait of Hormuz, one of the most critical maritime chokepoints in the global economy. Roughly a fifth of the world’s oil and a substantial portion of liquefied natural gas normally transit through that narrow passage between the Persian Gulf and the Gulf of Oman. Declaring the strait closed, and threatening vessels attempting passage, instantly disrupted global energy flows. Hundreds of tankers were stranded or forced to reroute. Insurance firms began withdrawing coverage for shipping in the region. The economic ripple effects spread quickly through global markets.

At the centre of the crisis now stands a clear strategic confrontation between the United States and Israel on one side and Iran on the other. Washington and Tel Aviv appear to be pursuing objectives that go well beyond slowing Iran’s nuclear ambitions. The pattern of strikes suggests a broader effort to degrade Iran’s missile capabilities, dismantle the network of allied militias often described as the “Axis of Resistance,” and limit Iran’s ability to project influence across Lebanon, Syria, Iraq, and Yemen. Israeli leadership has emphasized that the war is intended to be decisive rather than permanent, though history offers little reassurance that conflicts launched with such confidence remain contained.

Meanwhile, other major powers are behaving with notable caution. Russia and China both condemned the strikes and called for emergency discussions at the United Nations, yet neither has shown serious interest in entering the conflict militarily. Their restraint is not altruism, it is calculation. Russia remains heavily engaged in its own war in Ukraine and has little appetite for a second direct confrontation with the United States. China, while deeply dependent on Middle Eastern energy supplies, prioritizes stability above ideological alignment. India, for its part, is walking a delicate line. New Delhi relies heavily on Gulf energy imports and maintains strategic partnerships with both Washington and Tehran, making overt support for either side risky. As a result, India has largely called for de-escalation and dialogue, emphasizing diplomacy while quietly managing its energy security and regional influence. An open war involving great powers would threaten precisely the economic and strategic stability that Beijing and New Delhi alike rely upon.

That does not mean Iran stands entirely alone. Diplomatic backing, intelligence sharing, technological assistance, and strategic coordination are all possible forms of indirect support. Iranian officials have hinted at receiving “political and other assistance” from both Moscow and Beijing, though the ambiguity appears intentional. In geopolitics, uncertainty itself can function as a strategic tool.

Perhaps the most uncomfortable position belongs to the Gulf monarchies. Countries such as Saudi Arabia, the United Arab Emirates, and Qatar host major American military bases while simultaneously depending on regional calm to sustain their economic growth. That dual reality places them directly in the crossfire. Iranian missile and drone attacks have already struck installations in several of these states, including Qatar’s Al Udeid air base and key port infrastructure in the UAE and Bahrain. At the same time, disruptions to shipping through the Strait of Hormuz threaten the very energy exports that underpin their economies. In effect, they are both partners and potential victims in the same strategic arrangement.

There is a historical echo here that stretches far back into antiquity. When Xerxes I of Persia, the self-styled “King of Kings”, pushed the Achaemenid Empire westward, his campaign was driven not only by expansion but by the need to reinforce imperial authority after internal revolts in provinces like Egypt and Babylon. His massive expedition against the Greek world required extraordinary engineering feats such as the bridge across the Hellespont and culminated in decisive setbacks at battles like Battle of Salamis. Empires often expand outward when internal pressures rise, projecting strength to consolidate authority at home. The comparison is imperfect, yet the pattern is familiar, which is where the hegemonic power attempts to maintain dominance across a vast strategic theatre while managing political strain, logistical limits, and unpredictable resistance.

The crisis differs from the 19th century Great Game in several important ways. First, the struggle today is less about territorial control than about control over infrastructure and systems, where it focuses on energy routes, financial networks, cyber capabilities, missile defences, and global supply chains. A single disruption in the Strait of Hormuz can shake energy markets from Tokyo to London in a matter of hours.

Second, the battlefield now includes powerful non state actors. Groups such as Hezbollah and various regional militias extend the reach of state power while maintaining enough ambiguity to complicate retaliation. Their involvement blurs the line between conventional war and proxy conflict, increasing the risk of escalation across multiple fronts simultaneously.

Third, and perhaps most significant, the consequences are global rather than regional. Energy prices surge, stock markets wobble, and shipping routes stretch thousands of miles longer as vessels avoid conflict zones. Insurance companies withdraw coverage, logistics networks slow down, and the spectre of recession begins to hover over distant economies that have no direct involvement in the fighting.

All of this makes the modern “Great Game” less like a chess match and more like a complex web of dominoes. One move rarely affects only one square on the board.

The role of media further complicates how people understand these events. If you really want to experience modern warfare without leaving your sofa, forget streaming platforms and try channel surfing instead. Start with CNN, where the graphics move fast, the music is urgent, and the strikes often sound like decisive acts of strategic necessity. Then switch to Al Jazeera and watch the same event transform into an entirely different film, suddenly the language shifts, the victims have names, and the missiles look less like strategy and more like devastation. For an added intellectual workout, flip to BBC, which usually delivers the most carefully balanced version, a calm voice explaining that everything is “deeply concerning,” followed by a panel discussion that politely circles the issue without quite landing anywhere. By the time you finish rotating through the channels, you will have watched three different movies about the same war, complete with heroes, villains, tragedy, and suspense. It is better than any thriller, geopolitics, emotion, moral ambiguity, and contradictory plotlines, all broadcast live.

And yet, for most people far from the front lines, daily life continues. Bills must still be paid, children still go to school, friends still gather, and football debates remain as heated as ever. The spectacle of geopolitics often unfolds on screens rather than streets. For most of the world’s population, war exists as background noise, distant, dramatic, and strangely abstract. The evening news may flash images of missiles and burning infrastructure, but outside the window the bus still arrives, shops still open, and someone somewhere is passionately arguing about the weekend match. Life has a stubborn rhythm that refuses to pause simply because history is making noise elsewhere.

Yet that distance is fragile. Every global conflict begins somewhere specific, one border, one grievance, one decision taken in a quiet room by a handful of leaders. But the consequences rarely remain contained. Energy prices rise, alliances shift, economies tremble, and narratives harden. The real danger lies not only in the violence itself but in the human tendencies that ignite it, pride that refuses compromise, leaders who gamble with escalation, and societies that mistake retaliation for justice. When those forces converge, the irritation spreads outward like ripples in water. The scratching multiplies.

And eventually, everyone feels the itch.

Still, human history has never been defined solely by conflict. It is also defined by resilience, courage, and the stubborn refusal to surrender to fear. The spirit of that resilience is captured beautifully in the words of Subramania Bharati (aka Mahakavi, meaning the ‘Great Poet’, a 20th-century a revolutionary, patriot, and social reformist), whose famous poem Achamillai Achamillai reminds us that uncertainty and danger have never been enough to stop the human will to live freely,

அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே
இச்சகத்துள் ஒருவனுக்கு இந்நிலையே அமையுமோ?
அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே.

உச்சிமீது வானிடிந்து வீழுகின்ற போதினும்
அச்சமில்லை அச்சமில்லை அச்சமென்பதில்லையே.

அஞ்சுவது யாதொன்றும் இல்லை,
அஞ்சி நிற்கும் காலமும் இல்லை.”

(an excerpt from the Subramania Bharathi’s poem “The Fear”)

“There is no fear, there is no fear, there is no such thing as fear.
Can such a state truly exist for a human in this world?
Yet I say again, there is no fear, there is no fear.

Even if the sky itself were to shatter
And come crashing down upon my head,
Still there is no fear, there is no fear.

There is nothing in this world to be afraid of,
Nor is this a time to stand trembling in fear.”

Bharati was not naive about the dangers of the world, he understood them deeply. But his message was clear, humanity must not surrender its courage, its dignity, or its hope. Wars may erupt, powers may compete, and the machinery of geopolitics may grind loudly in the background. Yet ordinary life continues precisely because people refuse to let fear define the boundaries of their existence.

So, while empires manoeuvre and alliances shift, people will still gather around dinner tables, argue about football, plan their futures, and teach their children to dream of a better world. History may move in storms, but humanity moves in hope.

Cheers.

ravivarmmankkanniappan2053070320263.0567° N, 101.5851° E

© All Right Reserved.

Friday, 23 January 2026

WHERE TIMES BOWS TO IMPERMANENCE: A Lineage of Faith, Memory, and the Living Dhamma

 

The Buddhist Maha Vihara
(Brickfields, Kuala Lumpur)

Beneath the quiet dignity of the Buddhist Maha Vihara in Brickfields, founded in 1894 and now enclosed by the glass, steel, and ceaseless motion of a modern city, time itself appears to pause. The world beyond presses forward with urgency and ambition, yet within these grounds, stillness abides. Perhaps it is so because where the Dhamma is the very purpose of existence, time relinquishes its authority.

This sacred place stands not merely as architecture fashioned of stone and cement, but as a living continuum of faith sustained through intention, sacrifice, and unwavering devotion. My wife, Greeja, traces her lineage to her great-great-grandfather, Mr. Udanis, among the earliest Sinhalese settlers in Malaya, whose efforts helped establish and nourish the early flowering of the Buddha’s Dispensation in this land. What was once a fragile seed, planted with faith and perseverance, has endured through generations, taking firm root and maturing into the thriving Buddhist congregation that exists in Kuala Lumpur today, among whom is the De Silva family, to which Greeja belongs.

Today, however, our presence here is of a more intimate and solemn nature. We have come to offer prayers and merit in remembrance of Greeja’s dearly departed aunt, Madam Lalitha Pathmalata De Silva. In this act of recollection and offering, the temple becomes a mirror, reflecting back to us the fundamental truth proclaimed by the Blessed One:

“All conditioned things are impermanent.”
(Sabbe saṅkhārā aniccā)

This truth was not taught by the Buddha as a matter of abstraction but revealed through compassion grounded in wisdom. Once, a woman named Kisā Gotamī, distraught by the death of her only child, approached the Buddha carrying the child’s lifeless body, imploring him for medicine to restore him to life. Seeing her sorrow, the Buddha neither dismissed her anguish nor fed her despair with false hope. Instead, he asked her to bring a mustard seed obtained from a household untouched by death.

With faith in his words, Kisā Gotamī went from door to door. Mustard seeds were readily given, yet in every household she encountered the same truth, a parent lost, a spouse mourned, a child remembered. There was not a single home free from death. Through this quiet pilgrimage, her grief was gradually transformed. What had been borne as a private tragedy was revealed as the universal condition of all beings subject to birth. Returning to the Buddha, she understood that what arises must pass away, and that clinging to what is impermanent is itself the root of suffering.

So too does loss remind us, gently, yet unmistakably, that life is fleeting, that those we love are entrusted to us only for a time, and that all compounded things are in ceaseless change. Yet within this truth there is no call to despair. As the Buddha taught Kisā Gotamī, within impermanence lies the ground for wisdom, restraint, and compassion. The Dhamma does not ask us to deny sorrow, but to see clearly the nature of existence and to live in a way that is blameless, mindful, and generous.

Thus, amid the fragrance of incense and the measured cadence of chanting, surrounded by generations of devotion and the quiet certainty of change, we are reminded why the path matters. Not to stand against impermanence, but to understand it, and not to cling to what must pass, but to cultivate what does not decay. In aligning the heart with the Dhamma, one learns to meet arising and passing away with wisdom, dignity, and peace.

SADHU … SADHU … SADHU

ravivarmmankkanniappan@1540240120263.12786° N, 101.68679° E

©ravivarmmank

Saturday, 10 January 2026

“The Neanderthal Paradox” - Outer Progress, Inner Regression

 

AI Generated Image

When Neanderthals shaped stone into knives and spears, they weren’t just making tools, they were externalizing thought. Each strike against stone reflected judgment, foresight, and risk. A poorly made spear meant hunger or death. Tool making, then, was not convenience, it was cognition made visible. That is why early tools marked a genuine progression in human development where they extended imagination without replacing it.

This raises a troubling question today. If stone tools signalled human ascent, does artificial intelligence signal another evolutionary leap or a quiet regression to a new kind of prehistory, where thinking itself is outsourced?

At first glance, the arc of history seems clear. Tools evolved from survival aids to instruments of comfort, then to systems of efficiency. What began as necessity slowly became desire. Discovery was once driven by hunger and danger, but now it is driven by optimization and convenience. Yet this shift has altered not just what we make, but how we think.

To understand this transformation, it helps to briefly align a few thinkers, not as authorities, but as lenses.

Charles Darwin explains the biological groundwork. From his perspective, tool use is an evolutionary advantage, not a moral or historical turning point. Humans who could cooperate, imagine, and manipulate objects survived better. Tools followed intelligence but they did not direct it. Darwin’s account is powerful but limited as it only explains how tool making emerged, not how tools later came to reorganize human life.

Friedrich Engels fills that gap. For him, labour and tool making were not passive outcomes of evolution but active forces shaping the human hand, brain, language, and society. Tools didn’t just help humans survive but they helped create humans as conscious, social beings. Here, tool making is transformative, not merely adaptive.

Karl Marx extends this insight into history. Tools become “means of production,” and whoever controls them controls social life. Technological progress, Marx argues, restructures society and concentrates power. Tools amplify productivity, but under certain systems they also alienate humans from their own creative capacities. Progress outward, impoverishment inward.

Yuval Noah Harari updates this story for the present. What distinguishes modern humanity, he suggests, is not tools alone but shared imagination, which includes myths, money, laws, and now algorithms. Today’s tools are no longer just physical objects but they are systems of belief encoded in software. AI, financial models, and data infrastructures don’t just assist decision-making but they define what counts as a decision.

At this point, a pattern becomes visible. Human development is not driven by biology alone, nor labour alone, nor economics alone, but by their interaction with imagination. Tools once expanded imagination. Now they increasingly replace it.

Ancient traditions sensed this risk intuitively.

In Indian thought, craft (śilpa) was never just mechanical skill. The Śilpa Śāstras treated toolmaking as disciplined knowledge aligned with cosmic order and ethical purpose. Even Vedic metaphors compared crafting an object to crafting a thought where both acts of mindful construction. Action without reflection was never idealized.

Greek philosophy made this distinction explicit through technē. Plato warned that writing, an early cognitive tool, could weaken memory by externalizing it. Aristotle valued technē but insisted it be guided by phronēsis, practical wisdom. Tools were legitimate only when governed by judgment and ethical ends.

In both traditions, tools were subordinate to inner clarity. Thought preceded action. Skill served wisdom.

Modern technological society reverses this order.

Today, tools do not merely help us think but they structure how thinking happens. Recommendation algorithms decide what we read. GPS decides how we navigate. AI copilots draft our emails, summarize our meetings, and increasingly suggest what decisions to make. None of this is coercive. That is precisely the danger. Dependence arrives disguised as ease.

Martin Heidegger foresaw this condition. He warned that modern technology is not neutral, it “enframes” reality, turning everything, including humans, into resources to be optimized. Under this logic, thinking becomes calculative rather than contemplative. We learn how to operate systems fluently while losing the habit of questioning their purpose.

You can see this everywhere. University students rely on AI not to test ideas, but to avoid struggling with them. Professionals follow dashboards and metrics without understanding what is being measured or why. Social media platforms optimize “engagement,” subtly shaping attention spans, desires, and outrage cycles, while users feel more informed than ever. Judgment hasn’t vanished but it has been deferred.

Hannah Arendt helps explain the moral consequence. In her analysis of thoughtlessness, she showed how responsibility dissolves when individuals stop thinking and start merely following processes. Today’s conformity is not enforced by authority but by systems. “The algorithm recommended it.” “The model decided.” Obedience has become procedural.

Herbert Marcuse sharpens the critique. Technological societies, he argued, produce the “one-dimensional” human, highly capable within systems, yet incapable of imagining alternatives. This is not ignorance but it is a narrowing of possibility. A person may optimize workflows flawlessly and still struggle to ask whether the workflow should exist at all.

Ancient wisdom offers a counterpoint. The Thirukkural insists that action must be preceded by reflection:

எண்ணித் துணிக கருமம்; துணிந்தபின்
எண்ணுவம் என்பது இழுக்கு.” - Kural 467

“Think carefully before acting; once resolved, wavering is weakness.”

Here, dignity lies in judgment, not execution. When action becomes automated and thought outsourced, efficiency increases, but agency erodes.

This is where the Neanderthal comparison becomes illuminating rather than insulting. Neanderthals lived amid uncertainty. Every tool demanded engagement, improvisation, and risk. Their tools expanded human capability without replacing human responsibility.

Modern humans, surrounded by vastly superior tools, risk becoming cognitively passive. We execute without originating, optimize without imagining, comply without questioning. The regression is not biological, it is existential.

The danger of AI and advanced technology is not that machines will become human like. It is that humans may become machine like, precise, efficient, obedient and inwardly hollow. Civilization advances outward while retreating inward.

True progress is not measured by the intelligence of our tools, but by the vitality of the minds that wield them. When tools assist imagination, humanity advances. When tools replace imagination, humanity regresses, quietly, comfortably, and with great efficiency.

That is the real question before us, where it is not whether AI can think, but whether humans will continue to do so.

Note:
This writing was inspired by my friend Rajender, who poked this question a few days ago.

ravivarmmankkannaiappan@1551110120263.0567° N, 101.5851° E 

©ravivarmman