Tuesday, 12 August 2025

BOOK NOTE - The Blade Itself by Joe Abercrombie


As a fan of fantasy books, I have grown accustomed to the slower pace, expansive timelines, and numerous characters that define the great works in this genre. Think G.R.R Martin’s multi-volume A Song of Ice and Fire series, which I have read in full and in the right order. Think of the 20+ volume Sword of Truth series by Terry Goodkind, of which I have managed only three—starting somewhere in the middle. Or the mind-numbingly expansive Malazan Book of the Fallen series by Steven Erikson, which I am reading in chronological order but have only reached book four of the 25+ instalments.

A common thread in all these is the sheer expanse of the universe—the people, lands, and magical systems. They can be enchanting, transporting you to different worlds, complete with maps and side stories to disappear into. But they can also be overwhelming, just to keep pace with the scale and complexity. The Malazan series in particular is so vast and intricate that it can be difficult to get through—and I mean that in a good way. Given my interest in both fiction and non-fiction, committing to such sprawling series can limit the variety of my reading life.

Enter the likes of Patrick Rothfuss’s Kingkiller Chronicle and Joe Abercrombie’s First Law trilogy—both great fantasy series, but relatively faster-paced and, in subtle ways, less demanding on one’s reading cadence.

Abercrombie’s The Blade Itself is a fine example of this. I had been meaning to get to it for at least two years since it was recommended by someone with a good eye for epic fantasy, and I am glad I finally did.

The worldbuilding is both brisk and evocative. The familiar fantasy tropes are here—the vain and prickly hero, the noble brute, the wise wizard, the tragic mentor, the cheeky dame in distress and the evil forces from beyond the borders and within. There is magic that is taboo, historical figures assembling a fellowship, and wars and beasts at the periphery. But what I especially enjoyed was that the politics and motivations felt very contemporary. Perhaps it is Abercrombie’s language, but unlike Malazan, which feels truly ancient, the characters in The Blade Itself often seem modern. The sense of the ancient is there, but the events feel distinctly of our time.

It is a fine balance to strike. Lean too modern and it starts to feel like sci-fi with magic—a tone better suited to young adult vampire dramas on TV.

One example, without spoilers: the book features merchant guilds that are ruthless capitalist empires, sanctioned by the King. Over time, they have grown in wealth and influence, overtaking the old-money aristocrats. This sets the stage for a different kind of “eat-the-rich” revolution—not from the poor masses, but from the royal and erstwhile nobility. Many fantasy novels I have read have powerful merchants, usually individuals, but I cannot recall another that plays with this almost post-capitalist dynamic. For that alone, the book deserves credit.

A great read, highly recommended.

Sunday, 3 August 2025

Telling stories from the future


Stories from the future are always interesting.

Yes, you read that right. Stories from the future can be told today, but they are not always what you think they are. They might fall under science fiction or fantasy, but beyond that, they offer us a way to imagine what lies ahead, engaging both our creative and cognitive faculties. They serve an important role in helping us visualize plausible worlds.

I loved reading “The Story of You: What might Singapore look like for those born today?” in today’s (8 Aug 2025) edition of The Straits Times. As the summary puts it, the piece envisions life for a child born in 2025, projecting all the way to 2105 and beyond.

Drawing on current trends and data, as well as interviews with 19 experts, The Straits Times envisions one speculative and possible future for the first members of Generation Beta who are born in 2025, as part of its Born Tomorrow series.


                                                   The Story of You, The Straits Times, 8th August 2025.

Read Here

This kind of speculative futures work, with storytelling at its core, is not prophecy, nor is it pure science fiction. It’s a fascinating discipline at the intersection of science, art, and management that helps us imagine what could be. And rather than predict the future, it helps us prepare for it. We should take this seriously. It’s not just a story emerging from an inventive mind, but from a structured method. It is based on trends, uncertainties, and expert insights that explores the edges of science and culture.

Of course, any attempt to think about the future is always constrained by the past. As futurist Arthur C. Clarke once said, “The most daring prophecies seem laughably conservative.” Prophecy tends to be conservative. I did feel this article didn’t push the boundaries of imagination enough. It’s rather cautious, even. But it’s still heartening to see this kind of scenario thinking becoming more mainstream, rather than being confined to academic circles.

Interestingly, I had written a similar imagined future scenario back in 2016: "October 10th, 2030". It was my own small experiment in speculative storytelling - combining data, trends, and imagination.

And now, with the support of AI, this process of informed ‘hallucination’ and envisioning possible futures can become far more accessible.

Maybe in this field, some of AI’s hallucination problems could actually be a feature, and not a bug.

Banner Image Credit © Edisaacs | Stock Free Images

Sunday, 20 July 2025

Pulp fiction - fast plots, timeless lessons



Books come in various styles. One of my favourites is pulp fiction. What do I mean by pulp fiction? It is the genre of books that usually consists of murder mysteries, crime novellas, espionage stories, lawyers solving crimes, and some adventure novels. The common thread across all of these is that they are fast paced, with mostly morally ambiguous characters—either trying to commit a crime, hide a crime, solve a crime, or being a victim of one.

Some of the great authors of all time were pulp fiction writers. Legendary writers like Ray Bradbury and Philip K. Dick started off writing pulp stories in magazines before they wrote their famous works. But there is an entire league of authors who are great pulp fiction writers, full stop. The best among them, according to me, is James Hadley Chase. Nothing beats a good Chase potboiler when you have a few hours and do not want anything that taxes the mind.



Classic pulp noir: the seductive femme fatale and the hard-boiled detective, caught in a moment of danger, desire, and double-cross.


The hallmark of a great James Hadley Chase book is that the mystery is no mystery at all. As the reader, you usually begin the story already knowing who the criminal is and who the slightly better-off person chasing the criminal might be. The suspense lies in the story and in how the game of cat and mouse plays out. Chase’s characters are never entirely good or bad. They live real lives and are full of contradictions. Sometimes they are hard-boiled criminals with a cruel edge who kill, torture, and commit senseless crimes. But the great author places them in situations and storylines that have you rooting for these terrible characters—and wondering why. And long before George R. R. Martin became famous for killing off his main protagonists (I still cannot believe when Ned Stark died. Genius), James Hadley Chase was routinely offing some of the best characters—the ones you were kind of rooting for. But clearly, the little bit of goodness they had in them is what most likely got them killed.

As an avid reader, I often find myself reading many books at once. I usually try to finish every book I start, but sometimes, between great literature, non-fiction, and other average or hard-to-read titles, the reading mind needs some dessert. Enter a pulp novel. It is usually about 200 pages long, fast paced, and filled with action. I can usually finish a pulp book in less than three days, sometimes overnight. But it does the trick—it gets me motivated to return to my more substantial reading material. Pulp fiction is the original Instagram Reels or YouTube Shorts of the analog world. And like their modern equivalents, they can be addictive too—but without any of the side effects that come with doomscrolling.

I have been using books as a way to stay away from consuming media incessantly, and to resist what I call "dopamitis"—a constant need for stimulation. Pulp books help a lot to inject some good-natured fun and adventure.

In the last six months alone, I have read many pulp fiction books. The most recent ones are The Vulture is a Patient Bird by James Hadley Chase and The Best Laid Plans by Sidney Sheldon. These books are among the best by their respective authors and feature all the well-loved pulp tropes—murders, gunfights, damsels in distress, sneaky villains, handsome men, and of course, lots of gratuitous sexy stuff.

Other recent reads include A Case of the Negligent Nymph, a Perry Mason mystery by Erle Stanley Gardner, and Well Now, My Pretty by James Hadley Chase. As a teenager, and well into my twenties and thirties, I must have read at least 50 to 100 great pulp novellas, including series like Nick Carter: Killmaster.

Pulp books are excellent material to learn the art of storytelling from. They are mass market and commercial, and often demonstrate best practices in how to capture a reader’s attention. The genre is also very good at subverting traditional story structures like the Hero’s Journey, often twisting them into darker or more ironic forms.



A familiar pulp setup—tension, beauty, and danger—but with just enough ambiguity to leave you questioning who holds the upper hand.


What pulp fiction does well is to understand its audience. Historically, pulp writers published in magazines, and stories were often serialised. This created a feedback loop. Authors learned from how readers responded to each instalment, and that shaped the next one. Pulp fiction was a consumer product. Unlike literary writers, who had to rely on pure content and critical reception, pulp writers were always responsive to their readers. In that sense, pulp fiction offers great lessons for marketing and advertising, which also work best when grounded in consumer insight.

That said, good pulp fiction did not necessarily cater to the lowest common denominator. While there was plenty of that, the truly standout works knew how to hook readers, keep the pace, and land the ending in a way that kept them coming back. The tools they used were consistency and clear positioning. Perry Mason was about the wily defence lawyer outsmarting the justice system to prove his client’s innocence against overwhelming odds. He usually did this by revealing the actual killer in dramatic courtroom scenes. His clients were often attractive but morally ambiguous young women. Nick Carter was Killmaster, a globe-trotting spy in exotic locales with damsels in distress and gadgets galore. James Hadley Chase told stories of criminals clashing with other, slightly less evil criminals. The bounty was always something attained at great personal cost.

By codifying these tropes, the great pulp fiction writers ensured they kept their audiences coming back for more.


The drama of pulp lies not just in violence but in mystery, tension, and the unknown—books and bullets, both loaded


And finally, one piece of writing or marketing wisdom that can be drawn from good pulp fiction is how the writing leaves much to the reader’s imagination. This, to me, is the master stroke. Great writers know that each reader’s mind is fertile ground for imagination, and they use that to their advantage. The prose is tight and sharp, allowing the reader to imagine what something might feel or look like. By not being overly descriptive and trusting the reader to fill in the gaps, great pulp stories plant seeds in the reader’s mind and immerse them deeply in the experience. That feeling is probably only ever replicated in good cinema.


Note: Illustrations in this post, excepting the book covers, were generated by ChatGPT using AI image tools to evoke the visual spirit of classic pulp fiction.

Tuesday, 8 July 2025

BOOK NOTE - Tomorrow, and Tomorrow, and Tomorrow by Gabrielle Zevin


One of the advantages of a National Library subscription is that I get to read many books for free. But a persistent disadvantage is that it seems to drive me to read fast—sometimes way too fast—just to stay within the loan period. Especially since you don’t always get to extend the loan on a book, and you can end up stuck on the waitlist for an indefinite amount of time. This isn’t great for the reading experience. It doesn’t let you reflect or even enjoy the book fully.

I read Gabrielle Zevin’s Tomorrow, and Tomorrow, and Tomorrow in under seven days because my ‘Skip the Line’ loan only allowed me to keep the book for that long. The eBook, at about 400 pages, isn’t meant to be a fast read. But the story was compelling and engaging enough that I ended up finishing it. It also helped that I had two flights during that period—nothing like the internet blackout on planes to force some quality reading time. Frankly, that’s one of the best things about flights, and I hate that some now offer Wi-Fi.

NOTE: Spoilers ahead.

The book is about gamers and game developers, which connected well with me. It is also a book about friendship and love. But honestly, the main characters are hard to like. They’re depressed, sometimes psychotic, and go through or inflict a fair bit of mental torture.

Sam, one of the protagonists, has endured several emotional traumas: his parents aren’t together, he witnesses a freak suicide at a young age, and he’s in a horrific car accident that kills his mother and injures his leg badly. Despite all this, he remains broadly positive—rightfully a little reserved.

Sadie, on the other hand, has had less childhood trauma. She comes from wealthy parents, has a sister who recovers from cancer, and turns out to be brilliant—but also more bitter. The only truly likeable character is Marx, a key player in their story. He’s almost tragically good-natured. But I liked that. The idea that someone can be consistently cheerful—and that this could be a sign of intelligence—really stayed with me.

I felt Sadie, the other main protagonist, goes through some self-inflicted problems in life by getting into a torrid relationship with her married gaming professor, Dov. She pays the price by being jilted and ‘used,’ and unfortunately takes it out on her best friend Sam and their co-workers.

Overall, the characters didn’t quite connect with me. Especially with the mixed Japanese-Korean heritage of two of the leads, the whole thing started to feel very Murakami-like. The emotional turmoil, the torrid love affairs, the possessive pining—it was all very Murakami, just without the Murakami authorship.

What made the book work for me was the gaming industry storytelling, and the way computer games and their development shaped an entire generational cohort. I really enjoyed that part.

 

Monday, 30 June 2025

The fog and the mountain - how insight reveals itself



Through the fog of distractions you must go,
The Mountain's wisdom awaits at the far edge

 

Clarity and insight come mostly from dwelling on things over long periods of time.

It might feel like they appear in a moment, like in a flash of brilliance, but they actually come when the mind has been quietly marinating in certain ideas.

For me, insight emerges from looping through a familiar mix: writing, reading, thinking while walking, connecting thoughts, listening, consuming ideas, and then putting pen to paper again.

A busy, distracted mind has no energy to create. Distraction is perhaps the single biggest barrier to creativity. Writing helps anchor the mind.

But beyond writing, there is also a need for clear frameworks that help simplify and clarify thinking. It is only after one understands—by linking one idea to another—that it becomes possible to express what something really means, and make it simple and profound.

To show what this looks like in practice, consider the following example.

I was reading a book about early 20th-century expeditions to the South Pole. One striking detail was how explorers described the grand Antarctic mountains—massive, majestic—and yet often completely hidden behind fog and snow.

That sparked a thought. It is easy to become accustomed to seeing grand vistas like the Grand Canyon or the Himalayas—accessible places with seasons that allow for clear views. But in remote places like Antarctica and the Arctic, some of the most spectacular sights may remain unseen—not just because of their remoteness, but because they’re perpetually obscured by the elements.

There was a compelling fact—fog covers monumental mountains in Earth’s remotest places. And a wishful thought: that maybe some of the most beautiful vistas on Earth will never be seen or felt by humans.

At this point, it’s not yet insight. Just a curious observation. To make something of it, one has to write it down. Then review it. Then revisit it, using a framework like the one below.

This is where the earlier point about inhabiting a thought becomes real. The framework that follows is what allows a fleeting idea to evolve into something meaningful. It is a way of staying with the thought long enough for it to reveal something new—something that moves from noticing to insight.

This framework helps move from absorbing to seeing. From gathering to generating. From skimming past something to inhabiting it.

Here’s how it works.

First, to inhabit a thought, one needs to stay longer with it. Add more stillness between inputs. This is where reflection starts to deepen. Helpful questions include:

What is the emotional core of this?
Which part of me is responding to it?
What does it evoke in me?

Second, one has to make it strange again. Be childlike and indulge in some divergent thinking. Ask:

What would a child or a weird philosopher say about this?
What is this a metaphor for?
Where else does this pattern show up?

Third, one must zoom out and give attention more weight. Let the subconscious do its quiet work. Questions that help here:

What’s the larger truth being hinted at?
How can this be said more simply?

This is how uncommon connections form. And that’s where insight begins to surface.

Returning to the Antarctica example: the emotional core of the thought was longing—for beauty, for inspiration. The part that responded was the curious inner explorer. What it evoked was a kind of FOMO—not of missing out on trends, but missing out on inspiration.

Then, making it weirder—imagining that a philosopher might say that there is beauty out there, but it’s hidden—not because it doesn’t exist, but because it cannot be seen through the fog.

From a note about unseen Antarctic mountains, it became a metaphor about inspiration hidden by the fog of distraction.

That’s when the ideas clicked. There are things that remain out of reach not because they’re distant, but because attention is clouded. This is what people experience when they have writer’s block. Or creative plateaus. Because writing requires us to go to remote places - internally. Insight is often found in the places that are off the map.

What is the larger truth? To see the sublime, one must clear space. Remove the fog.

The mountains are there for the seeing—but only when one clears the fog of noise, haste, mental clutter and dares to venture to the edges where true insight lurks, can they finally come into view.

That is the power of inhabiting a thought—not grazing past it. Insight doesn't come from more inputs. It comes from deeper attention. And writing, more than anything, helps engage the whole of one’s consciousness.

The only caution: this should not become a mechanical exercise. One must not confuse deep thinking with sounding clever. Inhabiting thoughts for longer quietly dissolves that urge. 


Sunday, 29 June 2025

BOOK NOTE - Table for Two - Amor Towles



Mark of great writing is when you can imagine a scene so vividly that after a while, you start doubting whether it was from a book or a movie—or even a real-life experience. Especially when you are reading another book, and a scene from that earlier book plays in your mind like a movie and then shapes how you imagine the current book’s scene.

For example, in one of James Hadley Chase’s books, a couple of crooks rob a casino. I read that about a year ago. Now, as I am reading Amor Towles’ Table for Two, there is a novella where a character robs a casino, and I imagined it in Chase’s prose. It is also because of the vivid nature of stories set in Los Angeles.

Chase’s books are pulp fiction. But iconic pulp fiction. Towles’ writing is good literature. And I am sure that sometime later, I will experience Towles’ prose like a movie too, when I am reading another heist story.

Table for Two is a wonderful collection of short stories and a novella. They are characterized by Towles’ evocative prose, rich in history and culture, and written with subtle humour. Towles evokes the sense of the cities his stories are set in beautifully. Reading the ones set in New York in the 90s evokes a deep sense of nostalgia.

For someone who has not visited New York at the time of reading, the nostalgia I am referring to is the one shaped by consuming great New York stories—like The Fountainhead by Ayn Rand, The Great Gatsby by F. Scott Fitzgerald, and from movies like The Godfather and TV shows like Mad Men.

I loved the evocation of the old bookseller who inhabits the literary world in The Ballad of Timothy Touchett, and the simple optimism of Pushkin in The Line. Both are amazing and touching stories. But my favourite was The DiDomenico Fragment, which is brilliant in its description of old money in New York and the art world.

Sunday, 1 June 2025

BOOK NOTE - Sharing a house with the Never-Ending Man - Steve Alpert


I recently saw a YouTube short about how the 90s and 2000s in Japan were a kind of nostalgic golden age. Japan then was still the second-largest economy in the world and a major cultural influence—both on its own and through its ties with the US and the rest of the world. Japanese consumer tech was ahead of its time – think Walkman and Nintendo. Japanese cars were state of the art – Toyotas and Hondas were top-notch. Japanese management ideas – kaizen, just-in-time, etc. – were all the rage in business schools. Tokyo, especially, was the world’s biggest city and had this mix of high-tech modernity and ancient, arcane traditions. Tokyo, in particular, was the biggest city in the world and was a unique blend of cosmopolitan, techno-West, embedded in an old-world society.

It’s in this setting that Steve Alpert’s book about his time at Studio Ghibli takes place. It’s an easy read, and Alpert writes with a casual pace, recounting his time as a Gaijin inside the world of Japanese office culture. He captures the eccentricity of people like Hayao Miyazaki and Toshio Suzuki, while also drawing an insightful contrast with their interactions with large American establishments like Disney and Hollywood. Alpert doesn’t take sides, but it’s clear he has a soft spot for his Japanese colleagues. He often points out how brutal and crass the American style of business could be compared to the more refined, if rigid and dogmatic, Japanese way.

I picked up this book while researching something I was writing – whether the AI threat to art is real – and wanted to understand more about Miyazaki and his potential views on the Ghiblification trend. While the book doesn’t go deep into Miyazaki’s biography, it gave me a good sense of the people, the culture, and a feel for the Ushinawareta JĹ«nen—the Lost Decade. That period of economic stagnation in Japan, post-bubble, which ironically turned out to be rich in culture.

Alpert talks about the evocative scenes of central Tokyo that were visible out the offices of Tokuma Shoten, the publishing company behind Studio Ghibli. It is very well described and evokes the metropolitan madness that is Tokyo very well – almost like out of a scene from a Stuido Ghibli movie.

Our tenth-floor office also had large corner windows with views that extended all the way out to Tokyo Bay. From my desk I could see all of Japan's various modes of transportation at a single glance. There was the shinkansen Bullet Train just slowing on its final glide toward Tokyo Station. The various color-coded JR local and long-distance lines came and went every few minutes. The newly built and driverless Yurikamome train zipped along on rubber wheels toward the Odaiba entertainment area and the Big Site convention center.
The aging but still graceful Monorail, a leftover from the 1964 Olympics, leaned precariously leftward as it rounded a curve on its way to Haneda Airport. There were the gracefully arching branches of the Shuto, the overhead highways. These were clogged with traffic that barely moved all day. Once or twice a day I could spot a ferry just easing into its berth at the Takeshiba piers after completing its twenty-four-hour trip from one of the far-away Izu-Ogasawara Islands, incongruously an official part of metropolitan Tokyo. There was the newly built Rainbow Bridge standing astride the harbor and linking it to the island of Odaiba. The bridge was silvery white in the morning sunshine or bathed in colored lights against a hazy pink and purple sky at dusk.
All day, passenger jet aircraft banked low over Tokyo Bay on their final approach to Haneda Airport. Immediately below, bustling Shinbashi's wide main streets were packed with cars and buses mired in the heavy traffic. The warren of narrow pedestrian-only alleys in the Mizu Shobai (bar) district were mostly empty in the morning and crammed with wandering pedestrians once the evening rush began. At the beginning and end of the lunch hour, which everyone took at exactly the same time, the sidewalks were full of people.

The more humorous and interesting parts are his observations of Japanese office quirks. One example that stood out was nemawashi (“securing the roots”), where everything is decided before the meeting even happens. I see this a lot in management in 2025 too. It’s just called stakeholder management now, but doesn’t sound half as cool as nemawashi. 

The process of visiting and obtaining the approval of all required persons in advance is called nemawashi (securing the roots). In this way the arguments for or against any proposal or new idea and the decision makers' positions on the proposal have all been fixed long before any formal meeting takes place. Once the nemawashi has occurred, a meeting is called to pretend to discuss the matter in question, and the attendees vote on the outcome in accordance with the positions they have previously (and privately) confirmed they would take. By the time the meeting has been called, everyone attending already knows what's been decided.

Alpert also gives a strong picture of Miyazaki’s temperament and what drives him. Miyazaki is a genius, and like most geniuses, has his own unique quirks. Working with him was tough:

Hayao Miyazaki's way of making a film was particularly stressful, and that was exactly how he thought it should be. He would often say that a person only does his best work when faced with the real possibility of failure and its real consequences.

But Alpert also shows how Miyazaki’s ideas took shape. Miyazaki was always drawing, and out of those loose sketches, something would take hold. It was a long process of trial and error, where many ideas were percolated, thrown away, resurfaced and slowly the one that stuck would emerge. Then, after more iterations and more stress, about two years later, a finished film, a work of art, would emerge.

This reminded me of the classification I had read about how creative people think and create. Some are experimental innovators, who iterate and let ideas and trials percolate before it is finalized. Others are conceptual innovators, whose ideas burst forth like a sudden fount. Miyazaki was clearly the first kind.

Alpert also notes how Miyazaki treated a film once it was done:

When Miyazaki signed off on one of his and it was officially done, he preferred to never think about that film again. It was done. There was nothing more he could do to improve or change it. He always wanted to be moving forward and thinking about the next film.

There’s also what I think is a glorious moment from the book when Miyazaki visits the US for Princess Mononoke’s release. Hollywood royalty, including Martin Scorsese, sends an invite to meet him for after-dinner drinks. To the horror of the American studio executives, who think this is a great honour, Miyazaki and Suzuki decline politely and go hang out with a Japanese architect friend instead This prioritization of personal interests and artistic integrity over commercial networking is a recurring theme in the book.

The book gives a window into a world that feels lost now. Before the digital age flattened everything, and creative edges got smoothed out.

Alpert sees this too:

In a world where more and more of the little things that make one place different from the next are disappearing, it's somehow comforting to hold on to a few bits of the past.

This book was one of those bits. A reminder.

 


Monday, 12 May 2025

BOOK NOTE - The Last Place on Earth - Roland Huntford


 All stories are stories of adventure.

In the dawn of time, when the first adventurer left his tribe’s campfire and dared to wander beyond its glow, he brought back a story. Paul Zweig’s book The Adventurer calls this the source of storytelling itself.

Man risking his life in perilous encounters constitutes the original definition of what was worth talking about.

Roland Huntford’s The Last Place on Earth is one such book about adventure. It is a biography of two men, told through the lens of their grand journey to the South Pole at the beginning of the 20th century. In chronicling Robert Falcon Scott and Roald Amundsen's journeys —one ending in death and failure, the other in success that is almost clinical—the book takes us through a world on the brink of the modern era. A world without plastics, vitamins, or germ theory—this was adventure before comfort.

By the late 1890s, most of the world had been mapped. The last unconquered frontier was Antarctica, and at its heart, the South Pole. British explorers like Scott and Shackleton made attempts. But it was the Norwegian Roald Amundsen who won the race, reaching the Pole in 1911.

What drew me to this book was a reference to leadership. Greg McKeown in Effortless talks about consistency over intensity. He compares Amundsen—whose team succeeded—to Scott, whose team did not survive, on this leadership dynamic.

The key difference was in their approach. Amundsen was consistent. His team did 15-mile treks daily, regardless of weather, making steady progress without exhaustion. Scott’s team on the other hand was erratic. Some days they pushed too hard heroically, but on other days they completely stalled from exhaustion and poor planning. That emotional, unscientific style cost them the race—and their lives.

But this book offers so much more lessons for modern leaders. Lessons on preparedness, risk-taking, empowering autonomy, learning from the best, trying new methods, but also valuing cultural knowledge and the underrated value of good humour.

It took me over 4 years to finish reading it. Slowly. For the joy of it. For the story of an adventure from the last great expedition of the Age of Discovery.

The book chronicles the personal journeys of Scott and Amundsen—from their roots to their eventual race to the South Pole.

Scott was emblematic of a declining Britain. By the 1870s, the race of giants that defined the glory of Queen Victoria’s reign was coming to an end. Dickens died in 1870, Darwin’s last major work was published in 1871, and Livingstone died in 1873. Scott, born in 1868, was steeped in a world where that glory was slipping away—a hero for a nation in decline. It is because of this, and Scott’s temperament, that the author argues he made the expedition to the Pole an affair of heroism for heroism’s sake. Amundsen, by contrast, was a cool and calculated Norwegian who turned the conquest of the Pole into something between an art and a sport.

This led to remarkable differences in their motivations and approaches.

Scott, with civilizational pride behind him, expected the elements to align in his favor—and grew resentful when they did not. Amundsen, on the other hand, had the Viking respect for nature and prepared for the worst. When conditions favored him, he responded with gratitude, not entitlement.

Amundsen learned from failure. When his first sledging journey under his command failed spectacularly, he noted in his diary that they had “harvested experience.” Scott, more driven by passion, failed time and again to learn from experience—his ambition clouding his judgment.

Amundsen had a good read of people. Early in his apprenticeship journeys, he realized that under stress, passivity dissolved into apathy—and in extreme conditions, apathy was fatal. He developed ways to identify those prone to such weaknesses and removed them from the team before it was too late. Scott, meanwhile, was more swayed by flattery than by character. His choice of companions for the final push to the Pole reflected personal preference more than competence.

Amundsen had the humility to learn from anyone. He studied how to live under polar conditions from the indigenous Eskimos. He believed no civilization held a monopoly on wisdom—and that so-called primitive people had much to teach the modern man. From them, he learned how to dress in a way that avoided sweat (a deadly hazard in the cold) and how to effectively use dogs for sledging. Scott, rooted in Royal Navy traditions and ceremonies, resisted such learning. His reluctance to adopt key innovations in clothing, and his disdain for dog sledding, proved fatal for his team.

Scott carried the trappings of status as a Royal Navy officer but lacked true connection with his men. Beneath the mask of a gentleman officer, he smouldered with ambition, yet failed to truly move or inspire his followers. Locked into a command-and-control mindset, he couldn’t bring out the best in his team. Amundsen, by contrast, was quietly confident and inspired confidence in others. He preferred trust over control, noting that when “you let everybody have the feeling of being independent within their own sphere, there arises a spontaneous and voluntary discipline, which is worth far more than compulsion.”

This is not to say Amundsen was perfect. But he demonstrated the best traits of leadership where it mattered most.

When Amundsen finally reached the South Pole, there is a poignant passage. He experiences not jubilation, but something closer to emptiness:

Amundsen had learned what the Duke of Wellington had meant when, in the moment of victory, he wrote that ‘Nothing except a battle lost can be half so melancholy as a battle won.’ Such, then, was the attainment of the South Pole: a muted feast; a thing of paradox, of classic detachment; of disappointment almost.

And yet, it is Scott’s ill-fated expedition that lives on in popular memory. Amundsen, almost too perfect in his accomplishments, is largely forgotten.

That, perhaps, is the final lesson in leadership: success demands both science and art. The science of preparation and execution. And the art—not just of living the moment—but of making it live in others.

Friday, 2 May 2025

Ghiblification Is not the problem. Concentration of artistic power is.

I have this view, that those arguing that the Ghiblification trend is an insult to artists, have got it mostly wrong.

The panic is misplaced. What people should be concerned about isn’t whether AI is replacing artists. It’s who owns the tools, who controls the pipelines, and how creativity is quietly being fenced in by a few tech players.

This itself is not a new story. It is an old one — one of power concentrating around access, infrastructure, and enforcement, while everyone else argues about taste and authenticity.

What we’re watching is not the death of art. It is the quiet and at-scale takeover of the conditions for making and sharing it. It’s not imitation we should fear — it’s when only a handful of companies get to decide who imitates, and who gets paid. The scale of this concentration that AI enables is what should be the concern here. Those that claim that AI might be Humanity's last invention, need to focus on its real threat.

And now that this point is on the table, let’s address the three other things people keep circling back to: Would Miyazaki be offended? Is AI art real craft? And does imitation degrade originality?

I. Would Miyazaki care?

I don’t think Hayao Miyazaki would care about the Ghiblification trend that flooded the internet in early 2025. It's mostly those who are unable to create something like the great director that seem to be obsessing over and ranting about this supposed “affront to artists” — with their predictable arguments about copyright infringement and creativity deterrence.

Now, yes, there’s the infamous quote from 2016 when Miyazaki was shown an AI-generated animation. He was horrified. He said, “I strongly feel that this is an insult to life itself.”

But context matters.

This moment comes from the documentary Owaranai HitoThe Never-Ending Man — a portrait of Miyazaki grappling with retirement and continuing to be creative. In it, he embarks on his first CGI project, Boro the Caterpillar. He doesn’t reject the new tool. He experiments with it, albeit with disdain and reluctance. Eventually he adapts, then returns to what he does best. That’s what artists do.

The point is: Miyazaki was not rejecting technology in absolute terms. He was reacting to a shallow use of it. The AI demo he saw lacked not skill, but soul. It had no intentionality. It had a sense of movement that was not grounded in meaning. His discomfort wasn’t with technology — it was with indifference.

And as of today, neither Miyazaki nor Studio Ghibli has issued any public statement on the Ghiblification trend. That silence should tell you something.

II. What counts as craft?

Let’s continue with Miyazaki to understand the ethos of artists and craftsman like him. To understand Miyazaki’s relationship with craft, you need to look at how he works — or rather, how he obsesses. Ghibli’s work isn’t for the attention economy. It’s a meditation. Built over years. Layered with meaning.

As Steve Alpert recounts in Sharing a House with the Never-Ending Man, while watching Princess Mononoke scenes repeatedly during production:

“I once repetitively watched a sequence where the heroine San charges into the Tataraba Fortress, leaps up onto the roof, and speeds across it. Then the hero Ashitaka leaps up and goes after her. What I noticed after seeing this again and again was how the tiles on the rooftop react to being stepped on, first by the light and lithe San, barely registering the weight of her compact body and small feet, and then by the heavier and less graceful Ashitaka. Just by how the rooftop registers the tread of their feet you have a sense of the weight, mass, velocity, and physical force exerted by each character. What I also noticed in the sequence was that when Ashitaka jumps up onto the roof he causes a few of the tiles at the edge of the roof to crumble. Pieces of them fall to the ground.

With my newly gained knowledge of animation, I realized that what was unusual about this is that the roof is a part of the background and not something that normally moves in animation. Princess Mononoke was the last major feature-length animated film to be drawn by hand and animated on hand-painted cels. In hand-painted cel animation the moving pieces are done in a somewhat simplified style that allows them to be more easily replicated and manipulated. But the elaborate backgrounds on which they move are too detailed, too intricate, and too finely done to be manipulated (animated) in that way. Also, they are done in watercolor and not pencil.

In other words, in order to get those few pieces of rooftop tile to crack and crumble to the ground, Miyazaki would have had to get an animator to specially create elaborate hand-painted cels to match the background image and to painstakingly recreate them in enough versions of crumbling to make the effect work. This sequence lasted on screen for perhaps only a few seconds or less. But it would have taken a large chunk of someone's time (and therefore money) to create. This on a project that was already precariously in danger of not meeting its production deadlines.”

That level of care for a detail barely anyone would consciously notice — that’s not just technique. That is philosophy. That’s the difference between making art and just making content.

Someone with that kind of dedication isn’t going to be threatened by a trend. Leonard Cohen put it best: 

“You can add new technology — synthesizers, effects — but a song without a soul is still a dead song.” 

That’s Miyazaki’s position. I doubt he’d waste time being offended by machine mimicry.

All AI is doing right now is mimicking. Some results are charming, sure. But they’re not creating anything truly original. The outrage against Ghiblification is sort of virtue signaling. Because originality was never about output. It was about seeing, noticing, intending. As Antoine de Saint-ExupĂ©ry said,

“A rock pile ceases to be a rock pile the moment a single man contemplates it, bearing within him the image of a cathedral.”

AI doesn’t bear anything within it. It just recombines. But even recombination has its place. The right question isn’t whether AI threatens art. The real question is whether audiences will still care enough to tell the difference.

III. Imitation isn't theft. It’s how craft evolves.

Levelling the AI imitates accusation is shallow. Are we really arguing that human artists don’t learn from existing works? Does emergence of a new tool threaten an art form completely? Emergence of photography did not kill painting. And when photography took over realism, painters didn’t give up — they reimagined what it meant to see. Impressionist and surrealist movements were born. The two art forms are unique

Are we seriously going to ask artists to avoid digital brushes and synthetic paints because they’re too easy? Should we instruct storytellers to avoid the Hero’s Journey because it is templated? This kind of logic borders on the absurd.

As Leonard Read said in I, Pencil, everything today is built through distributed craft — no single creator, but a web of specializations, tools, and imaginations. Artists will use AI just like they used synthetic paints and make better things. 

Just like fast food doesn’t destroy high cuisine, Ghiblification doesn’t impact what Studio Ghibli stands for. One is engineered for mass attention. The other is for depth. We know the difference.

AI is just the next new tool humanity has discovered. Artists will experiment with this, the market will react, and once it commodifies, new things will have to be found — things that hit a chord the way Spirited Away once did. AI won’t be creating that. At least not without a human prompt. And what the artist creates with a new tool will often be better than what a non-artist can create with the tool. Basically, tools don’t make the artist, on the contrary, new tools make the artists even better.

The concern is not that anyone can now produce a Ghibli-style frame in five minutes. The concern is when those five minutes belong to a platform that logs your data, watermarks your work, and licenses your prompt to someone else.

And that brings us back to where we started.

The real danger isn't that artists will be replaced. It's that artists — and all of us — will be locked out.

Locked out of meaningfully shaping the tools.
Locked out of owning our own work.
Locked out of the new infrastructure being built under the guise of creativity.

Ghibli’s films were born from a context where the director had the final say. Not the algorithm. Not the platform. That’s what we’re at risk of losing.

So yes — use the new tools. Experiment with Ghiblification. Remix. Reimagine. But don’t confuse the tool for the vision.

And don’t be distracted by outraging over mimicry while the platforms are building moats around your cathedral.

Outrage, if you must, against monopolization of creativity.


PS: My previous post on this was more of a rant and had some errors, like wrongly calling the book by Steve Alpert by the name of the documentary. I have updated my views here and have also presented a more comprehensive view.

 

Sunday, 13 April 2025

SERIES: Potentially Contrarian Ideas - Miyazaki wouldn't care

I think Hayao Miyazaki would not be too worried about ChatGPT’s Ghiblification trend. It’s mostly those who are unable to create something like Miyazaki who seem to be obsessing over and ranting about copyright infringement.

Yes, he was horrified back in 2016 when he saw what AI animation produced and called it “an insult to life.” But if I understand him right, he was saying that AI would never be able to create what he can — and that he would never incorporate it into his work. And he’s right.

All AI is doing is mimicking. And while some of the results are charming — especially considering they often take less than five minutes of effort — they aren’t doing anything truly original.

And let’s be honest: the trend is literally called Ghiblification.

Even if it does produce something original, are we really arguing that artists don’t learn from existing works? Are we suggesting that if a machine does it, it’s somehow more of an infringement than if a human artist had done the same?

Are we going to ask artists to use only natural paint, which takes years to produce, instead of the incredible synthetic paints we have now?
Are we going to say that artists shouldn’t use MS Paint just because it allows things to be created faster than with watercolours?
Are we going to tell advertisers and filmmakers not to create themes inspired by popular films?
Are we going to instruct storytellers to avoid the Hero’s Journey as a structure?

It borders on the absurd.

Leonard Read already explained the modern world through I, Pencil. AI is just the next new tool humanity has discovered.

Yes, some claim AI might be the last invention of humanity. But I, for one, remain skeptical.
We shouldn’t confuse a fad with a trend. Artists will experiment with this, the market will react, and once it commodifies, new forms will need to emerge — forms that strike a chord, like Spirited Away once did. AI won’t be creating that. At least not without a prompt from a human.

Miyazaki and Studio Ghibli had a clear passion for their craft. AI today cannot mimic that.

What we’re witnessing is not just a copyright issue. It’s the age-old tension between new tools and existing power structures.
The real concern shouldn’t be mimicry — it should be about the concentration of these capabilities in the hands of anti-competitive forces.
It’s regulation and intellectual monopolies that risk stifling creativity.

On a side note, to understand better Miyazaki and Studio Ghibli's potential views, I am reading Sharing a House with the Never-Ending Man by Steve Alpert. A book by one of the few foreigners (gaijin) who spent years working with Miyazaki and Studio Ghibli. I haven’t finished the book, but in the introduction I find this text below:

Studio Ghibli's films have influenced and inspired both animated and live-action filmmakers worldwide. Reflections and versions of images originally created by Hayao Miyazaki can be found in the films of well-known independent and Hollywood motion picture directors, including some major box office hits. Hayao Miyazaki has been called the Walt Disney and the Steven Spielberg of Japanese film. His influence on other filmmakers has been enormous.

Craft is about learning from the best.

And in some ways, this Ghiblification trend is a kind of homage to Miyazaki’s passion.

It’s not going to derail his creative output.

Owaranai Hito, the Japanese title of the book, translates to “The Man Who Is Never Finished.”

I will return and update my views if the book reveals otherwise