Posted on

The real imposter is the system: Rethinking education in the age of GenAI

For decades, we’ve treated education as the ultimate equaliser.

Study hard. Get certified. Climb the ladder.

That formula powered the industrial economy and then the early knowledge economy. Degrees signalled competence. Credentials signalled readiness. Access to elite institutions signalled advantage.

Then GenAI arrived.

And quietly, without protest, it collapsed the scarcity model that education was built upon.

Today, anyone with a prompt can access legal reasoning, financial modelling, medical summaries, code scaffolding, strategic frameworks, and global research. The gates are no longer guarded.

This raises a difficult question: If AI has access to almost all codified knowledge — and most people now do too — what exactly is the education system optimising for?

The original purpose of education

Modern education systems were designed for three primary objectives:

  • Standardisation of knowledge
  • Industrial workforce readiness
  • Credential-based sorting

It rewarded:

  • Memorisation
  • Compliance
  • Accuracy within structured evaluation
  • Linear problem-solving

In the industrial era, this worked. Fact recall was valuable. Access to information was limited. Standardisation ensured predictable output.

But in the GenAI era, memorisation is automated. Information retrieval is instant.

Structured reasoning can be generated in seconds. If the value of knowledge used to lie in having it, today the value lies in knowing what to do with it.

And this distinction exposes the cracks in the current system.

When AI has all the answers

GenAI does not experience impostor syndrome. It doesn’t doubt its competence. It doesn’t gatekeep information. It doesn’t fear being “found out.”

It simply accesses and synthesises.

Ironically, humans — who built the education system — are now the ones experiencing inadequacy. Because we were trained in scarcity.

Scarcity of:

  • Access
  • Tools
  • Elite networks
  • Research
  • Mentorship

AI operates in abundance.

So the question shifts from: “Can you recall the answer?” to “Can you ask the better question?”

And this is where the current education model shows its limits.

Also Read: Gender gap in GenAI skills is narrowing, but progress remains uneven, Coursera finds

The hidden limitation: Education rewards convergence

Most education systems reward convergence thinking:

  • Find the correct answer
  • Follow the expected method
  • Produce the accepted framework

But GenAI excels at convergence.

What it struggles with — and where human advantage lies — is divergence:

  • Challenging premises
  • Identifying unseen patterns
  • Questioning assumptions
  • Connecting disciplines in novel ways
  • Acting with contextual judgment

Our education systems largely assess answers. The future economy will reward judgment. Those are not the same.

Education as a signalling mechanism is weakening

Degrees once signalled:

  • Rigor
  • Persistence
  • Domain expertise
  • Access to curated knowledge

But when AI can:

  • Summarise an MBA textbook
  • Draft a legal memo
  • Generate a financial model
  • Write production-ready code

Then the credential alone becomes insufficient. Not irrelevant — but insufficient.

What differentiates tomorrow’s knowledge worker is no longer: “How much you know.”

It becomes:

“How deeply you understand.”

“How effectively you apply.”

“How clearly you decide.”

Education, in its current form, does not consistently measure these dimensions.

The new divide: Curiosity vs compliance

GenAI does something profound. It removes knowledge access as a structural advantage.

But it introduces a new differentiator: curiosity.

Two individuals can access the same AI.

Only one chooses to:

  • Probe deeper
  • Refine prompts
  • Challenge outputs
  • Cross-check assumptions
  • Explore adjacent domains

Education traditionally rewarded compliance:

  • Follow curriculum.
  • Pass exam.
  • Meet benchmark.

The new economy rewards inquiry:

  • What else?
  • Why not?
  • What’s missing?
  • What’s next?

This is not a minor adjustment. It’s a systemic shift.

Also Read: GenAI in lending: Faster approvals, smarter risks, and personalised credit

What education must evolve into

If we are serious about preparing a generation of true knowledge workers, education must shift across five structural dimensions.

  • From memorisation → Meta-learning

Teach students:

  • How to learn
  • How to unlearn
  • How to validate AI outputs
  • How to interrogate sources

AI can retrieve answers. Humans must validate relevance.

  • From siloed disciplines → Interdisciplinary synthesis

Real-world problems do not come neatly packaged:

  • Climate intersects with finance.
  • Healthcare intersects with data ethics.
  • Supply chains intersect with geopolitics.

True knowledge workers will be synthesisers, not specialists confined within narrow lanes.

  • From fixed curriculum → Dynamic learning models

Curricula often lag the industry by years.

In a world where AI models update in months, static syllabi become outdated quickly.

Education must become:

  • Modular
  • Continuous
  • Adaptive
  • Stackable

Learning cannot end at graduation.

  • From exams → Applied judgment

Assessment should increasingly measure:

  • Scenario reasoning
  • Ethical trade-offs
  • Decision framing
  • Risk calibration

The world does not grade people on multiple-choice questions. It rewards decision quality under uncertainty.

  • From credential prestige → Portfolio evidence

Future differentiation will likely come from:

  • Projects
  • Problem-solving artifacts
  • Real-world experimentation
  • Public thinking

What you build may matter more than where you studied. It implies application.

The knowledge worker of the new age

Peter Drucker popularised the term “knowledge worker” decades ago.

But GenAI forces us to redefine it.

A true knowledge worker in the AI era:

  • Does not compete on access
  • Does not compete on recall
  • Does not compete on surface frameworks

Instead, they compete on:

  • Depth
  • Context
  • Original framing
  • Decision velocity
  • Ethical clarity
  • Strategic foresight

Education systems must therefore cultivate:

  • Systems thinking
  • Probabilistic reasoning
  • Bias awareness
  • Creativity under constraint
  • Communication clarity
  • Cross-domain fluency

These are not exam-friendly traits. But they are future-critical capabilities.

Talent vs experience in an AI-accelerated world

AI compresses learning curves.

A junior analyst can produce outputs once reserved for senior professionals.

An executive can independently generate strategy drafts without layers of support.

So, where does experience fit?

Experience now becomes:

  • Pattern recognition under ambiguity
  • Judgment calibrated by lived consequence
  • Crisis-tested decision making
  • Ethical discernment

Talent becomes:

  • Speed of synthesis
  • Intellectual curiosity
  • Cross-domain integration
  • Learning agility

Education should nurture both.

But today, it often privileges standardised performance over adaptive capability.

Also Read: The use of GenAI is turning innocent employees into insider threats: Here’s how to fix it

The structural recalibration we need

If we continue educating for yesterday’s scarcity economy, we will produce graduates optimised for irrelevance.

If instead we redesign education for:

  • Abundance of information
  • AI-augmented productivity
  • Continuous reinvention
  • Portfolio-based credibility
  • Judgment-based differentiation

Then we create a generation that does not fear AI — but compounds with it.

The real imposter is not the human.

It is the outdated system that measures humans by metrics AI can outperform.

In conclusion

GenAI is not replacing education. It is exposing what education was truly built to optimise.

The future knowledge worker will not win by competing with AI on answers.

They will win by:

  • Asking sharper questions
  • Integrating broader perspectives
  • Exercising wiser judgment
  • Pursuing depth relentlessly
  • Exploring “what’s next” before it becomes obvious

Education must therefore evolve from a delivery system of knowledge into a training ground for discernment.

In a world where AI knows almost everything, the true advantage belongs to those who know what matters.

And that begins with rethinking how we educate — not just what we teach.

This article is Part 4 of a four-part series on “Redefining Knowledge Work: AI, Ownership, and the Future of Value.” Explore the rest of the series: Part 1, Part 2, Part 3.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsApp, InstagramFacebookX, and LinkedIn to stay connected.

The post The real imposter is the system: Rethinking education in the age of GenAI appeared first on e27.

Posted on

Value creation: The compression principle — How to edit your pitch down to its atomic core

Your pitch isn’t too long because you care too much. It’s too long because you don’t yet understand.

January 9, 2007 — Macworld, San Francisco. Steve Jobs takes the stage.

“Today, we’re introducing three revolutionary products. A widescreen iPod with touch controls. A revolutionary mobile phone. And a breakthrough internet communications device.”

He pauses. Repeats it. Then pauses again.

“These are not three separate devices. This is one device.”

Most people remember the reveal. Almost nobody studies the structure. Jobs did not explain — he engineered expectation, then collapsed it. He let the audience close the distance themselves, then handed them the reward of inevitability.

That is compression.

Meanwhile, your forty-seven-slide deck is sitting quietly in someone’s inbox. Unread. A polite pass is already half-drafted.

These two facts are not coincidental.

The battlefield you chose

Daniel Kahneman divided the mind into two systems: fast, intuitive System 1 and slow, deliberate System 2. Founders tend to assume investors operate in the second. They don’t.

Early-stage decisions are made in System 1. System 2 exists mostly to justify them afterwards.

Here is what that means in practice: The moment your pitch becomes dense — over-explained, over-hedged —you force System 2 online. And when System 2 activates, the investor is no longer listening. They are auditing.

Auditing is adversarial by design. It looks for gaps, inconsistencies, overreach — and it finds them, because every business has them. You have chosen to fight on the only terrain where you are guaranteed to lose.

The deck didn’t just fail to convince. It selected the wrong game.

Also Read: Cambodia startups move from pitch to payoff

Evolutionary biologist Amotz Zahavi described this from the opposite direction. The peacock’s tail is inefficient. Costly. Dangerous. Which is precisely why it works. Only a genuinely strong organism can afford that level of waste. The handicap is the proof.

The founder who speaks less — but lands precisely. Who answers without rushing toward silence. Who leaves space unguarded? That restraint carries a signal no slide deck can manufacture: I don’t need to persuade you. This already stands.

Over-explanation is not passion. It is fear, wearing the costume of diligence.

What Shannon knew

Claude Shannon defined information as entropy — the degree of surprise in a message. What you cannot predict carries information. What you already expect carries none.

The average pitch deck is 90 per cent predictable. TAM/SAM/SOM.  A competitive matrix. Five-year projections no one believes. Entropy: zero. Signal: zero. Noise, presented with the production value of rigour.

Now consider this: Fei-Fei Li raised US$230 million anchored on two words — Spatial Intelligence. No slides. No deck. A concept so compressed it reshaped the room. Inside those two words was the complete answer to every investor’s three-part question: Why now? Why her. Why does it change everything?

That is density. That is what a pitch is supposed to be — not a document, but a gravitational event.

A black hole compresses vast mass into finite space — not by removing meaning, but by eliminating everything that isn’t load-bearing. A great pitch obeys the same physics. 120 seconds that hold the market’s contradiction, the team’s irreversible proof, and the investor’s fear of missing it — all at once, without remainder.

If you cannot compress, you have not yet reached the centre of your own idea. The pitch is not the failure. The understanding is.

The asset called silence

Japanese aesthetics has a concept: ma — the charged space between notes, between gestures, between words. Not absence. Potential. The silence doesn’t mean the music has stopped; it means something is about to land.

In the best pitches, silence is not a gap in delivery. It is where the investor’s imagination enters. And once they begin to co-create the narrative — once they are supplying the ending — you no longer need to sell it.

Also Read: Your agency’s pitch deck is a clone: Here’s why Meta’s new rules and AI will force you to evolve or collapse

Experienced investors share a quiet heuristic: distrust founders who cannot stop explaining. They recognise the pattern. Those who fear uncertainty try to eliminate it with words. And in doing so, they dilute the only thing that matters — coherence.

Mike Moritz once reflected on his first meeting with Google’s founders. “Larry and Sergey said very little,” he recalled. “But their silence said everything.”

That is not charisma. That is structure — trust, rendered in its most economical form.

Three tests

Don’t audit your pitch by adding. Audit it by removing.

  • The subtraction test

Delete one slide. If the narrative collapses, it belonged. If the pitch holds — if you barely notice the absence — that slide was never about your business. It was about your anxiety. It belongs in neither version.

  • The adversarial audience test

Assume the person in front of you already dislikes you. Do your first 30 seconds earn the next 30? Or are you asking for patience you have not yet justified? If your narrative requires goodwill, it isn’t self-sustaining.

  • The one-graph test

Can you render your unit economics in a single image — without a caption? Visual information routes through the amygdala, the brain’s emotional processor, bypassing deliberation entirely. Logic can be argued with. Feeling cannot. If your numbers still need explanation, they are not yet a story.

Also Read: The invisible fund: How to build a multi-million dollar runway before your first VC pitch

What length actually reveals

Da Vinci wrote: “Simplicity is the ultimate sophistication.” In venture, this is not a sentiment. It is filtration. Investors use it as a screen before they finish their second slide.

A 120-second pitch is not a format. It is a measurement device. It reveals — with a precision no due diligence process can match — whether you have reached the centre of your own idea, or are still orbiting it.

If your pitch is getting longer, stop. You don’t have a communication problem. You have an understanding problem.

And investors can read that signal before they open the deck.

“Your pitch isn’t getting longer because you care more. It’s getting longer because you’re not done thinking.”

This article is part of David Kim’s Value Creation column. It sits alongside the Asia Value Creation Awards, which aim to recognise PE and VC teams driving long-term, fundamentals-led value creation across the region.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsAppInstagramFacebookX, and LinkedIn to stay connected. 

The post Value creation: The compression principle — How to edit your pitch down to its atomic core appeared first on e27.

Posted on

The keys to your kingdom: Navigating crypto custody in 2026

In the digital asset ecosystem, custody is not just a feature. It is the foundation upon which everything else rests. Cryptocurrency operates as a bearer asset. This means whoever holds the private keys effectively owns the funds. This simple truth carries profound implications. Unlike traditional banking, where a forgotten password triggers a straightforward reset process, losing or compromising a private key in the crypto world often results in permanent and irreversible loss. There is no customer service hotline to call, and no administrator exists to undo a transaction. No safety net catches you when you fall.

As we navigate 2026, the importance of proper custody has evolved from a technical consideration to an existential necessity. Blockchain immutability means that transactions cannot be undone. If assets are stolen via a compromised key, there is simply no recourse to recover them. The numbers tell a sobering story. Approximately 20 per cent of all Bitcoin, or roughly 4 million BTC, is estimated to be permanently inaccessible due to lost keys or poor personal custody practices. That is billions of dollars worth of value vanished into the digital ether.

The threat landscape has grown increasingly sophisticated. Phishing attacks were responsible for approximately 83 per cent of stolen funds in 2025. High-profile exchange breaches like the devastating US$1.5 billion Bybit hack in early 2025 sent shockwaves through the industry. These incidents underscore a harsh reality. Basic storage methods are no longer sufficient to protect against modern threats. Meanwhile, institutional adoption has reached a tipping point. As of 2026, 74 per cent of family offices are actively engaged in cryptocurrency. They come with stringent requirements. These sophisticated investors demand qualified custodians who can meet fiduciary duties, ensure proper asset segregation, and provide comprehensive insurance coverage. The message is clear. Custody has matured from a DIY experiment into a professional service industry.

Also Read: Bitcoin holds US$71K as Ethereum surges 15%: What’s driving the US$2.44T crypto rally

For those entering the crypto space, the question is not whether to use custody solutions. The question is which model best fits their needs. The industry has converged around three primary approaches. Each comes with distinct advantages and trade-offs.

Self-custody remains the purist choice. It offers total autonomy and privacy. This model appeals to tech-savvy individuals who value sovereignty above all else. When you hold your own keys, you answer to no one. No platform can freeze your assets. No intermediary can deny your transactions. No third party can surveil your holdings. This freedom comes with a sobering responsibility because there is no forgot password button. User error is the primary risk, and mistakes are unforgiving. A lost seed phrase, a compromised device, or a simple typo can result in permanent loss. Self-custody demands technical competence, meticulous attention to detail, and an acceptance of absolute personal responsibility.

Third-party custody offers professional security and insurance coverage. This makes it ideal for institutions and beginners alike. These platforms employ teams of security experts. They maintain robust infrastructure and often carry insurance policies to protect against losses. The trade-off is counterparty risk since you are trusting another entity with your assets. Platform insolvency, regulatory action, or internal malfeasance can all threaten your holdings. Recent history has shown that even the most reputable exchanges can fall. They can take customer funds with them. Third-party custody simplifies the user experience. It requires careful due diligence in selecting a trustworthy provider.

Emerging as the goldilocks solution for many is the hybrid model utilising Multi-Party Computation technology. This approach offers distributed control and flexibility. It is particularly attractive to enterprises and exchanges. MPC splits private keys into encrypted shares distributed across different parties. This ensures the complete key never exists in one place. This occurs even during transaction signing. This eliminates single points of failure while maintaining operational efficiency. This sophistication comes at a cost. Operational complexity is the primary risk. Implementing and managing MPC solutions requires technical expertise and careful coordination among multiple parties.

Also Read: Crypto falls 1.29% to US$2.34T as geopolitical fear triggers risk-asset selloff

Modern custody solutions have evolved far beyond simple password protection. Today, the security arsenal includes multiple layers of defence. These are designed to eliminate vulnerabilities and protect against increasingly sophisticated threats. Cold storage remains the bedrock of secure custody. It keeps private keys entirely offline in air-gapped hardware that cannot be accessed remotely. This physical separation from the internet provides robust protection against hacking attempts. It makes cold storage ideal for long-term holdings. For those who choose this path, hardware wallets have become increasingly user-friendly while maintaining military-grade security.

Multi-Party Computation represents the cutting edge of custody technology. By splitting private keys into encrypted shares distributed across different locations or devices, MPC ensures that no single point of failure exists. Even during the critical moment of transaction signing, the complete key never materialises in one place. This mathematical elegance provides security that is greater than the sum of its parts. Multi-signature technology adds another layer of protection. It requires multiple independent keys to authorise transactions. A typical setup might require three out of five designated keys to approve a transfer. This ensures that a single compromised device cannot move funds. This distributed authorisation creates a system of checks and balances. It mirrors traditional financial controls. Hardware Security Modules provide tamper-resistant physical protection for key generation and storage. These specialised devices automatically wipe their contents if physical interference is detected. This provides a final line of defence against determined attackers.

So how should you approach custody? The answer depends on your technical comfort, risk tolerance, and usage patterns. For long-term holdings that you do not need to access frequently, cold storage via hardware wallets remains the gold standard. The inconvenience of physical access is a small price to pay for the security of keeping your keys completely offline. For active trading or frequently accessed funds, reputable exchanges offer convenience. They should be used judiciously. A prudent approach is to keep only a small portion of your portfolio, perhaps less than 20 per cent, on exchanges. Treat them as transactional tools rather than storage solutions. Move profits to cold storage regularly. Never leave more on an exchange than you can afford to lose. For those managing significant assets or operating businesses, the hybrid MPC model offers an attractive balance of security and functionality. It requires careful implementation and ongoing management.

Also Read: Why crypto market cap falls to US$2.53T despite regulatory clarity win and 6-day ETF streak?

The crypto custody landscape reflects the maturation of the entire ecosystem. What began as a libertarian experiment in self-sovereignty has evolved into a sophisticated industry. It offers solutions for every type of user. This ranges from the casual investor to the institutional giant. The technology is more robust. The options are more diverse. The stakes are higher than ever. Your private keys are more than just strings of code. They are the keys to your financial kingdom.

Choose your custody solution wisely. Understand the trade-offs. Never forget that in the world of cryptocurrency, you are ultimately your own bank. With great power comes great responsibility. In 2026, the tools to exercise that responsibility have never been more advanced. The question is not whether you can afford to take custody seriously. It is whether you can afford not to.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsAppInstagramFacebookX, and LinkedIn to stay connected.

The post The keys to your kingdom: Navigating crypto custody in 2026 appeared first on e27.

Posted on

Tovtrip introduces Cambodia’s first travel super app at Echelon Singapore 2026

Cambodia’s digital tourism sector takes a major step forward as Tovtrip, the country’s first comprehensive travel super app, showcases its platform at Echelon Singapore 2026, one of Asia’s leading technology and startup events.

Built to transform the way travelers explore Cambodia, Tovtrip offers a seamless booking experience across the entire travel journey. The platform integrates flights, hotel bookings, transportation rentals, spa and massage services, tours, activities, and local travel experiences — all within a single application.

To date, more than 600 Cambodian merchants including hotels, tour operators, transportation providers, and local experience vendors have joined the platform, making Tovtrip one of the fastest-growing local travel marketplaces in Cambodia.

Why Tovtrip Was Created

Tovtrip was developed to solve key challenges within Cambodia’s tourism ecosystem and empower local businesses through digital technology.

Enhancing Productivity and Visibility

Tovtrip directly connects travelers with local vendors, giving small and medium tourism businesses greater visibility in the digital marketplace. By optimizing how travelers discover and book services, the platform helps local merchants increase productivity while delivering better service to visitors.

Driving the Transition from Offline to Online Tourism

Cambodia’s tourism sector has historically been dominated by offline bookings. Tovtrip aims to accelerate the transition to digital travel services, enabling local communities and businesses to participate in the growing online travel economy while promoting destinations across the country.

Data-Driven Tourism Development

Tovtrip also generates valuable insights into traveler behavior and preferences. These analytics provide stakeholders — including businesses and tourism authorities — with actionable data to make informed decisions about tourism strategies, resource allocation, and destination development.

Also read: Meet the companies taking the floor at Echelon Singapore 2026

Cambodia’s growing travel market

The opportunity for digital travel services in Cambodia continues to grow rapidly.

The Total Addressable Market (TAM) for Cambodia’s internal travel market reached approximately $1.9 billion in 2024 and is expected to grow at 6.9% annually, reaching around $2.78 billion by 2025. The number of internal travelers is projected to increase from 22.5 million travelers in 2024 to approximately 33 million travelers by 2025.

Within this ecosystem, the Serviceable Available Market (SAM) — representing the online travel booking segment — is expected to reach around $139 million, equivalent to approximately 1.65 million travelers booking travel services online.

With the growth of local digital platforms, around 10% of these travelers (165,000 users) are expected to use local platforms to plan and book their trips.

Tovtrip aims to capture 0.5% of this market by 2026, representing approximately 8,500 active users. With an estimated average booking value of $84 per transaction, this could generate roughly $700,000 in revenue.

Empowering Cambodia’s tourism ecosystem

What makes Tovtrip particularly meaningful is the strong support from Cambodia’s local users and businesses.

Unlike global platforms, Tovtrip is designed specifically for Cambodia’s tourism ecosystem — supporting local language, local payment preferences, and direct connections with verified local merchants.

The platform also ensures:

  • Authentic listings from real Cambodian businesses
  • Direct engagement between travelers and local service providers
  • Better economic opportunities for local communities
  • Increased exposure for emerging destinations across Cambodia

This local-first approach has helped Tovtrip build trust with both merchants and travelers while strengthening Cambodia’s digital tourism infrastructure.

Also read: Builders wanted: Close the AI execution gap for SMEs

Representing Cambodia on the regional stage

By participating in Echelon Singapore 2026, Tovtrip aims to showcase Cambodia’s growing innovation ecosystem and demonstrate how local startups can play a vital role in shaping the future of tourism in Southeast Asia.

As Cambodia’s first travel super app, Tovtrip is committed to building a connected, data-driven, and inclusive tourism ecosystem — empowering local businesses while making travel across Cambodia easier than ever.

Want updates like this delivered directly? Join our WhatsApp channel and stay in the loop.

This article was sponsored by TovTrip

We can share your story at e27 too! Engage the Southeast Asian tech ecosystem by bringing your story to the world. You can reach out to us here to get started.

Featured Image Credit: TovTrip

The post Tovtrip introduces Cambodia’s first travel super app at Echelon Singapore 2026 appeared first on e27.

Posted on

The AI wave is real, but it won’t lift everyone

I am a product marketer. My world is go-to-market strategy, user journeys, and positioning. For most of my career, if I had an idea for a product flow or wanted to test a concept with users, I had to brief a developer, wait, iterate through feedback loops, and hope the final output matched what I had in my head. That process could take weeks. Now with AI, I can build the prototype myself without coding. Not a rough sketch, an actual working prototype, sometimes in the same afternoon I had the idea.

And what used to be separate workstreams, building the product, documenting it, creating the marketing materials, crafting the use cases, now feed into each other. I can build a prototype, screenshot the flow, write the copy around it, and have a landing page up before the end of the day. That kind of speed used to require a team.

I am not alone in this. Developers I know are writing their own documentation in minutes instead of hours. The walls between disciplines that used to define what each of us could and could not do are coming down fast. And with them, the barriers around who can access jobs, capital, and the tools to build something from scratch.

But what made that possible for me is part of a much bigger shift happening right now, and not everyone is going to benefit from it equally.

The stakes are bigger than most people realise

In mid-March, Jensen Huang stood on stage at GTC 2026 and told the world he could see at least US$1 trillion in AI infrastructure spending through 2027. Anthropic CEO Dario Amodei, speaking recently in Bangalore, said AI adoption in India has doubled in just three to four months. Andrej Karpathy, one of the people who helped build the foundations of modern AI, admitted he has not typed a line of code himself since December.

Also Read: Ethical implications of using AI in hiring

These are not just impressive numbers; they are signals that the rules of the next economy are being written right now, mostly by a small number of players with very large infrastructure budgets. In simple terms, tokens are the units AI runs on and gets paid for. Every response, every piece of analysis, every bit of generated content is measured in tokens, and a gigawatt data centre costs around US$40 billion before a single chip goes in. The countries and companies that can build those factories will define the cost of intelligence for everyone else. In this new economy, your token budget is becoming as critical as your cloud spend, and if that cost gets set by a handful of players in one part of the world, everyone else ends up buying at a price they had no say in.

This is what infrastructure inequality looks like in practice, and it is worth understanding what game you are actually playing.

The part that gets missed

Dario made a point in Bangalore worth paying attention to. He said Anthropic does not come to markets like India looking for consumers; they want to work with local builders who actually understand their own market. Every two or three months, a new model release opens up something that was not possible before.

OpenClaw, the open source agentic AI framework that Jensen described as the most downloaded open source project in history, surpassing Linux in weeks, makes this even more concrete. Karpathy called it the operating system for agentic computers, the same role Windows played for the personal computer. A developer anywhere in the world can now build on the same foundation as one in San Francisco.

The infrastructure layer requires billions to build, which means it is dominated by players with the deepest pockets, but the application layer, meaning the tools and products built on top of AI, is still wide open. That is where the real opportunity sits for founders and builders in this region. That window is real, but only if you know it is there.

The real barrier is the on-ramp

Most people assume the barrier to AI adoption is access, that if you just had the right tools, you would be fine. But that is not what the data shows. 64 per cent of Southeast Asian sellers cite high costs and time as major obstacles, and while 41 per cent of SMEs say they are adopting AI, only five per cent are actually using it in a meaningful way. The barrier is not access. It is the on-ramp.

A small logistics company in Southeast Asia, with five people and no dedicated tech team, recently started using AI to handle customer communications, route queries, and generate weekly ops summaries. What used to need a part-time coordinator now runs largely on its own, and the founder ended up not hiring the person she had budgeted for. That is what the tool working as promised actually looks like.

But getting there took three weeks of trial and error, a developer friend who helped with setup, and the willingness to push through a lot of frustration. Three weeks and a developer friend are not things everyone has.

Also Read: A new era of automation: Establishing best practices for intelligent automation and generative AI

This is the gap that does not show up in press releases or keynote slides. The tools exist, but most people still cannot figure out how to get started. Karpathy talks about spending sixteen hours a day in what he jokingly calls AI psychosis, basically an obsessive state of directing multiple AI agents at the same time, each working on different tasks, while he reviews, adjusts, and keeps them all moving. That is what mastery looks like right now, and that gap does not close on its own.

What actually needs to change

So what does closing it actually look like? Some of it is already happening. Google’s Stitch update in March 2026 means a founder who cannot afford a designer can now generate a full UI, interactive prototype, and design system in under an hour, for free, with no design skills required. Figma’s stock dropped 8.8 per cent the day the update was announced. The market saw the shift before most people did, and this is exactly the direction things need to keep moving: tools that start from what you want to achieve rather than assuming you already know how to build it.

That is why I think this category of operator-first tools matters, including what we are building with Fuseful Workflow Studio at Morpheus Labs. Most automation tools still assume you know how to build the system. Fuseful starts from the business outcome instead, built with operators in mind, not engineers.

But tools alone are not enough. The average enterprise now runs more than ten AI applications, yet 76 per cent report negative outcomes because the tools do not connect, and nobody was trained to use them together. Anthropic has a team they internally call the Ministry of Education, and that is not a trivial signal. Companies serious about equity need to treat capability-building in their users the same way they treat feature development, not as an afterthought but as the actual product.

And the last piece sits with local builders. Dario is right that Anthropic cannot and should not build for every vertical. The real opportunity for domain-specific, market-specific, culturally-grounded applications sits with the people who actually know those markets. Funding those builders and not cannibalising them when they find success is what building equity by design actually looks like in practice.

The big labs will keep building, and the infrastructure will keep scaling. That part is not really up for debate anymore. What is still up for debate is what gets built on top of all of it, who gets trained to use it, and who gets funded to try.

Those decisions do not belong to Jensen or Dario. They belong to every founder, operator, and builder in ecosystems like this one. And we are still early enough to get them right.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsAppInstagramFacebookX, and LinkedIn to stay connected.

The post The AI wave is real, but it won’t lift everyone appeared first on e27.

Posted on

From pilot to production: Where robotics actually breaks

Rahul Nambiar, CEO and co-founder of Botsync

Industrial automation in Southeast Asia is moving beyond experimentation into real-world execution, and the stakes are rising quickly. In January 2026, Singapore-based robotics company Botsync secured additional Series A funding from SGInnovate, signalling growing investor confidence in the region’s smart manufacturing push. But scaling robotics is not just about deploying machines but about integrating intelligence into complex, live operations.

In this interview with e27, Rahul Nambiar, CEO and co-founder of Botsync, breaks down what it takes to move from pilot projects to multi-site rollouts, where automation delivers real ROI, and why orchestration is becoming the true battleground. As labour shortages intensify and manufacturing digitises, Botsync’s journey offers a closer look at how robotics is evolving from a technical solution into critical industrial infrastructure.

Also Read: What’s changing inside Southeast Asia’s factories with IsCoolLab

Edited excerpts:

What’s the single most important capability this additional investment buys: hardware, software, talent, or market access?

The capital allows us to invest further in expanding the orchestration and intelligence capabilities of our no-code and vendor-agnostic automation control platform SyncOS. This will ensure our users get the most optimal solutions possible for their robotic fleets.

You’re positioning Botsync as moving from “startup momentum” to “regional scaleup”. What operational bottleneck tends to break first when autonomous mobile robots (AMR) deployments move from pilot to multi-site rollouts in the region?

During a pilot phase, users primarily focus on technical feasibility and whether your product can work in their facility. The impact on their operations is very minimal.

After transitioning to a full rollout, this changes quickly, as you become a critical component in their operations. Users now care more about whether their operation key performance indicators (KPIs) are being met than whether the technology looks feasible or impressive. This involves handling edge cases, defining response timelines in the event of failure, optimising systems based on continuous feedback, and implementing business continuity plans in the event of system failure.

The operational elements of the company, whether we have hired the right support team, built the right processes, and built redundancy into our systems and processes, soon become as critical as the technology itself.

Your pitch leans on labour shortages and inefficiencies. In practice, where does automation deliver the fastest payback in warehouses here: picking, putaway, replenishment, line-feeding, or yard operations? Where do buyers still overestimate what robots can do?

The greatest value from automation occurs when, in addition to automating a manual process, the data it collects enables users to further optimise their operations. This could be in the form of reduction of error rates of operation, ensuring accurate prioritisation, reduction of order fulfilment time, etc.

Also Read: 🤖Rise of the machines: 20 robotics startups shaping Southeast Asia’s future

In Botsync’s case, we leverage the integration enabled by SyncOS and the data we collect from multiple machines at each stage of production to ensure the accurate, timely delivery of parts between assembly lines and the warehouse within factories. This allows us to deliver value by ensuring higher manufacturing process uptime and better visibility into the entire process, in addition to the physical automation we provide.

Two hundred and thirty per cent revenue growth is big, but growth can be “cheap” or “expensive”. What’s driving it: larger contract values, more sites per customer, better margins, or simply more hardware shipped?

We are seeing revenue growth from these areas:

  • Larger expansion within the same site of a customer
  • Expansion to new sites by the same customer
  • This has also allowed us to ensure customer acquisition costs are managed as we scale up.

Botsync works across manufacturing, warehousing, and intra-logistics. What’s your core product wedge today: fleet management software, the robots themselves, integration services, or a full-stack automation solution? How does that choice affect scalability?

Botsync’s primary product wedge today comes from the integration and process intelligence capabilities of SyncOS, which encompass fleet management, allowing AMRs and automated guided vehicles (AGVs) to communicate with other automation systems such as robotic arms, programmable logic controllers (PLCs), and conveyor belts, and use AI to enable data-driven decision-making. This allows customers to maximise the efficiency of their deployed automation.

Singapore often talks about smart manufacturing and advanced automation. From your on-the-ground conversations with manufacturers and logistics players, where is policy genuinely accelerating adoption, and where is it still not translating into operational reality?

Singapore’s push for smart manufacturing and logistics automation is closely aligned with Manufacturing 2030, which aims to grow the sector by 50 per cent in value-added output and establish Singapore as a global hub for smart, green, and high-value manufacturing. Policies and funding have accelerated adoption among mid-sized manufacturers and third-party logistics (3PL) operators, while manpower constraints and tighter foreign worker quotas have made automation a commercial necessity. Budget 2026 further strengthens this drive, with expanded support under the productivity solutions grant (PSG) for AI and automation, the launch of National AI Missions and Council to coordinate sector-wide transformation, and continued RIE2030 investment in robotics, AI, and advanced manufacturing.

Despite these measures, adoption isn’t uniform. Legacy systems and fragmented operations continue to slow integration, and many companies that run successful pilots struggle to scale across multiple sites due to interoperability and workforce-readiness gaps. ROI expectations versus real-world deployment timelines also remain a challenge, particularly for smaller firms trying to translate grant support into measurable productivity gains.

Looking to 2026, what’s the biggest technical or commercial bet in your roadmap: multi-robot orchestration, richer perception and safety, interoperability with legacy systems, or moving towards robotics-as-a-service? What would make you change course?

Looking to 2026, this market insight shapes our biggest bets: multi-robot orchestration, interoperability with legacy systems, and enhanced intelligence to handle dynamic operations and edge cases.

Also Read: The transformative potential of humanoid robots: A VC perspective

Multi-robot orchestration is increasingly practical thanks to Singapore’s national robotics standards and testbeds, which enable coordination of heterogeneous fleets. Interoperability continues to be a challenge, as highlighted by IMDA’s AMR x Digital Leaders initiative, helping companies integrate new robotics with existing Warehouse Management Systems.

We continually assess the market landscape and customer needs, and we see growing demand for autonomous mobile robots (AMRs) and integrated robotics solutions. Our commitment remains to provide autonomous solutions tailored to our customers, and we would adjust our roadmap if breakthroughs in perception and safety, broader ecosystem standardisation, or shifts in customer priorities make alternative approaches more effective or efficient.

The post From pilot to production: Where robotics actually breaks appeared first on e27.

Posted on

The coming identity crisis of agentic AI

In the race to build autonomous AI agents—software that can book flights, negotiate contracts, execute financial transactions, or run entire workflows on behalf of humans—a quieter but equally critical debate is unfolding behind the scenes.

How do you identify and authorise an AI agent?

Right now, several major technology communities are attempting to answer that question simultaneously. Groups such as the World Wide Web Consortium, the OpenID Foundation, the Decentralised Identity Foundation, and the Trust Over IP Foundation are all exploring mechanisms for identity, authentication, and delegation in what many now call the agentic economy.

Each community brings its own philosophy.

The World Wide Web Consortium focuses on core web architecture and decentralised identifiers. The OpenID Foundation specialises in authentication protocols like OAuth and OpenID Connect. The Decentralised Identity Foundation builds open infrastructure for self-sovereign identity systems. Meanwhile, the Trust Over IP Foundation focuses on governance frameworks and trust networks.

Individually, each effort is valuable.

Collectively, they risk creating a fragmented identity landscape just as AI agents begin to proliferate across the internet.

And the stakes are high.

3If autonomous agents are going to operate in financial markets, government services, enterprise systems, and consumer platforms, the world will need a reliable way to verify who or what is acting.

Without that, the agentic internet could quickly become a chaotic ecosystem of unverifiable bots.

Why fragmentation is inevitable

The risk of fragmentation is not simply the result of organisational rivalry. It is largely structural. Technology evolves far faster than standards bodies.

Developers building agent frameworks today cannot wait three years for formal protocols to emerge. They will ship systems using whatever identity mechanisms exist — API keys, OAuth tokens, decentralised identifiers, or proprietary authentication models.

6Meanwhile, standards organisations deliberate carefully, balancing security, interoperability, and governance.

By the time a standard is finalised, the ecosystem may already have moved on.

This dynamic has played out before. The early internet saw competing encryption protocols, rival messaging systems, and incompatible browsers before a handful of dominant standards emerged.

The same evolutionary process may now be happening with agent identity.

Also Read: The digital lag: How traditional consulting is failing to grasp the agentic AI revolution

The real goal is interoperability

The instinctive response to fragmentation is often to call for a single universal standard.

But the internet rarely works that way.

Instead, it evolves through layers. Different technologies coexist, but they communicate through shared interfaces. Email servers may run different software, but they all speak SMTP. Websites may be built with different frameworks, but they all rely on HTTP and TLS.

The same layered model may be the best path forward for agent identity.

Rather than forcing convergence on one protocol, the ecosystem may need to focus on shared primitives that allow different systems to interoperate.

These primitives could include portable identity artefacts such as decentralised identifiers, verifiable credentials, and authorisation tokens.

An AI agent might authenticate using one protocol while presenting credentials issued by another system, with trust frameworks defining how those credentials are validated.

In other words, the agent identity ecosystem may look less like a single standard and more like a modular identity stack.

Open implementations matter more than documents

One lesson from past internet standards is that specifications alone rarely drive adoption.

Working code does.

Open reference implementations—wallets, credential exchanges, agent authorisation frameworks—can serve as anchors for the ecosystem. When multiple communities build on shared open-source infrastructure, fragmentation often resolves itself organically.

Developers gravitate toward tools that work. And once those tools gain momentum, standards tend to follow the architecture already in use.

The importance of cross-foundation collaboration

Another way to reduce fragmentation is simple: collaboration.

If the W3C defines core identity primitives, the OpenID Foundation could create authentication profiles for agents. The Decentralised Identity Foundation could build the supporting infrastructure. The Trust Over IP Foundation could establish governance frameworks that determine how trust is established between networks.

Also Read: Agentic AI in action: How Southeast Asia’s startups are turning constraints into strengths

This kind of layered collaboration mirrors how the internet itself evolved.

No single organisation built the web. Instead, a loose constellation of standards bodies, open-source communities, and industry alliances shaped its architecture over time.

Agent identity may require the same approach.

A new kind of digital identity

What makes the challenge especially complex is that agent identity is fundamentally different from human identity.

A human identity system answers questions like:

  • Who is this person?

Agent identity must answer additional questions:

  • Who authorised this agent?
  • What permissions does it have?
  • Who is accountable for its actions?

An AI agent booking a meeting might have minimal privileges. One managing supply chains or executing financial trades might have enormous authority.

Identity systems must therefore support delegation chains, where humans or organisations grant agents specific capabilities—and where those capabilities can be audited or revoked.

This problem sits at the intersection of identity, authorisation, and governance.

And no single standards body currently owns all three.

Competitive convergence

If fragmentation sounds alarming, history suggests it may also be necessary. Innovation often begins with competing ideas. Over time, the ecosystem experiments discard weak approaches and converge around the solutions that prove scalable and secure.

The early internet did not begin with cleanly aligned standards. Neither did cloud computing, mobile ecosystems, or cryptocurrencies.

Agent identity may follow the same trajectory.

A period of experimentation—messy, decentralised, and occasionally incompatible—may ultimately produce stronger systems than a prematurely unified standard.

The infrastructure of the agentic economy

As AI agents begin acting autonomously across the digital economy, identity will become one of the most critical pieces of infrastructure.

Without reliable identity and delegation mechanisms, autonomous agents cannot safely interact with banks, governments, enterprises, or consumers.

But solving the problem will require more than a single protocol.

It will require an ecosystem — a layered architecture where multiple standards, technologies, and governance models can interoperate.

Fragmentation may be unavoidable.

The real question is whether the communities building agent identity today can ensure that their systems eventually connect.

If they do, the agentic internet could become as interoperable as the web itself.

If they do not, the next generation of AI agents may inherit a fragmented identity landscape just as complex—and contentious—as the early days of the internet.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsAppInstagramFacebookX, and LinkedIn to stay connected.

The post The coming identity crisis of agentic AI appeared first on e27.

Posted on

What should states do about Meta-national platforms

For decades, governments have regulated companies as market participants. Tax them. License them. Fine them. Break them up if necessary.

But what happens when a company stops behaving like a firm — and starts functioning like infrastructure?

Not infrastructure in the traditional sense of roads or ports. Digital infrastructure.

Platforms that:

  • Clear cross-border payments.
  • Allocate capital at scale.
  • Coordinate labour across jurisdictions.
  • Optimise supply chains in real time.
  • Govern participation through code.

In a previous piece, we explored how fintech and AI infrastructure startups could evolve into “meta-national” platforms — systems that operate across borders, arbitrage jurisdictions, and become indispensable to economic coordination.

The question now is no longer hypothetical. It is strategic.

What should governments do about them?

Ignore them? Co-opt them? Compete with them? Constrain them?

The wrong answer could accelerate fragmentation. The right answer could reshape sovereignty for the digital age.

  • First: Recognise the shift from firms to systems

Governments often treat large platforms as “big companies.”

That’s outdated.

Some fintech and AI platforms are evolving into:

  • Monetary rails.
  • Identity layers.
  • Capital allocation engines.
  • Labour coordination networks.

When a platform becomes the default system through which millions earn, transact, and allocate capital, it is no longer just a private actor.

It becomes systemic infrastructure. And systemic infrastructure carries sovereign implications.

The first mistake governments make is assuming this is simply a competition issue.

It’s not.

It’s an institutional evolution.

  • Second: Avoid reflexive overregulation

The instinctive response to systemic platforms is control:

  • Heavy licensing regimes
  • Data localisation mandates
  • Strict capital restrictions
  • Forced domestic hosting

These measures may protect short-term policy control.

But they also create fragmentation.

When digital systems are forced into rigid territorial silos, two outcomes emerge:

  • Platforms are designed around the restrictions and relocated strategically
  • Domestic innovation falls behind global infrastructure layers

Overregulation may weaken state control rather than strengthen it.

Meta-national platforms thrive in regulatory arbitrage environments.

If a government makes participation too difficult, the platform does not disappear.

It simply routes around.

  • Third: Compete through performance, not prohibition

Digital platforms gain legitimacy through performance:

  • Faster settlement
  • Lower costs
  • Better allocation
  • Higher reliability

If citizens prefer digital currency rails over domestic banking systems, the problem is not merely regulatory. It is competitive.

Governments must ask:

Why are users choosing external platforms?

  • Is domestic banking too slow?
  • Are remittance costs too high?
  • Is SME credit inaccessible?
  • Is regulatory friction excessive?

The long-term solution is not prohibition.

It is upgrading domestic infrastructure.

Central bank digital currencies (CBDCs), instant payment systems, open banking frameworks — these are performance responses, not defensive reactions.

States that compete on efficiency retain legitimacy. States that rely solely on restrictions lose them.

Also Read: The first Meta-nation won’t be a country — and it might be built in Southeast Asia

  • Fourth: Engage platforms as strategic actors

As platforms scale, governments should shift from viewing them as adversaries to recognising them as stakeholders.

This does not mean surrendering authority. It means acknowledging mutual dependency.

Fintech platforms can:

  • Expand financial inclusion
  • Reduce remittance friction
  • Enhance capital access for SMEs
  • Improve transparency in economic flows

AI infrastructure platforms can:

  • Improve supply chain resilience
  • Enhance economic forecasting
  • Optimise public resource allocation

Rather than defaulting to hostility, governments should create structured engagement channels:

  • Regulatory sandboxes
  • Joint policy forums
  • Public-private coordination frameworks
  • Crisis response integration

The goal is not to capture. It is alignment.

  • Fifth: Preserve monetary sovereignty strategically

The greatest vulnerability meta-national platforms create is monetary.

If large segments of a population transact primarily in stable digital assets outside domestic banking systems, central banks lose:

  • Policy transmission tools
  • Visibility into capital flows
  • Control over liquidity conditions

Governments should respond in three ways:

  • Develop a credible digital currency infrastructure
  • Modernise domestic payment rails
  • Ensure interoperability with global systems

Total exclusion is unrealistic.

Interoperability preserves influence.

If domestic systems can plug into global digital infrastructure, states remain relevant in layered sovereignty rather than being sidelined by it.

  • Sixth: Protect identity without over-centralising it

Digital identity is the next frontier of sovereignty.

If platforms control identity verification and reputation scoring, they influence access to credit, employment, and participation.

Governments should:

  • Develop strong, portable digital identity frameworks
  • Enable API-based integration with private platforms
  • Ensure privacy standards are competitive globally

Over-centralised identity systems risk fragility.

Underdeveloped identity systems risk irrelevance.

The balance is delicate — but critical.

  • Seventh: Prepare for layered sovereignty

The 20th-century model assumed sovereignty was exclusive. You belonged to one nation-state. Period. The 21st-century model is layered.

An individual may simultaneously belong to:

  • A territorial state (passport)
  • A digital monetary network
  • An AI-driven labour marketplace
  • A cross-border capital ecosystem

Governments should not attempt to eliminate these layers.

They should design policies assuming coexistence.

Layered sovereignty does not automatically erode state authority.

It reshapes it.

States that adapt will remain central nodes. States that resist entirely may find themselves bypassed.

Also Read: You’re designing the wrong thing: Why SEA founders should focus on decision environments, not culture decks

  • Eighth: Avoid turning platforms into geopolitical weapons

In an era of US–China rivalry, digital infrastructure is increasingly politicised. Export controls. Sanctions. Data restrictions. Capital scrutiny.

But weaponising infrastructure has consequences. If digital platforms are perceived as extensions of geopolitical blocs, adoption narrows. Neutral platforms become more attractive.

This dynamic is especially important for Southeast Asia.

The region thrives on strategic balance. Governments here should resist binary alignment pressures that turn infrastructure into ideological tools. Neutrality enhances economic leverage.

  • Ninth: Build domestic champions — but don’t cage them

Many governments aim to build national champions in fintech and AI. That’s sensible.

But overprotection can backfire.

If domestic startups are shielded from global competition through restrictive barriers, they may fail to scale beyond home markets.

Meta-national platforms require:

  • Cross-border functionality
  • Regulatory sophistication
  • Global trust

Governments should:

  • Support outward expansion
  • Encourage global compliance capabilities
  • Invest in regional interoperability frameworks

Champion-building should focus on capability, not containment.

  • Tenth: Redefine sovereignty for the digital era

The deepest shift required is conceptual.

Sovereignty is no longer defined solely by territory.

It increasingly depends on:

  • Control over infrastructure layers
  • Influence over protocol standards
  • Participation in global coordination networks

Governments that cling to a purely territorial model will struggle.

Those that embrace infrastructure diplomacy — shaping standards, fostering interoperability, and partnering with platforms — will remain central.

The meta-national future does not eliminate states. It challenges them to evolve.

The strategic choice ahead

Meta-national platforms will not announce themselves.

They will scale quietly:

  • Through adoption in emerging markets.
  • Through integration with global SMEs.
  • Through developer ecosystems.
  • Through performance advantages.

By the time governments recognise them as sovereignty-adjacent actors, they may already be embedded in economic life.

The choice for governments is not whether to allow them. They are already emerging. The choice is whether to:

  • Fight them blindly,
  • Partner strategically,
  • Or upgrade state capacity to compete.

The most resilient governments will do all three — selectively.

Because the next decade will not be defined solely by great-power rivalry between states.

It will be defined by the rise of infrastructure actors that operate across them.

The most powerful economic system in your jurisdiction may not belong to your central bank.

It may run on code.

And how governments respond will determine whether sovereignty fractures — or adapts.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on InstagramFacebookX, and LinkedIn to stay connected.

The post What should states do about Meta-national platforms appeared first on e27.

Posted on

While stocks rally, gold hits US$4,780 and crypto correlation tells a hidden story

The crypto market’s modest 0.57 per cent gain, bringing total capitalisation to US$2.35T over the last 24 hours, tells a story far more nuanced than the headline suggests. The strength of the Ethereum ecosystem drove this movement, with the network outperforming the broader market by a significant margin. This divergence matters because it reveals where smart capital currently seeks refuge and growth. The 46 per cent correlation between crypto and Gold further underscores a market positioning itself for inflationary pressures, even as traditional risk assets rally on geopolitical hopes. I see this not as contradictory behaviour but as a sophisticated reallocation in which digital assets serve dual roles: as vehicles for speculative growth and as emerging stores of value.

Ethereum’s outperformance stems primarily from an unexpected source: a major security incident on Solana. The Drift Protocol exploit, where an attacker extracted substantial value, triggered a fascinating capital rotation. The exploiter now swaps over US$270M in stolen Solana-based assets into ETH, creating tangible on-chain buying pressure. This dynamic illustrates Ethereum’s evolving role as the preferred settlement layer during periods of uncertainty across competing chains. Rather than fleeing crypto entirely, capital seeks the network with the deepest liquidity, most robust developer activity, and strongest institutional recognition. I interpret this as validation of Ethereum’s long-term thesis: security and decentralisation compound value over time, especially when alternatives face stress. The market rewards resilience, and Ethereum’s ability to absorb this inflow without significant slippage demonstrates the maturity of its infrastructure.

Beyond the hack-driven flows, broader sentiment around Ethereum is supported by credible institutional developments and clarity on the protocol roadmap. Franklin Templeton’s move to launch an institutional crypto division signals traditional finance deepening its commitment to digital asset infrastructure. This is not speculative noise but strategic positioning by a firm managing hundreds of billions. Simultaneously, Ethereum’s 2026 protocol upgrades, including Glamsterdam and Hegotá, provide a tangible catalyst for long-term holders. These upgrades promise meaningful improvements to scalability and user experience, addressing the very concerns that limit broader adoption. Meanwhile, speculative capital rotates into low-market-cap tokens like StakeStone and TrustSwap, which posted triple-digit gains. This risk-taking behaviour indicates healthy market appetite, though I caution that such moves often precede consolidation. The combination of institutional validation and retail speculation creates a supportive, if uneven, foundation for prices.

Also Read:The keys to your kingdom: Navigating crypto custody in 2026

From a technical perspective, Ethereum’s near-term trajectory hinges on its ability to reclaim the US$2,400-US$2,600 resistance zone. A confirmed close above the 50-day exponential moving average would signal strengthening momentum, potentially opening a path toward US$3,000. Immediate support rests near US$2,200, a level bulls must defend to maintain the current structure. I watch these levels closely because they reflect not just chart patterns but the collective psychology of market participants. The situation remains fluid pending further details on the Drift Protocol exploit. Any new information could alter the flow dynamics currently supporting ETH. Protocol upgrades also warrant attention: successful testnet deployments and clear timelines would reinforce confidence, while delays might trigger profit-taking. Technical analysis in crypto never operates in isolation; it intersects with on-chain data, macro sentiment, and narrative shifts.

This crypto market movement unfolds against the backdrop of a rallying global risk-asset market. On 2 April 2026, major indices posted gains as de-escalating tensions in the Middle East reduced the geopolitical risk premium. The S&P 500 closed at 6,575.32, up 0.72 per cent, while the Nasdaq Composite gained 1.16 per cent to 21,840.95, led by technology stocks. The Dow Jones Industrial Average rose 0.48 per cent to 46,565.74. Crude oil prices pulled back, with Brent futures falling 1.15 per cent to US$100.00 per barrel and WTI slipping to US$98.71 per barrel, as investors anticipated reduced risk of supply disruptions. Treasury yields edged higher, with the 10-year note yielding 4.33 per cent, reflecting capital rotation from safe-haven bonds into equities. Asian markets surged, notably South Korea’s KOSPI, which jumped 8.4 per cent. This global risk-on sentiment typically supports crypto, and Bitcoin traded relatively steady near US$68,103, suggesting digital assets currently follow idiosyncratic drivers more than broad equity beta.

Gold’s strength amid this risk-on environment deserves particular attention. Spot gold rose to approximately US$4,780.40 per ounce despite de-escalation headlines, indicating persistent demand for inflation hedges. The 46 per cent correlation between crypto and Gold suggests a segment of the market treats digital assets as complementary to precious metals in portfolio construction. I find this convergence logical: both assets offer alternatives to fiat currency systems, though through different mechanisms. Gold provides physical scarcity and historical precedent; crypto offers programmable scarcity and network utility. When investors allocate to both, they express a nuanced view: scepticism about long-term fiat stability coupled with confidence in technological innovation. This dual positioning explains why crypto can rise alongside traditional risk assets while maintaining a hedge-like correlation with gold.

Also Read: Breaking: US Labour Department opens door to crypto in 401(k) plans, market jumps 1.86%

The current market structure rewards selective participation. Broad index exposure may underperform focused positions in ecosystems demonstrating clear catalysts and resilient infrastructure. Ethereum’s dual role as a technological platform and a liquidity sink during cross-chain stress events positions it uniquely. I caution against overextrapolating short-term flows: the US$270M in exploited assets represents a transient catalyst, not a fundamental revaluation. Lasting gains require sustained developer activity, user adoption, and regulatory clarity. The convergence of institutional interest, protocol innovation, and macro hedging demand creates a compelling setup, but execution risk remains. I advocate for disciplined position sizing and continuous monitoring of on-chain metrics alongside traditional technical levels.

In this complex environment, my perspective emphasises independent analysis over narrative conformity. The market’s modest gain masks significant underlying dynamics: capital rotation among chains, shifts in institutional strategy, and macro hedging behaviour. These forces interact in ways that simple headlines cannot capture. I believe the next phase of crypto market development will reward those who understand network fundamentals, liquidity dynamics, and macro correlations simultaneously. 

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. You can also share your perspective by submitting an article, video, podcast, or infographic.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of e27.

Join us on WhatsAppInstagramFacebookX, and LinkedIn to stay connected.

The post While stocks rally, gold hits US$4,780 and crypto correlation tells a hidden story appeared first on e27.

Posted on

Circulate Capital’s US$220M fund targets Asia’s recycling gap

Singapore-based Circulate Capital has raised US$220 million in the first close of its second Asia-focused fund, a vote of confidence for circular economy investing at a time when much of the market’s attention and money has been swallowed by artificial intelligence (AI).

The new vehicle, Circulate Capital Asia II, has already reached more than 70 per cent of its US$300 million target, surpassing the firm’s first fund, which closed at US$188 million. The capital will be deployed into recycling and circular supply chain businesses across India, Indonesia, Thailand, Vietnam, the Philippines, and Malaysia, with a focus on plastics and packaging, as well as electronics and apparel.

Also Read: Circulate Capital makes final close of US$76M fund to advance circular economy for plastics

The investor line-up comprises strategic corporates, development finance institutions, pension-linked capital, and family offices. Returning backers include The Coca-Cola Company, Danone, Dow, Procter & Gamble, British International Investment, Proparco, IFC and Builders Vision, while new investors include EMCAF, Impact Fund Denmark, SIFEM and Australian Development Investments.

Why circular economy investing is growing

The broader investment case for circularity is getting harder to dismiss. Globally, companies are facing volatile raw material costs, supply chain disruptions, tighter environmental regulations, and rising pressure from customers and consumer brands to reduce waste. At the same time, the world still extracts more than 100 billion tonnes of raw materials each year while remaining only 7.2 per cent circular.

Asia is central to that story. The region combines rapid consumption growth with weak waste management systems and a manufacturing base that increasingly needs reliable, locally sourced recycled materials.

For investors, this creates a more direct commercial opportunity than the old sustainability pitch. Recycling, recovery, and reuse are no longer just about impact reports; they are about securing feedstock, reducing import dependence, and building more resilient supply chains.

Plastics remain the biggest entry point, but the market is widening. Investors are looking not only at mature recycling streams, such as PET, but also at harder-to-process materials, including polyolefins, flexible packaging, textiles, batteries and electronic waste.

Is AI crowding out circular economy funding?

In short, yes, but not completely.

AI is dominating global venture and growth funding, which makes life harder for circular-economy startups and infrastructure plays, especially in Asia, where many investors still prefer software-led models with faster scaling potential. Circularity businesses are usually more capital-intensive, more operationally messy and slower to mature. That is not exactly catnip for momentum-driven investors.

But the sector is not competing for the same pools of money. Much of Circulate Capital’s backing comes from corporates, development finance institutions, and impact-oriented investors with longer time horizons and strategic reasons to be in the market. For them, circular economy investing is less about chasing the next valuation spike and more about addressing supply chain risks, regulatory exposure, and material scarcity.

That distinction may help the sector keep growing even while AI hoovers up headlines.

Asia’s plastic problem is still severe

If anything, the region’s waste crisis remains underfinanced relative to its scale.

South and Southeast Asia generate vast volumes of plastic waste, while collection, sorting, and recycling systems often lag far behind demand. Low-value and flexible plastics remain especially difficult to recover at scale, and leakage into waterways and coastlines continues to be one of the region’s defining environmental failures.

Also Read: Circulate Capital joins bio-based plastic developer Algenesis’s US$5M seed round

Investors are paying more attention than they were a few years ago, but not enough to match the problem. Circulate Capital estimates that plastics alone represent a US$100 billion cumulative investment opportunity in collection and recycling infrastructure by 2030. That figure underlines the gap between what is needed and what has actually been deployed.

Where the market is heading

The next phase of circular economy investing in Asia is likely to move beyond straightforward bottle recycling into more complex areas: flexible plastics, alternative packaging, textile recovery, battery recycling, and the recovery of rare and critical materials from electronics.

That shift is important because the easiest opportunities have already been identified. The future will depend on whether investors can back businesses that not only process waste but also build dependable circular supply chains around it. The winners are likely to be firms that can supply recycled inputs to major manufacturers and consumer brands on an industrial scale.

Rob Kaplan, founder and CEO of Circulate Capital, said the firm’s track record shows the circular economy is “a sophisticated asset class that can deliver liquidity to private equity investors”.

Circulate Capital’s record so far

Circulate Capital said it has completed more circular economy deals in Asia than any other manager. However, it did not disclose the exact number of deals it has made since launch.

It did, however, point to exits as proof that the model can produce returns. Fund I has fully exited Indian digital waste management platform Recykal, and partially exited Lucro, a recycler focused on hard-to-manage flexible plastics, and Srichakra Polyplast, described as India’s first food-grade plastic recycler.

Since 2020, the firm says its Asia portfolio has added nearly 900,000 tonnes of annual collection and recycling capacity. Fund II aims to finance nearly two million tonnes more.

The bigger takeaway is that circular economy investing in Asia is no longer a fringe climate theme. It is slowly becoming an industrial and supply chain play. AI may still be the market’s favourite shiny object, but waste, unlike hype cycles, has a habit of sticking around.

The post Circulate Capital’s US$220M fund targets Asia’s recycling gap appeared first on e27.