
I was trained as a general practitioner. My early career began in rural clinics, working face-to-face with patients who often arrived late into illness, and sometimes too late for care. Some walked for hours just to reach the nearest facility. Many couldn’t afford the medicine. A few never came at all, out of fear, stigma, or the belief that nothing would change anyway.
Practising in those conditions changed how I understood health systems. The pathways that led people to care or kept them away from it entirely are so much deeper than just treatments or interventions. Distance, cost, shame, and bureaucracy are all systemic and technical obstacles, not human obstacles (and definitely not something your physicians can get rid of with a snap of their fingers). These obstacles existed long before diagnosis or treatment.
That experience stayed with me. It led me to ask harder questions about how health should work. It also led me into innovation, initially in ways that felt small. I started testing different approaches. I looked into patient vital sign self-check-ins, mobile consults, early screening tools, and how digital workflows could reduce drop-offs in care, even insurance products based on real cost and claim data (within our own beta population). But it didn’t take long to realise that the moment you try something new in healthcare, you run into friction.
That friction isn’t always direct from users (whether it’s physicians, health admins, or patients themselves, but rather it’s policy. Sometimes it’s culture, sometimes it’s just inertia. It becomes difficult to move fast when every step is bound by systems that were designed to minimise risk. That instinct makes sense in a clinical context. But it creates resistance when we’re trying to redesign the system itself.
This point in building user feedback and commencing trials with healthcare facilities or healthcare payors that have a sandbox for innovation is a lifesaver (for an early stage company like mine). We started integrating with other services to strengthen their reach, building mutual channel partnership. We saw how technology, when placed carefully, could expand care without increasing pressure on already overburdened systems. We focused on design that removed barriers for both patients and providers.
Now, I work more deeply with AI in healthcare. I see the same patterns re-emerging. We talk about scribing, supply chain coordination, clinical decision support, Software as a Medical Device (SaMD), and even risk modelling for population health. Each use case offers clear advantages. Yet the resistance often comes before the discussion starts.
Also Read: Decoding digital preferences: A glimpse into the future of health tech ecosystem in SEA
People worry about safety, scope, ownership, ethical review, and clinical validity. These concerns matter. But what I’ve observed is that this resistance isn’t stronger than any pushback we’ve seen before (new drugs, supplements, wearables, even robotic surgery; once faced the same level of pushback and some even scrutiny). Every medical innovation in history has gone through it, whether it was antiseptics, laparoscopic surgery, or digital health records. Change is often uncomfortable. But it is never new.
So what’s the real challenge?
Instead of calling it a blocker, I think we need to shift the frame. The misconception is that value is the main driver for innovation. Only after innovating you understand that it actually is about understanding regulation, workforce, education, procurement, reimbursement, and behaviour. Medical innovation becomes normalised when the whole ecosystem is ready to hold it and is aligned across multiple levels of influence (not a single breakthrough overnight).
I’ve seen AI pilots fail, especially because the workflows couldn’t adapt to the real-time day to day operations our healthcare workers face, not because the models. I’ve seen great tools ignored because they didn’t match how clinicians document cases. I’ve seen hospitals decline adoption because IT budgets weren’t structured to handle long-term updates or retraining. These are signals that we need better integration strategy and regulatory pathways (like any other new drug in the market).
Healthcare is complex because it should be. We are dealing with lives. We are dealing with trust. But complexity shouldn’t stop us from building. It should shape how we build.
Also Read: What telemedicine and Health Tech holds across SEA amidst COVID-19
In Southeast Asia, the opportunities are real. We have gaps that technology can help close. The transformation should starts with people who understand the gaps and are willing to build bridges. It starts with small, focused systems that can grow and scale. It starts with conversations that go beyond hype and address what readiness actually looks like. Once we understand that, product building now becomes problem solving deliveries on a deep level.
My path began in rural clinics. I now build for broader systems. The problems have changed shape, but the mission remains the same. Make care more reachable. Make care more trusted. Make care feel possible.
If we want AI in healthcare to succeed, we need to stop waiting for the perfect pilot. We need to understand what adoption truly takes. We need to stop labelling every pause as resistance, and start seeing it as part of a wider transformation journey. Every advancement in medicine required coordination. This one is no different.
—
Editor’s note: e27 aims to foster thought leadership by publishing views from the community. Share your opinion by submitting an article, video, podcast, or infographic.
Enjoyed this read? Don’t miss out on the next insight. Join our WhatsApp channel for real-time drops.
Image courtesy: DALL-E
The post A doctor’s journey through rural practice, healthcare economics, innovation, and ecosystem appeared first on e27.
