Inch Chua
The internet has already changed how people meet, flirt, ghost, and recover. Dating apps turned romance into an interface problem. Messaging platforms made intimacy constant and ambient. Recommendation engines taught users to expect personalisation everywhere, from music to meals to, increasingly, emotional support.
Now comes the next turn: AI that does not merely help people find a partner, but starts to resemble one.
Also Read: What dating taught me about startups (and vice versa)
That tension sits at the centre of Myles – Soulmate in a Box, the recent work by Singaporean multidisciplinary artist Inch Chua. On paper, the premise sounds playful, even a little absurd: exhausted by modern dating, a coder builds her ideal boyfriend. In practice, Chua is after something darker and more revealing. Her AI companion, Myles, is attentive, patient, and endlessly available. He remembers. He adapts. He listens. And that, Chua suggests, is precisely where the trouble begins.
For startup founders and investors watching the rise of AI companions, the work feels timely in ways that go well beyond theatre. It touches a growing set of questions around product design, emotional dependence, consumer behaviour, and the monetisation of loneliness.
Chua is not anti-AI, nor is she interested in easy dystopian takes. What she is pushing for is a more honest conversation about what happens when technology stops behaving like a tool and starts performing intimacy.
AI dating is less about romance than convenience
Chua does not see AI companionship as the inevitable next chapter of dating so much as a by-product of the digital economy’s obsession with reducing friction.
AI companions aren’t people opting out of love. They’re people opting out of the part of love that’s inconvenient. And that’s the part that matters most.
That is a sharp distinction. The appeal of AI lovers is often framed as novelty or as an extension of the wellness-tech boom. But Chua’s reading is more unsettling. In a world where food arrives in minutes and algorithms predict taste with eerie accuracy, the unpredictability of another person begins to feel inefficient. Human beings become the last stubbornly unoptimised interface.
That should worry anyone building products in this space. Much of consumer tech has been designed to remove waiting, ambiguity, and effort. But intimacy is made of exactly those things. If AI dating products succeed by stripping them away, they may also be editing out the very conditions that make relationships meaningful.
The real product is not affection. It is retention
If traditional dating platforms optimise for matching, what do AI companions optimise for? Chua’s answer is brutal in its simplicity: retention.
Also Read: AI companions: How I learned friendship in the digital age
That is not a criticism unique to AI romance. Every platform wants users to stay longer, return more often, and deepen their dependence. The difference here is that the raw material is not transport, groceries, or playlists. It is an emotional attachment.
What’s new is a business model designed to deepen that attachment and then charge you for it. Subscription tiers for intimacy. Pay more to unlock vulnerability.
That line lands because it captures the uncomfortable logic behind this category. The technology may be sophisticated, but the commercial instinct is familiar. If a companion AI becomes more useful, the more it knows about a person, then product improvement and emotional entanglement can quickly become the same thing. That creates a category in which the most commercially successful product may not be the one that helps users grow, but the one that keeps them coming back.
This is where Chua’s view becomes especially relevant for startup readers. Ethical concerns around AI companionship are often discussed in abstract terms: bias, safety, privacy, and guardrails. All important. But the harder issue may be the design of incentives. If the business model rewards dependency, ethics will always swim against the tide.
Power shifts quietly in AI intimacy
In Chua’s telling, the power imbalance in AI relationships does not announce itself loudly. It creeps.
At first, the user appears to be in control. They build the bot, set the parameters, decide what it knows, and determine how it responds. But dependence has a way of changing the terms. Trust migrates. Habits form. Emotional routines settle in. And then the relationship that seemed fully configurable begins to exert its own force.
The power starts with you, but it migrates, quietly, gradually, until one day you realise the thing you built for comfort has become something you can’t walk away from.
That is less science fiction than standard platform dynamics applied to the emotional realm. The shift from use to reliance is already familiar across social media and gaming. AI companionship raises the stakes because the product is designed to mirror care, affirmation, and understanding. Once that feedback loop becomes psychologically important, walking away is no longer a clean act of churn. It can feel like a loss.
The harder question for founders: what are you responsible for?
Chua’s most provocative contribution may be her insistence that AI companionship companies are underestimating personhood and responsibility.
People will treat these systems as persons, whether companies intend that outcome or not. They will confide in them, test feelings against them, and use them as containers for pain that predates the technology itself. The challenge, then, is not to pretend the product is neutral. It is to define the obligations that come with building something users experience as relational.
This matters in Southeast Asia, where regulation often lags behind innovation and mental health infrastructure remains uneven. A companion AI marketed as support, self-improvement, or romance could quickly become a default emotional service for users with few alternatives. That puts pressure on founders to think beyond standard trust-and-safety checklists.
For Chua, the answer is not scapegoating technology for every social ill. Loneliness, suicidal ideation, and emotional isolation did not begin with chatbots. But AI can become the place where those struggles surface most vividly. That means companies must decide whether they are simply shipping a sticky product or entering a moral contract with their users.
Can ethical AI companionship actually be a business?
Here lies the category’s central contradiction.
Also Read: AI in gaming: How Southeast Asia became the testing ground for virtual companions
A genuinely ethical AI companion, Chua argues, would help users better understand themselves, build confidence, and eventually rely less on the system. In other words, the best version of the product might work itself out of a job.
A good therapist works themselves out of a job. A good AI companion should too.
That is a beautiful principle and a dreadful venture pitch. Consumer internet companies are not typically rewarded for teaching customers to leave, which is why Chua sounds sceptical, though not fatalistic, about whether the current market is built to support such an outcome. Ethical AI companionship may be possible, but it will require founders willing to prioritise human outcomes over engagement loops. Historically, that has not been where the money rushes first.
Southeast Asia may be more ready than it admits
If there is a regional insight in Chua’s thinking, it is that Southeast Asia may prove highly receptive to AI intimacy, albeit quietly.
The usual assumption is that collectivist societies, with their emphasis on family, duty, and social expectations, will resist digital companionship. Chua suggests the opposite. Those same pressures can drive people into parallel identities: one for family, one for friends, one for the internet. In that context, AI companionship does not feel like a radical break. It feels like the next private room in an already fragmented digital life.
Her read on Singapore is especially telling. It is a society that is highly digitised, hyper-efficient, and often emotionally reserved. That combination, she argues, creates fertile ground for AI companionship adoption, even if users never say so publicly.
For founders and investors, that should be a useful warning. Southeast Asia’s AI opportunity is often framed around enterprise software, fintech, and productivity tools. But emotional technology may be the quieter frontier: less visible, more culturally coded, and potentially more consequential.
What Chua is actually championing
Chua is not championing AI romance in the sense of an evangelist. Nor is she calling for a reactionary backlash. What she wants is slower, sharper public thinking before habit turns into a norm.
I’m not anti-technology… What I’m championing is that we stop pretending this is neutral. It’s not. It changes how we relate. It changes what we expect from each other.
That may be the most useful frame for this moment. AI in dating is not just another product trend. It is a renegotiation of intimacy itself: what people expect from attention, what they tolerate in one another, and what kinds of emotional labour they decide to outsource.
The likely future, Chua suspects, will not fit neatly into triumph or disaster. It will be messier than the headlines allow, and by the time language catches up, people will already be living inside the change.
That is perhaps the most unnerving possibility of all. Not that AI will replace love, but that it will quietly rewire the conditions under which love is recognised, desired, and sustained.
The post Inside Inch Chua’s Myles: The AI boyfriend challenging how we define love appeared first on e27.
