Posted on

pints.ai safeguards data privacy with its secure ‘AI in a Box’ solution

Privacy concerns and Generative AI are often at odds, particularly in industries where data security is critical, such as medicine, finance, and government services. The use of public large language models (LLMs) typically involves sending data to servers controlled by the model’s owner, which poses significant risks. AI deployment options in these sectors are limited to avoid compromising data privacy and security, but this is where pints.ai comes in.

pints.ai addresses this challenge by offering a solution enabling organisations to harness AI’s power while maintaining data privacy.

Their technology allows companies to build and deploy smaller, specialised AI models within secure environments, such as an air-gapped “AI in a Box” that can be kept on-premises or within a private cloud infrastructure. This approach empowers businesses to apply Generative AI to high-value, sensitive data, unlocking insights and gaining a competitive edge without compromising security.

For example, in the insurance sector, pints.ai’s RevSurge Suite equips relationship managers with AI tools for communication, product comparison, and sales assistance. These tools are deployed on the client’s infrastructure to ensure data privacy.

Launched in 2023 by Co-Founders Partha Rao (CEO) and Calvin Tan (CTO) and run by eight employees in Singapore, pints.ai raised US$1.3 million from family offices, founders, and business angels in a Pre-Seed round in 2022.

Also Read: Ecosystem Roundup: Indian startup ecosystem bets on deeptech, Gen AI for next growth phase | VinFast delays US plant opening

The company also received the coveted AI SG StartUp Grant in 2024, which Rao says helps it accelerate its research and business development goals.

“Part of the funds will go into creating some fantastic datasets and research into building the 1.5 Pints Compact Model, which will be released in 2024,” he says.

“A part of the grant will also be allocated to create more research workstreams with our university partner, the Singapore University of Technology and Design (SUTD). We also plan to open-source much of our research to build a solid foundation for research-focused AI development in Singapore.”

In this interview, Rao explains more about how pints.ai aims to make a difference and how it is unlike the many AI companies in the market.

The following is an edited excerpt of the conversation.

What is your product development process?

Early in our product development, we decided to build to client needs so that we didn’t waste the team’s time and effort. We had dozens of conversations and figured the best approach to product development, especially when it comes to Generative AI, was to co-create with our clients.

It’s a win-win. For clients, this process ensures we tackle the “unsolvable problems” that other startups aren’t really addressing. For us, it gives us an opportunity to productise as we build.

Also Read: Learning reimagined: Enhancing literacy with real-time metaverse and Gen AI

While this was a difficult and time-consuming path, this strategy is reaping the rewards, as we are now largely revenue-funded. We are grateful to our early clients for putting their faith in us. With this strategy, we have avoided many of the pitfalls that AI startups that pre-build products based on imagined market needs face in monetising their products.

We partnered with clients to identify three gaps in AI and set about fixing them. First, we discovered that simple AI “products” were being packaged as bespoke consulting solutions. Second, the time taken from idea to deployment was too long. Finally, the post-proof-of-concept roll-outs often ran into trouble due to high computing costs.

With these three challenges in mind, we decided to build a framework to pre-train or fine-tune compact language models (our own proprietary models and open-source options) to build AI agents that leverage client-sensitive data and ensure it doesn’t get locked away.

We also co-developed use cases with each client and the basic framework for each of these use cases—from making our whole stack work across multiple deployment options to building custom datasets, a family of our own (1.5 Pints) and opensource LLMs, RAG architecture, database design, UX, and workflows.

What is your major plan with your solution?

Our solutions aren’t fully bespoke, and neither are they out-of-the-box. We fall somewhere in between – call it mass customisation of software. This allows us to deploy faster without sacrificing customisation – whether for data, corporate IT policies, internal guidelines, etc.

Also Read: Choco Up, Set Sail AI forge partnership to help businesses grow through Gen AI adoption

We have built our core stack as separate AI Apps and modules. These are bound together under the Pints RevSurge umbrella on AWS. On the enterprise end, we deploy a custom version and are launching a version of this product for mid-tier financial institutions.

We see this as an underserved market. While big banks and global insurers are spending millions to build copilots and AI agents, Independent Financial Advisors, Family Offices, and smaller asset management firms are unable to use Gen AI for their work.

RevSurge is a collection of powerful AI-assisted workflows that will boost productivity and improve client servicing in this segment.

What is your business model? What is your strategy to build a sustainable business?

We work with large enterprise clients as partners, not as software vendors. Our revenues are split into two streams. First, a license fee for using our proprietary architecture. Second, a service fee includes customisation – model building, fine-tuning, developing custom workflows – to ensure the deployed solution adheres to organisational and regulatory policies.

Our approach offers a practical alternative to the large, cumbersome models that dominate the market. It reduces the hardware footprint and aligns with more sustainable tech practices.

As a business, we are already an AI-powered team. We have integrated AI workflows in every aspect—whether it is code, UI/UX, content creation, or proposals—enabling us to operate at peak productivity. We aim to quadruple our revenues in 2025 and expand into new markets and industries.

Also Read: Gen AI in banking: How to ensure a successful transformation for an age-old industry

Who are the users of your solutions, and how do you acquire them?

We work with regional customers in highly regulated, critical infrastructure industries such as finance, insurance, law, AI/ML, and fraud detection. Our solutions’ end-users include insurance agents, wealth managers, and fraud investigators.

Some great advisors introduced us to our early clients. We now have a small sales team and run our sales funnel. We also have a close partnership with AWS in multiple markets and work with them to solve privacy and data security challenges in AI that their financial services client may highlight.

In addition to our proactive business development efforts, our wins at the Singapore India Hackathon at IIT Gandhinagar, SFF—AI Singapore, MAS’ AI in Finance Global Challenge, and the recent AI SG grant have been incredible in putting our name on the map.

What is your plan for 2024 and beyond?

We are entering a very exciting phase of growth. By the end of 2024, we aim to double the number of clients we serve. Our product development pipeline is also packed until the middle of next year.

2025 will mark the start of our expansion outside of Asia and will be a crucial year for us. We have already begun discussions with several SI partners in other markets to develop a joint go-to-market plan. Expansion into Europe is high on our agenda as the regulatory landscape around AI and data protection is more advanced there.

We are also doubling hiring in both our business development and engineering teams. We plan to double our headcount in six months and create country teams for Customer Success and Localisation.

Image Credit: pints.ai

The post pints.ai safeguards data privacy with its secure ‘AI in a Box’ solution appeared first on e27.