Retool started out as a platform for building line-of-business apps, but over the last few years, the well-funded startup added a number of back-end services as well, including, most recently, a workflow automation service. Today, it’s launching a number of new tools that will help its users build AI-based apps, including a hosted vector store that will allow them to more easily add context to large language models (LLMs).
As Retool CEO and co-founder David Hsu told me, a lot of his customers are already looking at how to use AI in their apps, but for most enterprises, the value in using these tools is in being able to reason over their internal data. They have the option to copy and paste data into a query to add context, but that’s limited and can quickly become rather costly. While few businesses have the resources to train their own models, they could likely fine-tune existing models with a reasonable amount of data. Hsu, however, argues that fine-tuning a model with all of a company’s production data isn’t really feasible either — and that the data would soon be out of date.
Currently, the state-of-the-art for bringing custom data into LLMs is to vectorize the data to make it easily accessible for these models. That’s why you are seeing the likes of Google, Microsoft, DataStax and MongoDB all launching their vector search services in recent months.
In Hsu’s view, there isn’t a lot that differentiates these offerings — they are all pretty similar in their capabilities. “I think the problem actually is not which vector database you choose — whether it’s MongoDB or whatever else. Instead, it’s how do you actually get data into this vectorized database and how do you keep it up to date. How do you sync it with Salesforce, for example, so you could ask your LLM questions and actually get it to pull in fresh data from Salesforce.” That, he argues, is where Retool’s customers are facing the largest obstacles in building custom AI applications for their business use cases right now, so it’s no surprise then that the company is launching Retool Vectors today, a hosted vector storage service (using the open source pgvector extension for Postgres at its core).
Internally, Hsu explained, Retool had tested the use of Intercom’s GPT-powered AI chatbot to handle some customer service interactions. That bot, which already had access to a lot of business context, was able to close about 20% of tickets. But then, using a vector database coupled to an LLM that included all of Retool’s Salesforce data, support data and more, its custom bot’s close rate went up to almost 60%. The company also vectorized all of its transcripts from sales calls and then put OpenAI’s API on top of it to query them.
One nifty feature here is that Retool is then also using its recently launched Workflow service to keep a business’ production database and the vectorized database in sync, ensuring that the models have access to up-to-date information.
In addition to the vector storage service, Retool also launched a number of AI-based actions for common use cases like text summarization and classification, image generation and more. Retool partnered with OpenAI for these features, which includes integrations into Retool Workflows.
“We’re incredibly excited to partner with Retool to empower more companies to leverage generative AI across their business,” said Brad Lightcap, chief operating officer at OpenAI. “From reducing manual work to sharing more knowledge to adding new customer-facing capabilities, we believe tools like Retool help businesses put AI into production much faster without compromising safety.”