DataStax, the well-funded Apache Cassandra-centric database company, is placing a lot of its current bets on AI and its technology’s ability to provide highly scalable vector search capabilities to provide real-time context to generative AI models. Today, following a short public preview, the company is launching the vector search capabilities of its hosted Astra DB service into general availability.
Vector databases have emerged as a foundational technology for generative AI. “If you’re a database company and you weren’t making this your top priority, I wouldn’t be able to understand that,” DataStax CPO Ed Anuff told me. “This is the most exciting thing that’s happened to databases in a very long time. It’s just super cool. Databases are pretty cool. They’re foundational and all that, but now being the system that provides memory for artificial intelligence — it completely changes why you get up in the morning.”
DataStax customers can now use Astra DB’s new vector search capabilities on AWS, Microsoft Azure and Google Cloud Platform, where it originally launched. DataStax Enterprises users who run the service in their own data centers will get access to vector search within the next month.
Anuff noted that DataStax saw a lot of uptake during the preview period and given the nature of the product, customers who use vector search also tend to be highly active users. Within a few days after the company launched the public preview, he told me, the company saw just over 1,000 signups and DataStax CEO Chet Kapoor said that the company started 50 new major enterprise POCs last week alone.
“I consider myself to be leaning forward and aggressive with our goals,” Kapoor said. “This blew my mind. So I am very surprised. We are the database-as-a service going into real-time AI — and now we are almost showing up in every Pinecone conversation, every Chroma conversation which is there. It’s happening with investors, as well as customers and with partners.”
Given the hype around generative AI and the importance of vector search to augment these models with more recent data or personalized data, for example, it’s no surprise that other database services are also trying to capitalize on this momentum. The DataStax team argues that its core technology based on Apache Cassandra, which allows it (and its database index) to reach the massive scale needed for many of these use cases, as well as its wide range of certifications, give it a competitive edge. Anuff also stressed that Astra DB now supports the popular LangChain framework for building LLM-based applications.
“The ability to trust the output of generative AI models will be critical to adoption by enterprises,” explained Matt Aslett, VP and Research Director, Ventana Research. “The addition of vector embeddings and vector search to existing data platforms enables organizations to augment generic models with enterprise information and data, reducing concerns about accuracy and trust.”