Elastic Introduces Low-Code Interface for Building RAG Apps

While prototyping conversational search, the ability to experiment with and rapidly iterate on key components of a RAG workflow is essential to get accurate and hallucination-free responses from LLMs

author-image
SMEStreet Edit Desk
New Update
Matt Riley, global vice president and general manager, Search at Elastic
Listen to this article
0.75x 1x 1.5x
00:00 / 00:00

Elastic, the Search AI Company, announced Playground, a low-code interface that enables developers to build RAG applications using Elasticsearch in minutes. Playground’s intuitive interface allows users to A/B test different large language models (LLMs) and refine retrieval mechanisms to ground answers with the proprietary data indexed into Elasticsearch indices.

While prototyping conversational search, the ability to experiment with and rapidly iterate on key components of a RAG workflow is essential to get accurate and hallucination-free responses from LLMs,” said Matt Riley, global vice president and general manager, Search at Elastic. “Developers use the Elastic Search AI platform, which includes the Elasticsearch vector database, for comprehensive hybrid search capabilities and to tap into innovation from a growing list of LLM providers. Now, the playground experience brings these capabilities together via an intuitive user interface, removing the complexity from building and iterating on generative AI experiences, ultimately accelerating time to market for our customers.”

Playground can leverage transformer models directly in Elasticsearch and is augmented by the Elasticsearch Open Inference API, which integrates models from a growing list of inference providers, including Cohere and Azure AI Studio.

Playground currently supports chat completion models from OpenAI and Azure OpenAI Service. To get started, read the Elastic blog.

User Interface Elastic