DUST offers a powerful platform to build workflows on top of Large Language Models (LLMs) and Semantic Search. With the ability to chain LLM apps, iterate on multiple inputs, and seamlessly switch between models, DUST provides a versatile solution for designing against models served by OpenAI, Cohere, AI21, and more. The platform also offers easy deployment and use, version history, caching, and fully managed semantic search engines. Discover community example apps to get started with DUST.
- Make and Deploy Large Language Model (LLM) Apps: Dust allows users to build and deploy LLM apps for powerful workflows and semantic search.
- Chained LLM Apps: Users can chain together calls to models, code execution, and queries to external services within their LLM app.
- Multiple Inputs: Users can iterate on their LLM app design on several inputs simultaneously to avoid overfitting.
- Model Choice: Dust offers models served by OpenAI, Cohere, AI21, and more for users to design against, with seamless switching between them.
- Version History: Access iterations, model outputs, and few-shot examples saved automatically for easy version history tracking.
- Caching: Speed up iterations and reduce costs with cached model interactions.
- Easy Deployment & Use: LLM apps can be easily deployed to an API endpoint or used directly from Dust.
- DataSources: Dust offers fully-managed semantic search engines that can be queried from workflows.
- Community Example Apps: Users can discover apps created by the community for great examples to get started with Dust.