⭐️ Support Rig and give it a star on GitHub!

Rig - Build LLM Applications in Rust

Build modular and scalable LLM Applications in Rust

cargo add rig-core

Rig Demo

See More
main.rs
Build Yours

Core Features of Rig

Unified LLM Interface

Consistent API across different LLM providers, simplifying integration and reducing vendor lock-in.

Rust-Powered Performance

Leverage Rust's zero-cost abstractions and memory safety for high-performance LLM operations.

Advanced AI Workflow Abstractions

Implement complex AI systems like RAG and multi-agent setups with pre-built, modular components.

Type-Safe LLM Interactions

Utilize Rust's strong type system to ensure compile-time correctness in LLM interactions.

Seamless Vector Store Integration

Built-in support for vector stores, enabling efficient similarity search and retrieval for AI applications.

Flexible Embedding Support

Easy-to-use APIs for working with embeddings, crucial for semantic search and content-based recommendations.

Why Developers Choose Rig for AI Development

Efficient Development

  • Type-safe API reduces runtime errors
  • Async-first design for optimal resource utilization
  • Seamless integration with Rust's ecosystem (Tokio, Serde, etc.)

Production-Ready Architecture

  • Modular design for easy customization and extension
  • Comprehensive error handling with custom error types
  • Built-in support for tracing and logging

Connect with the Rig Community