# rig **Repository Path**: pojoin/rig ## Basic Information - **Project Name**: rig - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: bug(lancedb)/rag-embedding-filtering - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-02-06 - **Last Updated**: 2025-02-14 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
✨ If you would like to help spread the word about Rig, please consider starring the repo! > [!WARNING] > Here be dragons! Rig is **alpha** software and **will** contain breaking changes as it evolves. We'll annotate them and highlight migration paths as we encounter them. ## What is Rig? Rig is a Rust library for building scalable, modular, and ergonomic **LLM-powered** applications. More information about this crate can be found in the [crate documentation](https://docs.rs/rig-core/latest/rig/). Help us improve Rig by contributing to our [Feedback form](https://bit.ly/Rig-Feeback-Form). ## Table of contents - [What is Rig?](#what-is-rig) - [Table of contents](#table-of-contents) - [High-level features](#high-level-features) - [Installation](#installation) - [Simple example:](#simple-example) - [Integrations](#integrations) ## High-level features - Full support for LLM completion and embedding workflows - Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory) - Integrate LLMs in your app with minimal boilerplate ## Get Started ```bash cargo add rig-core ``` ### Simple example: ```rust use rig::{completion::Prompt, providers::openai}; #[tokio::main] async fn main() { // Create OpenAI client and model // This requires the `OPENAI_API_KEY` environment variable to be set. let openai_client = openai::Client::from_env(); let gpt4 = openai_client.agent("gpt-4").build(); // Prompt the model and print its response let response = gpt4 .prompt("Who are you?") .await .expect("Failed to prompt GPT-4"); println!("GPT-4: {response}"); } ``` Note using `#[tokio::main]` requires you enable tokio's `macros` and `rt-multi-thread` features or just `full` to enable all features (`cargo add tokio --features macros,rt-multi-thread`). You can find more examples each crate's `examples` (ie. [`src/examples`](./src/examples)) directory. More detailed use cases walkthroughs are regularly published on our [Dev.to Blog](https://dev.to/0thtachi). ## Integrations | Model Providers | Vector Stores | |:--------------:|:-------------:| |
| 
|
Vector stores are available as separate companion-crates:
- MongoDB vector store: [`rig-mongodb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-mongodb)
- LanceDB vector store: [`rig-lancedb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-lancedb)
- Neo4j vector store: [`rig-neo4j`](https://github.com/0xPlaygrounds/rig/tree/main/rig-neo4j)
- Qdrant vector store: [`rig-qdrant`](https://github.com/0xPlaygrounds/rig/tree/main/rig-qdrant)