Libby the librarian. AI agent specialized in creating and querying embeddings for RAG (Retrieval Augmented Generation).
You can install Libby D. Bot using pip:
pip install -U libby
Libby provides several commands through its CLI interface:
Create embeddings from your documents in a specified directory:
libby embed --corpus_path /path/to/your/documents --collection_name your_collection
The corpus_path
defaults to the current directory if not specified. The collection_name
parameter allows you to organize your embeddings into different collections (defaults to 'main').
After creating embeddings, you can ask questions about your documents:
libby answer "What is the main topic of the documents?" --collection_name your_collection
You can use Libby to generate content based on prompts:
# Generate using direct prompt
libby generate "Write a summary of..." --output_file output.txt
# Generate using prompt from file
libby generate "" --prompt_file input_prompt.txt --output_file output.txt
- Multiple language support (English and Portuguese)
- Various AI models available (Llama3, Gemma, ChatGPT)
- PDF document processing and embedding
- Question answering with context from your documents
- Content generation capabilities
Libby supports different AI models and languages. You can configure these through environment variables or the config.yml file.
Available Models:
- Llama3 (default)
- Gemma
- Llama3-vision
- ChatGPT
Supported Languages:
- English (en_US)
- Portuguese (pt_BR)
This project is licensed under the GPLv3 License.