Skip to content

Latest commit

 

History

History
55 lines (39 loc) · 1.57 KB

README.md

File metadata and controls

55 lines (39 loc) · 1.57 KB

Document Q&A RAG App - Setup Guide

This guide provides instructions for setting up and running the Document Q&A Retrieval-Augmented Generation (RAG) Streamlit app using Ollama for local LLM support - the typical RAG 101 project.

Prerequisites

  • Operating System: macOS or Linux (Windows users need WSL).
  • Python: Version 3.8 or higher.
  • Streamlit: Make sure Streamlit is installed (pip install streamlit).

Installation Steps

Step 1: Install Ollama

  1. macOS: Install via Homebrew:

    brew install ollama
  2. Linux: Download from the Ollama GitHub Releases page, extract, and move it to your PATH.

    tar -xzf ollama-linux-x86_64.tar.gz
    sudo mv ollama /usr/local/bin/
  3. Windows (via WSL): Install WSL and follow the Linux instructions above.

Step 2: Download Models

Download the mistral and the text-embedding-ada-002 models:

ollama pull mistral
ollama pull text-embedding-ada-002

Step 3: Install Python Dependencies

Navigate to the app's directory and install the required Python packages:

pip install -r requirements.txt

Ensure that you have streamlit, langchain, and chromadb installed.

Running the Streamlit App

After setting up Ollama and downloading the required models, run the app using Streamlit:

streamlit run doc-qa.py

Additional Resources