Running cognee with local models¶
🚀 Getting Started with Local Models¶
You'll need to run the local model on your machine or use one of the providers hosting the model.
We had some success with mixtral, but 7b models did not work well. We recommend using mixtral for now.
Ollama¶
Set up Ollama by following instructions on Ollama website
Set the environment variable in your .env to use the model
Otherwise, you can set the configuration for the model: You can also set the HOST and model name:cognee.config.llm_endpoint = "http://localhost:11434/v1"
cognee.config.llm_model = "mistral:instruct"
Anyscale¶
Otherwise, you can set the configuration for the model: You can also set the HOST and model name:LLM_MODEL = "mistralai/Mixtral-8x7B-Instruct-v0.1"
LLM_ENDPOINT = "https://api.endpoints.anyscale.com/v1"
LLM_API_KEY = "your_api_key"
You can set the same way HOST and model name for any other provider that has an API endpoint.