Troubleshooting¶
Ollama Not Running¶
If you see errors about connecting to Ollama:
# Start Ollama server
ollama serve
# In another terminal, verify it's running
curl http://localhost:11434/api/tags
Model Not Found¶
If harombe can't find your model:
# List available models
ollama list
# Pull a model (recommended: qwen2.5:7b)
ollama pull qwen2.5:7b
# Update your config
nano ~/.harombe/harombe.yaml # Change model.name
Installation Issues¶
# Ensure Python 3.11+ is installed
python3 --version
# Upgrade pip
pip install --upgrade pip
# Reinstall harombe
pip install --force-reinstall harombe
Permission Errors¶
If you get permission errors during tool execution:
- Check that
confirm_dangerous: truein your config - Review the operation before approving
- Consider running in a sandboxed environment
Getting Help¶
- Check existing Issues
- Start a Discussion
- Review the Security Policy for security concerns