[Bug]: it's still using OpenAI after set Ollama in .env file #2599
Labels
area: backend
Related to backend functionality or under the /backend directory
bug
Something isn't working
What happened?
Modifying .env file to use Ollama:
1.- Using a fake key to skip OpenAI integration
2.- OLLAMA_API_BASE_URL=http://host.docker.internal:11434
3.- Compose again with rebuild
Checks:
1.- Ollama server running and checked with line command
2.- Quivr: Embeddings for uploaded docs are done with Ollama
3.- Quivr: Chat is still trying with OpenAI, of course with error in frontend because of fake api
4.- Error log (attached) shows what is obvious
Relevant log output
NOTE:
This seems to be a bug and what is worse, a security fault, because chunks of private documents could be uploaded to OpenAI, if api was correct, as part of the behavior of a RAG flow.
The text was updated successfully, but these errors were encountered: