AI Chat is coming to YaCy!
https://github.com/yacy/yacy_search_server/commit/13fbff0bffc78aa4676e92eb0ff95e5a9e98872f
Here YaCy acts as a RAG reverse proxy and creates context for LLMs based on search in the YaCy index.
So far this commit works with an external chat client; all OpenAI chat clients should work as the YaCy RAG reverse proxy implements the OpenAI API.
An integrated chat will be added probably soon.
As a back-end LLM server, Ollama was chosen. We will make this configurable in the future, however right now this is a hard-coded setting.
@orbiterlab That's an interesting move.