Having the option to use local models instead of OpenAI would be great. Sometimes I need to use them for security reasons. Ollama support would make it easy to choose between models I already have installed locally.