Yes. If the AI is not integrated with the IDE, it's not as helpful.
If there were an IDE plugin that let you use a local model, perhaps that would be an option, but I haven't seen that (Github Copilot allows selecting different models, but I didn't check more carefully whether that also includes a local one, anyone knows?).
> (Github Copilot allows selecting different models, but I didn't check more carefully whether that also includes a local one, anyone knows?).
To my knowledge, it doesn't.
On Emacs there's gptel which integrates quiet nicely different LLM inside Emacs, including a local Ollama.
> gptel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. It works in the spirit of Emacs, available at any time and uniformly in any buffer.
I also use Ollama for coding. I have a 32G M2 Mac, and the models I can run are very useful for coding and debugging, as well as data munging, etc. That said, sometimes I also use Claude Sonnet 3.5 and o1. (BTW, I just published an Ollama book yesterday, so I am a little biassed towards local models.)
You can also pick the right model for the right need and it's free.