Coding seems to become an AI thing more and more. The European options we have:
- Mistral is special because they are the only EU company which directly competes with OpenAI, Anthropic, Google, etc.
- Open-Weight Hosting is the alternative where (mostly Chinese) models are hosted on EU hardware
I have not checked benchmarks yet, but the sentiment seems to be that some Chinese models (e.g. Kimi K2.5/2.6) are relatively close to the top models like Opus or Codex.
Did I miss any provider?
There is https://www.infomaniak.com/en/euria (Switzerland)
And https://mammouth.ai/ (France), though they’re more a “middleman” for various providers (including providers serving open-weights models)
And of course you can still run models locally with LLM hosts like https://github.com/ggml-org/llama.cpp (there are hundreds of derivatives, but llama.cpp is the OG/underlying library for most of them). A decent gaming PC can now run local LLMs on par with SOTA proprietary models from 6-12 months ago (qwen3.6 is a beast). https://old.reddit.com/r/LocalLLaMA/ is a decent subreddit for news and discussions about this, I didn’t find a real equivalent on lemmy.
There is !localllama@sh.itjust.works
Of course, it is not as busy as the subreddit.
Lumo, from Proton (Switzerland, I believe)
We have the Swiss AI iniative as well. It’s in early stages and doesn’t match the performance of …well anything. But they list some more providers: https://apertvs.ai/pages/get-started/


