This is super interesting! I'm the founder of Muna (https://docs.muna.ai) with much of the same underlying philosophy, but a different approach:
We're building a general purpose compiler for Python. Once compiled, developers can deploy across Android, iOS, Linux, macOS, Web (wasm), and Windows in as little as two lines of code.
Oh! Muna looks cool as well! I've just barely glanced at your docs page so far, but I'm definitely going to explore further. One of the biggest issues in the back of our minds is getting models running on a variety of hardware and platforms. Right now, we're just using Ollama with support for Lemonade coming soon. But both of these will likely require some manual setup before deploying LlamaFarm.
We should collab! We prefer to be the underlying infrastructure behind the scenes, and have a pretty holistic approach towards hardware coverage and performance optimization.
We're building a general purpose compiler for Python. Once compiled, developers can deploy across Android, iOS, Linux, macOS, Web (wasm), and Windows in as little as two lines of code.
Congrats on the launch!