News in Short
- Osaurus is an open-source AI server built exclusively for Mac users.
- It lets people switch between local AI models and cloud AI providers from one interface.
- The platform keeps files, tools, and AI memory on user hardware instead of moving everything to external servers.
- Osaurus now supports voice features and more than 20 native plugins.
AI is moving beyond the chatbot phase. New startups are now racing to build the software layer sitting above large language models. Osaurus is entering that race with a different approach. Instead of creating another AI assistant, it wants to turn your Mac into a personal AI hub. Within one system, users can run local AI models or connect cloud services while keeping more control over data and workflows.
The project, built specifically for Apple devices, is trying to answer a growing question in AI: what happens when users want flexibility instead of depending on one model or one provider?
What is Osaurus and why is it getting attention?
The idea behind Osaurus began with another product called Dinoki, an AI desktop companion. According to co-founder Terence Pae, users questioned why they needed to buy an app and still pay ongoing AI token costs. That feedback pushed the team toward local AI experiences.
Pae previously worked as a software engineer at Tesla and Netflix. He started exploring whether an AI assistant could run directly on a Mac. The thinking was straightforward. Most personal files, settings, and tools already live on a device. If AI also stayed there, users could gain deeper access and greater control.
Soon after, Pae began developing Osaurus publicly as an open-source project while continuously adding features and fixing issues.
How does Osaurus bring local and cloud AI models together?
The core idea behind Osaurus is flexibility. Users can connect local AI models running on their own machines or switch to cloud providers depending on the task. More importantly, files, memory systems, and tools can stay on personal hardware.
This creates a different workflow compared to many AI assistants available today.
Different models excel at different jobs. One may perform better at coding. Another may be stronger at reasoning. Some may handle research or creative tasks more effectively. Osaurus allows users to switch between them without rebuilding their entire setup.
Today the platform supports models including MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, and DeepSeek V4. It can also connect with cloud services from OpenAI, Anthropic, Gemini, Grok, OpenRouter, and others.
Is Osaurus trying to simplify AI for regular users?
Many existing tools for local AI target developers. Some require command-line knowledge or technical setup. Others create security concerns. Osaurus appears to focus on making that experience simpler.
The platform uses a hardware-isolated virtual sandbox. This limits what AI systems can access and creates separation between the AI environment and the user’s machine. The goal is to reduce security risks while still allowing broad functionality.
That user-friendly approach may become important. Local AI remains powerful but still feels intimidating for mainstream users.
Can your Mac actually run local AI today?
This remains one of the biggest questions around local AI.
Running AI models directly on a device still requires serious hardware. According to Pae, users need at least 64GB RAM for local models. Larger systems such as DeepSeek V4 may require around 128GB.
That requirement instantly limits access.
Still, the company believes hardware efficiency is improving quickly. Pae pointed toward increasing “intelligence per watt” gains. Last year, many local models struggled with basic tasks. Today they can browse, write code, access tools, and perform more advanced functions.
The broader question now becomes whether local AI eventually reaches laptops and desktops without demanding premium hardware.
Why could this matter beyond consumers?
Osaurus may not stay focused only on individuals.
The startup is already considering enterprise opportunities in industries like legal services and healthcare where privacy concerns remain high. Running AI locally could reduce dependence on external infrastructure while keeping sensitive information closer to users.
The company also believes stronger local AI systems could eventually reduce dependence on giant data centers. Instead of constantly scaling cloud infrastructure, organizations might deploy on-premise hardware setups with lower power demands.
That idea still feels early. Yet it reflects a larger shift happening in AI.
As models become easier to access, attention may move toward software layers that organize, secure, and manage them. Osaurus enters that conversation with a simple pitch: your Mac may become more than a device. It could become the place where your entire AI workflow lives. The bigger test now is whether Osaurus can make local AI practical for more people.