Transcription as it should be for the agent era.
Meetings become Markdown files on your Mac. Your AI tools read them directly. No APIs, no rate limits. Private by architecture.
The problem
Granola, Otter, Fireflies. They work great until you try to do something with your data. Want your AI to work with your meetings? You're going through APIs with rate limits, authentication, and someone else deciding how much access you get to your own conversations. Want to feed transcripts to Claude? You're exporting PDFs and copy-pasting like it's 2019.
APIs have rate limits. Your filesystem doesn't. You're paying $20/month for the privilege of restricted access to your own conversations.
The fix
OpenOats is a macOS app that transcribes your meetings and saves them as Markdown files on your machine. Not on someone's server. Not in a proprietary format. Just files in a folder you control.
It looks and feels like the polished apps you're used to. Live transcription, real-time suggestions from your own notes, meeting detection, post-call summaries. The difference is that everything stays local. Your audio never leaves your device. Your transcripts are yours to search, analyze, or hand to any AI tool you want. No rate limits. No permission needed.
What you get
Your transcripts work for you
Every meeting becomes a Markdown file on your Mac. Plain text your AI tools can read directly, with no API standing between you and your own data. Search across months of conversations. Feed your entire meeting history to Claude, build automations, pipe transcripts into any workflow. Your filesystem is the API.
A real app, not a side project
OpenOats is a native macOS app with live transcription, meeting detection, and AI-powered suggestions drawn from your own notes. It sits next to your call and stays out of the way. Screen-share safe.
Private by architecture
Transcription runs locally on your machine. Audio never touches the network. This isn't a privacy policy, it's how the app is built. There's no server to send data to.
Use any AI you want
OpenRouter for Claude or GPT, Ollama for fully local models, any OpenAI-compatible endpoint. Pick what works for you. Switch anytime.
Open source, community-driven
MIT licensed. No subscription. No account. Built and maintained by the community. You pay for cloud API costs if you choose cloud models, which works out to a few dollars a month. Or run everything locally and pay nothing.
Ships today
This isn't a waitlist. The app works today. Over 1,000 stars on GitHub and counting.
- Native macOS app with real-time transcription
- Records both sides of a call (mic + system audio)
- Surfaces relevant context from your notes mid-call
- Detects meetings automatically (Zoom, Teams, Slack)
- Generates post-call summaries with any LLM
- Saves Markdown files you can use with any tool
- Apple Silicon, macOS 15+
Common concerns
"How does it compare to Granola?"
Similar polish, fundamentally different architecture. Granola is a cloud product that gives you gated access to your own data through APIs. OpenOats is a local product that optionally uses the cloud. Your transcripts are Markdown files on your machine. Any tool that can read a file can read your meetings. No tokens, no authentication, no rate limits.
"Is local transcription any good?"
It uses NVIDIA's Parakeet TDT, a production-grade speech recognition model. The quality is solid. And the backend is pluggable. When something better ships, you can swap it in without waiting for us to update.
"Do I need to be technical to use it?"
No. Download the app, open it, grant mic access, and hit record. The defaults work out of the box. If you want to tinker later, swap models, change the output format, build automations on top, everything is open and configurable. But you don't have to.