Run Revdoku on your laptop
Revdoku is open source and the full app runs on your machine. One-line install, Docker under the hood, no signup. Use a cloud AI key, or run a local LLM through Ollama so documents and inference both stay on your hardware.
Full privacy with a local LLM
Run Google Gemma 4 through Ollama on your own machine. After the model is downloaded, no documents and no inference leave your laptop. No API keys, no telemetry, no cloud round-trip.
See how to set it up →Quick install
Open Terminal on macOS or Linux. On Windows, open WSL. Then run:
curl -fsSL https://raw.githubusercontent.com/revdoku/revdoku/main/install-local.sh | sh
The installer creates ~/.revdoku/, generates the encryption secrets, starts Revdoku in Docker, and opens a one-time local sign-in link in your browser. No email or password is required for the single-user install.
Any time after that, open Revdoku with:
~/.revdoku/revdoku open
Read the full README Source on GitHub
http://localhost:3217. That URL shows the email/password sign-in page. Always use ~/.revdoku/revdoku open to get a fresh local sign-in link.What you need
- Docker. Docker Desktop on macOS or Windows. Docker Engine with Docker Compose on Linux.
- A terminal. Built-in on macOS and Linux. WSL on Windows.
- An AI provider. Either a cloud API key (OpenAI, Google Vertex, OpenRouter) or a local LLM through Ollama. Reviews need one.
Pick your AI provider
Revdoku doesn’t ship its own model. You bring your own. Add provider keys inside the app at Account -> AI -> Providers. Keys are stored encrypted at rest.
Local LLM with Ollama (fully offline)
With Ollama, both documents and inference stay on your laptop. Nothing is uploaded, no API key is required, and after the model is downloaded the workflow runs offline. Useful for sensitive material, evaluations, and air-gapped review.
Follow the steps below to set it up.
-
Install Ollama:
curl -fsSL https://ollama.com/install.sh | shOr download it from ollama.com/download on macOS or Windows.
-
Pull the recommended local model (about 9.6 GB):
ollama pull gemma4:e4bgemma4:e4bis Google Gemma 4 E4B, an edge model with text and image input. -
Enable Ollama in Revdoku:
~/.revdoku/revdoku enable-ollama -
In the app, open
Account -> AIand pickLocal Gemma · Basicfor the workflows where you want local inference.
After setup, Revdoku and Ollama work offline as long as both are running. Revdoku reaches Ollama over http://host.docker.internal:11434/v1. If Revdoku can’t reach Ollama, set OLLAMA_HOST=0.0.0.0:11434 and restart Ollama. Keep port 11434 private or firewalled.
Desktop shortcut
Make Revdoku a one-click launch.
macOS
cat > "$HOME/Desktop/Revdoku.command" <<'EOF'
#!/bin/sh
"$HOME/.revdoku/revdoku" open
EOF
chmod +x "$HOME/Desktop/Revdoku.command"
Double-click Revdoku.command on your Desktop. If macOS blocks it the first time, right-click and choose Open.
Linux
mkdir -p "$HOME/.local/share/applications"
cat > "$HOME/.local/share/applications/revdoku.desktop" <<'EOF'
[Desktop Entry]
Name=Revdoku
Comment=Open Revdoku
Exec=sh -lc "$HOME/.revdoku/revdoku open"
Terminal=false
Type=Application
Categories=Office;
EOF
chmod +x "$HOME/.local/share/applications/revdoku.desktop"
Some Linux desktops may ask you to mark the launcher as trusted.
Windows (with WSL)
Create a normal Windows shortcut with this target:
C:\Windows\System32\wsl.exe -e bash -lc "$HOME/.revdoku/revdoku open"
If you use a named WSL distro:
C:\Windows\System32\wsl.exe -d Ubuntu -e bash -lc "$HOME/.revdoku/revdoku open"
Daily commands
~/.revdoku/revdoku open
~/.revdoku/revdoku start
~/.revdoku/revdoku stop
~/.revdoku/revdoku logs
~/.revdoku/revdoku update
~/.revdoku/revdoku backup
~/.revdoku/revdoku enable-ollama
The local app listens on 127.0.0.1 only. Default port is 3217, or the next free one. The exact port is saved in ~/.revdoku/revdoku.env.
Backup
Your local data is encrypted. The key lives in ~/.revdoku/revdoku.env, the data lives in ~/.revdoku/storage/. Keep them together.
~/.revdoku/revdoku backup
Don’t copy storage/ alone. Without revdoku.env the backup can’t be read.
Windows native (experimental)
WSL is the recommended path on Windows. There’s also an experimental native PowerShell installer:
Invoke-WebRequest -UseBasicParsing https://raw.githubusercontent.com/revdoku/revdoku/main/install-local.ps1 -OutFile "$env:TEMP\revdoku-install-local.ps1"; powershell -ExecutionPolicy Bypass -File "$env:TEMP\revdoku-install-local.ps1"
What’s included
- All checklists, evidence highlights, document compare, and revision re-check from the cloud version
- SQLite storage on your machine
- Two-factor authentication available
- Encrypted documents and database fields at rest
- No telemetry
- AGPL-3.0 source
Need a team server or HIPAA?
- Team server. For a shared instance on your own infrastructure, see self-host on a server.
- HIPAA / BAA. Use the hosted cloud version today. The local install is single-user and not HIPAA-managed.
- Managed enterprise install. Talk to sales for vendor-supported on-premise.