Introduction Download Documentation BAAM About

Installation & Setup

Get BioRouter running in three steps โ€” download, install, and connect an AI provider. The whole process takes about five minutes.

1

Download BioRouter

Download the installer for your platform from the Download page or directly from GitHub Releases.

  • macOS: Open the .dmg and drag BioRouter to /Applications
  • Windows: Unzip and run BioRouter.exe
  • Linux (Debian): sudo dpkg -i biorouter_*.deb
  • Linux (RPM): sudo rpm -i BioRouter-*.rpm
macOS tip: If you see a security warning on Apple Silicon, run chmod 755 ~/.config in Terminal and relaunch.
2

Connect an LLM Provider

BioRouter will guide you on first launch. Choose the option that fits your needs:

  • UCSF users: Select Azure OpenAI (UCSF ChatGPT) โ€” use endpoint https://unified-api.ucsf.edu/general, deployment gpt-5-2025-08-07, and your Versa API key. Or select Amazon Bedrock (UCSF Anthropic). Full details in the UCSF Setup Guide.
  • Commercial API: Enter your Anthropic, OpenAI, or Google API key.
  • Fully local: Install Ollama, pull a model (ollama pull qwen3), select Ollama in BioRouter โ€” no API key, no data leaves your machine.
3

Verify & Explore

Send a test message in the chat. If BioRouter responds, you're all set. Open the Extensions sidebar to add agents, enable tools, and start building workflows.

macOS CLI Installation

curl -fsSL https://github.com/BaranziniLab/BioRouter/releases/download/stable/download_cli.sh | bash

To update the CLI: biorouter update

Configuration File

All settings are stored in:

~/.config/biorouter/config.yaml    # macOS / Linux

This file is shared between the Desktop app and the CLI. API keys are stored encrypted.

Troubleshooting

IssueSolution
No window on launch (macOS M-chip)chmod 755 ~/.config then relaunch
Extension fails to activateEnsure Node.js (npx) or Python/uv (uvx) is installed
API key not workingVerify the key is valid and has available quota
Logs~/.config/biorouter/logs/

UCSF Setup Guide

BioRouter is built for UCSF researchers. Use your institutional AI access for secure, compliant workflows with enterprise-managed models.

Option A โ€” UCSF Versa / Azure OpenAI (Recommended)

UCSF's institutional AI platform, accessed via Azure OpenAI. Get your API key at:

ai.ucsf.edu/platforms-tools-and-resources/ucsf-versa

  1. In BioRouter, go to Settings โ†’ Models โ†’ Azure OpenAI โ†’ Configure
  2. Enter the endpoint and credentials below, then click Save
  3. Click Launch to start chatting
FieldEnvironment VariableValue
EndpointAZURE_OPENAI_ENDPOINThttps://unified-api.ucsf.edu/general
Deployment NameAZURE_OPENAI_DEPLOYMENT_NAMEgpt-5-2025-08-07
API VersionAZURE_OPENAI_API_VERSION2024-10-21
API KeyAZURE_OPENAI_API_KEYYour UCSF Versa API key
Where to get your Versa API key: Sign in with your UCSF MyAccess credentials at the link above to generate or retrieve your API key.

Data stays within UCSF's Azure tenant โ€” safe for institution-approved use cases.

Option B โ€” Amazon Bedrock (UCSF-hosted Anthropic)

  1. Log in via UCSF AWS SSO: aws sso login --profile <your-profile>
  2. In BioRouter: Settings โ†’ Models โ†’ Amazon Bedrock โ†’ Configure
  3. Set AWS_PROFILE and AWS_REGION when prompted
  4. Select Claude Sonnet or another Bedrock-hosted model

Default model: claude-sonnet-4-5. Data stays within UCSF's AWS environment.

Option C โ€” Local (Ollama, Air-gapped)

For maximum privacy โ€” nothing ever leaves your device:

  1. Install Ollama from ollama.com/download
  2. Pull a model: ollama pull qwen3
  3. In BioRouter, select Ollama as your provider โ€” no configuration needed

Other Institutions

Check with your institution's IT or compliance office for approved AI hosting options. For institutions without managed AI services, commercial API keys (Anthropic, OpenAI) or local Ollama inference are recommended โ€” but always verify compliance before processing any sensitive data.

โš ๏ธ Patient data & PHI: Never use personal commercial API keys with patient data, PHI, or other regulated research data. Use UCSF-managed services (Azure, Bedrock) or local Ollama only. Always verify with UCSF compliance before processing regulated data.

Providers & Models

BioRouter connects to a wide range of LLM providers. Switch providers at any time from Settings โ†’ Models without changing your workflows.

UCSF Institutional Providers

ProviderAccessDefault Model
Azure OpenAI (UCSF ChatGPT)UCSF Versa API key ยท Setup guide โ†’gpt-5-2025-08-07
Amazon Bedrock (UCSF Anthropic)UCSF AWS SSO profileClaude Sonnet 4.5

Commercial Cloud Providers

ProviderEnv VariableGet API Key
AnthropicANTHROPIC_API_KEYplatform.claude.com
OpenAIOPENAI_API_KEYplatform.openai.com
Google GeminiGOOGLE_API_KEYaistudio.google.com
X.AI (Grok)XAI_API_KEYconsole.x.ai
OpenRouterOPENROUTER_API_KEYopenrouter.ai

Local Models โ€” Ollama

Install Ollama, pull any model, and select Ollama in BioRouter. Everything stays on your machine.

ollama pull qwen3           # Recommended general-purpose
ollama pull llama3.2        # Fast lightweight option
ollama pull deepseek-r1     # Strong reasoning model

Browse all available Ollama models โ†’

Switching Providers

Desktop: Settings โ†’ Models โ†’ select a provider card โ†’ Configure or Launch.
CLI: biorouter configure โ†’ Select "Configure Providers"

Extensions, Skills & MCP

Extend BioRouter with MCP servers, reusable Skills, and built-in platform capabilities โ€” from database access to browser automation.

Built-in Extensions

ExtensionWhat It DoesDefault
DeveloperFile operations, shell commands, code search, text editingโœ… On
Computer ControllerWeb scraping, file caching, browser automationOff
MemoryRemembers user preferences and context across sessionsOff
Auto VisualiserAuto-generates data visualizations from conversationOff
Chat RecallSearch across all past conversation historyOff
Code ExecutionRun JavaScript in a sandboxed environmentOff
SkillsLoad and invoke reusable instruction setsโœ… On
Extension ManagerDiscover, enable, and disable extensions mid-sessionโœ… On

Enabling Extensions

Desktop: Sidebar โ†’ Extensions โ†’ toggle on.
CLI: biorouter configure โ†’ Add Extension โ†’ Built-in Extension

Adding External MCP Servers

Desktop: Sidebar โ†’ Extensions โ†’ Add custom extension โ†’ enter type, ID, name, command, and env vars.

Config file (~/.config/biorouter/config.yaml):

extensions:
  github:
    name: GitHub
    cmd: npx
    args: [-y @modelcontextprotocol/server-github]
    enabled: true
    envs:
      GITHUB_PERSONAL_ACCESS_TOKEN: "<your_token>"
    type: stdio
    timeout: 300

Skills System

Skills are reusable .md instruction files that encode your team's workflows and best practices. Drop a skill file in your skills directory and BioRouter makes it callable in any session.

Skills can be shared across a team or institution to standardize analysis pipelines โ€” without sharing any underlying data.

Recipes & Automation

Recipes package any workflow into a shareable, parameterizable, schedulable unit โ€” the foundation of federated research collaboration in BioRouter.

What is a Recipe?

A Recipe is a YAML file that defines a prompt template, input parameters, and an optional schedule. Recipes enable reproducible, shareable workflows that travel across institutions without carrying any data.

Recipe Structure

name: "Literature Review"
description: "Summarize recent papers on a research topic"
parameters:
  - name: topic
    description: "Research topic to review"
    required: true
  - name: years
    description: "How many years back to cover"
    default: "3"
prompt: |
  You are a scientific literature reviewer.
  Summarize key findings on: {{ topic }}
  Cover publications from the past {{ years }} years.
  Focus on clinical relevance and methodology.

Running a Recipe

Desktop: Sidebar โ†’ Recipes โ†’ select a recipe โ†’ fill parameters โ†’ Run.

CLI:

biorouter run recipe.yaml --param topic="SPOKE knowledge graph" --param years=5

Scheduling Recipes

Add a schedule field using cron syntax to run a recipe automatically:

schedule: "0 8 * * 1"   # Every Monday at 8:00 AM

Scheduled recipes run via BioRouter's background service โ€” even when the desktop app is closed.

Sharing Recipes

Recipe files are plain YAML text. Commit them to a shared repository, publish them on GitHub, or email them. Recipients open them with File โ†’ Open Recipe or biorouter run <recipe.yaml>. No data is embedded โ€” only the workflow logic.

Data Privacy Guide

BioRouter routes your inputs to an LLM provider. Privacy properties depend entirely on which provider you use โ€” here's how to choose the right one.

โš ๏ธ Patient data / PHI: Only use UCSF-managed services (Azure OpenAI, Amazon Bedrock) or local Ollama when working with patient data, PHI, or any regulated research data. Never use personal commercial API keys for sensitive data. Always verify compliance with your institution before processing regulated data.

Provider Privacy Properties

ProviderData Stays WithinBest For
Ollama (local)Your device only โ€” no networkMaximum privacy, air-gapped requirements
UCSF Azure OpenAIUCSF's Azure tenantInstitution-approved clinical use cases
UCSF Amazon BedrockUCSF's AWS environmentInstitution-approved clinical use cases
Commercial APIs (personal)Provider's cloud infrastructureDe-identified or non-sensitive data only

Best Practices

  • De-identify first โ€” remove names, dates of birth, MRNs, and other identifiers before any AI processing
  • Minimize data exposure โ€” provide only what's needed for the task
  • Prefer local models for exploratory work with real data
  • Protect your device โ€” session logs at ~/.config/biorouter/logs/ may contain your inputs
  • Never share sessions that contain patient or sensitive data
  • Always verify with UCSF IT or your compliance office before processing regulated data

Architecture

BioRouter is a modular, plugin-based system built with a Rust backend and Electron + React frontend, connected via a local REST API.

Three-Layer Design

  • Interface โ€” Desktop GUI (Electron + React 19) or CLI, accepts user input and renders responses
  • Agent Core โ€” Rust-based reasoning loop managing LLM interaction, tool execution, context, and session state
  • Extensions โ€” Pluggable MCP servers providing tools: file system, databases, web, code execution, and custom agents

Backend โ€” Rust Workspace

CrateRole
biorouterCore agent library โ€” loop, providers, sessions, recipes, scheduling
biorouter-serverLocal REST API server (biorouterd) the desktop communicates with
biorouter-cliCLI binary โ€” biorouter session, biorouter configure, etc.
biorouter-mcpBuilt-in MCP servers (Developer, Computer Controller, etc.)

Key dependencies: tokio (async), axum (HTTP), rmcp (Model Context Protocol), serde/serde_json, tiktoken-rs (token counting), sqlx/SQLite (session persistence), minijinja (recipe templates), tokio-cron-scheduler.

Frontend โ€” Electron + React

Desktop app: Electron 39 + React 19 + TypeScript, built with Vite + Electron Forge. Communicates with the local biorouterd REST API.

Model Context Protocol (MCP)

All extensions use MCP โ€” a standard protocol for LLM tool invocation. Any MCP-compatible server (local process or remote HTTP) can be added as a BioRouter extension, enabling a growing open ecosystem of research agents.

MCP Agents

BioRouter connects to local and remote MCP agents for specialized research tasks. Browse the curated agent ecosystem in the BAAM Marketplace.

Adding an Agent via Desktop UI

  1. Go to Sidebar โ†’ Extensions โ†’ Add custom extension
  2. Choose type: Command-line (stdio) for local agents or Streamable HTTP for remote
  3. Enter the agent name and command (e.g., uvx --from git+https://... agentname)
  4. Add any required environment variables
  5. Click Add and enable the extension

Adding an Agent via Config YAML

extensions:
  spokeagent:
    name: SPOKE Agent
    cmd: uvx
    args:
      - --from
      - "git+https://github.com/BaranziniLab/SPOKEAgent"
      - spokeagent
    enabled: true
    type: stdio
    timeout: 300

Remote HTTP Agents

extensions:
  remote-agent:
    name: My Remote Research Agent
    url: https://my-agent.example.com/mcp
    type: streamable_http
    enabled: true
    timeout: 300

Prerequisites

Most UCSF agents use uvx (requires Python + uv). Install with:

curl -LsSf https://astral.sh/uv/install.sh | sh   # macOS / Linux
# or: pip install uv

Playwright MCP uses npx โ€” requires Node.js: nodejs.org

Available Agents

Browse all available agents, copy install commands, and find GitHub links in the BAAM Marketplace โ†’