An Expert is an LLM-powered capability that an AI agent can utilize. Experts are designed to complete small, discrete tasks efficiently, enabling the Orchestrator to compose complex workflows by chaining multiple experts together.Documentation Index
Fetch the complete documentation index at: https://docs.corti.ai/llms.txt
Use this file to discover all available pages before exploring further.
Expert Registry
Corti maintains a registry of experts that includes both first-party experts built by Corti and third-party integrations. You can browse the available experts in the Available Experts overview, or discover them programmatically through the Expert Registry API endpoint, which returns information about all available experts including their capabilities, descriptions, and configuration requirements. The registry includes experts for various healthcare use cases such as:- Clinical reference lookups
- Medical coding
- Document generation
- Data extraction
- And more
Common registry experts
A minimal sample of frequently-used experts:| Key | Purpose |
|---|---|
memory-expert | Recall and analyze content from large in-request contexts and files |
coding-expert | Assign diagnosis and procedure codes from notes |
medical-calculator-expert | Compute BMI, HbA1c, glucose conversions, etc. |
drugbank-expert | Drug information and interaction lookups |
posos-expert | Medication guidance and prescribing decision support |
pubmed-expert | PubMed literature search and abstracts |
clinical-trials-expert | Search clinical trial registries |
web-search-expert | Search and retrieve up-to-date web content |
interviewing-expert | Drive structured questionnaire interviews |
Bring Your Own Expert
You can create custom experts by exposing an MCP (Model Context Protocol) server. When you register your MCP server, Corti wraps it in a custom LLM agent with a system prompt that you can control. This allows you to:- Integrate your own tools and data sources
- Create domain-specific experts tailored to your workflows
- Maintain control over the expert’s behavior through custom system prompts
- Leverage Corti’s orchestration and memory management while using your own tools
Expert Configuration
When creating a custom expert, you provide configuration that includes:- Expert metadata: ID, name, and description
- System prompt: Controls how the LLM agent behaves and reasons about tasks
-
MCP server configuration: Details about your MCP server including transport type, authorization, and connection details (see MCP Authentication for details)
Expert Configuration
MCP Server Requirements
Your MCP server must:- Implement the Model Context Protocol specification
- Expose tools via the standard MCP
tools/listandtools/callendpoints - Handle authentication
Multi-Agent Composition
We’re working on exposing A2A (Agent-to-Agent) endpoints that will allow you to attach multiple agents together, enabling more sophisticated multi-agent workflows. This will provide:- Direct agent-to-agent communication using the A2A protocol
- Composition of complex workflows across multiple agents
- Fine-grained control over agent interactions and data flow
Direct Expert Calls
We’re also working on enabling direct calls to experts, allowing you to use them directly in your workflows rather than only through agents. This will provide:- Direct API access to individual experts
- Integration of experts into custom workflows
- More flexible composition patterns beyond agent-based orchestration
While AI chat is a useful mechanism, it’s not the only option!The Corti Agentic Framework is API-first, enabling synchronous or async usage across a range of modalities: scheduled batch jobs, clinical event triggers, UI widgets, and direct EHR system calls.Let us know what types of use cases you’re exploring, from doctor-facing chat bots to system-facing automation backends.
Please contact us if you need more information about Experts or creating custom experts in the Corti Agentic Framework.