Overview
The crewship.toml file configures how your crew is built and deployed. Place it in your project root.
Crewship supports two formats:
Single deployment — one [deployment] section (default, created by crewship init)
Multi-deployment — multiple [deployments.<name>] sections for monorepos with several agents
Run crewship init to auto-generate this file with detected settings from your project.
Single Deployment (default)
CrewAI
LangGraph
LangGraph.js
[ deployment ]
framework = "crewai"
entrypoint = "src.my_crew.crew:MyCrew"
profile = "slim"
python = "3.11"
[ build ]
exclude = [ "tests" ]
[ deployment ]
framework = "langgraph"
entrypoint = "src.my_agent.graph:graph"
profile = "slim"
python = "3.11"
[ build ]
exclude = [ "tests" ]
[ deployment ]
framework = "langgraph-js"
entrypoint = "./src/graph.ts:graph"
profile = "slim"
[ build ]
exclude = [ "tests" ]
Multi-Deployment
For monorepos with multiple agents sharing the same codebase, use named [deployments.<name>] sections:
[ build ]
exclude = [ "tests" ]
[ deployments . research-agent ]
framework = "crewai"
entrypoint = "research_crew.crew:ResearchCrew"
profile = "slim"
python = "3.11"
[ deployments . writer-agent ]
framework = "crewai"
entrypoint = "writer_crew.crew:WriterCrew"
Each named deployment gets its own deployment on Crewship with the name as the project name (e.g. research-agent).
[apis] and [chat] can be set at the top level as global defaults, and overridden per deployment:
# Global defaults
[ chat ]
input_key = "query"
[ deployments . support-agent ]
framework = "crewai"
entrypoint = "support_crew.crew:SupportCrew"
[ deployments . research-agent ]
framework = "crewai"
entrypoint = "research_crew.crew:ResearchCrew"
# Override chat config for this deployment only
[ deployments . research-agent . chat ]
input_key = "topic"
[deployment] and [deployments.*] are mutually exclusive. Using both in the same file will produce an error.
Selecting a deployment
Use --name / -n on any command to target a specific deployment:
crewship deploy --name research-agent
crewship invoke --name writer-agent --input '{"topic": "AI"}'
crewship env set --name research-agent OPENAI_API_KEY=sk-...
If --name is omitted:
One named deployment — auto-selected
Multiple named deployments — interactive prompt (or error in CI)
Single [deployment] format — used directly (no --name needed)
Deployment IDs
After the first deploy, deployment_id is saved into each section automatically:
[ deployments . research-agent ]
framework = "crewai"
entrypoint = "research_crew.crew:ResearchCrew"
deployment_id = "dep_xxx1" # auto-populated
[ deployments . writer-agent ]
framework = "crewai"
entrypoint = "writer_crew.crew:WriterCrew"
deployment_id = "dep_xxx2" # auto-populated
Full Example
[ deployment ]
framework = "crewai"
entrypoint = "src.research_crew.crew:ResearchCrew"
python = "3.11"
profile = "slim"
[ build ]
exclude = [ "tests" ]
[ build . install ]
packages = [ "ffmpeg" , "imagemagick" ]
[ apis ]
enabled = [ "thread" , "run" ]
[ chat ]
input_key = "topic"
[ runtime ]
timeout = 600
memory = 512
[ metadata ]
description = "A crew that researches topics and writes reports"
tags = [ "research" , "writing" ]
Configuration Reference
[deployment] / [deployments.<name>]
Field Type Required Description frameworkstring ✅ Framework to use: crewai, langgraph, or langgraph-js entrypointstring ✅ Entry point for your agent (format varies by framework) pythonstring "3.11"Python version (3.10, 3.11, 3.12) — Python frameworks only profilestring "slim"Base image profile dockerfilestring — Custom Dockerfile path deployment_idstring — Auto-populated after first deploy
Entrypoint Format
The entrypoint format depends on your framework:
CrewAI and LangGraph (Python) — Python module path:
module.path:ClassName_or_variable
# CrewAI: points to a class decorated with @CrewBase
entrypoint = "src.my_crew.crew:MyCrew"
# LangGraph: points to a compiled graph variable
entrypoint = "src.my_agent.graph:graph"
LangGraph.js — file path relative to project root:
./path/to/file.ts:exportName
# LangGraph.js: points to an exported compiled graph
entrypoint = "./src/graph.ts:graph"
Run crewship init to auto-detect your framework and entrypoint. For LangGraph projects, place a langgraph.json in your project root to enable auto-detection.
Profile Options
Profile Description Frameworks Use case slimMinimal base environment All Most agents browserIncludes Playwright + Chromium crewai, langgraphWeb scraping, screenshots
[build]
Build configuration options.
Field Type Default Description excludearray of string []Paths to exclude from build
[build.install]
Additional system packages to install.
[ build . install ]
packages = [ "ffmpeg" , "poppler-utils" , "tesseract-ocr" ]
Only packages available in the base image’s package manager (apt) are supported.
[runtime]
Runtime configuration.
Field Type Default Description timeoutinteger 300Max execution time in seconds memoryinteger 256Memory limit in MB
[ runtime ]
timeout = 900 # 15 minutes
memory = 1024 # 1 GB
[apis]
Controls which APIs are enabled for the deployment. This affects both direct API access and which interaction modes are available in Slack .
Field Type Default Description enabledarray of string — Which APIs to expose: "thread", "run"
When omitted (or null), all APIs are enabled by default.
[ apis ]
enabled = [ "thread" , "run" ] # Both APIs enabled (same as default)
[ apis ]
enabled = [ "run" ] # Only stateless runs — no thread/conversation support
[ apis ]
enabled = [ "thread" ] # Only threaded conversations — no standalone runs
In Slack, the enabled APIs determine which interaction modes work:
Thread API — @mention conversations use threads for multi-turn context
Run API — /crewship run slash commands use single stateless runs
If both are enabled, @mention defaults to thread mode
Not sure which to use? Read Runs vs Threads: When to Use Which .
[chat]
Configures how chat messages (from Slack or other integrations) map to your crew’s input and output.
Field Type Default Description input_keystring "input"The parameter name that receives the chat message output_keystring — Field in the run output containing response messages
[ chat ]
input_key = "topic"
When a user sends a message in Slack (e.g. “Tell me about quantum computing”), it gets mapped to:
{ "topic" : "Tell me about quantum computing" }
If output_key is set, the platform extracts that field from the run result to display as the response:
[ chat ]
input_key = "query"
output_key = "messages"
If your crew already accepts an input parameter, you can omit [chat] entirely — it defaults to input_key = "input".
Optional metadata for organization.
Field Type Description descriptionstring Human-readable description tagsarray Tags for filtering in Console
[ metadata ]
description = "Researches topics and generates blog posts"
tags = [ "content" , "blog" , "research" ]
Validation
The CLI validates your crewship.toml on deploy:
Common validation errors:
❌ Error: Invalid crewship.toml
- name: must be lowercase with hyphens only
- deployment.entrypoint: required field missing
Environment-specific Config
For different environments, use separate projects:
# Production
crewship deploy --project my-crew
# Staging
crewship deploy --project my-crew-staging
Set different environment variables per project:
crewship env set OPENAI_API_KEY=sk-prod-... --project my-crew
crewship env set OPENAI_API_KEY=sk-test-... --project my-crew-staging
Example Configurations
Basic CrewAI
[ deployment ]
framework = "crewai"
entrypoint = "src.simple_crew.crew:SimpleCrew"
Basic LangGraph
[ deployment ]
framework = "langgraph"
entrypoint = "src.my_agent.graph:graph"
python = "3.11"
Basic LangGraph.js
[ deployment ]
framework = "langgraph-js"
entrypoint = "./src/graph.ts:graph"
Web Scraping Crew
[ deployment ]
framework = "crewai"
entrypoint = "src.scraper.crew:ScraperCrew"
profile = "browser"
python = "3.11"
[ runtime ]
timeout = 600
memory = 1024
Document Processing
[ deployment ]
framework = "crewai"
entrypoint = "src.processor.crew:ProcessorCrew"
python = "3.11"
[ build . install ]
packages = [ "poppler-utils" , "tesseract-ocr" ]
[ runtime ]
timeout = 900
memory = 2048
Slack Chatbot
[ deployment ]
framework = "crewai"
entrypoint = "src.support_bot.flows.chat_flow:ChatFlow"
python = "3.11"
[ apis ]
enabled = [ "thread" ]
[ chat ]
input_key = "query"
output_key = "messages"
Multi-Agent Monorepo (mixed frameworks)
[ build ]
exclude = [ "tests" , "notebooks" ]
[ deployments . research-agent ]
framework = "crewai"
entrypoint = "agents.research.crew:ResearchCrew"
profile = "browser"
python = "3.11"
[ deployments . writer-agent ]
framework = "langgraph"
entrypoint = "agents.writer.graph:graph"
python = "3.11"
[ deployments . frontend-agent ]
framework = "langgraph-js"
entrypoint = "./agents/frontend/graph.ts:graph"
Deploy individual agents:
crewship deploy --name research-agent
crewship deploy --name writer-agent
Deploy Deploy your configured crew
Slack Integration Connect your crew to Slack
Environment Variables Configure secrets
Threads Multi-turn conversations