What We’ll Build
A research agent using LangGraph.js (TypeScript) that:
- Receives a topic as input
- Runs a researcher node to gather key facts
- Runs a reporter node to expand findings into a markdown report
Prerequisites
- Node.js 20+
- Crewship CLI installed and authenticated
Want to skip the setup? Clone the langgraph-js-quickstart repo and run crewship deploy to get a working agent deployed in minutes.
Step 1: Create the Project
mkdir research-agent-js && cd research-agent-js
mkdir src
Your project structure:
research-agent-js/
├── src/
│ └── graph.ts
├── langgraph.json
├── package.json
├── tsconfig.json
└── crewship.toml
Step 2: Define Your Graph
Create src/graph.ts:
import { ChatOpenAI } from '@langchain/openai'
import { StateGraph, Annotation } from '@langchain/langgraph'
const State = Annotation.Root({
topic: Annotation<string>(),
research: Annotation<string>(),
report: Annotation<string>(),
})
const llm = new ChatOpenAI({ model: 'gpt-4o-mini', temperature: 0.7 })
async function researcher(state: typeof State.State) {
const response = await llm.invoke([
{
role: 'system',
content:
'You are a senior researcher. Given a topic, produce 10 concise bullet ' +
'points covering the most important facts, recent developments, and key ' +
'insights. Return only the bullet list.',
},
{ role: 'user', content: `Topic: ${state.topic}` },
])
return { research: typeof response.content === 'string' ? response.content : String(response.content) }
}
async function reporter(state: typeof State.State) {
const response = await llm.invoke([
{
role: 'system',
content:
'You are a senior reporting analyst. Given research bullet points, ' +
'expand them into a well-structured markdown report with an introduction, ' +
'detailed sections, and a conclusion.',
},
{ role: 'user', content: `Research notes:\n${state.research}` },
])
return { report: typeof response.content === 'string' ? response.content : String(response.content) }
}
const builder = new StateGraph(State)
.addNode('researcher', researcher)
.addNode('reporter', reporter)
.addEdge('__start__', 'researcher')
.addEdge('researcher', 'reporter')
.addEdge('reporter', '__end__')
export const graph = builder.compile()
The exported graph is what Crewship invokes. Your input (e.g., {"topic": "quantum computing"}) becomes the initial state.
Step 3: Add langgraph.json
Create langgraph.json in the project root — this lets crewship init auto-detect the framework:
{
"node_version": "20",
"graphs": {
"agent": "./src/graph.ts:graph"
}
}
Step 4: Add Dependencies
Create package.json:
{
"name": "research-agent-js",
"version": "1.0.0",
"type": "module",
"dependencies": {
"@langchain/langgraph": "^0.2.0",
"@langchain/openai": "^0.3.0"
}
}
Create tsconfig.json:
{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "bundler",
"strict": true,
"esModuleInterop": true
}
}
Step 5: Add Crewship Configuration
Run crewship init to auto-generate the config — or create it manually:
[deployment]
framework = "langgraph-js"
entrypoint = "./src/graph.ts:graph"
profile = "slim"
[build]
exclude = ["tests"]
[runtime]
timeout = 300
memory = 512
LangGraph.js uses a file path entrypoint (with ./), unlike Python frameworks which use a module path. The python field is not applicable.
Step 6: Set Environment Variables
crewship env set OPENAI_API_KEY=sk-proj-...
Step 7: Deploy
📦 Packaging agent...
☁️ Uploading build context...
🔨 Building image...
✅ Deployed successfully!
Deployment: dep_abc123xyz
Project: research-agent-js
Step 8: Run Your Agent
crewship invoke --input '{"topic": "quantum computing"}' --stream
Watch the execution:
▶ Run started: run_xyz789
├─ [10:30:01] Starting graph execution
├─ [10:30:02] Node: researcher starting...
├─ [10:30:12] Node: researcher completed
├─ [10:30:13] Node: reporter starting...
├─ [10:30:45] Node: reporter completed
✅ Run completed in 44.8s
Step 9: Access the Output
The run result is the final graph state — an object with all state keys:
curl https://api.crewship.dev/v1/runs/run_xyz789 \
-H "Authorization: Bearer YOUR_API_KEY"
{
"status": "succeeded",
"result": {
"topic": "quantum computing",
"research": "• Quantum computers use qubits...",
"report": "# Quantum Computing\n\n## Introduction\n..."
}
}
To save the report as an artifact, write it to the artifacts/ directory inside your node:import { writeFileSync, mkdirSync } from 'fs'
async function reporter(state: typeof State.State) {
const response = await llm.invoke([...])
const content = String(response.content)
mkdirSync('artifacts', { recursive: true })
writeFileSync('artifacts/report.md', content)
return { report: content }
}
Next Steps