Skip to main content

Overview

This guide walks through building a chatbot that maintains conversation history across multiple messages. It uses:
  • CrewAI Flows to manage conversation state
  • Threads to persist state between runs
  • The Thread API (or Slack) for multi-turn interaction

Project Structure

my-chatbot/
  src/
    my_chatbot/
      agents/
        chat_agent.py
        suggest_agent.py
      flows/
        chat_flow.py
  crewship.toml
  pyproject.toml

Step 1: Define the Chat Agent

Create an agent that responds to user messages given conversation history:
src/my_chatbot/agents/chat_agent.py
from crewai import Agent, LLM

llm = LLM(model="openrouter/anthropic/claude-sonnet-4")

chat_agent = Agent(
    role="Helpful Assistant",
    goal="Assist users by answering their questions clearly and helpfully",
    backstory=(
        "You are a friendly and knowledgeable assistant. You provide clear, "
        "concise, and helpful responses to user questions."
    ),
    llm=llm,
    respect_context_window=True,
    verbose=False,
)
Optionally, add a second agent that suggests follow-up questions:
src/my_chatbot/agents/suggest_agent.py
from crewai import Agent, LLM

llm = LLM(model="openrouter/anthropic/claude-sonnet-4")

suggest_agent = Agent(
    role="Follow-up Question Generator",
    goal="Generate 3 relevant follow-up questions based on a conversation",
    backstory=(
        "You analyze conversations and suggest exactly 3 brief follow-up "
        "questions the user might want to ask next. Return only the 3 questions, "
        "one per line, without numbering or bullet points."
    ),
    llm=llm,
    respect_context_window=True,
    verbose=False,
)

Step 2: Create the Chat Flow

The flow manages conversation state — it appends each user message and assistant response to a messages list that persists across runs via thread state:
src/my_chatbot/flows/chat_flow.py
from crewai.flow.flow import Flow, listen, start
from pydantic import BaseModel

from my_chatbot.agents.chat_agent import chat_agent
from my_chatbot.agents.suggest_agent import suggest_agent


class ChatState(BaseModel):
    query: str = ""
    messages: list[dict] = []
    suggested_questions: list[str] = []


class ChatFlow(Flow[ChatState]):
    @start()
    def chat(self):
        """Add the user query to messages, call the agent, and return the response."""
        user_msg = {"role": "user", "content": self.state.query}
        history = self.state.messages + [user_msg]

        result = chat_agent.kickoff(history)

        assistant_msg = {"role": "assistant", "content": result.raw}
        self.state.messages = self.state.messages + [user_msg, assistant_msg]
        return result.raw

    @listen(chat)
    def suggest(self, chat_response: str):
        """Generate 3 follow-up questions based on the conversation."""
        recent = self.state.messages[-6:]
        conversation = "\n".join(f"{m['role']}: {m['content']}" for m in recent)

        result = suggest_agent.kickoff(
            f"Based on this conversation, suggest exactly 3 brief follow-up questions "
            f"the user might want to ask next. Return only the 3 questions, one per line, "
            f"without numbering or bullet points.\n\n{conversation}"
        )

        questions = [q.strip() for q in result.raw.strip().split("\n") if q.strip()][:3]
        self.state.suggested_questions = questions

        return {
            "messages": self.state.messages,
            "suggested_questions": self.state.suggested_questions,
        }
How state works: When running inside a thread, Crewship passes the previous thread state (including messages) into the flow. The flow appends the new exchange and returns the updated state, which Crewship saves as a checkpoint.

Step 3: Configure crewship.toml

crewship.toml
[deployment]
framework = "crewai"
entrypoint = "my_chatbot.flows.chat_flow:ChatFlow"
python = "3.11"

[chat]
input_key = "query"
output_key = "messages"
  • input_key = "query" — maps incoming messages to the query field in ChatState
  • output_key = "messages" — tells integrations (like Slack) to extract the messages field from the output

Step 4: Deploy

crewship deploy

Using the Chatbot

Via the Thread API

Create a thread and send messages to it:
# Create a thread
curl -X POST https://api.crewship.dev/v1/threads \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"deployment_id": "dep_abc123"}'
# Returns: {"thread_id": "thr_xyz789", ...}

# Send first message
curl -X POST https://api.crewship.dev/v1/threads/thr_xyz789/runs \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"input": {"query": "What are AI agents?"}}'

# Send follow-up (thread state carries conversation history)
curl -X POST https://api.crewship.dev/v1/threads/thr_xyz789/runs \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"input": {"query": "How do they differ from simple chatbots?"}}'
Each run in the thread receives the accumulated messages from previous runs, so the agent has full conversation context.

Via the CLI

# Create a thread
crewship thread create dep_abc123

# Chat in the thread
crewship invoke dep_abc123 --thread thr_xyz789 -i '{"query": "What are AI agents?"}'
crewship invoke dep_abc123 --thread thr_xyz789 -i '{"query": "How do they differ from simple chatbots?"}'

Via Slack

Once you’ve connected Slack and set this deployment as the default, users can simply mention the bot:
@MyBot What are AI agents?
Replies in the same Slack thread automatically continue the conversation — Crewship maps each Slack thread to a Crewship thread behind the scenes.