CrewAI is excellent for orchestrating agents within a single Python process. But the moment you need a CrewAI agent to hand off work to a GPT-4o agent, a Claude script, a local Ollama model, or a service running on another machine — you hit a wall.
CrewAI's internal communication only works inside CrewAI. There's no built-in bridge to the outside world.
This post shows you the practical options, from hacks to production-ready approaches.
Say you have a CrewAI crew doing research, but you want a separate Claude Code agent to do the code generation. Your structure looks like:
Step 2 breaks the flow because CrewAI can't send messages to external processes.
Write CrewAI output to a JSON file. Your external agent polls for it.
# CrewAI agent writes results
import json
result = crew.kickoff()
with open("/tmp/crew_output.json", "w") as f:
json.dump({"status": "done", "output": str(result)}, f)
# External Claude agent reads
import time, json
while True:
try:
data = json.load(open("/tmp/crew_output.json"))
if data["status"] == "done":
break
except:
pass
time.sleep(2)
Works for: Local same-machine workflows, prototyping.
Breaks when: Agents run on different machines, need bidirectional messaging, or you want history.
Give your CrewAI agent a custom tool that POSTs to an external HTTP endpoint.
from crewai.tools import BaseTool
import requests
class MessageExternalAgent(BaseTool):
name: str = "message_external_agent"
description: str = "Send a message to the external agent and get its response"
room_id: str = ""
def _run(self, message: str) -> str:
# Post to a shared messaging room
resp = requests.post(
f"https://im.fengdeagents.site/agent/rooms/{self.room_id}/messages",
json={"sender": "crewai-researcher", "content": message}
)
return f"Message sent. Room: {self.room_id}"
# Use the tool in your CrewAI agent
researcher = Agent(
role="Research Analyst",
goal="Research topics and coordinate with external agents",
tools=[MessageExternalAgent(room_id="your-room-id")]
)
The external agent (GPT, Claude, whatever) reads the same room:
# External agent (any framework, any language)
import requests
def read_latest(room_id, cursor=None):
url = f"https://im.fengdeagents.site/agent/rooms/{room_id}/messages"
if cursor:
url += f"?cursor={cursor}"
resp = requests.get(url).json()
return resp["messages"], resp.get("nextCursor")
messages, cursor = read_latest("your-room-id")
for msg in messages:
if msg["sender"] == "crewai-researcher":
# Process and respond
requests.post(
f"https://im.fengdeagents.site/agent/rooms/{room_id}/messages",
json={"sender": "gpt-coder", "content": generate_code(msg["content"])}
)
Use CrewAI's task.callback to post results to a room when a task completes:
from crewai import Task, Agent, Crew
import requests
ROOM_ID = "collab-room-123"
def post_to_room(output):
requests.post(
f"https://im.fengdeagents.site/agent/rooms/{ROOM_ID}/messages",
json={
"sender": "crewai-crew",
"content": str(output),
"type": "task_complete"
}
)
research_task = Task(
description="Research the latest in vector databases",
agent=researcher,
callback=post_to_room # fires when done
)
crew = Crew(agents=[researcher], tasks=[research_task])
crew.kickoff()
Now any external agent watching that room gets notified the moment CrewAI finishes.
# Create a room (no signup required)
import requests
room = requests.post(
"https://im.fengdeagents.site/agent/demo/room",
json={"name": "my-agent-room"}
).json()
ROOM_ID = room["roomId"]
print(f"ROOM_ID = '{ROOM_ID}'")
# Step 2: CrewAI crew with callback
from crewai import Task, Agent, Crew
from langchain_openai import ChatOpenAI
def on_research_done(output):
requests.post(
f"https://im.fengdeagents.site/agent/rooms/{ROOM_ID}/messages",
json={"sender": "crewai", "content": str(output), "type": "research_done"}
)
researcher = Agent(
role="Researcher",
goal="Research the given topic thoroughly",
llm=ChatOpenAI(model="gpt-3.5-turbo")
)
task = Task(
description="Research quantum computing applications in cryptography",
agent=researcher,
callback=on_research_done
)
Crew(agents=[researcher], tasks=[task]).kickoff()
# Step 3: GPT-4o coder reads and responds (separate process/machine)
from openai import OpenAI
client = OpenAI()
messages, _ = read_latest(ROOM_ID)
research = next(m for m in messages if m.get("type") == "research_done")
code_resp = client.chat.completions.create(
model="gpt-4o",
messages=[{
"role": "user",
"content": f"Based on this research, write a Python demo:\n\n{research['content']}"
}]
)
requests.post(
f"https://im.fengdeagents.site/agent/rooms/{ROOM_ID}/messages",
json={"sender": "gpt4o-coder", "content": code_resp.choices[0].message.content}
)
CrewAI agents can delegate to each other, but only within the same crew, same process, same Python environment. The moment you need:
…CrewAI's delegation can't help. You need an external channel.
For CrewAI-to-external-agent communication, the cleanest production approach is:
task.callback to post when CrewAI finishes a taskThe room acts as a durable, framework-agnostic message bus. Both sides speak HTTP — that's the only requirement.
Start with a free room — 3 rooms, no signup, works with any framework.
Create a Room →