Skip to main content
If you’ve been building AI agents with LangGraph or CrewAI — defining state graphs, tool nodes, conditional edges, and bolting on separate memory stores — this guide shows how Pixeltable replaces the graph DSL with declarative tables.
Related use case: Agents & MCP

Concept Mapping

Agent FrameworkPixeltable Equivalent
StateGraph / AgentExecutorpxt.create_table() with computed columns
Graph nodes (functions)Computed columns — dependencies resolved automatically
Graph edges / conditional routingColumn references — Pixeltable infers the DAG
ToolNode / @toolpxt.tools() + invoke_tools()
MemorySaver / checkpointerTables are persistent by default
Separate vector DB for RAGadd_embedding_index() + @pxt.query
LangSmith for observabilityt.select() on any column — every step is queryable

Side by Side: Tool-Calling Agent

An agent that picks tools, calls them, and answers based on the results.
from typing import Annotated, Sequence, TypedDict
from langchain_core.messages import BaseMessage, HumanMessage
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, END, add_messages
from langgraph.prebuilt import ToolNode
from langchain_core.tools import tool

class AgentState(TypedDict):
    messages: Annotated[Sequence[BaseMessage], add_messages]

@tool
def get_weather(city: str) -> str:
    """Get current weather for a city."""
    return f'Weather in {city}: 72°F, sunny'

@tool
def search_docs(query: str) -> str:
    """Search internal documents."""
    return f'Results for: {query}'

tools = [get_weather, search_docs]
model = ChatOpenAI(model='gpt-4o-mini').bind_tools(tools)

def call_model(state):
    return {'messages': [model.invoke(state['messages'])]}

def should_continue(state):
    last = state['messages'][-1]
    return 'tools' if last.tool_calls else END

workflow = StateGraph(AgentState)
workflow.add_node('agent', call_model)
workflow.add_node('tools', ToolNode(tools))
workflow.set_entry_point('agent')
workflow.add_conditional_edges(
    'agent', should_continue, {'tools': 'tools', END: END})
workflow.add_edge('tools', 'agent')
graph = workflow.compile()

result = graph.invoke(
    {'messages': [HumanMessage(content='Weather in SF?')]})
print(result['messages'][-1].content)
Packages: langgraph, langchain-openai, langchain-core, plus a vector DB client for RAG

What Changes

LangGraph / CrewAIPixeltable
StateEphemeral — lost when the process endsPersistent — every row survives restarts
CachingNo built-in caching of tool resultsSame input returns cached result
ObservabilityLangSmith (separate service + API key)agent.select(agent.tool_output).collect()
Adding RAGSeparate vector DB integrationadd_embedding_index() + @pxt.query — no extra service
Graph definitionNodes, edges, conditional routing DSLComputed columns — Pixeltable infers the DAG
MCP toolsCustom integrationpxt.mcp_udfs() loads tools from any MCP server

Common Patterns

Adding persistent memory

from langgraph.checkpoint.memory import MemorySaver

checkpointer = MemorySaver()
graph = workflow.compile(checkpointer=checkpointer)
# In-process only — lost on restart

Adding RAG to an agent

from langchain_pinecone import PineconeVectorStore

vector_store = PineconeVectorStore(
    index_name='docs', embedding=embeddings)

@tool
def search_kb(query: str) -> str:
    """Search the knowledge base."""
    docs = vector_store.as_retriever() \
        .get_relevant_documents(query)
    return '\n'.join(d.page_content for d in docs)
# Must add tool to graph, re-compile...

Inspecting agent behavior

# Requires LangSmith: set LANGSMITH_API_KEY,
# LANGSMITH_PROJECT, then view traces in dashboard

Next Steps

Last modified on March 3, 2026