Output: Autonomous AI agents with memory and tool use Build AI agents that can call tools, remember context, and integrate with MCP servers—all backed by Pixeltable’s persistent storage and orchestration.
Declarative Agents: Instead of imperative control flow, define your agent as a table with computed columns. Each row is a user query; computed columns define the reasoning chain (tool selection → execution → context retrieval → response). Pixeltable handles orchestration, caching, and persistence automatically.
Agent Capabilities
Tool Calling
Register UDFs and queries as tools that LLMs can invoke
Persistent Memory
Store conversation history and retrieved context in tables
MCP Integration
Connect to Model Context Protocol servers for external tools
RAG Retrieval
Semantic search over documents, images, and more
Data Lifecycle
- 1. Tools
- 2. Workflow
- 3. MCP
- 4. Memory
- 5. Deploy
Define Tool UDFs
Wrap any Python code as
@pxt.udf tools—API calls, web scraping, database queriesCopy
Ask AI
import pixeltable as pxt
import requests
import yfinance as yf
@pxt.udf
def get_latest_news(topic: str) -> str:
"""Fetch latest news using NewsAPI."""
response = requests.get(
"https://newsapi.org/v2/everything",
params={"q": topic, "apiKey": os.environ["NEWS_API_KEY"]}
)
articles = response.json().get("articles", [])[:3]
return "\n".join(f"- {a['title']}" for a in articles)
@pxt.udf
def fetch_financial_data(ticker: str) -> str:
"""Fetch stock data using yfinance."""
stock = yf.Ticker(ticker)
info = stock.info
return f"{info['shortName']}: ${info['currentPrice']}"
UDF Guide
Writing custom functions
Define Query Tools
Turn semantic search into callable tools with
@pxt.queryCopy
Ask AI
@pxt.query
def search_documents(query_text: str, user_id: str):
"""Search documents by semantic similarity."""
sim = chunks.text.similarity(query_text)
return (
chunks.where((chunks.user_id == user_id) & (sim > 0.5))
.order_by(sim, asc=False)
.select(chunks.text, source_doc=chunks.document, sim=sim)
.limit(20)
)
@pxt.query
def search_video_transcripts(query_text: str):
"""Search video transcripts by text."""
sim = transcript_sentences.text.similarity(query_text)
return (
transcript_sentences.where(sim > 0.7)
.order_by(sim, asc=False)
.select(transcript_sentences.text, source_video=transcript_sentences.video)
.limit(20)
)
Register Tools
Combine UDFs, queries, and MCP tools into a single registry
pxt.tools()Copy
Ask AI
# Register tools from multiple sources
tools = pxt.tools(
# UDFs - External API Calls
get_latest_news,
fetch_financial_data,
# Query Functions - Agentic RAG
search_documents,
search_video_transcripts,
)
Tool Calling Cookbook
Complete tool calling walkthrough
Create Agent Table
Define the workflow as a table with computed columns
Copy
Ask AI
# Main workflow table - rows trigger the agent pipeline
agent = pxt.create_table('agents.workflow', {
'prompt': pxt.String,
'timestamp': pxt.Timestamp,
'user_id': pxt.String,
'system_prompt': pxt.String,
'max_tokens': pxt.Int,
'temperature': pxt.Float,
})
Tool Selection (LLM)
First LLM call decides which tool to use
Copy
Ask AI
from pixeltable.functions.anthropic import messages, invoke_tools
# Step 1: LLM selects which tool to call
agent.add_computed_column(
initial_response=messages(
model='claude-sonnet-4-20250514',
messages=[{'role': 'user', 'content': agent.prompt}],
max_tokens=agent.max_tokens,
tools=tools, # Available tools
tool_choice=tools.choice(required=True), # Force tool selection
model_kwargs={'system': agent.system_prompt}
)
)
Tool Execution
Pixeltable executes the selected tool automatically
invoke_tools()Copy
Ask AI
# Step 2: Execute the tool the LLM chose
agent.add_computed_column(
tool_output=invoke_tools(tools, agent.initial_response)
)
Context Assembly
Combine tool output with retrieved context
Copy
Ask AI
# Parallel context retrieval (Pixeltable handles this)
agent.add_computed_column(doc_context=search_documents(agent.prompt, agent.user_id))
agent.add_computed_column(image_context=search_images(agent.prompt, agent.user_id))
agent.add_computed_column(memory_context=search_memory(agent.prompt, agent.user_id))
# Assemble everything into final context
agent.add_computed_column(
final_context=assemble_context(
agent.prompt,
agent.tool_output,
agent.doc_context,
agent.memory_context,
)
)
Final Response
Second LLM call generates the answer with full context
Copy
Ask AI
# Step 3: Generate final answer with all context
agent.add_computed_column(
final_response=messages(
model='claude-sonnet-4-20250514',
messages=agent.final_context,
max_tokens=agent.max_tokens,
model_kwargs={'system': agent.system_prompt}
)
)
# Extract answer text
agent.add_computed_column(
answer=agent.final_response.content[0].text
)
Tool Calling Cookbook
Complete walkthrough
Connect MCP Server
Load tools from any MCP-compatible server
pxt.mcp_udfs()Copy
Ask AI
# Load tools from MCP server
mcp_tools = pxt.mcp_udfs('http://localhost:8000/mcp')
# Combine with local tools
all_tools = pxt.tools(
get_latest_news,
fetch_financial_data,
search_documents,
*mcp_tools # Add MCP tools
)
Pixeltable MCP Server
MCP server for Claude, Cursor, and AI IDEs
Build Your Own MCP Server
Expose Pixeltable tables as MCP tools for AI IDEs
Copy
Ask AI
# Example: JFK Files MCP Server
# Exposes document search to Claude Desktop, Cursor, etc.
from mcp.server import Server
import pixeltable as pxt
server = Server("jfk-files")
@server.tool()
def search_jfk_documents(query: str) -> str:
"""Search declassified JFK documents."""
docs = pxt.get_table('jfk.documents')
sim = docs.content.similarity(query)
results = docs.order_by(sim, asc=False).limit(5).collect()
return "\n".join(r['content'] for r in results)
JFK MCP Server
Example MCP server with document search
Chat History
Store conversation turns with semantic search
Copy
Ask AI
# Chat history with embedding index
chat_history = pxt.create_table('agents.chat_history', {
'role': pxt.String, # 'user' or 'assistant'
'content': pxt.String,
'timestamp': pxt.Timestamp,
'user_id': pxt.String
})
chat_history.add_embedding_index(
'content',
string_embed=sentence_transformer.using(model_id='all-MiniLM-L6-v2')
)
# Recent history query
@pxt.query
def get_recent_chat_history(user_id: str, limit: int = 4):
return (
chat_history.where(chat_history.user_id == user_id)
.order_by(chat_history.timestamp, asc=False)
.select(role=chat_history.role, content=chat_history.content)
.limit(limit)
)
# Semantic search over all history
@pxt.query
def search_chat_history(query_text: str, user_id: str):
sim = chat_history.content.similarity(query_text)
return (
chat_history.where((chat_history.user_id == user_id) & (sim > 0.8))
.order_by(sim, asc=False)
.select(role=chat_history.role, content=chat_history.content, sim=sim)
.limit(10)
)
Agent Memory Pattern
Persistent conversation context
Memory Bank
Store user-saved snippets (code, text, facts) for recall
Copy
Ask AI
# Selective memory - things the user explicitly saves
memory_bank = pxt.create_table('agents.memory_bank', {
'content': pxt.String,
'type': pxt.String, # 'code', 'text', 'fact'
'language': pxt.String, # For code: 'python', 'javascript', etc.
'context_query': pxt.String, # What triggered this save
'timestamp': pxt.Timestamp,
'user_id': pxt.String
})
memory_bank.add_embedding_index('content', string_embed=embed_fn)
@pxt.query
def search_memory(query_text: str, user_id: str):
sim = memory_bank.content.similarity(query_text)
return (
memory_bank.where((memory_bank.user_id == user_id) & (sim > 0.8))
.order_by(sim, asc=False)
.select(
content=memory_bank.content,
type=memory_bank.type,
language=memory_bank.language,
context_query=memory_bank.context_query,
)
.limit(10)
)
Multimodal Knowledge Base
Index documents, images, video, and audio for retrieval
Copy
Ask AI
# Documents with chunking
documents = pxt.create_table('agents.collection', {
'document': pxt.Document,
'uuid': pxt.String,
'user_id': pxt.String
})
chunks = pxt.create_view('agents.chunks', documents,
iterator=DocumentSplitter.create(
document=documents.document,
separators='paragraph',
metadata='title, heading, page'
)
)
chunks.add_embedding_index('text', string_embed=embed_fn)
# Images with CLIP
images = pxt.create_table('agents.images', {
'image': pxt.Image,
'user_id': pxt.String
})
images.add_embedding_index('image', embedding=clip.using(model_id='openai/clip-vit-large-patch14'))
# Video frames
videos = pxt.create_table('agents.videos', {'video': pxt.Video, 'user_id': pxt.String})
video_frames = pxt.create_view('agents.video_frames', videos,
iterator=FrameIterator.create(video=videos.video, fps=1)
)
video_frames.add_embedding_index('frame', embedding=clip.using(model_id='openai/clip-vit-large-patch14'))
RAG Pipeline
Document retrieval patterns
Flask/FastAPI Endpoint
Expose your agent via HTTP API
Copy
Ask AI
from flask import Flask, request
from datetime import datetime
import pixeltable as pxt
app = Flask(__name__)
agent = pxt.get_table('agents.workflow')
chat_history = pxt.get_table('agents.chat_history')
@app.route("/chat", methods=["POST"])
def chat():
data = request.json
user_id = data["user_id"]
prompt = data["message"]
# Store user message
chat_history.insert([{
"role": "user",
"content": prompt,
"timestamp": datetime.now(),
"user_id": user_id
}])
# Trigger agent workflow (computed columns run automatically)
agent.insert([{
"prompt": prompt,
"timestamp": datetime.now(),
"user_id": user_id,
"system_prompt": "You are a helpful assistant.",
"max_tokens": 1024,
"temperature": 0.7,
}])
# Get the answer (already computed)
result = agent.order_by(agent.timestamp, asc=False).limit(1).collect()
answer = result[0]["answer"]
# Store assistant response
chat_history.insert([{
"role": "assistant",
"content": answer,
"timestamp": datetime.now(),
"user_id": user_id
}])
return {"response": answer}
Deployment Guide
Production deployment patterns
Cloud Deployment (Coming Soon)
One-command deployment with
pxt serve and pxt deployCloud Offering
Learn about upcoming Endpoints and Live Tables
Built with Pixeltable
Pixelbot
Multimodal AI agent with infinite memory, file search, and image generation
Pixelagent
Lightweight agent framework with built-in memory and tool orchestration
Pixelmemory
Persistent memory layer for AI applications
MCP Server
Model Context Protocol server for Claude, Cursor, and AI IDEs