Lattice helps you build AI agents in Go with clean abstractions for LLMs, tool calling, retrieval-augmented memory, and multi-agent coordination. Focus on your domain logic while Lattice handles the orchestration plumbing.
Building production AI agents requires more than just LLM calls. You need:
Lattice provides all of this with idiomatic Go interfaces and minimal dependencies.
Tool interface once, use everywhere automaticallyLattice is built for production speed:
See PERFORMANCE_SUMMARY.md for detailed benchmarks.
git clone https://github.com/Protocol-Lattice/go-agent.git
cd lattice-agent
go mod download
package main
import (
"context"
"flag"
"log"
"github.com/Protocol-Lattice/go-agent/src/adk"
adkmodules "github.com/Protocol-Lattice/go-agent/src/adk/modules"
"github.com/Protocol-Lattice/go-agent"
"github.com/Protocol-Lattice/go-agent/src/subagents"
"github.com/Protocol-Lattice/go-agent/src/memory"
"github.com/Protocol-Lattice/go-agent/src/memory/engine"
"github.com/Protocol-Lattice/go-agent/src/models"
"github.com/Protocol-Lattice/go-agent/src/tools"
)
func main() {
qdrantURL := flag.String("qdrant-url", "http://localhost:6333", "Qdrant base URL")
qdrantCollection := flag.String("qdrant-collection", "adk_memories", "Qdrant collection name")
flag.Parse()
ctx := context.Background()
// --- Shared runtime
researcherModel, err := models.NewGeminiLLM(ctx, "gemini-2.5-pro", "Research summary:")
if err != nil {
log.Fatalf("create researcher model: %v", err)
}
memOpts := engine.DefaultOptions()
adkAgent, err := adk.New(ctx,
adk.WithDefaultSystemPrompt("You orchestrate a helpful assistant team."),
adk.WithSubAgents(subagents.NewResearcher(researcherModel)),
adk.WithModules(
adkmodules.NewModelModule("gemini-model", func(_ context.Context) (models.Agent, error) {
return models.NewGeminiLLM(ctx, "gemini-2.5-pro", "Swarm orchestration:")
}),
adkmodules.InQdrantMemory(100000, *qdrantURL, *qdrantCollection, memory.AutoEmbedder(), &memOpts),
),
)
if err != nil {
log.Fatal(err)
}
agent, err := adkAgent.BuildAgent(ctx)
if err != nil {
log.Fatal(err)
}
// Use the agent
resp, err := agent.Generate(ctx, "SessionID", "What is pgvector")
if err != nil {
log.Fatal(err)
}
log.Println(resp)
}
# Interactive CLI demo
go run cmd/demo/main.go
# Multi-agent coordination
go run cmd/team/main.go
# Quick start example
go run cmd/quickstart/main.go
# CodeMode + Agent as UTCP Tool
go run cmd/example/codemode/main.go
# Multi-Agent Workflow Orchestration
go run cmd/example/codemode_utcp_workflow/main.go
# Agent-to-Agent Communication via UTCP
go run cmd/example/agent_as_tool/main.go
go run cmd/example/agent_as_utcp_codemode/main.go
# Agent State Persistence (Checkpoint/Restore)
go run cmd/example/checkpoint/main.go
cmd/example/codemode/main.go: Demonstrates how to use CodeMode to enable agents to call UTCP tools (including other agents) via generated Go code. Shows the pattern: User Input → LLM generates codemode.CallTool() → UTCP executes tool.
cmd/example/codemode_utcp_workflow/main.go: Shows orchestrating multi-step workflows where multiple specialist agents (analyst, writer, reviewer) work together through UTCP tool calls.
cmd/example/agent_as_tool/main.go: Demonstrates exposing agents as UTCP tools using RegisterAsUTCPProvider(), enabling agent-to-agent communication and hierarchical agent architectures.cmd/example/agent_as_utcp_codemode/main.go: Shows an agent exposed as a UTCP tool and orchestrated via CodeMode, illustrating natural language to tool call generation.cmd/example/checkpoint/main.go: Demonstrates how to checkpoint an agent’s state to disk and restore it later, preserving conversation history and shared space memberships.lattice-agent/
├── cmd/
│ ├── demo/ # Interactive CLI with tools, delegation, and memory
│ ├── quickstart/ # Minimal getting-started example
│ └── team/ # Multi-agent coordination demos
├── pkg/
│ ├── adk/ # Agent Development Kit and module system
│ ├── memory/ # Memory engine and vector store adapters
│ ├── models/ # LLM provider adapters (Gemini, Ollama, Anthropic)
│ ├── subagents/ # Pre-built specialist agent personas
├── └── tools/ # Built-in tools (echo, calculator, time, etc.)
| Variable | Description | Required |
|---|---|---|
GOOGLE_API_KEY |
Gemini API credentials | For Gemini models |
GEMINI_API_KEY |
Alternative to GOOGLE_API_KEY |
For Gemini models |
DATABASE_URL |
PostgreSQL connection string | For persistent memory |
ADK_EMBED_PROVIDER |
Embedding provider override | No (defaults to Gemini) |
export GOOGLE_API_KEY="your-api-key-here"
export DATABASE_URL="postgres://user:pass@localhost:5432/lattice?sslmode=disable"
export ADK_EMBED_PROVIDER="gemini"
Lattice includes a sophisticated memory system with retrieval-augmented generation (RAG):
store := memory.NewInMemoryStore() // or PostgreSQL/Qdrant
engine := memory.NewEngine(store, memory.Options{}).
WithEmbedder(yourEmbedder)
sessionMemory := memory.NewSessionMemory(
memory.NewMemoryBankWithStore(store),
8, // context window size
).WithEngine(engine)
Features:
Create custom tools by implementing a simple interface:
package tools
import (
"context"
"fmt"
"strings"
"github.com/Protocol-Lattice/go-agent"
)
// EchoTool repeats the provided input. Useful for testing tool wiring.
type EchoTool struct{}
func (e *EchoTool) Spec() agent.ToolSpec {
return agent.ToolSpec{
Name: "echo",
Description: "Echoes the provided text back to the caller.",
InputSchema: map[string]any{
"type": "object",
"properties": map[string]any{
"input": map[string]any{
"type": "string",
"description": "Text to echo back.",
},
},
"required": []any{"input"},
},
}
}
func (e *EchoTool) Invoke(_ context.Context, req agent.ToolRequest) (agent.ToolResponse, error) {
raw := req.Arguments["input"]
if raw == nil {
return agent.ToolResponse{Content: ""}, nil
}
return agent.ToolResponse{Content: strings.TrimSpace(fmt.Sprint(raw))}, nil
}
Register tools with the module system and they’re automatically available to all agents.
Use Shared Spaces to coordinate multiple agents with shared memory
Perfect for:
Lattice treats Agents as first-class Tools. This allows you to expose any agent as a tool to another agent, enabling powerful hierarchical or mesh architectures.
Why use this?
When you wrap an agent as a tool:
parent_session.sub.tool_name). It has its own memory and history, preventing the parent’s context window from being polluted with the sub-agent’s internal thought process.// 1. Create a specialist agent
researcher, _ := agent.New(agent.Options{
SystemPrompt: "You are a researcher. Search for facts.",
// ...
})
// 2. Create a manager agent that uses the researcher
manager, _ := agent.New(agent.Options{
SystemPrompt: "You are a manager. Delegate tasks.",
Tools: []agent.Tool{
// Expose the researcher as a tool!
researcher.AsTool("researcher", "Delegates research tasks to the specialist."),
},
})
// 3. The manager can now call the researcher tool
// User: "Find out why the sky is blue"
// Manager -> calls tool "researcher" -> Researcher Agent runs -> returns result -> Manager answers
In addition to the internal Tool interface, Lattice agents can be exposed as Universal Tool Calling Protocol (UTCP) tools. This allows them to be consumed by any UTCP-compliant client, enabling cross-language and cross-platform agent orchestration.
Key Functions:
agent.AsUTCPTool(name, description): Wraps an agent as a standalone UTCP tools.Tool struct.agent.RegisterAsUTCPProvider(ctx, client, name, description): Automatically registers the agent as a tool provider on a UTCP client.Example:
// 1. Create your specialist agent
researcher, _ := agent.New(agent.Options{
SystemPrompt: "You are a researcher.",
})
// 2. Initialize a UTCP client
client, err := utcp.NewUTCPClient(ctx, nil, nil, nil)
if err != nil {
log.Fatal(err)
}
// 3. Register the agent as a UTCP provider
// This makes the agent available as a tool named "researcher.agent"
err := researcher.RegisterAsUTCPProvider(ctx, client, "researcher.agent", "Deep research agent")
if err != nil {
log.Fatal(err)
}
// 4. Call the agent via UTCP
// The tool accepts 'instruction' and optional 'session_id'
result, err := client.CallTool(ctx, "researcher.agent", map[string]any{
"instruction": "Analyze the latest trends in AI agents",
}, "researcher", nil)
fmt.Println(result["response"])
Benefits:
Lattice supports Checkpointing and Restoration, allowing you to pause agents mid-task, persist their state to disk or a database, and resume them later (even after a crash or restart).
Key Methods:
agent.Checkpoint(): Serializes the agent’s state (system prompt, short-term memory, shared space memberships) to a []byte.agent.Restore(data []byte): Rehydrates an agent instance from a checkpoint.Example:
// 1. Checkpoint the agent
data, err := agent.Checkpoint()
if err != nil {
log.Fatal(err)
}
// Save 'data' to file/DB...
// 2. Restore the agent (later or after crash)
// Create a fresh agent instance first
newAgent, err := agent.New(opts)
if err != nil {
log.Fatal(err)
}
// Restore state
if err := newAgent.Restore(data); err != nil {
log.Fatal(err)
}
// newAgent now has the same memory and context as the original
Token-Oriented Object Notation (TOON) is integrated into Lattice to dramatically reduce token consumption when passing structured data to and from LLMs. This is especially critical for AI agent workflows where context windows are precious and API costs scale with token usage.
Traditional JSON is verbose and wastes tokens on repetitive syntax. Consider passing agent memory or tool responses:
{
"memories": [
{ "id": 1, "content": "User prefers Python", "importance": 0.9, "timestamp": "2025-01-15" },
{ "id": 2, "content": "User is building CLI tools", "importance": 0.85, "timestamp": "2025-01-14" },
{ "id": 3, "content": "User works with PostgreSQL", "importance": 0.8, "timestamp": "2025-01-13" }
]
}
Token count: ~180 tokens
TOON compresses the same data by eliminating redundancy:
memories[3]{id,content,importance,timestamp}:
1,User prefers Python,0.9,2025-01-15
2,User is building CLI tools,0.85,2025-01-14
3,User works with PostgreSQL,0.8,2025-01-13
Token count: ~85 tokens
Savings: ~53% fewer tokens
TOON is particularly effective for:
When your agent queries its memory system, TOON can encode dozens of memories in the space where JSON would fit only a handful:
// Retrieve memories
memories := sessionMemory.Retrieve(ctx, "user preferences", 20)
// Encode with TOON for LLM context
encoded, _ := toon.Marshal(memories, toon.WithLengthMarkers(true))
// Pass to LLM with 40-60% fewer tokens than JSON
prompt := fmt.Sprintf("Based on these memories:\n%s\n\nAnswer the user's question.", encoded)
Despite its compactness, TOON remains readable for debugging and development. The format explicitly declares its schema, making it self-documenting:
users[2]{id,name,role}:
1,Alice,admin
2,Bob,user
You can immediately see: 2 users, with fields id/name/role, followed by their values.
Lattice automatically uses TOON for internal data serialization. To use it in your custom tools or memory adapters:
import "github.com/toon-format/toon-go"
// Encode your structs
encoded, err := toon.Marshal(data, toon.WithLengthMarkers(true))
// Decode back to structs
var result MyStruct
err = toon.Unmarshal(encoded, &result)
// Or decode to dynamic maps
var doc map[string]any
err = toon.Unmarshal(encoded, &doc)
For more details, see the TOON specification.
Bottom Line: TOON helps your agents do more with less, turning token budgets into a competitive advantage rather than a constraint.
The Tool Orchestrator is an intelligent decision engine that lets the LLM choose when and how to call UTCP tools. It analyzes user input, evaluates available tools, and returns a structured JSON plan describing the next action.
This brings
go-agentto the same capability tier as OpenAI’s tool choice, but with fully pluggable UTCP backends and Go-native execution.
Produces a strict JSON decision object:
{
"use_tool": true,
"tool_name": "search.files",
"arguments": { "query": "config" },
"reason": "User asked to look for configuration files"
}
Collect Tool Definitions
rendered := a.renderUtcpToolsForPrompt()
Build the Orchestration Prompt
choicePrompt := fmt.Sprintf(`
You are a UTCP tool selection engine.
A user asked:
%q
You have access to these UTCP tools:
%s
You can also discover tools dynamically using:
search_tools("<query>", <limit>)
Return ONLY JSON:
{ "use_tool": ..., "tool_name": "...", "arguments": { }, "reason": "..." }
`, userInput, rendered)
LLM Makes a Decision (via TOON)
Agent Executes the Tool
CallToolSearchToolsCallToolStreamThe result becomes the agent’s final response
The orchestrator uses TOON as its structured reasoning layer:
This yields stable, deterministic tool choice behavior.
find all files containing “db connection” in the workspace
{
"use_tool": true,
"tool_name": "search.files",
"arguments": {
"query": "db connection",
"limit": 20
},
"reason": "User wants to search through the workspace files"
}
The search.files UTCP tool is invoked, and its direct output is returned to the user.
UTCP tool calls can run inside the Go DSL:
r, _ := codemode.CallTool("echo", map[string]any{"input": "hi"})
The orchestrator can:
This makes go-agent one of the first Go frameworks with multi-step, LLM-driven tool-routing.
# Run all tests
go test ./...
# Run with coverage
go test -cover ./...
# Run specific package tests
go test ./pkg/memory/...
We follow standard Go conventions:
gofmt for formattingNew LLM Provider:
models.LLM interface in pkg/models/New Tool:
agent.Tool interface in pkg/tools/New Memory Backend:
memory.VectorStore interfacepgvector extension (optional, for persistent memory)For persistent memory with vector search:
CREATE EXTENSION IF NOT EXISTS vector;
The memory module handles schema migrations automatically.
Missing pgvector extension
ERROR: type "vector" does not exist
Solution: Run CREATE EXTENSION vector; in your PostgreSQL database.
API key errors
ERROR: authentication failed
Solution: Verify your API key is correctly set in the environment where you run the application.
Tool not found
ERROR: tool "xyz" not registered
Solution: Ensure tool names are unique and properly registered in your tool catalog.
We welcome contributions! Here’s how to get started:
git checkout -b feature/amazing-feature)Please ensure:
go test ./...)gofmt)This project is licensed under the Apache 2.0 License.
Star us on GitHub if you find Lattice useful! ⭐