Quick Start¶
Get a self-learning agent running in under a minute.
Simplest Example¶
If you've run ace setup (see Setup), you can load your config automatically:
from ace import ACELiteLLM
agent = ACELiteLLM.from_setup()
# Ask related questions — the agent learns patterns across them
answer1 = agent.ask("If all cats are animals, is Felix (a cat) an animal?")
answer2 = agent.ask("If all birds fly, can penguins (birds) fly?")
print(f"Learned {len(agent.skillbook.skills())} strategies")
# Save and reload later
agent.save("my_agent.json")
Or specify a model directly (API key must be in the environment):
Choose Your Integration¶
The simplest path. Supports 100+ LLM providers.
Wrap any LangChain Runnable (chains, agents, graphs) with learning.
Browser automation that learns navigation patterns.
Full Pipeline Example¶
For full control, use the three ACE roles directly:
from ace import (
ACE, Agent, Reflector, SkillManager,
Sample, SimpleEnvironment,
)
# Create roles (each takes a model string directly)
agent = Agent("gpt-4o-mini")
reflector = Reflector("gpt-4o-mini")
skill_manager = SkillManager("gpt-4o-mini")
# Build the adaptive pipeline
runner = ACE.from_roles(
agent=agent,
reflector=reflector,
skill_manager=skill_manager,
environment=SimpleEnvironment(),
)
# Train on samples
samples = [
Sample(question="What is the capital of France?", context="", ground_truth="Paris"),
Sample(question="What is 2 + 2?", context="", ground_truth="4"),
]
results = runner.run(samples, epochs=2)
print(f"Learned {len(runner.skillbook.skills())} strategies")
runner.save("trained.json")
Loading Saved Agents¶
from ace import ACELiteLLM
# Resume from a saved skillbook
agent = ACELiteLLM.from_model("gpt-4o-mini", skillbook_path="my_agent.json")
answer = agent.ask("New question") # Uses previously learned strategies
Trying Different Models¶
from ace import ACELiteLLM
# OpenAI
agent = ACELiteLLM.from_model("gpt-4o-mini")
# Anthropic
agent = ACELiteLLM.from_model("claude-sonnet-4-5-20250929")
# Google
agent = ACELiteLLM.from_model("gemini-pro")
# Local (Ollama)
agent = ACELiteLLM.from_model("ollama/llama2")
What to Read Next¶
- How ACE Works — understand the three-role architecture
- The Skillbook — how strategies are stored and evolve
- Full Pipeline Guide — build custom ACE pipelines
- Integrations — LangChain, Browser-Use, Claude Code