Quick Start Guide
Get up and running with AgentSea in minutes. Build your first AI agent with just a few lines of code.
Installation
Install AgentSea using your preferred package manager:
bash
pnpm add @lov3kaizen/agentsea-core
npm install @lov3kaizen/agentsea-core
yarn add @lov3kaizen/agentsea-coreBasic Agent
Create your first agent in just a few lines of code:
typescript
import { Agent, AnthropicProvider, ToolRegistry, BufferMemory, calculatorTool } from '@lov3kaizen/agentsea-core';
const provider = new AnthropicProvider(process.env.ANTHROPIC_API_KEY);
const toolRegistry = new ToolRegistry();
toolRegistry.register(calculatorTool);
const agent = new Agent({
name: 'my-assistant',
model: 'claude-sonnet-4-20250514',
provider: 'anthropic',
systemPrompt: 'You are a helpful assistant.',
tools: [calculatorTool],
}, provider, toolRegistry, new BufferMemory(50));
const response = await agent.execute('What is 42 * 58?', context);Using Local Models (Privacy & Cost-Free)
Run agents completely locally with Ollama - perfect for privacy-sensitive applications, offline development, or eliminating API costs:
typescript
import { Agent, OllamaProvider, ToolRegistry, BufferMemory } from '@lov3kaizen/agentsea-core';
// No API key needed - runs 100% locally!
const provider = new OllamaProvider({
baseUrl: 'http://localhost:11434',
model: 'llama3.2' // or mistral, gemma2, qwen2.5, etc.
});
const agent = new Agent({
name: 'local-assistant',
model: 'llama3.2',
provider: 'ollama',
systemPrompt: 'You are a helpful assistant running locally.',
}, provider, new ToolRegistry(), new BufferMemory(50));
const response = await agent.execute('Hello!', context);🦙 Get Started with Ollama
Install Ollama in seconds:
bash
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3.2MCP Integration
Connect to MCP servers to extend your agent with external tools:
typescript
import { Agent, AnthropicProvider, ToolRegistry, MCPRegistry } from '@lov3kaizen/agentsea-core';
const mcpRegistry = new MCPRegistry();
await mcpRegistry.addServer({
name: 'filesystem',
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '/tmp'],
});
const mcpTools = mcpRegistry.getTools();
const toolRegistry = new ToolRegistry();
toolRegistry.registerMany(mcpTools);
const agent = new Agent(config, new AnthropicProvider(), toolRegistry);
const response = await agent.execute('List the files in /tmp', context);Multi-Agent Workflows
Orchestrate multiple agents for complex tasks:
typescript
import { WorkflowFactory, AnthropicProvider, ToolRegistry } from '@lov3kaizen/agentsea-core';
const workflow = WorkflowFactory.create({
name: 'research-workflow',
type: 'sequential',
agents: [
{ name: 'researcher', systemPrompt: 'Research information.' },
{ name: 'writer', systemPrompt: 'Write a summary.' },
],
}, new AnthropicProvider(), new ToolRegistry());
const result = await workflow.execute('Research AI agents', context);Next Steps
💡 Tip
Check out the examples page for complete, runnable examples covering all features.