Agent Runtime
An embeddable agent execution engine. Plug in any LLM provider, storage backend, and tool set to build production agents.
npm install @ainative/agent-runtime
Quick Start
import { Agent, AINativeProvider, InMemoryStorage, webFetchTool } from '@ainative/agent-runtime';
const agent = new Agent({
name: 'research-assistant',
provider: new AINativeProvider({ apiKey: 'your-key' }),
storage: new InMemoryStorage(),
tools: [webFetchTool],
systemPrompt: 'You are a research assistant. Use web_fetch to find information.',
});
const result = await agent.run('What are the latest trends in AI agents?');
console.log(result.output);
console.log(`Tools used: ${result.toolCalls.length}`);
LLM Providers
Swap providers without changing agent logic.
import {
AINativeProvider,
OpenAIProvider,
AnthropicProvider,
OllamaProvider,
} from '@ainative/agent-runtime';
// AINative (default)
const ainative = new AINativeProvider({ apiKey: 'key' });
// OpenAI
const openai = new OpenAIProvider({ apiKey: 'sk-...' });
// Anthropic
const anthropic = new AnthropicProvider({ apiKey: 'sk-ant-...' });
// Ollama (local)
const ollama = new OllamaProvider({ baseUrl: 'http://localhost:11434', model: 'llama3' });
Storage Backends
Agents persist memory and state across runs.
import { InMemoryStorage, LocalStorage, AINativeStorage } from '@ainative/agent-runtime';
// In-memory (development)
const mem = new InMemoryStorage();
// Local filesystem
const local = new LocalStorage({ path: './agent-data' });
// AINative cloud (production)
const cloud = new AINativeStorage({
apiKey: 'key',
projectId: 'my-project',
});
Built-in Tools
| Tool | Description |
|---|---|
webFetchTool | Fetch and parse web pages |
httpTool | Make arbitrary HTTP requests |
bashTool | Execute shell commands |
fileTool | Read and write local files |
gitTool | Git operations (status, diff, commit) |
mcpTools(config) | Load tools from any MCP server |
MCP Tool Adapter
Connect any MCP server as agent tools:
import { Agent, AINativeProvider, mcpTools } from '@ainative/agent-runtime';
const agent = new Agent({
provider: new AINativeProvider({ apiKey: 'key' }),
tools: [
...await mcpTools({ command: 'npx', args: ['ainative-zerodb-memory-mcp'] }),
],
});
Dialog Manager
For multi-turn conversations with context management:
import { Agent, DialogManager, AINativeProvider, AINativeStorage } from '@ainative/agent-runtime';
const agent = new Agent({
provider: new AINativeProvider({ apiKey: 'key' }),
storage: new AINativeStorage({ apiKey: 'key', projectId: 'proj' }),
});
const dialog = new DialogManager(agent);
// Multi-turn with automatic context
await dialog.say('My name is Alice');
const response = await dialog.say('What is my name?');
// "Your name is Alice"
Events
Monitor agent execution in real-time:
import { Agent, AgentEventEmitter } from '@ainative/agent-runtime';
const agent = new Agent({ ... });
agent.on('tool:start', ({ tool, input }) => {
console.log(`Calling ${tool}...`);
});
agent.on('tool:end', ({ tool, output, duration }) => {
console.log(`${tool} completed in ${duration}ms`);
});
agent.on('llm:start', ({ messages }) => {
console.log(`Sending ${messages.length} messages to LLM`);
});
Workspace Security
The WorkspaceManager sandboxes file operations:
import { Agent, WorkspaceManager } from '@ainative/agent-runtime';
const workspace = new WorkspaceManager({
rootDir: '/tmp/agent-workspace',
allowedPaths: ['/tmp/agent-workspace/**'],
});
Next Steps
- Agent SDK — API client for Agent Cloud
- Agent Cloud — Deploy agents to production
- MCP Servers — Connect agents to tools