Installation
Install OrkaJS and configure your first LLM provider.
Option 1: Full Package
Install the full OrkaJS package with all features included:
npm
npm install orkajspnpm
pnpm add orkajsyarn
yarn add orkajsOption 2: Selective Packages (Recommended)
Install only what you need for smaller bundle sizes:
# Corenpm install @orka-js/core # LLM Adaptersnpm install @orka-js/openai # OpenAI (GPT-4, etc.)npm install @orka-js/anthropic # Anthropic (Claude)npm install @orka-js/mistral # Mistralnpm install @orka-js/google # Google Gemininpm install @orka-js/ollama # Ollama (local models)npm install @orka-js/cohere # Coherenpm install @orka-js/replicate # Replicate # Vector Databasesnpm install @orka-js/memory # In-memory (dev/testing)npm install @orka-js/pinecone # Pineconenpm install @orka-js/qdrant # Qdrantnpm install @orka-js/chroma # ChromaDBnpm install @orka-js/pgvector # PostgreSQL / Supabase # Agents & Workflowsnpm install @orka-js/agent # Agents (ReAct, HITL, Toolkits)npm install @orka-js/workflow # Multi-step workflowsnpm install @orka-js/graph # Graph-based workflowsnpm install @orka-js/durable # Durable, resumable & scheduled agentsnpm install @orka-js/a2a # Agent-to-Agent (Google A2A protocol) # Tools & Memorynpm install @orka-js/tools # Loaders, splitters, parsers, chainsnpm install @orka-js/memory-store # Conversation memorynpm install @orka-js/prompts # Prompt versioning & registrynpm install @orka-js/mcp # Model Context Protocol (MCP) # Multimodalnpm install @orka-js/multimodal # Vision, Audio agentsnpm install @orka-js/realtime # Voice agent (STT → LLM → TTS)npm install @orka-js/ocr # OCR & document extraction # Reliability & Orchestrationnpm install @orka-js/cache # Caching layer (Memory, Redis)npm install @orka-js/resilience # Retry, fallback, ResilientLLMnpm install @orka-js/orchestration # Router, Consensus, Race, LoadBalancer # Observability & Evaluationnpm install @orka-js/observability # Tracer, hooks, loggingnpm install @orka-js/otel # OpenTelemetry (OTLP exporter)npm install @orka-js/evaluation # Testing, metrics, assertionsnpm install @orka-js/devtools # Visual debugging dashboardnpm install @orka-js/finetuning # Model fine-tuning orchestration # Framework Integrationsnpm install @orka-js/nestjs # NestJS (DI, modules, SSE, CQRS)npm install @orka-js/express # Express.js middlewarenpm install @orka-js/hono # Hono (edge-compatible)npm install @orka-js/react # React — graph workflow visualizernpm install @orka-js/server # Dev server with Vite playgroundnpm install @orka-js/cli # CLI — npx orka init, orka devnpm install @orka-js/test # Testing utilities & mock LLM | Package | Description |
|---|---|
| @orka-js/core | Types, errors, utils, Knowledge |
| @orka-js/openai | OpenAI adapter (GPT-4, etc.) |
| @orka-js/anthropic | Anthropic adapter (Claude) |
| @orka-js/mistral | Mistral adapter |
| @orka-js/google | Google Gemini adapter |
| @orka-js/ollama | Ollama adapter (local models) |
| @orka-js/cohere | Cohere adapter |
| @orka-js/replicate | Replicate adapter |
| @orka-js/agent | Agents (ReAct, HITL, Toolkits) |
| @orka-js/workflow | Multi-step workflows |
| @orka-js/graph | Graph-based workflows |
| @orka-js/durable | Durable, resumable & scheduled agents |
| @orka-js/a2a | Agent-to-Agent (Google A2A protocol) |
| @orka-js/tools | Loaders, splitters, parsers, chains |
| @orka-js/memory-store | Conversation memory |
| @orka-js/prompts | Prompt versioning, registry |
| @orka-js/mcp | Model Context Protocol (MCP) client, server & gateway |
| @orka-js/multimodal | Vision, Audio agents |
| @orka-js/realtime | Voice agent — STT → LLM → TTS pipeline |
| @orka-js/ocr | OCR & document extraction |
| @orka-js/cache | Memory/Redis cache, CachedLLM |
| @orka-js/resilience | Retry, fallback, ResilientLLM |
| @orka-js/orchestration | Router, Consensus, Race, LoadBalancer |
| @orka-js/observability | Tracer, hooks, logging |
| @orka-js/otel | OpenTelemetry — OTLP exporter, W3C TraceContext |
| @orka-js/evaluation | Testing, metrics, assertions |
| @orka-js/devtools | Visual debugging & observability dashboard |
| @orka-js/finetuning | Fine-tuning orchestration (dataset, API, versioning) |
| @orka-js/nestjs | NestJS — DI, modules, SSE streaming, CQRS |
| @orka-js/express | Express.js middleware with SSE streaming |
| @orka-js/hono | Hono middleware (edge-compatible) |
| @orka-js/react | React — graph workflow visualizer |
| @orka-js/server | Dev server with React+Vite playground |
| @orka-js/cli | CLI — npx orka init, orka dev |
| @orka-js/test | Testing utilities, mock LLM, AgentTestBed, Vitest matchers |
Install an LLM Provider SDK
OpenAI
Recommended to start
npm install openaiAnthropic
Claude models
npm install @anthropic-ai/sdkOllama
Local, no API key
ollama pull llama3.2Vector Database (Optional)
| Provider | Install Command | Use Case |
|---|---|---|
| Memory | Built-in | Development, testing |
| Pinecone | npm install @pinecone-database/pinecone | Managed cloud |
| Qdrant | npm install @qdrant/js-client-rest | Self-hosted or cloud |
| ChromaDB | npm install chromadb | Open-source |
TypeScript Compatibility
OrkaJS works with all moduleResolution modes:
// tsconfig.json - All supported{ "compilerOptions": { "moduleResolution": "node" // Legacy (supported) // or "node16" // Modern // or "nodenext" // Modern // or "bundler" // Bundlers (Vite, Webpack) }}Environment Setup
Store your API keys in environment variables. Create a .env file:
OPENAI_API_KEY=sk-...ANTHROPIC_API_KEY=sk-ant-...PINECONE_API_KEY=...