walkerOS CLI
The walkerOS CLI (@walkeros/cli) is a command-line tool for building, testing, and running event collection flows. It handles the complete workflow from configuration to deployment—bundling flows into optimized JavaScript, testing with simulated events, and running collection servers locally or in Docker.
If you're familiar with tools like Segment's CLI or Amplitude's SDK—the walkerOS CLI is your build pipeline for analytics infrastructure.
Installation
Global Installation (Recommended)
Install globally to use the walkeros command anywhere:
Local Installation
Install in your project for team consistency:
Commands Overview
| Command | Purpose | Use Case |
|---|---|---|
bundle | Build production-ready bundle from flow config | Create deployable JavaScript from configuration |
simulate | Test flow with sample events (mocked) | Validate configuration before deployment |
push | Execute event with real API calls | Integration testing, production validation |
run collect | Start HTTP event collection server | Accept incoming events via HTTP POST |
run serve | Serve web bundles as static files | Host browser tracking scripts |
cache | Manage CLI package and build caches | Clear stale caches, view cache statistics |
Getting Started
Before using the CLI, you need a flow configuration file. Here's a minimal example:
Save this as flow.json.
Bundle Command
The bundle command builds production-ready JavaScript bundles from flow configurations.
Use Case
You've defined your sources, destinations, and transformations in a flow configuration file. Now you need to:
- Download the required npm packages
- Bundle everything into a single optimized JavaScript file
- Deploy it to production (Docker, Cloud Run, serverless functions)
The bundle command handles all of this.
Basic Usage
This creates an optimized bundle at the path specified in build.output.
Step-by-Step Guide
1. Create a flow configuration
Create server-collect.json:
2. Bundle the flow
Output:
📦 Downloading packages from npm...
✓ @walkeros/collector@latest
✓ @walkeros/server-source-express@latest
✓ @walkeros/destination-demo@latest
🔨 Bundling...
✓ Bundle created: ./dist/server-flow.mjs
📊 Bundle Statistics:
Size: 45.2 KB (minified)
Packages: 3
Format: ESM
3. Review the bundle
The bundle is now ready to deploy!
Options
| Option | Description |
|---|---|
-e, --env <name> | Build specific environment (for multi-env configs) |
--all | Build all environments |
-s, --stats | Show bundle statistics |
--json | Output statistics as JSON (for CI/CD) |
--no-cache | Skip package cache, download fresh |
--local | Execute locally without Docker |
-v, --verbose | Detailed logging |
Multi-Environment Example
Simulate Command
The simulate command tests your flow configuration with sample events without deploying to production.
Use Case
Before deploying your flow, you want to:
- Verify your configuration is valid
- Test event transformations and mappings
- See what data would be sent to destinations
- Debug issues locally
Simulate executes your flow with test events and captures all destination API calls.
Basic Usage
Step-by-Step Guide
1. Create a flow to test
Create test-flow.json:
2. Run simulation
3. Review output
{
"success": true,
"event": {
"name": "page view",
"entity": "page",
"action": "view",
"data": {
"title": "Welcome to Analytics",
"path": "/welcome"
},
"timestamp": 1701234567890
},
"captured": [
{
"destination": "demo",
"event": {
"name": "pageview",
"data": {
"page_title": "Welcome to Analytics",
"page_path": "/welcome"
}
}
}
]
}
The simulation shows:
- ✅ Event was processed successfully
- ✅ Mapping transformed
"page view"→"pageview" - ✅ Data mapping extracted
title→page_titleandpath→page_path
Options
| Option | Description |
|---|---|
--event <json> | Event JSON string to test |
--event-file <path> | Path to JSON file with event(s) |
--json | Output as JSON |
--local | Execute locally without Docker |
-v, --verbose | Detailed logging |
Multiple Events
Test multiple events at once:
Push Command
The push command executes your flow with a real event, making actual API calls to your configured destinations. Unlike simulate which mocks API calls for safe testing, push performs real HTTP requests—ideal for integration testing and production validation.
Use Case
You've validated your configuration with simulate and now need to:
- Test with real third-party APIs (GA4, Meta, BigQuery, etc.)
- Verify production credentials and endpoints work
- Debug actual API responses and errors
- Perform integration testing before deployment
Push is your bridge between local testing and production deployment.
Basic Usage
Step-by-Step Guide
1. Create a flow configuration
Create api-flow.json with a destination that makes real HTTP calls:
2. Create an event file
Create event.json:
3. Push the event
4. Review the output
📥 Loading event...
📦 Loading flow configuration...
🔨 Bundling flow configuration...
🖥️ Executing in server environment (Node.js)...
Pushing event: order complete
✅ Event pushed successfully
Event ID: 1701234567890-abc12-1
Entity: order
Action: complete
Duration: 1234ms
The event was sent to your real API endpoint!
Options
| Option | Description |
|---|---|
-e, --event <source> | Required. Event to push (JSON string, file path, or URL) |
--env <name> | Environment name (for multi-environment configs) |
--json | Output results as JSON |
-v, --verbose | Verbose output with debug information |
-s, --silent | Suppress output (for CI/CD) |
--local | Execute in local Node.js instead of Docker |
Event Input Formats
The --event parameter accepts three formats:
Inline JSON string:
File path:
URL:
Push vs Simulate
| Feature | push | simulate |
|---|---|---|
| API Calls | Real HTTP requests | Mocked (captured) |
| Use Case | Integration testing | Safe local testing |
| Side Effects | Full (writes to DBs, sends to APIs) | None |
| Output | Success/error from real APIs | Captured call data |
When to use each:
- Use
simulatefirst to validate configuration without side effects - Use
pushto verify real integrations work before deployment
JSON Output
For CI/CD pipelines, use --json for machine-readable output:
Output:
{
"success": true,
"event": {
"id": "1701234567890-abc12-1",
"name": "page view",
"entity": "page",
"action": "view"
},
"duration": 1234
}
On error:
{
"success": false,
"error": "Connection refused: https://your-endpoint.com/events",
"duration": 5023
}
Multi-Environment
Push to specific environments:
Run Collect Command
The run collect command starts an HTTP server that accepts events and processes them through your flow.
Use Case
You need an HTTP endpoint to:
- Receive events from browser clients, mobile apps, or server-side sources
- Process events through your collector and destinations
- Test the full event pipeline locally before deploying to production
This is similar to running a Segment or Jitsu collection endpoint.
Basic Usage
Step-by-Step Guide
1. Create a collection flow
Create collect.json:
2. Start the collector
Output:
📦 Bundling flow...
✓ Bundle ready
🚀 Starting collection server...
✓ Server running on http://localhost:8080
✓ Endpoint: POST http://localhost:8080/collect
✓ Health check: GET http://localhost:8080/health
3. Send test events
Open a new terminal and send events:
4. See events in console
The collector terminal shows:
[Event Collector] page view
data: {"title":"Home Page","path":"/"}
user.id: user123
timestamp: 1701234567890
[Event Collector] product view
data: {"id":"P123","name":"Laptop","price":999}
timestamp: 1701234567891
Options
| Option | Description |
|---|---|
--port <number> | Server port (default: 8080) |
--local | Execute locally without Docker |
-v, --verbose | Detailed logging |
Running Pre-Built Bundles
You can also run pre-built bundles directly:
Run Serve Command
The run serve command serves web bundles as static files for browser-side tracking.
Use Case
You've created a web flow (browser event tracking) and need to:
- Host the JavaScript bundle for your website to load
- Test browser tracking locally before deploying to a CDN
- Serve multiple versions or configurations
This is like hosting a tracking script (similar to Segment's analytics.js or Amplitude's SDK).
Basic Usage
Step-by-Step Guide
1. Create a web flow
Create web-track.json:
2. Start the serve mode
Output:
📦 Bundling flow...
✓ Bundle ready: tracker.js
🌐 Starting static file server...
✓ Server running on http://localhost:3000
✓ Script available at: http://localhost:3000/scripts/tracker.js
✓ Health check: GET http://localhost:3000/health
3. Use in HTML
Create test.html:
4. Test in browser
Open test.html in your browser. Events will be sent to http://localhost:8080/collect.
Options
| Option | Description |
|---|---|
--port <number> | Server port (default: 3000) |
--path <string> | URL path to serve script (default: /) |
--name <string> | Script filename (default: from config) |
--local | Execute locally without Docker |
-v, --verbose | Detailed logging |
Complete Example: Web → Server Flow
This example demonstrates a complete analytics pipeline:
- Browser events captured by web flow
- Sent to server collection endpoint
- Logged to console (swap for BigQuery in production)
1. Create Server Collection Flow
Create server-collect.json:
2. Create Web Tracking Flow
Create web-track.json:
3. Start Collection Server
Terminal 1:
4. Start Web Server
Terminal 2:
5. Test in Browser
Create demo.html:
Open in browser. Terminal 1 shows:
[Server Logger] page view
[Server Logger] promotion view
[Server Logger] promotion cta
[Server Logger] custom event
Cache Command
The cache command manages the CLI's package and build caches.
Use Case
The CLI caches downloaded npm packages and compiled builds to speed up repeated operations. You may need to:
- Clear stale cached packages when debugging version issues
- Free up disk space by removing old cached builds
- View cache statistics to understand cache usage
How Caching Works
Package Cache (.tmp/cache/packages/):
- Mutable versions (
latest,^,~) are re-checked daily - Exact versions (
0.4.1) are cached indefinitely - Saves network time on repeated builds
Build Cache (.tmp/cache/builds/):
- Caches compiled bundles based on flow.json content + date
- Identical configs reuse cached builds within the same day
- Dramatically speeds up repeated builds (~100x faster)
Basic Usage
Commands
View cache info:
Output:
Cache directory: .tmp/cache
Cached packages: 12
Cached builds: 5
Clear all caches:
Clear only package cache:
Clear only build cache:
Bypassing Cache
To skip the cache for a single build operation:
This downloads fresh packages and rebuilds without using or updating the cache.
Options
| Option | Description |
|---|---|
--packages | Clear only the package cache |
--builds | Clear only the build cache |
Global Options
These options work with all commands:
| Option | Description |
|---|---|
--local | Execute without Docker (faster, requires local Node.js) |
--verbose | Show detailed logs |
--silent | Suppress output |
--json | Output as JSON (for CI/CD) |
--help | Show help for command |
--version | Show CLI version |
Execution Modes
Docker Mode (Default)
By default, the CLI uses Docker for isolated execution:
Advantages:
- Consistent environment across machines
- Isolated from local Node.js setup
- Reproducible builds
Requirements:
- Docker installed and running
Local Mode
Use --local to execute in your current Node.js environment:
Advantages:
- Faster execution (no container startup)
- No Docker dependency
- Useful in CI/CD or containers
Requirements:
- Node.js 18+ installed locally
CI/CD Integration
GitHub Actions
Docker Build
Troubleshooting
Package Download Issues
If packages fail to download:
Docker Permission Issues
On Linux, you may need to run Docker commands with sudo or add your user to the docker group:
Port Already in Use
If the port is already in use:
Next Steps
- Flow Configuration - Learn about flow config structure
- Docker Deployment - Deploy flows to production
- Sources - Explore event sources
- Destinations - Configure analytics tools
- Mapping - Transform events for destinations