Skip to main content

walkerOS CLI

The walkerOS CLI (@walkeros/cli) is a command-line tool for building, testing, and running event collection flows. It handles the complete workflow from configuration to deployment—bundling flows into optimized JavaScript, testing with simulated events, and running collection servers locally or in Docker.

If you're familiar with tools like Segment's CLI or Amplitude's SDK—the walkerOS CLI is your build pipeline for analytics infrastructure.

Installation

Install globally to use the walkeros command anywhere:

Loading...

Local Installation

Install in your project for team consistency:

Loading...

Commands Overview

CommandPurposeUse Case
bundleBuild production-ready bundle from flow configCreate deployable JavaScript from configuration
simulateTest flow with sample events (mocked)Validate configuration before deployment
pushExecute event with real API callsIntegration testing, production validation
run collectStart HTTP event collection serverAccept incoming events via HTTP POST
run serveServe web bundles as static filesHost browser tracking scripts
cacheManage CLI package and build cachesClear stale caches, view cache statistics

Getting Started

Before using the CLI, you need a flow configuration file. Here's a minimal example:

Loading...

Save this as flow.json.

Bundle Command

The bundle command builds production-ready JavaScript bundles from flow configurations.

Use Case

You've defined your sources, destinations, and transformations in a flow configuration file. Now you need to:

  • Download the required npm packages
  • Bundle everything into a single optimized JavaScript file
  • Deploy it to production (Docker, Cloud Run, serverless functions)

The bundle command handles all of this.

Basic Usage

Loading...

This creates an optimized bundle at the path specified in build.output.

Step-by-Step Guide

1. Create a flow configuration

Create server-collect.json:

Loading...

2. Bundle the flow

Loading...

Output:

📦 Downloading packages from npm...
✓ @walkeros/collector@latest
✓ @walkeros/server-source-express@latest
✓ @walkeros/destination-demo@latest

🔨 Bundling...
✓ Bundle created: ./dist/server-flow.mjs

📊 Bundle Statistics:
Size: 45.2 KB (minified)
Packages: 3
Format: ESM

3. Review the bundle

Loading...

The bundle is now ready to deploy!

Options

Loading...
OptionDescription
-e, --env <name>Build specific environment (for multi-env configs)
--allBuild all environments
-s, --statsShow bundle statistics
--jsonOutput statistics as JSON (for CI/CD)
--no-cacheSkip package cache, download fresh
--localExecute locally without Docker
-v, --verboseDetailed logging

Multi-Environment Example

Loading...

Simulate Command

The simulate command tests your flow configuration with sample events without deploying to production.

Use Case

Before deploying your flow, you want to:

  • Verify your configuration is valid
  • Test event transformations and mappings
  • See what data would be sent to destinations
  • Debug issues locally

Simulate executes your flow with test events and captures all destination API calls.

Basic Usage

Loading...

Step-by-Step Guide

1. Create a flow to test

Create test-flow.json:

Loading...

2. Run simulation

Loading...

3. Review output

{
"success": true,
"event": {
"name": "page view",
"entity": "page",
"action": "view",
"data": {
"title": "Welcome to Analytics",
"path": "/welcome"
},
"timestamp": 1701234567890
},
"captured": [
{
"destination": "demo",
"event": {
"name": "pageview",
"data": {
"page_title": "Welcome to Analytics",
"page_path": "/welcome"
}
}
}
]
}

The simulation shows:

  • ✅ Event was processed successfully
  • ✅ Mapping transformed "page view""pageview"
  • ✅ Data mapping extracted titlepage_title and pathpage_path

Options

Loading...
OptionDescription
--event <json>Event JSON string to test
--event-file <path>Path to JSON file with event(s)
--jsonOutput as JSON
--localExecute locally without Docker
-v, --verboseDetailed logging

Multiple Events

Test multiple events at once:

Loading...

Push Command

The push command executes your flow with a real event, making actual API calls to your configured destinations. Unlike simulate which mocks API calls for safe testing, push performs real HTTP requests—ideal for integration testing and production validation.

Use Case

You've validated your configuration with simulate and now need to:

  • Test with real third-party APIs (GA4, Meta, BigQuery, etc.)
  • Verify production credentials and endpoints work
  • Debug actual API responses and errors
  • Perform integration testing before deployment

Push is your bridge between local testing and production deployment.

Basic Usage

Loading...

Step-by-Step Guide

1. Create a flow configuration

Create api-flow.json with a destination that makes real HTTP calls:

Loading...

2. Create an event file

Create event.json:

Loading...

3. Push the event

Loading...

4. Review the output

📥 Loading event...
📦 Loading flow configuration...
🔨 Bundling flow configuration...
🖥️ Executing in server environment (Node.js)...
Pushing event: order complete
✅ Event pushed successfully
Event ID: 1701234567890-abc12-1
Entity: order
Action: complete
Duration: 1234ms

The event was sent to your real API endpoint!

Options

Loading...
OptionDescription
-e, --event <source>Required. Event to push (JSON string, file path, or URL)
--env <name>Environment name (for multi-environment configs)
--jsonOutput results as JSON
-v, --verboseVerbose output with debug information
-s, --silentSuppress output (for CI/CD)
--localExecute in local Node.js instead of Docker

Event Input Formats

The --event parameter accepts three formats:

Inline JSON string:

Loading...

File path:

Loading...

URL:

Loading...

Push vs Simulate

Featurepushsimulate
API CallsReal HTTP requestsMocked (captured)
Use CaseIntegration testingSafe local testing
Side EffectsFull (writes to DBs, sends to APIs)None
OutputSuccess/error from real APIsCaptured call data

When to use each:

  • Use simulate first to validate configuration without side effects
  • Use push to verify real integrations work before deployment

JSON Output

For CI/CD pipelines, use --json for machine-readable output:

Loading...

Output:

{
"success": true,
"event": {
"id": "1701234567890-abc12-1",
"name": "page view",
"entity": "page",
"action": "view"
},
"duration": 1234
}

On error:

{
"success": false,
"error": "Connection refused: https://your-endpoint.com/events",
"duration": 5023
}

Multi-Environment

Push to specific environments:

Loading...

Run Collect Command

The run collect command starts an HTTP server that accepts events and processes them through your flow.

Use Case

You need an HTTP endpoint to:

  • Receive events from browser clients, mobile apps, or server-side sources
  • Process events through your collector and destinations
  • Test the full event pipeline locally before deploying to production

This is similar to running a Segment or Jitsu collection endpoint.

Basic Usage

Loading...

Step-by-Step Guide

1. Create a collection flow

Create collect.json:

Loading...

2. Start the collector

Loading...

Output:

📦 Bundling flow...
✓ Bundle ready

🚀 Starting collection server...
✓ Server running on http://localhost:8080
✓ Endpoint: POST http://localhost:8080/collect
✓ Health check: GET http://localhost:8080/health

3. Send test events

Open a new terminal and send events:

Loading...

4. See events in console

The collector terminal shows:

[Event Collector] page view
data: {"title":"Home Page","path":"/"}
user.id: user123
timestamp: 1701234567890

[Event Collector] product view
data: {"id":"P123","name":"Laptop","price":999}
timestamp: 1701234567891

Options

Loading...
OptionDescription
--port <number>Server port (default: 8080)
--localExecute locally without Docker
-v, --verboseDetailed logging

Running Pre-Built Bundles

You can also run pre-built bundles directly:

Loading...

Run Serve Command

The run serve command serves web bundles as static files for browser-side tracking.

Use Case

You've created a web flow (browser event tracking) and need to:

  • Host the JavaScript bundle for your website to load
  • Test browser tracking locally before deploying to a CDN
  • Serve multiple versions or configurations

This is like hosting a tracking script (similar to Segment's analytics.js or Amplitude's SDK).

Basic Usage

Loading...

Step-by-Step Guide

1. Create a web flow

Create web-track.json:

Loading...

2. Start the serve mode

Loading...

Output:

📦 Bundling flow...
✓ Bundle ready: tracker.js

🌐 Starting static file server...
✓ Server running on http://localhost:3000
✓ Script available at: http://localhost:3000/scripts/tracker.js
✓ Health check: GET http://localhost:3000/health

3. Use in HTML

Create test.html:

Loading...

4. Test in browser

Open test.html in your browser. Events will be sent to http://localhost:8080/collect.

Options

Loading...
OptionDescription
--port <number>Server port (default: 3000)
--path <string>URL path to serve script (default: /)
--name <string>Script filename (default: from config)
--localExecute locally without Docker
-v, --verboseDetailed logging

Complete Example: Web → Server Flow

This example demonstrates a complete analytics pipeline:

  • Browser events captured by web flow
  • Sent to server collection endpoint
  • Logged to console (swap for BigQuery in production)

1. Create Server Collection Flow

Create server-collect.json:

Loading...

2. Create Web Tracking Flow

Create web-track.json:

Loading...

3. Start Collection Server

Terminal 1:

Loading...

4. Start Web Server

Terminal 2:

Loading...

5. Test in Browser

Create demo.html:

Loading...

Open in browser. Terminal 1 shows:

[Server Logger] page view
[Server Logger] promotion view
[Server Logger] promotion cta
[Server Logger] custom event

Cache Command

The cache command manages the CLI's package and build caches.

Use Case

The CLI caches downloaded npm packages and compiled builds to speed up repeated operations. You may need to:

  • Clear stale cached packages when debugging version issues
  • Free up disk space by removing old cached builds
  • View cache statistics to understand cache usage

How Caching Works

Package Cache (.tmp/cache/packages/):

  • Mutable versions (latest, ^, ~) are re-checked daily
  • Exact versions (0.4.1) are cached indefinitely
  • Saves network time on repeated builds

Build Cache (.tmp/cache/builds/):

  • Caches compiled bundles based on flow.json content + date
  • Identical configs reuse cached builds within the same day
  • Dramatically speeds up repeated builds (~100x faster)

Basic Usage

Loading...

Commands

View cache info:

Loading...

Output:

Cache directory: .tmp/cache
Cached packages: 12
Cached builds: 5

Clear all caches:

Loading...

Clear only package cache:

Loading...

Clear only build cache:

Loading...

Bypassing Cache

To skip the cache for a single build operation:

Loading...

This downloads fresh packages and rebuilds without using or updating the cache.

Options

OptionDescription
--packagesClear only the package cache
--buildsClear only the build cache

Global Options

These options work with all commands:

Loading...
OptionDescription
--localExecute without Docker (faster, requires local Node.js)
--verboseShow detailed logs
--silentSuppress output
--jsonOutput as JSON (for CI/CD)
--helpShow help for command
--versionShow CLI version

Execution Modes

Docker Mode (Default)

By default, the CLI uses Docker for isolated execution:

Loading...

Advantages:

  • Consistent environment across machines
  • Isolated from local Node.js setup
  • Reproducible builds

Requirements:

  • Docker installed and running

Local Mode

Use --local to execute in your current Node.js environment:

Loading...

Advantages:

  • Faster execution (no container startup)
  • No Docker dependency
  • Useful in CI/CD or containers

Requirements:

  • Node.js 18+ installed locally

CI/CD Integration

GitHub Actions

Loading...

Docker Build

Loading...

Troubleshooting

Package Download Issues

If packages fail to download:

Loading...

Docker Permission Issues

On Linux, you may need to run Docker commands with sudo or add your user to the docker group:

Loading...

Port Already in Use

If the port is already in use:

Loading...

Next Steps

💡 Need Professional Support?
Need professional support with your walkerOS implementation? Check out our services.