The original chaos-proxy was developed in Node.js, primarily to support testing of TypeScript and JavaScript applications. Node's event-driven model and npm ecosystem made it a natural fit for rapid development and for supporting custom middleware written in JS/TS - features that are especially valuable for frontend and full-stack teams.

However, unlike chaos-fetch, another testing tool, which is tightly coupled to JS runtimes, chaos-proxy's core concept is language-agnostic: it's a standalone HTTP proxy for injecting chaos (latency, failures, header/body transforms, etc.) into any API traffic. This opens the door to reimplementing the proxy in a language better suited for raw performance and concurrency, like Go.

In this article, I benchmark the original Node.js/Express-based chaos-proxy against a new Go implementation, chaos-proxy-go. While the Go version can't run custom JS/TS middleware, it aims to deliver the same core chaos features with much higher throughput and lower latency. Using a Caddy server as the backend and the hey tool for load testing, I'll compare both proxies (and direct Caddy) in a controlled environment, sharing all configs and results for reproducibility.

System & Test Environment

To ensure a fair and reproducible comparison, all benchmarks were run locally on the same machine with minimal background activity. Below are the full system specifications and environment details:

System Specs:

Test Setup:

All tests were performed on localhost to eliminate network variability. The backend API was served by a Caddy server running on port 8080. Both chaos-proxy (Node.js) and chaos-proxy-go were configured to proxy requests to the Caddy backend. No other heavy processes were running during the benchmarks. Each test scenario was run multiple times, and the best/average results are reported. Software Versions:

Backend Setup: Caddy Server

For all benchmarks, I used a Caddy server as the backend API. Caddy is a modern, fast, and easy-to-configure web server, making it ideal for consistent, low-overhead benchmarking.

Caddyfile Configuration:

# Caddyfile
http://localhost:8080 {
	# Add CORS headers for all responses
	header {
		Access-Control-Allow-Origin *
		Access-Control-Allow-Methods "GET, POST, PUT, DELETE, OPTIONS"
		Access-Control-Allow-Headers "Content-Type, Authorization"
		Content-Type application/json
	}

	# Simple JSON API endpoint	
	route /api/hello {
		respond `{
		"message": "Hello, World!",
		"server": "Caddy",
		"timestamp": "2025-10-03T22:00:00Z"}
		` 200
	}

	# Log all requests
	log {
		output file access.log
		format json
	}
}

We use a single /api/hello endpoint for benchmarking that returns a simple JSON response. The CORS headers ensure compatibility with any client, and logging is enabled for potential further analysis.

This setup ensures that the backend is not a bottleneck and that all proxy benchmarks reflect the overhead of the proxy layer itself.

Proxy Setups

To compare the performance impact of each proxy implementation, I tested three scenarios:

1. Direct to Caddy (Baseline)

Requests are sent straight to the Caddy backend at http://localhost:8080/api/hello, with no proxy in between. This provides a baseline for the lowest possible latency and highest throughput.

2. chaos-proxy (Node.js/Koa)

The original chaos-proxy is a Node.js/Express-based HTTP proxy designed for injecting chaos into API traffic.

For this benchmark:

The config file (chaos.yaml) used is simply

target: http://localhost:8080
port: 5000

3. chaos-proxy-go (Go)

chaos-proxy-go is a Go reimplementation of the same proxy concept, focused on performance and concurrency.

Configuration Notes:

Benchmarking Methodology

To ensure a fair and transparent comparison, I used the same benchmarking tool, request pattern, and methodology for all scenarios. The backend was restarted between tests to clear any caches.

Test Command:

For each scenario, I ran hey -n 1000 -c 50 http://localhost:<port>/api/hello

Where:

Reproducibility:

All configuration files, commands, and system specs are included in this article. Anyone with a similar setup should be able to reproduce these results.

Results

1. Direct to Caddy (Baseline)

./hey -n 1000 -c 50 http://localhost:8080/api/hello

Summary:
  Total:        0.0352 secs
  Slowest:      0.0121 secs
  Fastest:      0.0001 secs
  Average:      0.0016 secs
  Requests/sec: 28383.8519

  Total data:   94000 bytes
  Size/request: 94 bytes

Response time histogram:
  0.000 [1]     |
  0.001 [567]   |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
  0.003 [342]   |■■■■■■■■■■■■■■■■■■■■■■■■
  0.004 [32]    |■■
  0.005 [5]     |
  0.006 [2]     |
  0.007 [19]    |■
  0.009 [8]     |■
  0.010 [3]     |
  0.011 [2]     |
  0.012 [19]    |■


Latency distribution:
  10% in 0.0002 secs
  25% in 0.0005 secs
  50% in 0.0012 secs
  75% in 0.0018 secs
  90% in 0.0024 secs
  95% in 0.0063 secs
  99% in 0.0116 secs

Details (average, fastest, slowest):
  DNS+dialup:   0.0002 secs, 0.0001 secs, 0.0121 secs
  DNS-lookup:   0.0002 secs, 0.0000 secs, 0.0047 secs
  req write:    0.0001 secs, 0.0000 secs, 0.0017 secs
  resp wait:    0.0011 secs, 0.0001 secs, 0.0058 secs
  resp read:    0.0002 secs, 0.0000 secs, 0.0016 secs

Status code distribution:
  [200] 1000 responses

2. chaos-proxy (Node.js/Koa)

./hey -n 1000 -c 50 http://localhost:5000/api/hello

Summary:
  Total:        0.2346 secs
  Slowest:      0.0430 secs
  Fastest:      0.0049 secs
  Average:      0.0115 secs
  Requests/sec: 4262.3420

  Total data:   94000 bytes
  Size/request: 94 bytes

Response time histogram:
  0.005 [1]     |
  0.009 [9]     |
  0.013 [880]   |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
  0.016 [60]    |■■■
  0.020 [0]     |
  0.024 [8]     |
  0.028 [0]     |
  0.032 [10]    |
  0.035 [22]    |■
  0.039 [0]     |
  0.043 [10]    |


Latency distribution:
  10% in 0.0093 secs
  25% in 0.0097 secs
  50% in 0.0103 secs
  75% in 0.0110 secs
  90% in 0.0126 secs
  95% in 0.0224 secs
  99% in 0.0417 secs

Details (average, fastest, slowest):
  DNS+dialup:   0.0002 secs, 0.0049 secs, 0.0430 secs
  DNS-lookup:   0.0002 secs, 0.0000 secs, 0.0046 secs
  req write:    0.0000 secs, 0.0000 secs, 0.0002 secs
  resp wait:    0.0112 secs, 0.0049 secs, 0.0370 secs
  resp read:    0.0000 secs, 0.0000 secs, 0.0005 secs

Status code distribution:
  [200] 1000 responses

3. chaos-proxy-go (Go)

./hey -n 1000 -c 50 http://localhost:5000/api/hello

Summary:
  Total:        0.1133 secs
  Slowest:      0.0222 secs
  Fastest:      0.0004 secs
  Average:      0.0053 secs
  Requests/sec: 8828.0577

  Total data:   94000 bytes
  Size/request: 94 bytes

Response time histogram:
  0.000 [1]     |
  0.003 [202]   |■■■■■■■■■■■■■■■■■■■■■■■■■
  0.005 [318]   |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
  0.007 [214]   |■■■■■■■■■■■■■■■■■■■■■■■■■■■
  0.009 [135]   |■■■■■■■■■■■■■■■■■
  0.011 [59]    |■■■■■■■
  0.013 [57]    |■■■■■■■
  0.016 [6]     |■
  0.018 [4]     |■
  0.020 [2]     |
  0.022 [2]     |


Latency distribution:
  10% in 0.0018 secs
  25% in 0.0028 secs
  50% in 0.0046 secs
  75% in 0.0071 secs
  90% in 0.0099 secs
  95% in 0.0118 secs
  99% in 0.0140 secs

Details (average, fastest, slowest):
  DNS+dialup:   0.0002 secs, 0.0004 secs, 0.0222 secs
  DNS-lookup:   0.0002 secs, 0.0000 secs, 0.0046 secs
  req write:    0.0000 secs, 0.0000 secs, 0.0009 secs
  resp wait:    0.0048 secs, 0.0004 secs, 0.0157 secs
  resp read:    0.0001 secs, 0.0000 secs, 0.0060 secs

Status code distribution:
  [200] 1000 responses

Analysis

The benchmark results highlight the performance impact of each proxy layer:

Scenario

Requests/sec

Avg Latency (s)

99th %ile Latency (s)

Fastest (s)

Slowest (s)

Errors

Direct to Caddy

28,384

0.0016

0.0116

0.0001

0.0121

0

chaos-proxy (Node.js)

4,262

0.0115

0.0417

0.0049

0.0430

0

chaos-proxy-go (Go)

8,828

0.0053

0.0140

0.0004

0.0222

0

Key Observations

Conclusion

This benchmark demonstrates the performance advantages of using Go for an HTTP proxy compared to Node.js. While the original chaos-proxy in Node.js is highly extensible and integrates well with JS/TS applications, it incurs a substantial performance penalty under load. If you need maximum performance and don't require custom JavaScript/TypeScript middleware, chaos-proxy-go is the clear winner. For teams prioritizing extensibility in JS/TS, the original chaos-proxy remains a flexible option—just be aware of the performance tradeoff.