This project compares the performance of two JavaScript runtimes, Node.js and Bun, to assess their strengths and weaknesses across multiple workloads. The focus is on CRUD operations, concurrency handling, cold start times, and throughput. By benchmarking these two runtimes, the project aims to identify the optimal runtime for specific API workloads.
benchmark-node-bun/
├── bun-api/ # Bun API implementation
│ ├── server.js
│ └── Dockerfile
├── node-api/ # Node.js API implementation
│ ├── server.js
│ └── Dockerfile
├── utils/ # Shared utilities (e.g., Fibonacci function)
│ └── fibonacci.js
├── benchmarks/ # Benchmark scripts and results
│ ├── benchmark.sh
│ └── results/
│ └── benchmark-results.md
├── .github/ # CI/CD configuration
│ └── workflows/
│ └── benchmark.yml
└── README.md # Main documentation (this file)
-
CRUD Operations:
POST /tasks
: Create a new task.GET /tasks
: Retrieve all tasks.GET /tasks/:id
: Retrieve a specific task.PATCH /tasks/:id
: Update a task.DELETE /tasks/:id
: Delete a task.
-
Fibonacci Sequence Calculation:
GET /fibonacci/:n
: Calculate and return the nth Fibonacci number.
-
Cold Start Time Measurement:
- Measure how quickly each runtime starts and handles the first request.
-
Concurrent Load Testing:
- Use tools like
wrk
andab
to measure throughput and latency under concurrent requests.
- Use tools like
-
Automated CI/CD Benchmarking:
- Automate benchmarks using GitHub Actions to run on every code push.
-
Consistent Environments with Docker:
- Ensure reproducible results by running both APIs in Docker containers.
- Performance: Handle at least 500 concurrent requests and CPU-bound tasks with minimal latency
- Scalability: Support multi-core operations through clustering or workers.
- Reliability: Handle traffic spikes without downtime.
- Reproducibility: Ensure consistent results across multiple runs.
- Portability: Use Docker to maintain consistent performance across environments.
- Docker installed on your machine.
- Node.js and Bun installed locally for development.
- wrk, ApacheBench and hyperfine installed for benchmarking.
-
Clone the repository:
git clone <repository-url> cd benchmark-node-bun
-
Run the APIs with Docker:
- Node.js API:
docker build -t node-api ./node-api docker run -p 3000:3000 node-api
- Bun API:
docker build -t bun-api ./bun-api docker run -p 3001:3001 bun-api
- Node.js API:
-
Verify APIs are running:
- Node.js API:
curl http://localhost:3000/tasks
- Bun API:
curl http://localhost:3001/tasks
wrk -t12 -c100 -d30s http://localhost:3000/tasks # Node.js
wrk -t12 -c100 -d30s http://localhost:3001/tasks # Bun
wrk -t12 -c100 -d10s http://localhost:3000/fibonacci/20 # Node.js
wrk -t12 -c100 -d10s http://localhost:3001/fibonacci/20 # Bun
ab -n 10000 -c100 http://localhost:3000/tasks # Node.js
ab -n 10000 -c100 http://localhost:3001/tasks # Bun
hyperfine "bun run ./bun-api/server.js" "node ./node-api/server.js"
The project uses GitHub Actions to automate benchmark tests on every push.
- Benchmarks run via the
.github/workflows/benchmark.yml
file: - Results are stored in
benchmark-results.md
for tracking.
- Binary File Upload Benchmarking: Measure performance under large file uploads.
- Monitoring with Prometheus/Grafana: Track resource usage trends.
- Gradual Load Testing with k6: Identify runtime bottlenecks by increasing load gradually.
- Chaos Testing: Test resilience under network disruptions and simulated crashes.
This project is licensed under the MIT License. See LICENSE for details.