Speedracer logo

Speedracer

Run, trace, and report JavaScript performance with Chrome

Speedracer executes JavaScript races in Chrome, captures DevTools traces, and generates concise performance reports for regression testing, benchmarking, and analysis.

Overview

Overview

Speedracer is a CLI tool that runs JavaScript "races" inside Google Chrome, captures low‑level DevTools trace events, and produces both raw .trace.gz files and summarized .speedracer reports. It is designed for developers and performance engineers who need precise metrics on scripting, rendering, and painting phases.

How it works

Install globally via npm, write races using the simple race(name, fn) API (ES6 or CommonJS), and invoke speedracer run to execute them. Chrome (preferably Canary on macOS for headless mode) is driven through the DevTools protocol, and all artifacts are stored in a .speedracer directory. Use speedracer display to view a quick summary, or load the trace files into Chrome DevTools for deeper investigation.

Typical deployment

Add the tool to your CI pipeline, ensure Chrome is available on the build agents, and script speedracer run as part of performance regression checks. The generated JSON reports can be diffed programmatically, while the trace files can be archived or fed into custom analysis pipelines.

Highlights

Headless Chrome execution via DevTools protocol
Automatic generation of compressed trace files and JSON reports
Simple race definition using ES6/CommonJS modules
CLI commands `run` and `display` for quick workflows

Pros

  • Provides low‑level Chrome trace data for deep analysis
  • Integrates with existing JavaScript test suites
  • Minimal setup—just npm install and Chrome
  • Outputs both raw traces and summarized reports

Considerations

  • Unmaintained; no active updates or support
  • Headless mode limited to Chrome Canary on macOS
  • Lacks advanced analysis or visual dashboards
  • Requires manual handling of large trace files

Managed products teams compare with

When teams consider Speedracer, these hosted platforms usually appear on the same shortlist.

Blackfire Continuous Profiler logo

Blackfire Continuous Profiler

Low-overhead continuous profiling for app performance optimization.

Datadog Continuous Profiler logo

Datadog Continuous Profiler

Always-on code profiling to cut latency and cloud costs.

Elastic Universal Profiling logo

Elastic Universal Profiling

Whole-system, always-on profiling with no instrumentation.

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Developers needing precise performance metrics during CI
  • Teams performing regression benchmarks of front‑end code
  • Researchers exploring Chrome rendering and scripting timelines
  • Projects that can manage raw trace files for custom analysis

Not ideal when

  • Users requiring cross‑browser performance testing
  • Environments without Chrome or macOS Canary support
  • Teams seeking out‑of‑the‑box visual dashboards
  • Projects needing ongoing maintenance and feature updates

How teams use it

CI regression benchmark

Detect performance regressions between builds by comparing generated reports.

Micro‑benchmarking a rendering algorithm

Capture detailed scripting, rendering, and painting times to optimize the algorithm.

Custom trace analysis

Export .trace.gz files and load them into Chrome DevTools for manual investigation.

Performance education

Teach developers how Chrome’s rendering pipeline works using real trace data.

Tech snapshot

JavaScript100%

Tags

performance-testperformance-analysisperformance-metricsperformance-testingperformancerunnerchrome-headless

Frequently asked questions

What browsers does Speedracer support?

It drives Google Chrome via the DevTools protocol; headless mode works with Chrome Canary on macOS.

How are performance results stored?

Each race produces a compressed .trace.gz file and a .speedracer JSON report saved in the .speedracer directory.

Can I compare runs over time?

Speedracer generates summary reports that can be programmatically diffed to identify faster or slower runs.

Do I need to write tests in a special format?

Races are defined with a simple `race(name, fn)` API using standard ES6 or CommonJS modules.

Is there a graphical UI?

No built‑in UI; reports are JSON and traces can be opened in Chrome DevTools’ Performance panel.

Project at a glance

Dormant
Stars
1,850
Watchers
1,850
Forks
23
LicenseMIT
Repo age8 years old
Last commit3 years ago
Primary languageJavaScript

Last synced 3 hours ago