Metrics are a powerful and cost-efficient tool for understanding the health and performance of your code in production. But it's hard to decide what metrics to track and even harder to write queries to understand the data.
Autometrics provides a wrapper function and decorator to instrument functions, classes, and methods with the most useful metrics: request rate, error rate, and latency. It standardizes these metrics and then generates powerful Prometheus queries based on your function details to help you quickly identify and debug issues in production.
Learn more about Autometrics at autometrics.dev.
- ✨
autometrics()
wrapper /@Autometrics()
decorator instruments any function or class method to track its most useful metrics - 🌳 Works in Deno, NodeJS and browser environments (*)
- 💡 Writes Prometheus queries so you can understand the data generated without knowing PromQL
- 🔗 Injects links to live Prometheus charts directly into each function's doc
- 🔍 Helps you to identify commits that introduced errors or increased latency
- 📊 Grafana dashboards work out of the box and visualize the performance of instrumented functions & SLOs
- ⚡ Minimal runtime overhead
- Pushing metrics from client-side and FaaS environments is currently experimental.
- 🚨 Allows you to define alerts using SLO best practices directly in your source code comments
import { autometrics } from "@autometrics/autometrics";
const createUserWithMetrics = autometrics(async function createUser(payload: User) {
// ...
});
createUserWithMetrics();
(See the recipes below for other setup scenarios.)
- Install the library
npm install @autometrics/autometrics @autometrics/exporter-prometheus
# or
yarn add @autometrics/autometrics @autometrics/exporter-prometheus
# or
pnpm add @autometrics/autometrics @autometrics/exporter-prometheus
- Instrument your code using the
autometrics
wrapper orAutometrics
decorator
import { autometrics } from "@autometrics/autometrics";
const createUserWithMetrics = autometrics(async function createUser(payload: User) {
// ...
});
createUserWithMetrics();
import { Autometrics } from "@autometrics/autometrics";
class User {
@Autometrics()
async createUser(payload: User) {
// ...
}
}
- Call
init()
to set up a Prometheus scrape endpoint
This endpoint will serve to export the metrics from your application and allows them to be scraped by Prometheus.
import { init } from "@autometrics/exporter-prometheus";
init(); // starts the webserver with the `/metrics` endpoint on port 9464
- Run Prometheus locally to validate and preview the data
You can use the open source Autometrics CLI to run automatically configured Prometheus locally to see the metrics that will be registered by the change. See the Autometrics CLI docs for more information.
or you can configure Prometheus manually:
scrape_configs:
- job_name: my-app
metrics_path: /metrics # The default path for the Autometrics Prometheus exporter
static_configs:
- targets: ['localhost:9464'] # The default port for the Autometrics Prometheus exporter
scrape_interval: 200ms
# For a real deployment, you would want the scrape interval to be
# longer but for testing, you want the data to show up quickly
See the docs for more Prometheus configurations.
- Install the IDE extension
In order to get charts in VSCode, download the Autometrics VSCode extension.
If you're on any other IDE you can install and add the TypeScript plugin directly:
npm install --save-dev @autometrics/typescript-plugin
Add the language service plugin to the tsconfig.json
file:
{
"compilerOptions": {
"plugins": [
{
"name": "@autometrics/typescript-plugin",
"prometheusUrl": ""
}
]
}
}
Below are different recipes for using Autometrics with a server-side setup and edge/client-side setups. If you would like to see examples with specific frameworks, please have a look at the examples/ directory.
npm install @autometrics/autometrics @autometrics/exporter-prometheus
# or
yarn add @autometrics/autometrics @autometrics/exporter-prometheus
# or
pnpm add @autometrics/autometrics @autometrics/exporter-prometheus
- Anywhere in your source code:
import { autometrics } from "@autometrics/autometrics";
import { init } from "@autometrics/exporter-prometheus";
init(); // starts the webserver with the `/metrics` endpoint on port 9464
async function createUserRaw(payload: User) {
// ...
}
const createUser = autometrics(createUserRaw);
// ^ instrumented function
npm install @autometrics/autometrics @autometrics/exporter-prometheus-push-gateway
# or
yarn add @autometrics/autometrics @autometrics/exporter-prometheus-push-gateway
# or
pnpm add @autometrics/autometrics @autometrics/exporter-prometheus-push-gateway
- Anywhere in your source code:
import { autometrics } from "@autometrics/autometrics";
import { init } from "@autometrics/exporter-prometheus-push-gateway";
init({ url: "https://<your-push-gateway>" });
async function createUserRaw(payload: User) {
// ...
}
const createUser = autometrics(createUserRaw);
// ^ instrumented function
npm install @autometrics/autometrics @autometrics/exporter-otlp-http
# or
yarn add @autometrics/autometrics @autometrics/exporter-otlp-http
# or
pnpm add @autometrics/autometrics @autometrics/exporter-otlp-http
- Anywhere in your source code:
import { autometrics } from "@autometrics/autometrics";
import { init } from "@autometrics/exporter-otlp-http";
init({ url: "https://<your-otel-collector>" });
async function createUserRaw(payload: User) {
// ...
}
const createUser = autometrics(createUserRaw);
// ^ instrumented function
Issues, feature suggestions, and pull requests are very welcome!
If you are interested in getting involved:
- Join the conversation on Discord
- Ask questions and share ideas in the Github Discussions
- Take a look at the overall Autometrics Project Roadmap