You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add build_info metric and use it in generated queries (#35)
* Add blurb to readme about identifying commits
* Remove "coming soon" from readme item on adding links to live Prom charts
* Initialize Prometheus Gauge for build_info
* Add updown counter for build info to otel tracker
* Implement set_build_info for OTEL and Prom, and call when we set the default tracker
* Move set_build_info call into create_tracker
* Update prometheus queries
* Update prometheus URL tests
* Add test for build_info gauge for prometheus tracker (skipped test for otel tracker)
* Update otel tracker and tracker tests after finding otel prometheus bug
* Ensure set_build_info is only called once
* Update changelog
* Add set_build_info to the TrackMetrics Protocol
* Fix build_info query based off of autometrics-dev/autometrics-shared#8
* Rename create_tracker to init_tracker
* Update pyright
* Update README to mention OpenTelemetry tracker does not work with build_info
---------
Co-authored-by: Brett Beutell <[email protected]>
Copy file name to clipboardExpand all lines: README.md
+20-2
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,8 @@ See [Why Autometrics?](https://github.com/autometrics-dev#why-autometrics) for m
17
17
most useful metrics
18
18
- 💡 Writes Prometheus queries so you can understand the data generated without
19
19
knowing PromQL
20
-
- 🔗 Create links to live Prometheus charts directly into each functions docstrings (with tooltips coming soon!)
20
+
- 🔗 Create links to live Prometheus charts directly into each function's docstring
21
+
-[🔍 Identify commits](#identifying-commits-that-introduced-problems) that introduced errors or increased latency
21
22
-[🚨 Define alerts](#alerts--slos) using SLO best practices directly in your source code
22
23
-[📊 Grafana dashboards](#dashboards) work out of the box to visualize the performance of instrumented functions & SLOs
23
24
-[⚙️ Configurable](#metrics-libraries) metric collection library (`opentelemetry`, `prometheus`, or `metrics`)
@@ -112,7 +113,22 @@ def api_handler():
112
113
Configure the crate that autometrics will use to produce metrics by using one of the following feature flags:
113
114
114
115
-`opentelemetry` - (enabled by default, can also be explicitly set using the AUTOMETRICS_TRACKER="OPEN_TELEMETERY" env var) uses
115
-
-`prometheus` -(using the AUTOMETRICS_TRACKER env var set to "PROMETHEUS")
116
+
-`prometheus` - (using the AUTOMETRICS_TRACKER env var set to "PROMETHEUS")
117
+
118
+
## Identifying commits that introduced problems
119
+
120
+
> **NOTE** - As of writing, `build_info` will not work correctly when using the default tracker (`AUTOMETRICS_TRACKER=OPEN_TELEMETRY`).
121
+
> This will be fixed once the following PR is merged on the opentelemetry-python project: https://github.com/open-telemetry/opentelemetry-python/pull/3306
122
+
>
123
+
> autometrics-py will track support for build_info using the OpenTelemetry tracker via #38
124
+
125
+
Autometrics makes it easy to identify if a specific version or commit introduced errors or increased latencies.
126
+
127
+
It uses a separate metric (`build_info`) to track the version and, optionally, git commit of your service. It then writes queries that group metrics by the `version` and `commit` labels so you can spot correlations between those and potential issues.
128
+
129
+
The `version` is read from the `AUTOMETRICS_VERSION` environment variable, and the `commit` value uses the environment variable `AUTOMETRICS_COMMIT`.
130
+
131
+
This follows the method outlined in [Exposing the software version to Prometheus](https://www.robustperception.io/exposing-the-software-version-to-prometheus/).
"Error Ratio URL": "http://localhost:9090/graph?g0.expr=sum%20by%20%28function%2C%20module%29%20%28rate%20%28function_calls_count_total%7Bfunction%3D%22myFunction%22%2Cmodule%3D%22myModule%22%2C%20result%3D%22error%22%7D%5B5m%5D%29%29%20/%20sum%20by%20%28function%2C%20module%29%20%28rate%20%28function_calls_count_total%7Bfunction%3D%22myFunction%22%2Cmodule%3D%22myModule%22%7D%5B5m%5D%29%29&g0.tab=0",
"Error Ratio URL": "http://localhost:9090/graph?g0.expr=sum%20by%20%28function%2C%20module%2C%20commit%2C%20version%29%20%28rate%20%28function_calls_count_total%7Bfunction%3D%22myFunction%22%2Cmodule%3D%22myModule%22%2C%20result%3D%22error%22%7D%5B5m%5D%29%20%2A%20on%20%28instance%2C%20job%29%20group_left%28version%2C%20commit%29%20%28last_over_time%28build_info%5B1s%5D%29%20or%20on%20%28instance%2C%20job%29%20up%29%29%20/%20sum%20by%20%28function%2C%20module%2C%20commit%2C%20version%29%20%28rate%20%28function_calls_count_total%7Bfunction%3D%22myFunction%22%2Cmodule%3D%22myModule%22%7D%5B5m%5D%29%20%2A%20on%20%28instance%2C%20job%29%20group_left%28version%2C%20commit%29%20%28last_over_time%28build_info%5B1s%5D%29%20or%20on%20%28instance%2C%20job%29%20up%29%29&g0.tab=0",
0 commit comments