Sampling events
You can enable event sampling to reduce the number of events sent to Honeybadger. This is especially useful if you are hitting your daily event quota limit:
from honeybadger import honeybadger
honeybadger.configure(events_sample_rate=50) # Sample 50% of eventsThe events_sample_rate option accepts a whole percentage value between 0 and
100, where 0 means no events will be sent and 100 means all events will be sent.
The default is no sampling (100%).
Events are sampled by hashing the request_id if available in the event
payload, otherwise a unique UUID is generated for that event. This deterministic
approach ensures that related events from the same request are consistently
sampled together.
Per-event sampling
Section titled “Per-event sampling”The sample rate is applied to all events sent to Honeybadger Insights, including
automatic instrumentation events. You can also set the sample rate per event by
adding the _hb metadata with a sample_rate key:
honeybadger.event("user_created", { "user_id": user.id, "_hb": {"sample_rate": 100}})The event sample rate can also be set within the honeybadger.set_event_context
function. This can be handy if you want to set an overall sample rate for a
thread or ensure that specific instrumented events get sent:
# Set a higher sampling rate for this entire threadhoneybadger.set_event_context(_hb={"sample_rate": 100})
# Now all events from this thread, including automatic instrumentation,# will use the 100% sample rateRemember that event context is thread-local and applies to all events sent from
the same thread after the set_event_context call, until it’s changed or the
thread terminates.
Sample rate precedence
Section titled “Sample rate precedence”Sample rates are determined in the following order (highest to lowest priority):
- Event data -
_hbmetadata passed directly to the event - Event context -
_hbmetadata set viaset_event_context() - Global configuration -
events_sample_rateconfig option
When setting sample rates below the global setting, be aware that this affects
how events with the same request_id are sampled. Since sampling is
deterministic based on the request_id hash, all events sharing the same
request_id will either all be sampled or all be skipped together. This ensures
consistency across related events.