[01:45:10] the initial interval is randomized between 0 and 60s so the chance of a short request being sampled is proportional to its runtime [01:50:42] normal GET requests have a request timeout of 60s so the number of samples is normally between 0 and 1 [01:54:27] the aim is to provide a uniform probability density over that interval [15:01:21] https://techblog.wikimedia.org/2021/03/03/profiling-php-in-production-at-scale/#user-content-how-profiling-can-be-expensive [15:01:22] https://gerrit.wikimedia.org/g/mediawiki/php/excimer/+/141ea6d3c873b63a7453983d602d98bfc2603e7b/excimer.c#652 [15:01:27] ori: ^ some details :) [15:02:04] oops, wrong anchor, the thing about random start is in https://techblog.wikimedia.org/2021/03/03/profiling-php-in-production-at-scale/#user-content-at-last-we-collect-a-sample