[08:32:53] 06Traffic, 10Data-Engineering (Q3 2025 January 1st - March 31th), 13Patch-For-Review: Migrate Benthos `webrequest_sampled_live` to feed from HAProxy data - https://phabricator.wikimedia.org/T390029#10685623 (10elukey) The upgrade went fine, but there is a big difference in behavior. From the Benthos graphs,... [09:08:27] 06Traffic: Unify CDN ats/haproxy/varnish upgrade cookbooks - https://phabricator.wikimedia.org/T390094#10685679 (10Fabfur) Not much experience in writing cookbooks but I like the approach of having a `__init__` with common code and minimal code on the specific cookbooks, but this should be well documented (or th... [09:18:00] FIRING: PurgedHighEventLag: High event process lag with purged on cp4047:2112 - https://wikitech.wikimedia.org/wiki/Purged#Alerts - https://grafana.wikimedia.org/d/RvscY1CZk/purged?var-datasource=ulsfo%20prometheus/ops&var-instance=cp4047 - https://alerts.wikimedia.org/?q=alertname%3DPurgedHighEventLag [09:18:00] FIRING: PurgedHighBacklogQueue: Large backlog queue for purged on cp4047:2112 - https://wikitech.wikimedia.org/wiki/Purged#Alerts - https://grafana.wikimedia.org/d/RvscY1CZk/purged?var-datasource=ulsfo%20prometheus/ops&var-instance=cp4047 - https://alerts.wikimedia.org/?q=alertname%3DPurgedHighBacklogQueue [09:53:27] 06Traffic, 10Data-Engineering (Q3 2025 January 1st - March 31th): Migrate Benthos `webrequest_sampled_live` to feed from HAProxy data - https://phabricator.wikimedia.org/T390029#10685814 (10elukey) Worked nicely, no need to bump threads :) [10:18:53] Hey! I have a question about edge caching and headers. Lets say PCS returns a response for request (eg. en.wikipedia.org/api/rest_v1/page/mobile-html/Earth) and a second response for the same request but with a different accept-language header, would that end up to the same cache entry on edge? [10:21:33] hi nemo-yiannis, let me check our vary header [10:30:46] if I've searched correctly there's no vary on accept-language, that means that cache object *should* be the same on different accept-language reqs [10:31:04] `vary: Accept-Encoding,Cookie,Authorization` <-that's an example from curling en.w.o [10:33:29] ok [10:33:56] so in order to have both requests we should send a vary for accept-language ? [10:34:49] if you have the same exact host/url that varies on different languages and you want to have separate cache objects, yes [10:35:02] ok, makes sense thanks for the clarification [10:35:17] we have to do a little bit of testing though to check that nobody rewrites it "up to the chain" but yes [10:35:37] and when we purge a url that purge applies for all the cached requests for the same URL ? [10:38:01] that's something I still don't know, sorry, I'm looking for it but maybe someone who actually worked on the mw purge code is more reliable [10:38:15] good question though [10:40:17] i assume it purges variants too [10:41:12] this is for a new service ? [10:49:18] no, this for the PCS sunset from RESTBase. We are going to serve behind the rest gateway and i am trying to see what we need to make sure language variants work [11:17:17] what currently happens when we supply differing accept-language headers via restbase? [11:25:33] this is what i am trying to figure out [11:29:33] https://phabricator.wikimedia.org/P74492 i think when there are variants restbase returns `vary: accept-language` but for example for enwiki it doesnt [11:32:08] so we also need to add the vary header on PCS too [11:36:57] do the client requests set multiple accept-languages do you know? [11:37:22] yes they do [11:37:40] looks like we ignore all but the first lang [11:37:41] https://github.com/wikimedia/operations-puppet/blob/production/modules/varnish/templates/text-frontend.inc.vcl.erb#L299 [11:39:01] we have logic on PCS to parse multiple `accept-language` entries so i don't worry that much about it [11:40:28] you have concerns about caching efficiency with some random `accept-language: 0.8 sr-el 0.2 sr-ec`using weights ? [11:46:04] i am not sure if our main clients for ios/android set multiple languages [11:46:10] i can check with the team