[07:43:20] pfischer: if there's no emergency in getting this feature out we should let the the train do its job imo [07:44:01] we tend to backport only bugfixes [08:39:55] looking at the last 14 days, "Search is currently too busy" are coming only from commons (SpecialMediaSearch to be precise) [08:46:22] and just realized that MediaSearch is also available on some other wikis (ptwikinews) https://pt.wikinews.org/wiki/Especial:MediaSearch?type=page [09:46:20] hi folks, I've opened https://phabricator.wikimedia.org/T374916 re: categories lag/ping alerts, I'm not quite sure if I got the project tags right though, please adjust at will [10:00:35] godog: thanks! [10:06:24] lunch [10:42:22] dcausse: sure np! [13:48:52] \o [13:48:58] o/ [14:19:01] godog: thanks! I'm adding Data Platform SRE as well [14:57:07] Is it normal to see triple counts going down? I wonder if there's some type of purge happening? https://grafana.wikimedia.org/d/000000489/wikidata-query-service?orgId=1&var-cluster_name=wdqs&from=1724811131811&to=1726584934413 [14:58:34] Steady decrease over last ~6 days. Pattern holds for wdqs-main as well, although not wdqs-scholarly. Which probably lines up with some sort of non-scholarly data being purged [15:01:08] Could it be related to the mul language code? Merging a bunch of identical labels across languages into one mul label would technically decrease the number of triples, but I don't know if it would be as many as the graph is showing.. though I suppose "millions" is plausible. [15:02:35] i imagine millions is likely, i don't think they would go through the trouble if it wasn't [15:06:04] ryankemper: there are a lot of discussions regarding the limitation of the infra since we announced the split (https://www.wikidata.org/wiki/Wikidata:Project_chat#Mass-import_policy) so I would not be surprised that the community started some cleanups and/or started to populate the mul language as Trey suggest [15:06:37] fun fact is that the decrease is only visible on the main subgraph :) [15:16:20] errand [15:26:28] gehel: sweet, thank you! [15:39:08] hmm, naming is hard...not a fan of SiteMatrixPrivateWikiClassifier, but I don't currently have better ideas :P [15:58:59] workout, back in ~40 [16:54:06] back [17:19:28] * ebernhardson mutters at package cycles...never obvious how to fix :P [17:29:04] The triple count is reducing. The databases are less overloaded. Nature is healing. [17:34:34] lunch, back in ~40 [17:35:25] tax: :P [18:30:25] inflatador: running 5' late [18:30:50] ryankemper ACK, np [18:40:01] ebernhardson I'm in pairing if interested. Nothing interesting, just looking at T374916 [18:40:02] T374916: Port Categories lag / ping checks to Prometheus/Alertmanager - https://phabricator.wikimedia.org/T374916 [18:40:19] inflatador: oh, i totally missed that while debugging something. Sec [21:13:42] ryankemper +1'd https://gerrit.wikimedia.org/r/c/operations/alerts/+/1073533 but my OCD forced me to update commit msg ;P [22:48:31] * ebernhardson realizes after writing almost everything...that i have no clean way to delineate private-update-stream being required in producer, but optional in consumer...for tomorrow :P