[09:24:03] dcaro: I noticed the runbook test in CI for alerts.git sometimes fails, looks like the runbooks linking to google docs sometime 401 :( have you run into this before? https://integration.wikimedia.org/ci/job/alerts-pipeline-test/1354/console [09:24:17] I'm not sure why it doesn't always fail tho [09:26:13] I guess sometimes something changes on the google side and we get a 401 instead of 200 [09:26:22] godog: I have not seen it before, but yep, that doc is not public, I would expect it to fail every time [09:27:46] mmhh ok, I'm tempted to check the final url and if it is docs.google.com then we can tolerate a 401 [09:28:07] interesting [09:28:10] https://www.irccloud.com/pastebin/EiOnIRIJ/ [09:28:35] I know right? wild [09:29:21] I am running to lunch, will read later [09:29:37] dcaro: if you have other ideas/suggestions please let me know! I'll check the scrollback [09:34:27] hmm, when it fails, the url it gets stuck in is different (it's the one that bitly redirects to): [09:34:29] https://www.irccloud.com/pastebin/KPWhSkTM/ [09:34:44] while when it passes the url is already the sign in page [09:34:47] https://www.irccloud.com/pastebin/L5aH4PdR/ [09:36:06] yep, for some reason google sometimes replies with 301 or 401 for the first preview request: [09:36:09] https://www.irccloud.com/pastebin/TjNGEXQK/ [09:38:28] my browser seems to alway get 401 though [09:40:52] hmm, the bitly response though has already a webpage in it, with some stuff [09:51:33] it's google itself for sure, got two requests (disallowing redirects) directly to the google url and I get that behavior, ~25% of the times I get 401. The headers don't help a lot that I can see [09:51:36] https://www.irccloud.com/pastebin/KX5jrZqF/ [09:53:03] Given that the runbook is private, I'd recommend just skip checking if the response is 200, and just avoid 404 for that specific one, maybe flag it as 'private' or similar [09:53:13] * dcaro lunch [10:30:03] Does the VPS serving beta cluster wikis have a stable IP address? Asking because we may try to tell WMF's Gmail config about it, to help with mail delivery problems. [10:46:17] kostajh: the beta cluster has a mail server using a floating IP which should be relatively stable, but iirc beta has been misconfigured for years to not use it and to route mail using the generic cloud vps mail servers instead [10:47:06] taavi: ok. any suggestions on what to do? enwiki beta seems to sometimes deliver mail to @wikimedia.org addresses, and other times not. [10:49:27] kostajh: which domain is it sending from? beta.wmflabs.org? [10:50:19] yeah, it is coming from wiki@wikimedia.beta.wmflabs.org [10:50:21] (wikimedia.beta.wmflabs.org according to https://gerrit.wikimedia.org/g/operations/mediawiki-config/+/46ddca3b60a29a9a8a8b133858d2406662bcbb15/wmf-config/CommonSettings-labs.php#29) [10:50:58] Some emails are making it to @wikimedia.org domains [10:52:32] so beta.wmflabs.org has a SPF policy permitting a toolsbeta proxy host to send mail and softfailing everything else, wikimedia.beta.wmflabs.org does not have an SPF policy [10:52:58] [Striker bug] Hello, internet! Who could review a Striker patch in https://phabricator.wikimedia.org/T348131 ? [10:53:54] kostajh: my general advice would be to assign a floating IP to the mail server in deployment-prep, configure SPF records to pass traffic from that floating IP, and then fix the mail relay puppetization to use the correct mail server [10:55:25] andre: me probably. I left a comment on the task [10:55:41] * kostajh taavi: thanks. I have no idea how to do any of those things, unfortunately. Would you be able to put this into a task for someone to potentially work on? [10:55:53] (oops, sorry for the italics) [10:56:06] I don't understand why sometimes messages are delivered, and other times not. [11:06:02] taavi, yay, thanks for the quick reply; I tried to elaborate over there [11:10:13] kostajh: https://phabricator.wikimedia.org/T343925#9264468 [11:13:22] essentially mail delivery to big providers like google is a black box. there are some standards like SPF that generally help with any provider, and the big ones use them as a part of their algorithms helping to decide which mail gets through and which does not [11:14:28] (the smaller providers tend to have much more predictable and fixable rules, although beta doesn't seem to have even the bare minimum config that generally gets through most small provider's filters) [11:16:40] thanks taavi [11:18:16] dcaro: thank you for your help with debugging, I've bandaided the issue here https://gerrit.wikimedia.org/r/c/operations/alerts/+/967144 [12:34:20] !log tools.wikibugs Updated channels.yaml to: cd85fd51baf38429449ea72d88330a1c2d7c249f Send Fresnel notifications to Release Engineering [12:34:23] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.wikibugs/SAL [12:37:34] !log tools.wikibugs Updated channels.yaml to: cd3ece63c7cb70143297fea0343d1fa8e5284c9f Remove #wikimedia-quibble (already in #wikimedia-releng) [12:37:36] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.wikibugs/SAL [12:48:03] !log tools flush queued webgrid jobs that had been waiting in the queue since the nfs issues last week [12:48:06] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [12:59:44] !log paws Bump urllib3 from 1.26.16 to 1.26.17 [12:59:48] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL [14:02:04] !log tools.iw Restrict redirected hostnames (T345784) [14:02:07] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.iw/SAL [14:25:56] !log tools.canary disable tool. looks like a precursor of modern day toolschecker [14:25:59] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.canary/SAL [14:40:39] 38 [14:40:42] uff [14:40:43] :) [14:44:12] 39 [14:53:46] !log paws Bump urllib3 from 1.26.17 to 1.26.18 [14:53:49] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL [15:37:04] !log paws Bump jupyterlab version T349203 [15:37:08] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL [15:37:09] T349203: jupyterlab to 4.0.7 - https://phabricator.wikimedia.org/T349203 [18:38:49] Hello. I'm having an annoying problem with mysqlclient recently. I'm unable to pip install it. It flares an error "Getting requirements to build wheel did not run successfully". I tried with Python 3.11, then 3.9 without success [18:49:20] hm, the `/usr/lib/x86_64-linux-gnu/pkgconfig/mysqlclient.pc` pkg-config file for “mysqlclient” (actually mariadbclient with compat) is installed in the container [18:49:23] but `pkg-config` isn’t [18:50:04] I wonder why… did debian kick it out of build-essentials? [18:53:08] you can work around it with `MYSQLCLIENT_CFLAGS='-I/usr/include/mariadb' MYSQLCLIENT_LDFLAGS='-L/usr/lib/x86_64-linux-gnu/' pip install mysqlclient` [18:53:20] but it’s probably worth filing a phab task too [19:01:11] Thank you, @lucaswerkmeister. Will do that right away! [19:15:11] added a few more details to T349341