[08:00:17] Cteam: welcome to today 🦄! Don’t forget to post your update in thread. [08:00:17] Feel free to include: [08:00:17] 1. 🕫 Anything you'd like to share about your work [08:00:17] 2. ☏ Anything you'd like to get help with [08:00:17] 3. ⚠ Anything you're currently blocked on [08:00:17] (this message is from a toolforge job under the admin project) [11:11:08] Done: [11:11:08] * Nothing [11:11:08] Doing: [11:11:08] * Looking into harbor db always growing issues (https://phabricator.wikimedia.org/T356037), started cleaning up things, broke harbor login for a few minutes. Almost finished. [11:11:08] * Will try to rebuild the toolforge images to allow envvars on lighthttpd based images [11:11:09] * Continue working on the toolforge backlog [11:11:09] Blockers: [11:11:10] * Nothing [11:11:57] Also doing: investigating connection reset errors from toolforge k8s for a few tools (https://phabricator.wikimedia.org/T356164) [13:14:00] done: [13:14:00] * toolforge mail things. I have patches in review that would make us meet all of the new Google requirements. https://phabricator.wikimedia.org/T354112 [13:14:00] doing: [13:14:00] * looking if I can make any sense of the reported toolforge k8s network issues [13:14:00] blockers: [13:14:00] * FOSDEM (in the sense that attending takes some time) [16:21:52] Done: [16:22:10] * T356177 debugged lima-kilo on my M1 Mac and (maybe) found a fix [16:22:20] Next: [16:22:27] * back to T344717 toolsdb replicas [16:26:00] Today: another short day for me, don't expect to get into much beyond correspondence and meetings. Still looking at designate logs to see why we leak records when deleting k8s workers (but not when deleting other VMs somehow)