[04:09:25] Done: [04:09:26] • worked on https://phabricator.wikimedia.org/T383081 (persist maintain-harbor logs) [04:09:26] Doing: [04:09:26] • Started working on https://phabricator.wikimedia.org/T317953 (add on-wiki edits of toolforge tools to toolviews report) [04:09:26] This already had some codes written in puppet. What I did was: [04:09:26] • move the code from puppet to toolviews repo [04:09:26] • re-architecture the way it works and the data models so we can eventually expose the following endpoints in the toolsview API (or something similar): [04:09:26] * /api/v1/edits/wiki/all/tool/all/daily// [04:09:27] * /api/v1/edits/wiki/all/tool//daily// [04:09:27] * /api/v1/edits/wiki/all/tool/all/day/ [04:09:28] * /api/v1/edits/wiki//tool/all/daily// [04:09:28] * /api/v1/edits/wiki//tool/all/day/ [08:00:11] Cteam: welcome to today 🦄! Don’t forget to post your update in thread. [08:00:11] Feel free to include: [08:00:11] 1. 🕫 Anything you'd like to share about your work [08:00:11] 2. ☏ Anything you'd like to get help with [08:00:11] 3. ⚠ Anything you're currently blocked on [08:00:11] (this message is from a toolforge job under the admin project) [14:18:49] * Got 80% of the way to ripping out labs-ip-aliaser and then discovered that we maybe still need it (at least in codfw1dev) T374129 [14:18:49] * Rebooted a bunch of nfs-suffering exec nodes [14:18:49] * Got nova-api-metadata logs into ELK where they belong [19:18:02] Done: [19:18:02] * [intern] interview for the intern position [19:18:02] * [ceph] finally added QoS to ceph \o/, currently only cloudcephosd1040 and 1038 have it, will enable it fleet-wide on monday [19:18:03] * [ceph] drained coludcephosd1012 to reimage it, but the reimage did not work :/ [19:18:03] * [toolforge,hypotheses] worked on the new hypotheses for the quarter [19:18:03] Doing: [19:18:03] * [code reviews] I'm sorry again raymond :/, I'll spend a few hours monday morning exclusively on it [19:18:04] * [jobs-emailer] Merge the patches to enable monitoring [19:18:04] * [toolforge,bastion container] I have to retake this, now that we can load configs from the environment for toolforge clis [19:18:07] * [openstack,nova-api-metadata] have to do a bit of a look on why there's no logs on logstash for that service [19:18:07] * [dell,ceph hard drives] We got a reply yesterday afternoon asking for some info, have to gather and reply back [19:18:07] * [nova-api-metadata] either investigate why the logs were not in logstash or close the task on monday [19:18:55] * [nfs hiccup] finish up adding logs there, I did not find any issues today when draining the cloudceph node, so no new info there [19:18:58] Blockers: [19:19:01] * None