[07:46:08] this is going to be patches-applied vs patches-unapplied I suspect [09:11:30] <_joe_> Emperor: uh wdym? [09:11:49] <_joe_> I was curious if you had any idea what was working as not-expected :) [09:12:08] <_joe_> (as designed, ofc, is possible, but I was also surprised) [09:13:41] Broadly, if you clone some random package's repository, you might get "source code and Debian patches applied" or "source code without Debian patches applied".[or some other more obscure things] This is why the Debian packaging docs on wikitech say "just use dgit clone". But the effect of this is that if you don't know which you got (and use appropriate build runes) you may end up building the patched source or the unpatched source. [09:14:27] The merit of the dgit-based approach written up on wikitech is you don't have to care about this nor about worrying about maintaining debian/patches because dgit does all that for you [09:15:51] There is tooling Debian developers use for managing the patch stack and switching between applied and unapplied, but honestly it's pain and I long since stopped bothering for my packages, because I can store patches in git, so why also store them in the source tree separately and make pain for myself? [09:16:01] [but this is not uncontroversial within Debian] [09:18:14] <_joe_> I love quilt :D [09:18:21] I've not yet looked at sukh.e's package (I'm aiming to get to it before their day starts), but I suspect that's what it is [09:18:40] My love for quilt is unconfined. In the lower bound [09:19:30] <_joe_> yeah I got that [09:19:51] XD [10:39:36] heads-up oncall: I'm going to start decommissioning the jobrunner (*not* the currently-used mw-jobrunner service) and videoscaler services. They both have paging disabled but no guarantees about hidden gremlins [10:40:06] <_joe_> 👺 [12:40:20] (having looked at the build, it looks like the source tree is already patched and gets built thus, but I may be missing something) [12:43:30] Emperor: I was building some custom patches against OpenSSL 3.0.15 and they were applying correctly, when they should have been failing. that's how I discovered it. [12:43:45] using the same repo on the build host, I get failures as expected [12:44:05] in this case, I got a clean build because the patches in debian/ were not being applied, and the output does not reflect that, unless I am misreading it? [12:45:23] I guess, shouldn't it explicitly say that the patches are being applied, as it does in say the varnish repo? [12:45:59] https://gitlab.wikimedia.org/repos/sre/varnish/-/jobs/477093/raw [12:46:03] dpkg-source: info: using patch list from debian/patches/series [12:46:26] I'm confused, sorry. AFAICT the tip of bookworm-wikimedia branch of the openssl-ech has all the patches in debian/patches applied to the source code. So when it is built it's the already-patched source code that's being built [12:46:49] So no, I wouldn't expect the build to say "Applying patches", because the source code is already patched, which is what wmf-debci expects [12:47:29] Emperor: I am confused too, that should be true for the varnish repo as well though? [12:48:11] and also I guess, that I had a custom patch that was never applied so I had a clean build when I should have not. I can get that up again and that's not in the source code [12:49:25] I nuked that repo to start again so that's on me but it had a custom ech.patch that was failing on build host [12:49:28] https://gitlab.wikimedia.org/repos/sre/openssl-ech/-/jobs/479286/raw [12:49:31] dpkg-buildpackage: info: source changed by Sukhbir Singh [12:49:39] and no mention of the custom patch [12:49:47] I can get it up again shortly and we can try again perhaps [12:50:26] sukhe: the varnish repo looks to be patches-unapplied (which, err, is now how you're meant to do it) [12:51:05] The point of the dgit-based workflow is that you only ever deal with patched source code and never have to worry about debian/patches [12:52:04] So if you want to change the source yourself, you should just change the source and commit (cf https://wikitech.wikimedia.org/wiki/Debian_packaging_with_dgit_and_CI#Making_Changes ) [12:54:48] (nb again here you're building from already-patched source code so you wouldn't expect to see dpkg-source doing any patching when you build) [12:57:33] Emperor: ok I will get my ech example up again and revisit it. thanks for digging into it! [13:06:03] NP; if the documentation could be improved to make this clearer, do feel free to make edits/suggestions :) [13:08:14] Emperor: mostly is me I think, carrying over the workflow from how I have built packages so far [13:20:24] Sure. Aside: you don't need to worry about quilt patches for any changes you make because we never build a source package (if we were to do that, dgit can do it for you), just build binary .debs from the patched source [14:08:56] hello on-callers, as FYI I just updated tegola on Wikikube (part of the maps cluster) with https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/1133142, so it uses a more resilient kafka config [14:09:06] there is a cronjob that periodically contacts kafka etc.. [14:09:15] if any issue arise, lemme know [15:23:28] just a heads-up, removing jobrunner as a service broke two spec tests, fixing now. [15:25:09] oh that fun again! [15:28:41] <_joe_> hnowlan: have you ever tried to remove the limestone sediments from your shower? if so, this work must feel familiar :) [15:29:44] hah [15:30:26] here's the fix https://gerrit.wikimedia.org/r/c/operations/puppet/+/1135056 [15:31:59] thanks! [15:32:41] I am going to upgrade the changeprop's jobqueue docker image in eqiad, last one of the series [15:33:00] all the other ones have been running fine for some days (nodejs20 + librdkafka bump) [15:33:35] sgtm [15:43:17] the usual turmoil in the metrics but it looks working, we have some errors emitted when the pod starts (failed to connect to redis and unknown topics) [15:43:27] but very few, they stop in few seconds [15:43:52] I'll keep it monitored [17:57:09] does anyone know if we create any custom Puppet facts from netbox? [17:59:32] inflatador: I don't think so, by probably some more context could be helpful on what you're trying to do. Maybe there are other ways [18:00:32] volans I was looking at Jesse's ticket for use of regex in Puppet https://phabricator.wikimedia.org/T389932#10722979 ...was thinking if we could get row/rack info into facts (or some other place like etcd), maybe we could get rid of our gross regexes [18:02:25] we do have rows and racks already in hiera [18:03:02] profile::netbox::host::location [18:04:55] nice! so if I import that profile to my role, will it automatically look up the rack/row from netbox? [18:05:31] no, the data is exported from netbox into hiera, it's accessible as any other hiera key with lookup [18:07:14] I do remember seeing the "Bare Metal host on site ${location['site']} and rack ${location['rack']}", verbiage on hosts. So it sounds like this has potential, thanks! [18:08:05] check the usage of profile::netbox::host::location in puppet as some example [18:12:50] yes when we added that data from netbox was exactly to help remove all the hardcoded info for the rack/row aware services [19:50:39] just a heads-up that I merged a change to regex.yaml https://gerrit.wikimedia.org/r/c/operations/puppet/+/1134765 . Elastic/cirrussearch are happy but ping me if you notice any problems [21:40:38] heads up that ryankemper and I are starting the elastic-opensearch migration in CODFW. CODFW is completely disabled in mwconfig, so no impact is expected. more details in T388610 if interested [21:40:39] T388610: Migrate production Elastic clusters to Opensearch - https://phabricator.wikimedia.org/T388610 [22:04:55] heads up that I got some diffs for wikikube-worker2142 when I ran the rename cookbook, it's been changed to 'failed' status in netbox [22:26:21] We renamed elastic2087->cirrussearch2087 via the rename cookbook. The cookbook finished successfully, but we're still getting NXDOMAIN for cirrussearch2087.codfw.wmnet, any suggestions? [22:27:14] I guess it's negative TTL...I can get a response from eqiad, but not from CODFW resolver [22:27:55] clear the caches if required. https://wikitech.wikimedia.org/wiki/DNS#Wipe_caches [22:28:28] thanks sukhe , was about to ask if it was OK to run that [22:29:28] that did the trick! Thanks