[11:20:07] !log admin apply urpf strict filter to eqiad cloud-hosts vlan - T285461 [11:20:10] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Admin/SAL [11:20:11] T285461: Review filtering for cloud-hosts on CR routers eqiad - https://phabricator.wikimedia.org/T285461 [15:17:52] Hi, I'm working on T304433 and have just created two new hosts in deployment-prep [15:17:53] T304433: Upgrade event platform related VMs in deployment-prep to Debian bullsye (or buster) - https://phabricator.wikimedia.org/T304433 [15:18:03] neither of them can currently run puppet [15:18:19] can you be more specific on which errors you're getting? [15:18:24] ... [15:18:33] self signed certificate in certificate chain for /CN=Puppet CA: deployment-puppetmaster03.deployment-prep.eqiad.wmflabs [15:18:52] its been years since I made new nodes in deployment-prep [15:19:09] so am not up to date on current process (and ya I know deployment-prep isn't officially maintained by Cloud Services :) ) [15:19:40] i don't know why it has deployment-puppetmaster03, afaict that node doesn't exist [15:19:59] and, on deployment-puppetmaster04, sudo puppet ca list (is that even the right command anymore?) doesn't show my node needing signed [15:20:05] that's happening because deployment-prep has its own self-hosted puppet master, you need to follow https://wikitech.wikimedia.org/wiki/Help:Standalone_puppetmaster#Step_2:_Setup_a_puppet_client starting on step 2 [15:20:16] one of the new nodes: [15:20:23] deployment-eventstreams-2.deployment-prep.eqiad1.wikimedia.cloud [15:20:58] right, and step one should be done in project puppet [15:21:01] that hiera is set there [15:21:16] ah okay, following step 2 then, ty [15:21:51] yep, that re-creation of the certs (what taavi linked) should help there yes [15:22:28] ty, had done that before but didn't realize it was necessary in deployment-prep with defauluts there. [15:22:41] it worked, much appreciated! [15:28:47] 🎉 [18:42:39] !log tools.lexeme-forms deployed c6001bf897 (l10n updates; use pip-tools, includes some package updates such as Flask 2.0.2→2.1.0; clean up service.template) [18:42:42] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.lexeme-forms/SAL [20:10:56] hi taavi [20:11:36] i was just starting to upgrade some OSes in deployment-pre [20:11:37] p [20:11:40] and came across this [20:11:40] https://gerrit.wikimedia.org/r/plugins/gitiles/cloud/instance-puppet/+/cb613260a9df45b499991e12f85a6d30944b8ddd%5E%21/#F0 [20:12:18] which is fairly confusing, why did you add new kafka clusters to hiera that are copies of kafka clusters that already exist? [20:23:54] https://phabricator.wikimedia.org/T304433#7819828 [20:24:02] going offline! feel free to reply on ticket, thank you! [21:30:25] on wikitech-static, there's some conflicts in the JavaScript stuff. There's a very old copy a new one. It seems the imports we use overwrite and add to existing content, which means deletions of interface overrides etc don't get applied, including e.g. COmmon.js [21:30:30] https://wikitech-static.wikimedia.org/w/index.php?title=MediaWiki:Common.js&action=history vs https://wikitech.wikimedia.org/wiki/MediaWiki:Common.js [21:30:45] not sure, if anything, to do about it. [21:30:54] maybe mass-delete ns_mediawiki before each import [21:31:19] Or $wgUseSiteJs = $wgUseSiteCss = false; [21:31:31] (might affect some templates to not look right, but seems fine?) [21:45:52] Krinkle: let me find the code that does that import... [21:46:01] (unless you're staring right at it) [21:46:51] https://wikitech.wikimedia.org/wiki/Wikitech-static#Automatic_content_syncronization [21:47:34] I can edit that on your behalf or get you login access [21:51:54] I wasn't able to find that page or the script [21:52:27] Ha, TIL about https://wikitech.wikimedia.org/dumps/ [21:52:29] that's neaet [21:53:30] Sometimes the old tech is the best [21:54:05] It would appear maintenance/nukeNS.php might do the job [21:54:09] it even defaults to NS_MEDIAWIKI [21:54:20] * Krinkle tests locally [21:57:28] oh wow, this thing goes deep.. purgeRedundantText() and nukePage.php, content slots all by hand SQL. [21:57:33] That seems unwise. [21:58:25] I can confirm it works though [21:58:33] and the wiki still work fine afterwards [21:58:45] and even if it did remove more, we're importing everything anyway [21:59:11] but it is kinda nice I suppose to not have the wiki publicly empty everytime before import, and as quick way to access a recently deleted content page [21:59:16] seems harmless enough to leave as-is [21:59:46] So yeah, you'd run `php maintenance/nukeNS.php --ns 8 --all --delete` [21:59:53] I'm trying to think how this affects history on wikitech-static, and if we care about history [22:00:02] before importDump.php [22:00:42] I'm confused by the "harmless to leave as-is" thing. Do you mean leaving the import script as is? [22:01:49] andrewbogott: what I mean is that as-is wikitech-static accumulates content pages that were intentionally deleted on wiktech, e.g. create [[Moon_datacenter]] today, delete it after import happens once, then delete it, it'll remain on static forever since nothing deletes it, it only adds every import (except pages that havent' changed). [22:02:02] I'm suggesting to leave that as-is, except for ns=8 which is config/interface pages [22:02:18] which the above would wipe clean before each import [22:02:25] ah, I see. [22:02:32] Ok, let me figure out how to log in there :) [22:03:16] the observable effect of this is `Deployments [curr] [curr]` as two different gadgets at different times did something like that and they're now both on static [22:03:45] In theory this code is also in https://gerrit.wikimedia.org/r/admin/repos/operations/wikitech-static [22:03:54] I'm checking to see if that's actually current on the server... [22:09:21] Krinkle: a bit of a mess here, I'm going to need to make some preliminary patches. Will add you as reviewer when I get to the good part [22:10:45] okay :) Thanks [22:21:47] Krinkle: feel free to edit the patch description to add more detail [22:59:10] andrewbogott: LGTM. I suppose the wiki page perhaps shouldn't look as official w.r.t. source inlined there [22:59:17] https://wikitech.wikimedia.org/wiki/Wikitech-static#Automatic_content_syncronization [23:02:42] Yeah, I looked back at that page and it does mention the git repo but maybe I should remove that cut-and-paste script entirely in favor of a link