[02:31:09] How do I add a template [04:13:31] Disabling warmup for now, things look rather stable, but probably still needs some manual intervention to keep the queue from rising. [04:39:20] I've got some scripts in my home directory for running jobs (uses loop to make sure all jobs are ran even if runJobs.php exits due to error). [07:16:12] We need to come up with a better solution asap then [07:16:28] The cache needs to be warm when 1.41 is deployed [08:19:49] @paladox can you work out what the top 100 wikis by db size are [08:19:55] And can we try excluding them [08:20:06] We need to work out a plan [08:22:37] _personally thinks it needs to be something we just do, as long as we can monitor and control the backlog, it is not going to impact anything have a large number of jobs in their own queue and reparsing all pages on read under a new parser will generate a lot of jobs_ [08:22:47] A large backlog alone is not a reason to be worried [08:23:12] The job queue can recover from fairly high number extremely quickly with a bit of support [08:24:49] Afaics, at the point of rollback, it was on average dropping [18:34:09] It was dropping but one wiki hit 60k with refreshlinks and 40k on cache warming jobs simultaneously, and I wasn't sure how much the dropping was affected by the scripts I was running, so I decided it would be safer to turn it off while no one was monitoring. [18:53:51] Turning off unsupervised programs that may potentially go very wrong is normally a good idea yea [18:55:56] the warmup jobs have their own queue with their own runner [19:33:16] Fair [19:33:25] I've had quite a day [19:33:41] I think I'm officially cancelling bonfire night [19:34:03] But we can talk new strategies tomorrow [19:57:33] I think the current strategy of doing it letter by letter seems to be working [19:58:01] Certainly for now, though we'll see how well it holds up when everything is enabled [19:58:51] IMO, something big like this was almost always going to have to be done by dividing wikis up into clusters and enabling each cluster one by one. [20:12:51] Didn't realise paladox was doing something [20:13:05] I haven't been watching stuff today @orduin [20:14:39] Graphs look good though ye [20:14:51] There's some SMW jobs on dmlwikiwiki [20:15:10] But nothing big for parsoid [20:15:14] Thanks @paladox [21:21:26] [1/2] db error [21:21:27] [2/2] https://discord.com/channels/407504499280707585/407537962553966603/1170834606744936508 [21:29:50] i don't know why but `curl -X HEAD http://127.0.0.1/sonicpediawiki/thumb/8/85/SX57_Knuckles_and_Cosmo.jpeg/800px-SX57_Knuckles_and_Cosmo.jpeg` just stalls. Works if i use either --head or http 1.0. MediaWiki calls that url but with swift-lb i obviously replaced the domain because i'm trying to see why. [21:30:13] https kinda works in so far as it doesn't stall but it does take its time (i think it hits a timeout). [21:45:33] Weird