[00:53:33] 67,435,987,321,567.76 years [03:01:50] When Ward Cunningham becomes President of the Foundation [10:52:56] Who [10:53:17] [1/2] I wanted to make the textbox for my wiki page, but it requiers the permission [10:53:17] [2/2] Where should i read it? [10:56:39] #general and #support are the places to discuss something like this. I'm also not sure what the question means [11:23:47] [1/2] MW servers at 100% utilization. [11:23:47] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1477627407136587917/image.png?ex=69a57342&is=69a421c2&hm=1a2242d73ce792674b68c2f783a12d03baabbc520cad677aefa193d79ce9064b& [11:24:48] And mw201 is fine when I ssh into it and launched htop. TBH I didn't even know that I have access to mw- servers until I tried this time. [11:25:22] Up to 100% on mw201 again. Just lots of php-fpm processes which unfortunately I do not know how to interpret. [11:31:41] [1/2] Might've been a request spike or some scraper going haywire. I do see some botted requests on Cloudflare but the number of requests wasn't abnormal compared with the usual activity. [11:31:41] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1477629394011820032/image.png?ex=69a5751c&is=69a4239c&hm=a977aaf11907b0f88099671cda033936d19dc09b67a5438196aface87b2b8321& [11:41:43] [1/2] This is not improving. I really need to sleep. Hopefully it resolves by itself or someone figures out a fix. The irregularity suggests it's probably some scraper traffic? [11:41:44] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1477631921814245427/image.png?ex=69a57777&is=69a425f7&hm=fe3ec28bb33d991fcf34eb22e7578ddfcb5e726a61b3a88abd267d92c50f41e5& [15:45:07] likely was, see MM [20:17:26] usually it's scrapers that spike php process usage [22:31:28] How does MH handle regenerating databases.php across all the mw and mwtask servers? I've made hacky workarounds for WO but uh they're not great lol, I know there's $wgManageWikiServers but I don't think that applies to databases.php [22:32:39] actually is it that CreateWiki doesn't play nice with redis as a BagOStuff cache? [22:34:46] There's an expiry key which is saved in the cache (Redis or Memcached, not sure rn) and iirc it's checked on every request and if the local db lists are outdated, they are regenerated [22:35:08] Okay I think it might just be that redis was never properly defined for my job runner server lmao [22:35:34] I can't really tell and I've just been forcefully regenerating it through a mix of cron jobs and hacky scripts and it's questionable work from me [22:35:37] Oh [22:35:51] probably should've looked into it further before I started making hacky fixes lol [22:36:49] 💀 it was that [22:36:53] genuinely I am so stupid [22:37:13] someone needs to trout me urgently [22:38:01] it's all making sense now