[06:28:21] [1/2] So this is what I found so far with CreateWiki atm: [06:28:21] [2/2] I created the wiki, and whiles checking, access the wiki directly (`https://jumpking.miraheze.org/`) redirect to the root domain and shows 404, other than that, getting to the Main_Page, the wiki is normal and the perms are given. [09:24:25] I did the same as you did, but it redirected immediately to Main page. [09:33:05] Might just be the caches, and/or temporary issue [11:27:40] GitHub staff disabled the repo 😳 [11:40:58] @bluemoon0332 I was wondering, what's the status with CNAME/NS checks for RequestSSL? [11:41:32] I think there's a few things that still need to be fixed with the PR? [11:42:29] And also, it would be nice to have that wiki creation change (the new "needs more details" status) and 72 hours for the requestor to respond before automatic decline if you have the time sometime [11:42:31] only one actually [11:42:47] the easiest one in fact, just a small change to extension.json [11:43:01] I'd do the status myself as that's easy but not sure about that 72h thing as that's beyond my limited dev capabilities [11:43:10] Oh, that sounds good! [11:44:16] only reason it is still pending is because I've gotten a bit sidetracked these days with sysadmin stuff and the CreateWiki API (and related messes found while doing that one...) [12:48:16] I'm not sure that's the right move [12:48:27] There's still old useful versions [12:48:39] And there's still analysis to be done on how it was introduced [17:14:41] @bluemoon0332 here's my attempt at the new status, though not sure if I did everything right with the echo notifications https://github.com/miraheze/CreateWiki/pull/500 [17:44:27] We already know all there is to know about that tho. The developer introduced malicious test files which are used to derive the exploit code and add malicious lines to liblzma's Makefile [17:45:01] the exploit itself merits more analysis, we know it compromises OpenSSH servers on Debian/RPM distros but maybe there's more [17:45:33] there could also be more backdoors, this dev had write access to xz's repo on GitHub [17:48:05] I'm hoping to see a professional audit of xz's code soon tho, it is installed on basically every Linux, nix, and BSD out there, so this is a pretty big deal [17:49:04] What’s xz do again? [17:49:11] compress stuff [17:49:22] Hm [17:49:30] I thought it sounded familiar in that regard [17:58:45] FYI for everyone, the GH repo is down but before that they had a selfhosted git server: https://git.tukaani.org/?p=xz.git;a=commitdiff;h=f9cf4c05edd14dedfe63833f8ccbe41b55823b00 [17:59:11] (sorry for linking directly to a commit, got the link from https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/63132) [18:10:28] @reception123 you need to write i18n for the strings in the new echo notif [18:11:13] other than that I see some rather strange warnings on the CI with your PR... [18:14:08] Its because the class is called EchoRequestDeclinedPresentationModel [18:14:15] Which is already defined [18:15:11] oh, you're right [18:15:25] @reception123 left a review on the PR [18:15:58] It was 2 years in the planning. No saying other stuff wasn't done. Or who exactly was behind it. [18:16:48] oh right.. I knew I was forgetting something. Thanks for the review! [18:17:05] I'll fix the class name too [18:17:21] _wins sloppy (and not very good at all in general) dev of the year award_ [18:19:31] oh actually we also need stuff logentry-farmer [18:20:38] and related to my last comment in the review, this will need extra logic in the form submission function of the RequestWikiRequestViewer and in `WikiRequest` [18:21:34] oh yeah, I didn't initially realise there was a separate function needed [18:21:59] you can probably mostly duplicate `WikiRequest::onHold()` though [18:22:27] yeah, that's what I did so far anyway heh [18:23:19] You should see me introducing syntax errors everytime I do a security PR through GHSA heh [18:23:31] mostly because the CI doesn't run in those PRs [18:24:14] I'll hopefully remember to run `php -l` before merging those in the next GHSA [18:24:15] oh, syntax errors happen to me all the time, but my worst fault is that I always forget to change stuff appropriately. Like you remember for RequestSSL I left a lot of unchanged stuff around [18:24:28] but yes, you seem to have had a lot of GHSA PRs lately 😄 [18:25:56] @bluemoon0332 do you think it'll be difficult to implement the second part of this idea, where if after someone uses "moredetails" there's no user response in 72 hours the status is changed to "declined"? [18:27:39] It is theoretically easy, check the timestamp and userrights of the last person to comment in a maint scripts, have that maint script run in a cron. [18:28:04] (check the userrights because we don't want to close if the last person to comment is not a wiki creator) [18:28:30] problem would be adding that to puppet mostly, I don't yet know Miraheze's puppet all that well [18:30:51] oh if you mean adding the cron I can do that, it's fairly easy [18:30:55] the hard part for me is the mw part heh [18:31:20] well, gimme a sec then [18:31:30] or a few secs [18:41:11] @bluemoon0332 yay, looks like CI seems happy [18:41:38] the main thing is notifications as we need to be sure they work as the whole point is to get the user back to the request to provide the extra info [18:49:54] yep, looks pretty good [18:50:06] only missing the logging stuff [18:54:27] @reception123 added a comment to the task with how to do the logs [18:55:36] *PR, not task [22:56:57] <.labster> https://bsky.app/profile/hannah.the-void.social/post/3koubpqy5fy2w [23:03:15] <.labster> xz backdoor continues to freak people out [23:03:35] rightfully so tbh [23:03:47] this is like Heartbleed 2.0 [23:04:04] except worse [23:18:32] <.labster> Hasn’t someone come up with a cool name for the bug? [23:19:27] ULTRAKILLER9000 [23:19:52] <.labster> Like “Shellcrack”