[00:52:42] thanks for notify, latest update installed [00:58:41] Unrelated, but would love your help expanding rfx helper to the reopening wikis sr page when time allows [00:59:21] Template is pretty basic, but would benefit from automation [01:33:03] [1/2] Real shot in the dark here but is there a resident Varnish wizard I could chat with? Noticed mention Miraheze might(?) be looking to phase out its use in favor of Cloudfare's offerings but I am very much so interested in trying it! New territory so I'm treading carefully about setting it up for a production environment And yes, this is for a (public!) wiki external to Miraheze... tho [01:33:03] [2/2] ugh I might as well ask 😊 [02:06:59] We decided not to completely and instead use it behind CloudFlare, combined with HAProxy [02:11:15] Hm, glad to hear it'll stay in use, looks wonderful in theory. [02:42:50] @cosmicalpha so, seems like a stress-free deployment of the new ManageWiki version. [02:43:45] Yep I'm surprised lol. I guess testing more than 500 times has its benefits lol [02:45:59] I have a lot of feature patches that'll be deployed throughout this week also. [02:46:48] Like finally dynamic extension disable toggling so like extensions that require others can be enabled together rather than needing to save them twice. [02:48:23] [1/2] > Like finally dynamic extension disable toggling so like extensions that require others can be enabled together rather than needing to save them twice. [02:48:23] [2/2] This sounds eerily familiar to my extension 🤨 [02:48:34] (not that that's an issue lol) [02:49:17] I dont think I've ever even seen yours lol [02:49:27] https://github.com/miraheze/ManageWiki/pull/563 anyway that's the PR for it [03:12:15] (Deep down, OA is shaking his fist and plotting his revenge against Cosmic.) [03:27:55] Hey @cosmicalpha, why don't we have a custom name for our Maintenance script worker? [03:28:13] wdym? [03:28:16] Fandom had like ten, Gamepedia had two, we have the standard name still [03:28:32] Oh we do have a Miraheze maintenance script one but never use it lol [03:28:37] It makes it easier to discern where a wiki comes from in the possibility they fork to a different wikifarm. [03:28:58] oh that [03:29:02] It's very easy to know if a Wiki was a Wikia, Gamepedia or Fandom wiki, but almost impossible to discern if it was Miraheze/WikiTide. [03:29:57] This feels kind of like branding digital livestock... [03:30:00] I know it's a picky point to make, I just think it would be cool to have the maintenance script have a custom name. [03:30:01] For us it doesn't matter as much as the others do [03:30:44] Well, its always changeable with Special:RenameUser now if a wiki self-hosts. [03:31:00] But for Wiki-farm to Wiki-farm transfers, it can help point out wikis from a common past wiki-farm/host. [03:31:04] We are meant to be more of a stock mediawiki host so it's a little different and we dont brand wikis like that. [03:33:55] I guess, it's just something I kind of got used to from those three, I think ShoutWIki does it too, can't remember. [03:53:13] Refreshlinks is such a slow process. [04:08:07] personally, i keep forgetting to set the user lol [04:30:34] Howdy, can someone with divine powers on Beta approve my OAuth consumer proposal. [04:46:34] No!!!!! [04:46:38] ⛈️ [04:59:03] hai! [05:01:23] https://meta.mirabeta.org/wiki/Special:Log?logid=738 [05:14:04] :pacman: [11:44:12] disappointing [11:44:17] you didnt say wawa [11:45:00] i'm declining your application [11:45:07] wait no [11:45:25] anyways, Claire, do you wanna poke around WACA? [11:45:30] I can tunnel my localhost [11:45:33] done >:3 [11:46:09] wait [11:46:14] https://meta.mirabeta.org/wiki/Special:Log?offset=20250421214600&limit=2 [11:46:30] sure [11:46:39] maybe find an xss by accident if i'm really lucky [11:46:50] it does have editable interface [11:46:58] I'll make you a new admin account too [11:47:31] my password is not gonna be awawa [11:47:55] lets see if serveo is working todya [11:48:19] it is! [11:48:34] oh neat [11:48:51] may work better if i forwarded the right port [11:49:09] should i connect my router up to my server using wireguard? [11:49:18] and then host a socks server on said router [11:49:27] so i can get a residential ip address if i have to use proxies xDD [11:49:39] uuuuuuuu [11:50:24] [1/34] unrelated, but judge my shell script [11:50:25] [2/34] ```bash [11:50:25] [3/34] #!/bin/bash [11:50:25] [4/34] set -e [11:50:25] [5/34] export GNUPGHOME=/usr/local/restic-backup-logger/keyring [11:50:26] [6/34] CONTENTS=$(cat << EOF [11:50:26] [7/34] Content-Type: text/plain; charset="UTF-8" [11:50:26] [8/34] $(journalctl -u restic-backup -S today) [11:50:27] [9/34] EOF [11:50:27] [10/34] ) [11:50:27] [11/34] EML=$(cat << EOF [11:50:28] [12/34] To: blankeclair@disroot.org [11:50:28] [13/34] Subject: $(date -I): Restic backup logs [11:50:29] [14/34] Content-Type: multipart/encrypted; [11:50:29] [15/34] boundary="!!!BOUNDARY!!!"; charset="UTF-8"; [11:50:30] [16/34] protocol="application/pgp-encrypted" [11:50:30] [17/34] This is an OpenPGP/MIME encrypted message (RFC 2440 and 3156) [11:50:31] [18/34] --!!!BOUNDARY!!! [11:50:31] [19/34] Content-Transfer-Encoding: 7bit [11:50:32] [20/34] MIME-Version: 1.0 [11:50:32] [21/34] Content-Type: application/pgp-encrypted; charset="UTF-8" [11:50:33] [22/34] Content-Description: PGP/MIME version identification [11:50:33] [23/34] Version: 1 [11:50:34] [24/34] --!!!BOUNDARY!!! [11:50:34] [25/34] MIME-Version: 1.0 [11:50:35] [26/34] Content-Type: application/octet-stream; name="encrypted.asc" [11:50:35] [27/34] Content-Description: OpenPGP encrypted message [11:50:36] [28/34] Content-Disposition: inline; filename="encrypted.asc" [11:50:36] [29/34] $(gpg --armor --sign --encrypt -r 886653BFBBA31BC2490424846F0877E28B7FE933 <<< "$CONTENTS") [11:50:37] [30/34] --!!!BOUNDARY!!!-- [11:50:37] [31/34] EOF [11:50:38] [32/34] ) [11:50:38] [33/34] msmtp --file=/usr/local/restic-backup-logger/config -t <<< "$EML" [11:50:39] [34/34] ``` [11:50:42] [1/2] pffft [11:50:42] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1363844390866387015/image.png?ex=68078291&is=68063111&hm=d896be90b198f114ef21f8f1f4c0f6bc1eb1f72c0e91491763205ca69437b182& [11:50:46] hmmm lemme see [11:52:38] idfk im not good at bash :Kek: dont see anything that looks too wrong(read: rm -rf) [11:52:52] it feels cursed how i construct emails using heredocs [11:52:56] especially the multipart part [11:53:01] [1/2] it's just [11:53:01] [2/2] idk, it feels so cursed to me [11:57:27] oh no totally [11:58:21] erg, its still trying to load the CSS styles from localhost [11:58:26] gg [11:58:35] localhOwOst [11:58:39] how do i tell it to use the forward url [12:00:17] did you set baseurl? [12:00:59] i think baseurl wasnt even set to local host before [12:01:24] now its complaingi [12:01:26] yay [12:01:43] i can dm you the link and make an account while i try and mess with conf [12:02:00] gg [12:37:16] we cooking chat [12:49:14] pixl, you don't have to crack so many eggs [12:49:52] pardon [12:50:00] what? [12:52:46] :HMMM: [12:53:20] hi aeywoo [12:59:02] Hello. [19:50:02] lmao I hit grafana on beta? What? [19:53:50] wtf [19:54:01] Also that bug still happening? [19:56:44] @cosmicalpha hey what’s the process of sending emails from our domains via smtp [19:57:01] For later reference, not immediately needed [19:57:19] Mediawiki mail goes through a Google service for mass mail [19:57:23] I think [19:57:36] You looking at AC tool [19:57:39] Been working on getting https://github.com/enwikipedia-acc/waca/tree/master up locally before CVT EXPLODES!!!! [19:57:41] Yup [19:57:48] Good progress going [19:58:03] Main blocker currently is why the fuck is OAuth not working [19:58:08] I get an error code [19:58:10] But [19:58:11] Uh [19:58:20] I have zero idea where to use that to look up the trace [19:58:22] :steamhappy: [19:58:24] Help [19:58:44] what's the code [19:58:55] It’s like the exception codes we get [19:59:04] it's like a request id [19:59:10] dunno if it's a request id tho [19:59:23] I guess you want https://github.com/miraheze/mw-config/blob/5dedb36bac69c89848a95fc8aaec1e2f77b4f208/LocalSettings.php#L3108 [19:59:40] also, morning~ [19:59:46] Hey @blankeclair [19:59:52] No it’s in WACA on my laptop [19:59:56] Ask us then [20:00:14] No I mean in format [20:00:16] Does WACA have a log file [20:00:20] It’s not on Miraheze servers [20:00:40] Possibly? I checked the logs I found but that was HTTP requests coming in [20:00:48] Hmm [20:01:00] I think I found an error directory but seemed empty [20:01:07] But I wasn’t in the docker container [20:01:14] Did you turn logging on? [20:01:36] I presume it is, since the docker dev config has the giant ugly deprecation warnings showing [20:01:41] Ah [20:01:49] my plan is to try and get in the application container and look in that error folder [20:02:07] @blankeclair we have a nice morning treat for you [20:02:14] What’s the CVE this time [20:02:23] I need eats [20:02:29] they don't issue them that early [20:02:32] We made one public today [20:02:37] But not this one [20:02:56] I thought yesterday [20:03:21] Ye [20:03:25] I've been on leave [20:03:32] I have no sense of time [20:03:39] MOOD [20:05:06] I don't go back to work until Wednesday [20:08:21] also, this burnout recovery been annoying af [22:15:35] We have another CVE today but a lot more less severe. [22:16:51] :NepStare: [22:16:55] More or less [22:17:37] [1/3] I found this one when I was testing something else lol [22:17:37] [2/3] Anyway... [22:17:37] [3/3] https://github.com/miraheze/ManageWiki/security/advisories/GHSA-ccrf-x5rp-gppr [22:17:47] lmao whoops less [22:20:57] Gotcha [22:21:57] also: question, what would be the process of deploying a fork of https://github.com/enwikipedia-acc/waca/ onto Miraheze? Would we need puppet nonsense? [22:22:23] ManageWiki has been changed how it handles conflicts completely. It doesn't save when there's a conflict it shows errors for what conflict and requires explicit disabling. I never liked ManageWiki randomly removing extensions when one that conflicts with it was enabled lol. [22:22:56] Been messing round with it on local, CVT emails are getting to be unsustainable so we need to try something else, and this is looking like a great solution but wanted to consult with you if there’d be issue with running it if CVT agrees [22:22:57] We would need puppet stuff done yeah. [22:23:08] Ffffffffffff [22:23:28] <- puppet not enjoyer [22:23:39] We have no issue with running it though we can just need to find what VM to on. [22:24:07] maybe a new one? [22:24:24] Maybe [22:24:54] I was going to propose we rename reports171 to tools171, and use that for these and any other sort of tool sites we need [22:28:05] [1/2] I don’t know how much I can assist with the deployment itself, since unless we can put this on it’s own non NDA’d server I wouldn’t be able to do anything needed via shell(probably not much except later troubleshooting cause puppet?) and i can’t understand puppet for heck, but for now I’ll try and get it close to functional as possible locally and figure out [22:28:05] [2/2] the quirks of this thing [22:28:26] I do want to get consensus from CVT once I have a feel for the flow of it [22:28:47] Some issues with OAuth I need to figure out:p [22:28:55] There isn't much docs I can find for installing lol [22:28:59] Once I find the damn error logs [22:29:24] Yeah, the INSTALLING.md and docker md file have been pretty helpful in set up [22:29:47] Add the comments in the config files [22:30:03] ARCHITECTURE.md doesn’t look very fresh though lmao [22:38:43] If there's consensus from CVT and you manage to get it working I'll work with you and do the puppet part but will need some help figuring out what you did to get it working. [22:39:13] @pixldev ^ [22:40:24] K, note I'm using the docker container, not bare metal [22:40:47] Thats fine I can still use what you did to install it on how we do things. [22:42:09] Yeah, the config is funky but once i get it working it should work [23:03:07] what the fuck is going on here [23:09:58] @blankeclair so, my error log appears to indicate that the OAuth request im sending is erroring because the result it’s receiving is apparently the full HTML of a Wikipedia article??? [23:10:54] Also may need to fork to remove the Wikipedia specific stuff a little bit