[01:51:50] btw @orduin once/if my PR to CW get's merged do you want me to add the config to https://www.mediawiki.org/wiki/Extension:CreateWiki. I'm thinking now if any other parts need lots of updates. The table schema haven't been changed since 2020 I think but idk how much they changed in the codebase itself. OS seems to be keeping it updated with the new [01:51:51] WIP API [01:52:52] not a ton of people use CW outside of us so not a big deal but can't hurt to stay up to date on docs, even if it's just to leave a good impression on the recent changes watchers lol [03:14:26] <.guardianx.> [1/6] Good evening, new to having this much control in a wiki, mainly did things in managed wiki's. Likely going to mess up the language so forgive me please LOL. [03:14:26] <.guardianx.> [2/6] I'm looking to make a fairly large table and then parse that table out into various pages based on a filter, the reason for the main table is to have a goal of users form editing the table. [03:14:27] <.guardianx.> [3/6] Is there a best practice for this? [03:14:27] <.guardianx.> [4/6] Debating using an array, but I don't know much about them and am in the process of learning. [03:14:27] <.guardianx.> [5/6] filtering a large table into smaller tables seems to require some extensions I don't know if I really need. [03:14:28] <.guardianx.> [6/6] Any suggestions more than welcome. [04:43:35] create a template and make the table using wikitext, protect the template, insert the template [04:43:44] there is no per section protect possible [09:03:54] @elliethepwincess do you want me to help so you can learn how to add to ManageWiki or not? [09:05:06] I'm a bit confused on how those settings work so sure [09:06:40] okay but not yet, I have a huge headache and am on 15 hr flight right now that I should try and get some sleep. [15:19:30] You want help doing config PRs? I could try and help [15:21:50] Side note @reception123 I had an idea about https://issue-tracker.miraheze.org/T10674 [15:22:26] just so you know part of it is already done in ManageWiki [15:22:30] it just needs the table part [15:22:32] Dang [15:22:33] like WikiDiscover [15:22:43] How so [15:23:00] Never mind I guess lol:( [15:23:32] oh no, I mean the table is still fully needed for it to work [15:24:00] oh, interesting [15:24:01] https://github.com/miraheze/ManageWiki/pull/399 [15:24:50] I was thinking a page that shows a list of wikis by any toggle settings like extensions [15:32:32] well yeah basically like WikiDiscover [16:43:33] https://github.com/miraheze/YouTube/pull/27 [16:44:05] I made some changes to the PR to fix some isuses [16:44:12] Would appreciate if someone could review the changes [17:03:43] Took way to long to find the request review button dang [17:16:11] CI is failing: syntax error in extension.json [17:16:14] ah [17:16:16] will check [17:16:24] https://tenor.com/view/computer-works-for-me-meme-engineering-dvd-gif-27508941 [17:16:27] but will figure out whats wrong [17:20:35] Dang imagine working with an extension you can actually test on your machine [17:20:39] Couldn’t be me [17:22:33] Side question cause thought of it: would it be possible to grant shell access on test without accessing to the database (and by proxy prod) this isn’t me asking how the hell do I get shell I’m just genuinely curious if that’s physically possible. I doubt it would be unless you removed www-data sudo access and that kinda makes working with MediaWiki difficult [17:24:07] Due to sql.php and eval.php/shell.php it probably isn't necessarily feasible as something we'd likely do but it may be technically possible but not 100% reliable. [17:24:34] We may split beta into a different DB altogether maybe though [17:25:20] It would be possible if it was a different db and password [17:25:34] And we didn't store the prod creds on test at all [17:25:55] I would probably firewall off the prod dbs too for sanity [17:26:15] Yeah that’s probably the cleanest option and what Rhinos has mentioned a few times [17:26:32] Although it’s obviously not a priority considering everything else on tech’s plate [17:26:43] Yeah we would. [17:26:48] Perhaps mirabeta could be set up so that IP addresses are not stored or are assigned random values [17:26:53] And same for emails [17:27:00] Because you don't really need real information on mirabeta [17:27:05] The issue is beta uses the same db cluster as prod [17:27:11] Ah I see [17:27:23] We could pull a WMF and say beat has no privacy policy I think [17:28:10] That I can think of the only immediate benefit would be removing the NDA from test shell and being proper which really only benefits this idiot and maybe some other users who may want to help in development [17:28:20] So hardly convincing reasoning to waste time in it [17:28:45] Yeah it won't be soon but it is on an eventually to-do [17:29:10] Beta's privacy policy should be at risk tbh [17:29:16] That's not no privacy policy [17:29:21] To quote our fallen penguin [17:29:23] I mean you can install linux server as either live mode or permanent to a usb, wouldn't take much to set up mediawiki for dev environment [17:29:24] But just be prepared for this to go wrong [17:29:26] We’ll do it soon™️ [17:29:36] We are only using about 50% of our resources right now so we can do something with them and putting more into beta would be good to streamline upgrades etc... also... [17:29:45] Oh I have a local mw dev environment [17:30:00] The issue is a wiki farm with Create and ManageWiki isn’t as easy [17:30:02] I run MediaWiki on my normal pc, not on a USB just on Apache I have installed [17:30:05] Oh that's true [17:30:14] I've never tested anything with CreateWiki [17:30:14] is that a windows or macOS? [17:30:15] docker(mwcli) on WSL here [17:30:18] Linux [17:30:21] ok [17:30:22] I use Ubuntu specifically [17:30:28] are you using same os as production? [17:30:36] I do need to see if I can find my CW/MW docker image as it would be good to have... [17:30:40] Doubt it but don't know [17:30:41] Our prod is Debian [17:30:51] I'm not targeting a specific OS, I hope that what I code works on most OS's [17:31:03] Ah, Ubuntu is Debian-based so it should be good [17:33:19] https://jsonlint.com/ and https://jsonchecker.com/ says your extension.json is valid [17:33:57] same for https://jsonformatter.org/ [17:34:00] I think it is valid actually but placement for resource paths and package files configuration wrong [17:34:09] Yeah I know, I've been looking at it for a while and can't figure it out [17:34:11] Hmm [17:34:28] But not certain I may be imagining things [17:34:30] I'm not sure what could be wrong because it loads for me and looks right, I had asked in the MediaWiki support server about it because I couldn't get it working at first but now it's working [17:34:33] I'm not 100% sure [17:34:58] usually when I modify and its telling me broken, I use any of those to quickly locate issue. [17:35:19] For me, VSCodium usually tells me where the json formatting errors are at [17:35:33] That's a good practice. I usually spend an hour looking at it to realize I forgot a comma or something. [17:35:35] that being all said, why are you electing to not use embedvideos [fork] by starcitizen folks? [17:35:47] rather not waste my time getting stressed out [17:36:06] You know that is kinda a good point it would be one less extension for us to maintain... [17:36:46] Miraheze made an original YouTube extension a while ago, I just thought I should make a PR to improve the performance of the extension a bit because I see the extension on Miraheze wikis sometimes messing up core web vitals [17:36:47] would be good especially for you to have less to maintain [17:37:11] perhaps now is a good time to encourage to switch [17:37:51] [1/4] what if you bump up version for ```php [17:37:51] [2/4] "requires": { [17:37:52] [3/4] "MediaWiki": ">= 1.35.0" [17:37:52] [4/4] },``` [17:37:57] If you embed a YouTube video directly (i.e.