[00:40:34] Any board members on Discord? [00:52:08] all three are in fact [04:04:46] <光纪#9056> hey guys I'm back. Sadly fonts including all Unicode characters are too big for server while loading [04:05:17] <光纪#9056> about 45m [04:06:00] <光纪#9056> Now I know why wikipedia doesn't support this [04:10:56] <光纪#9056> temporary solution is link local font on my computer or others' which have installed it since only I and my friends visit this wiki now😐 [04:28:57] you get a time-out error when uploading m [04:31:59] You'll absolutely be getting one with 45mb. Server's not meant for that level of load for a font file. [04:34:04] actually Void just replied in your support thread that there has to be some update, so maybe you'll be able to upload old file (unless it'll be the same size, then we'll have to look for other ways) [04:35:08] <光纪#9056> I'll wait with patience [04:35:55] <光纪#9056> 😄 [06:20:02] <光纪#9056> I compressed the font and finally it appears correctly [06:20:26] <光纪#9056> Thank you all here🥳 [06:25:05] <光纪#9056> But mobile browser will ignore the css of website, which makes me annoyed [06:27:58] It shouldn't, unless you have the Minerva Neue skin being served to mobile users [06:28:04] in which case you can also just add the css to that skin too [06:28:11] or put it in common.css [06:33:33] <光纪#9056, replying to dorito#0001> Oh I forgot that [06:33:37] <光纪#9056> You are right [06:41:04] You could alternatively disable the MobileFrontend extension (but it's only recommended really if you use a mobile-friendly skin as your main one, like Cosmos/Timeless) [06:59:07] <光纪#9056> It's a little strange. Only do I touch the text "desktop" below on my phone it appears, and if I switch it back, it disappears. I even try to use Vector for mobile but it still disappears on my phone [07:11:06] <光纪#9056, replying to dorito#0001> I went to the page of minerva in mediawiki and found there shoule be a file called mobile.css lmao [07:11:22] <光纪#9056> I didn't hear anything of it before [07:12:27] Yes, some wiki skins make use of a mobile-specific CSS file, Minerva is one of those. [07:15:43] <光纪#9056> finally everything is ok [07:16:32] <光纪#9056> I'm gonna take a break [08:37:41] I sent you a DM [08:39:40] Just a quick general question, is copying text/sources from Wikipedia permitted? There are a few articles that have useful info for a wiki I’m working on. I’m not really familiar with how certain licenses work. If there are any info pages with more details someone could direct me to that would be lovely 🙏 [08:54:24] As long as you use CC-BY-SA 3.0 or higher (e.g. 4.0), you should be able to use Wikipedia content directly. https://enwp.org/WP:CC lists a simple version of the requirements of copying and remixing Wikipedia content. [08:54:31] Ermm [08:55:53] See above [09:04:58] Tysm! [12:16:12] I have a question are these stats individual users or it can the same user opening the wiki multiple times? [12:46:56] It would be unique visits to my knowledge. [12:47:03] But I can't guarantee that 100% [13:25:25] Unique users given Matomo tracks users based on their IP if cookies are disabled. [13:29:47] In today's world of CGNAT IPs are hardly unique though. [13:30:53] I personally take the stats at Special:Analytics with a pinch of salt [13:59:39] Goodnight! [14:07:24] the stats have interest but shouldn't be taken too literally, and I'm pretty sure matomo does something to help anonymize for privacy but I'm not sure on that either [14:14:22] @Meta-Wiki Administrators please consider a conduct warning for TheCoolStranger45 based on recent messages in [[Requests for reopening wikis]] [14:14:22] https://meta.miraheze.org/wiki/Requests_for_reopening_wikis [14:14:22] [14:14:31] I plan to reply on that page directly/shut down the request shortly [14:18:52] O_o yeah I think I can geuss why said wiki was deleted just from that conversation alone xD [14:22:09] as I reiterated in the now-posted message closure [14:49:48] Has this been done yet? [14:50:03] sigh [14:50:50] reception is addressing [14:50:51] Nevermind I see @Reception123 did it [14:51:47] Yeah, they clearly aren't good faith so I've given them a first and final warning. [15:13:01] @Gummiel let’s continue here to not take over the support post lol [15:14:22] I hadn’t thought about just “ignoring” the data tag and just formatting other variables in its place so many params to one cell isn’t that’s big of a concern than I thought [15:19:32] [1/2] Well as soon as you start to use the tag, you have to tell it where to put a parameter(though it has to be used within the /data tag still, so you are not really ignoring it per say. But yeah between that and the straight up tag, that both takes pure wikitext, including most parser functions and stuff, there really is very little you can't do (I am in [15:19:33] [2/2] cluding to say nothing it can't do, but I don't want to exclude that there might be some obscure case, of an obscure case that might not be doable) [15:20:39] Sure at that point the PI isn't that simple anymore, but it is still a million times more simple than even the most simple wikipedia infobox [15:24:30] I do think pi for 99% of cases on miraheze is just plain better, plus it's simpler to handle and it's even hypothetically easier to switch from if you really just don't like it vs using wikipedia templates outright [15:24:53] and using wikipedia templates outright is the 'wrong' way, because as pinned in this channel there is actually a wp template system that is sanitized for general wiki use [16:05:20] [1/2] It still seems to me that the data tag is just a wrapper, since format is the one doing all of the display work by grabbing the necessary parameters itself. Doubt there's a better way with "xml" though so eh can't complain. [16:05:20] [2/2] The "default" wikitable infobox is far simpler than PI imo but has barely any qol features built-in so making that nice and pretty can become spaghetti real fast [16:07:31] My personal bias is to lua everything anyway but I'm sure the learning curve for that is higher than wikitext or PI so it's not a good recommendation :p [16:12:52] There is no default infobox when you make a wiki initially. And a wikitable does not really qualify as an infobox on its own [16:14:10] one thought I have is a) make a proper starter's infobox guide on meta, b) link to it prominently on the new wiki front page, and c) use it as a baseline for addressing the infobox problem going forward [16:14:33] It's not a) just have default infoboxes but it is a way to deal with it that doesn't force a single default that may not be suitable to a given wiki's needs [16:14:35] Ah my bad, thought Module:Infobox was available by default [16:14:56] there are really no infobox resources available by default, and worse, no easy way to point people the right direction [16:14:56] Yeah personally I'm minded towards some kind of default infobox that one can opt for in their wiki request [16:15:02] like [checkbox] Default infobox package [16:15:13] opt-in preconfig for the wiki request is something that wouldbe great to have in general [16:15:18] default package + other things [16:15:21] there is a task for it I believe [16:15:36] https://phabricator.miraheze.org/T9153 should encompass something like that [16:15:45] that would be the best answer of all but also not realistic until someone can finish it off at a technical level [16:16:38] Yeah an option for some default package would be ideal, though a easy to link to guide would probably be far easier and faster to be able to get made [16:16:49] I've added a note on the task for that [16:16:53] Pretty sure there are no Modules at all [16:17:07] no modules come at wiki creation [16:17:10] I mean a very basic guide that was made years ago was https://meta.miraheze.org/wiki/User:Void/Infoboxes [16:17:18] so maybe that can be developed [16:17:18] on paper, the guide is something anyone can make/finish [16:17:28] just never anyone's high enough priority to see progress [16:17:36] big cookies to the one who gets it done [16:17:46] the rest (link on wiki creation) is easy [16:18:00] yeah a link can be added to the default main page [16:18:05] which, while we're at it, needs a fresh look [16:18:23] the default Main Page has been there for 7 years and has been largely unchanged [16:18:24] Tbh, that is so basic I am not even sure I would call it a guide, more of a quick note really xD [16:19:46] another guide was attempted later but involved the terrible idea of just yanking from enwiki and playing whack a mole [16:19:51] forgot where it was exactly [16:20:14] needless to say, there is a need for a capable guide that gets people in the right direction [16:20:58] and during/after that, there should be import packages ready for wikis to take in probably on `devwiki`, which could eventually be used as the basic import package for RequestWiki if that feature comes through [16:21:12] but we should design based on what's possible now and not necessarily wait for features to complete, that puts us behind years [16:24:31] There was a plan upstream to produce lightweight packs for useful templates [16:24:37] It didn’t go anywhere [16:25:05] That seems to be a theme around here sometimes xD [16:27:09] [1/4] What would the guide entail though? We know the end result is people want an infobox but are we teaching? [16:27:09] [2/4] - How to copy enwp infoboxes [16:27:09] [3/4] - Migrate existing infoboxes to PI [16:27:10] [4/4] - Create new infoboxes with PI/wikitext/lua [16:27:56] The third bullet point already has plenty of resources for PI, maybe some for wikitext [16:29:15] They are a number of very old phab tasks [16:29:19] Upstream [16:30:52] third point focusing on pi and a sanitized wikipedia import per pins here exclusively [16:31:15] Oldest task open is apparently https://phabricator.wikimedia.org/T45 [16:31:22] subsection for more advanced usage below to allow people to go for more complicated/lua based options if they wish, after it is clear those are not default or necessary in most cases [16:31:23] But I have a feeling that’s a lie [16:31:28] Because bugzilla [16:31:57] wow 45 xD, aren't most phab ID's in like 4 or 5 digit ID's now? xD [16:32:01] encourage the easiest solution that works for most people, then get into other options including properly importing from enwiki after knowing that's usually not what they want [16:32:24] even Miraheze's phab is at 5 [16:32:30] it was weird to get used to at first [16:32:38] people want an infobox, they don't want a black box that requires the virtually endless time that enwiki boxes do for most cases [16:33:13] Aug 10 2004 I’ve found [16:33:22] https://phabricator.wikimedia.org/T2005 [16:33:24] we may as well do this ourselves if we have the people with interest and capability [16:33:34] Yeah this. I mean even for most of the more tech savy people here, the wikipedia infoboxes are still black boxes I would expect [16:33:38] no point relying on upstream, or indeed hedging our bets on technical development of features if we don't have to [16:34:07] the only user at hand I'm aware of who even has true utility for this, let alone the experience to navigate them well, is @Ugochimobi [16:34:26] The fact they are wikimedia phab tasks born in the same year as me is worrying [16:34:41] if it's what you really really want he is the one for the job, otherwise a simpler solution is entailed 99% of the time [16:35:05] RhinosF1: Why? [16:35:10] because at the end of the day doing things the enwiki way and doing them correctly has certain benefits in search, embeds and such [16:35:16] They too are volunteers right? [16:35:26] No [16:35:42] The vast majority of wikimedia core devs are paid [16:37:03] I see, so the guide that's required is one that pretty much entails everything [16:37:04] Core + bundled should always be owned by a team [16:37:08] that's news for me. [16:37:09] It’s a big should but [16:37:33] guess it's not really surprising, can't see someone running Wikimedia scale with only volunteers [16:37:35] If it’s bundled or wikimedia deployed code, a WMF team will handle it [16:37:50] Lots of small stuff gets done by volunteers [16:38:01] But major changes have to have WMF staff helping [16:38:13] it doesn't have to do everything in great detail, the minimum requirement is 'please don't start importing from wikipedia, have a look at this and this' in a way less savvy operators can follow [16:38:17] A lot aren’t paid enough and work way outside their scope to keep things moving [16:38:23] other options can be presented in the basic for more niche cases [16:38:32] /me coughs and quietly mentions Reedy, James, Martin & Amir [16:38:56] I've always been quite confused about salaries and WMF volunteers tbh [16:38:58] And then maybe end up with something along the lines for "don't be afraid to ask for more help on the discord" [16:39:09] like I know some are unpaid/completely volunteer and some are paid. I think? [16:39:22] yeah pretty much this is what I thought too [16:39:26] for sure, as even an excellently written guide won't necessarily be understood by everyone evenly or there are other use cases requiring individual attention [16:39:54] Yeah what we want to avoid is repetitive/easy questions though as are frequently asked on Discord now [16:39:57] which is why the guide is definitely necessary [16:40:06] to cover the basics and only direct to Discord if that's not enough [16:40:16] I would encourage the guide to start relatively simple and answer the most common problems, then refine it with more cases as long as the overall simplicity is maintained for the inexperienced operator to understand [16:41:52] the PortableInfobox extension isn't all that bad either [16:41:53] But also the most important part might be to make sure new wiki crats and admins see such guide, before they start looking for other solutions (those other solutions often ending up as wikipedia importing, as that may at first glance seem like the easy solution, when it end up being the polar opposite) [16:42:11] yes [16:42:39] when the guide is in a good position, it can be added to the list of advice posted on the main pages of new wikis [16:43:17] I think that would help for a lot of cases at the least, and it would also be a good linkable reference when the question comes up on discord/community noticeboard support in the future [16:44:33] I expect it won't catch everyone but it would put us in a very good position while waiting for things like building 'include/start with x feature' options in requestwiki [16:51:18] They are volunteers too but most complete volunteers don’t do major stuff [16:51:36] Because major stuff requires staff co-ordination [16:51:44] Which happens in private places [17:21:21] See: my languishing new admin guide, several other projects in similar veins. [17:22:12] all noble efforts, I do want to see an infobox-themed guide come up somewhere soon [17:31:58] It's a hidden tackle to a certain wiki hosting platform, right? [17:32:22] To various wiki hosts who all do the same [17:32:36] but now that I think about it, one of them is more guilty than the rest heh [17:33:19] We all know who it is. [17:33:38] well I don't [17:34:04] I would say even to a bunch of non-wiki sites as well for that matter xD [17:37:45] Yeah, like YouTube. [17:55:19] @The Morning Star see DM [17:55:28] yours was stuck in requests [19:37:09] @Owen Check your messages. Urgent. [20:18:25] where can i find a copy of an nda? [20:19:51] NDAs contain the PII of the folks they're prepared for, if you're looking for a sample blank one Owen may be able to assist. [20:21:54] [[Board/NDA]] hope I remember the link right [20:21:54] https://meta.miraheze.org/wiki/Board/NDA [20:21:55] [20:21:58] ech [20:22:12] [[Board/NDAs]] [20:22:12] https://meta.miraheze.org/wiki/Board/NDAs [20:22:12] [20:22:50] seems like you need to get in touch with owen specifically [20:22:57] he tends to handle them [20:23:21] shame there isnt a blank copy on meta :p [20:28:00] @Fiddlestix why do you need a copy of an NDa [20:29:12] might sign one :p [20:30:37] @Fiddlestix you don't need an NDA [20:52:47] i only just now realized that js pages need to be reviewed by fandom staff for every rivision [20:52:47] i wasted 3 hours trying to figure out why my edits wouldnt work just for a thing that wasnt shown unless i had the side bar opened (thing with rail modules) [20:52:49] you've surely noticed that fandom is very locked down in a lot of areas [20:52:49] not all of them well or prominently documented [20:52:49] Fandom supremacy [20:52:49] <:fandomflame:872100256999952436> numbah 1!!! [20:52:49] fandom [20:52:50] fandom is better than miraheze‼️ [20:52:50] Today I learned that you can actually request for additional extensions to be enabled for your Fandom wiki. [20:52:50] no way [20:52:50] Miraheze is better than FANDOM [20:53:01] But of course the process is cumbersome and prone to human whim. [20:53:14] You can but they'll likely decline if it's too much work for their engineering team [20:53:20] It's rare for them to approve a new extension [20:53:32] ik we are being sarcastic here but wikia seo is actually killer [20:53:46] Oh for sure. That's the first, second and third order of business. [20:53:54] If you want perfect SEO then you'll have to make a lot of sacrifices [20:54:02] For example, their skins are designed in a way to ensure the best SEO [20:54:19] If you don't want skin flexibility then you can select SEO friendly skins [20:54:44] a reason why they don't provide a lot of features is primarily that they have SEO in mind when doing things hence so many limitations [20:55:24] Everything comes at a cost unfortunately [20:55:25] TIL their search extension is 100% proprietary homebrew. [20:55:35] no wonder it's so bad [20:56:07] Speaking of, is the move to CirrusSearch still dead in the water? [20:56:48] i knew this before i knew that js was disabled on every wiki and even if its enabled the revisions need to be manually reviewed by their internal review team [20:57:35] for some reason all of this is literally just because they "want to protect the community" [20:57:52] which honestly i call bullshit [20:57:53] "Fandom uses the JavaScript review process to enhance your security while using the network." [20:58:23] Well, that's their money-maker. Can't be having malicious or non-performant JS code putting off your consumers. [20:59:44] Pain in the butt, but I get it. [20:59:53] Security is a real concern, sure, but they didn't use to lock it down before, so I wonder what real benefit there is [21:00:59] It's planned for eventually if the fundraiser ever allows for it [21:01:22] Much like most big policy shifts, I'm sure someone did something sufficiently dumb/harmful to result in that decision being made. [21:02:45] I believe it was locked down because one of their staff members was once actually hacked while visiting a rogue wiki to investigate something and due to their MediaWiki setup, it was easy to compromise their account and make a mess on Community Central using WikiFactory to redirect and delete wikis [21:03:04] They had an issue one time where a staff member got their account compromised because JS on a rogue wiki hooked into the login form before they added JS review [21:03:10] oh lol yeah [21:03:57] i feel like that's really just a issue with how they handle js, js being allowed to hook into their login form shouldnt be happening in the first place [21:04:01] Was that before or after they made the login a popup modal / separate page [21:04:16] Before they made it a separate page [21:04:22] and im pretty sure js should just, be sandboxed [21:04:58] I mean usually not much harm can be done if cross site JS requests are disabled and you are using httponly secure cookies for session tokens [21:04:58] because clearly they arent sandboxing the js of wikis, causing the js to be able to touch into stuff it shouldnt be touching [21:05:11] But since the login form was not separate from the wiki [21:05:28] They were able to log what was submitted into the login form on their wiki [21:05:32] fandom security skill issue clearly [21:07:03] Since then fandom also added a usergroup for staff that requires you to login with oauth (Google, twitter) so that there is no password to leak anymore lol [21:08:00] as long as those endpoints are not compromised, all good I suppose [21:09:02] I have safe mode enabled on all non-Meta wikis so no JS is loaded [21:09:43] but either way, if my account were compromised on one wiki, there's not much you can do, global groups don't have very destructive user rights in their global group, only in Meta [21:10:11] I use NoScript, only allow JS from Meta [21:10:34] Agent: sysadmins are an exception though, they have global (managewiki), I think [21:10:41] Technically they could use it on any wiki [21:12:08] [1/2] you could technically make important parts of js inaccessible by non-meta wikis for example stuff used for security, accounts, etc [21:12:08] [2/2] unless you cant do that, im not too experienced in js to know. [21:12:32] You can use ManageWiki remotely on any wiki from meta [21:12:39] At the Special:ManageWiki root page [21:13:37] but sysadmins don't necessarily have to do it from Meta, using their global group they can use the interface on any wiki [21:13:45] It's still done at Meta for better logging [21:14:18] [1/4] and by this i meant [21:14:18] [2/4] we have value x [21:14:18] [3/4] value x is important, used in stuff [21:14:19] [4/4] make value x unable to be used by non-meta wiki's js pages to make it so those pages cant interfere with value x [21:14:42] That sounds easier said than done [21:14:45] " [2/2] unless you cant do that, im not too experienced in js to know." [21:14:50] I don't think that's a thing [21:14:55] Wiki JS is not loaded on sensitive pages such as preferences, login, etc [21:15:07] Stewards also have managewiki globally [21:15:08] yeah [21:15:16] that also works. [21:15:25] i also noticed that css isnt loaded on preferences [21:15:28] But ManageWiki on local wikis doesn't do anything [21:15:38] srsly? [21:15:44] The destructive ManageWiki features are restricted to Meta [21:15:49] like deleting and locking wikis [21:16:06] but overall it seems that miraheze has good security while having good freedom [21:16:25] I think it's a fair balance [21:16:59] I believe all destructive components are also restricted for any account/connection that isn't 2fa verfied as well [21:17:20] i think the funny part in all of this is that fandom only just recently added 2fa [21:18:13] lol, shocking [21:18:23] 2FA has been a thing in most platforms for a while [21:19:24] 👀 [21:20:33] Thanks for tackling that! [21:21:04] what? [21:22:37] does the relay not show images [21:22:47] I sent a screenshot of the wiki tags thing I've been working on [21:23:20] no, no images here [21:23:33] https://media.discordapp.net/attachments/407537962553966603/1090747056978669598/firefox_t36Un2r3Yz.png [21:24:05] wait [21:24:25] you're Joritochip? [21:24:31] ye [21:25:25] looks pretty good to me [21:28:31] looks great [22:11:15] That’s incorrect. [22:11:45] Fandom has most of the extensions available on MediaWiki and obviously their own. The list of extensions in WikiConfig is huge. [22:12:03] Easily matches and probably outnumbers even Miraheze’s offering. [22:14:40] aracham was talking about the amount of extensions that they actually allow you to request to be enabled lol [22:14:43] on that front there's not that many [22:15:43] True. Although that’s largely now up to the discretion of Wiki Managers as far as I’m aware. At least when I was contracting there the guidance was “if it’s in use on a good number of wikis then you’re free to enable it on request” [22:16:15] that only really applies for the very small amount of big wikis that have wiki managers [22:16:27] everyone else has to go through cs, which is less likely to enable things for you in my experience [22:16:46] i suppose you can contact wms that dont manage your wiki still and they might do it but generally that isnt how you're supposed to do it [22:17:44] Wiki Manager assignment on Fandom is greatly flawed, yes. Every wiki should have one. [22:18:08] i think the amount of wikis with assigned reps is actually decreasing [22:18:20] especially after pcj left since he was rep for something like 600 wikis [22:18:41] All of PCJs wikis were redistributed. [22:19:00] Or at least 95% of them were. [22:19:01] yeah probably by now, it took them a long time to do that though [22:19:18] i am active on a wiki he repped and we didnt have anyone to contact for many months [22:19:28] Thanks for clarifying, my perspective is obviously not as expansive, this is what I could scrape together from user facing documentation [22:19:57] Good to learn more about the other side of operations [22:20:09] One interesting thing about WikiFactory is the part which tells how many users are using an extension, probably something that would be neat for ManageWiki. [22:20:21] there is a magic word for that [22:20:29] {{NUMBEROFWIKISBYEXTENSION:visualeditor}} [22:20:29] https://meta.miraheze.org/wiki/NUMBEROFWIKISBYEXTENSION:Template:visualeditor [22:20:30] [22:20:37] oops [22:20:41] A great portion of that is just their own proprietary extensions though, no? [22:21:03] I've seen how many in house extensions they have enabled on their wikis and it's vast [22:21:30] yeah most are proprietary, at least in the installed-by-default set [22:21:38] Yeah but they also have most of the extensions you’d find on MW.org that are marked as stable. [22:21:54] Don’t forget they also 95% of all of the extensions GP was using pre-UCP. [22:23:48] Oh really? So basically any big wiki with a WR can request almost any extension on mediawiki.org be enabled? [22:24:06] maybe technically but i dont think in practice that is true [22:24:45] i had to borderline argue with fandom staff one time to explain why a wiki i am active on needed abusefilter before they agreed to enable it [22:24:48] Theoretically if it’s stable and enough wikis are already using it. [22:25:26] enough wikis on Fandom? [22:25:29] or in general [22:25:33] On Fandom. [22:25:34] also abusefilter on fandom is kind of annoying because after 20 hits in a short period it throttles and allows the action, and there's a big "retry" button on the popup showing you hit a filter [22:25:37] so if people just click it 20 times [22:25:39] it goes through [22:25:49] you have to set up a secondary duplicate filter that blocks them if they try too many times or else it can be bypassed [22:26:09] Also WikiFactory compares everything with community central which is very annoying. [22:26:16] very interesting that Fandom has a secret amount of extensions which can technically be enabled but they basically restrict them to big wikis only on request [22:27:59] Thanks is always one I really liked and it got disabled n [22:29:59] they are bringing it back on fandom as we speak actually [22:30:09] that is the first extension they are adding as a result of their community survey program [22:30:54] embarrassing that they had to do community wish list for that. [22:31:23] lol [22:31:48] really, the thanks extension? it's not even that complicated [22:31:59] It's going to become a MediaWiki default extension in 1.40 anyhow [22:32:04] thats what i said [22:32:12] but i think its ok cause they plan on adding a new one every 1-2 quarters iirc [22:32:17] Oh really? [22:32:23] Fandom will probably upgrade to that in 2056 [22:32:41] its not like fandom is too far behind with version now [22:32:45] they are upgrading to 1.39 now [22:32:50] some wikis are already on it [22:33:21] is it just me or is this concerning? https://publictestwiki.com/w/index.php?title=TestWiki:Request_permissions&diff=50189&oldid=49605&diffmode=source [22:33:42] actually probably not the place [22:33:59] why is that concerning? its publictestwiki [22:34:02] the whole point is to ask for bureaucrat [22:34:17] no….never mind [22:34:36] not worth going into in a public channel [22:34:40] We're enabling Thnaks globally when 1.40 comes [22:34:52] in a few months [22:35:15] Does Miraheze upgrade every version, not just LTS? [22:35:19] is it not already enabled? [22:35:27] we upgrade to every version [22:35:48] 1.40 upgrade coming in May, the leadup process has started [22:36:04] Is it a lot of work to keep it upgraded every version? Especially given how many wikis you host [22:36:42] the hard part is making sure all the extensions still work [22:36:53] it can be, we’re lucky to have a great technical team to help with the upkeep of all that [22:37:32] yup, believe we’re planning to start that soon but not 100% [22:38:06] Yeah, making sure extensions work is 100% the hardest part [22:38:28] next version will be painful [22:38:33] We don't employ any proprietary changes to MediaWiki so no real issue having to try to fix our MW base code every version [22:38:34] Extensions are always fun, some are very professional, others are completely broken and years out of date! And it's up to you to sort what's what [22:38:49] Upgrading MediaWiki for all wikis takes about 2-3 hours [22:38:57] we remove out of date extensions [22:39:04] miraheze doesn't bother with extensions that are totally abandoned, particularly insecure or don't work in the first place [22:39:06] not sure if there’s been any lately but we’ve done so before [22:39:14] Sounds perfectly reasonable to me [22:39:22] a forum extension was just disabled because of security issues iirc [22:39:27] Also I like that you use the portal navigation template on Miraheze Meta; I developed it as a WMF intern in 2017 and it seems to have since been picked up in other locations [22:39:49] oh? when you say portal nav could you elaborate a bit? [22:39:58] which one [22:40:09] I did it in 30 minutes last time. [22:40:11] i’m not entirely sure what you’re referring to haha [22:41:16] You did, thank you for that! Extremely useful template! [22:41:38] oh the template haha i thought you meant an extension [22:41:51] thank you! [22:42:01] I wouldn't doubt if you said you did it in 5 minutes, you're quite fast lol [22:43:39] lol [22:50:17] in september of 2021 there were 3.1k fandom wikis with assigned reps, today there are 1,350 [22:50:42] it's not good [22:55:49] Yeah even internally we were never convinced the programme would be a permanent thing and it looks to be dying out. [22:56:11] Ouch, that's a shame. [22:56:23] not surprisng [22:56:29] it's not a scalable system [22:56:34] unless you hire more wrs [22:57:49] GP managed it. [22:58:16] GP had a lot fewer wikis [22:58:44] Fandom doesn’t care about community that’s why they aren’t bothered. [22:58:54] They just care about how big Brandon Rhea’s wage will be. [22:58:55] ironic for a website named fandom [23:05:16] generating a list rn of every extension enabled on every fandom wiki that has a rep [23:05:26] to see which ones are commonly enabled [23:07:14] they probably have a `Special:Version` [23:07:30] yes but not every wiki has every extension [23:07:38] so im using the special:version api on every wiki with a rep [23:07:45] and then combining the data from each [23:07:52] to get a list of extensions on all big wikis [23:09:40] [1/60] excluding all globally-enabled extensions, here is a list of extensions with how many wikis have it (remember this is only wikis that have wiki reps, 1338 of them in the dataset) [23:09:40] [2/60] ```json [23:09:41] [3/60] { [23:09:41] [4/60] UploadNewImages: 1238, [23:09:41] [5/60] ArticleExporter: 1333, [23:09:42] [6/60] DiscussionPermissions: 1244, [23:09:42] [7/60] DiscussionModeration: 1244, [23:09:42] [8/60] DiscussionMaintenance: 1238, [23:09:43] [9/60] FandomDesktop: 1333, [23:09:43] [10/60] ParserHooks: 1238, [23:09:43] [11/60] Validator: 1238, [23:09:44] [12/60] PortableInfoboxBuilder: 1144, [23:09:44] [13/60] VisitSource: 1144, [23:09:45] [14/60] SpecialChangeFandomEmail: 1238, [23:09:45] [15/60] Achievements: 665, [23:09:46] [16/60] Variables: 274, [23:09:46] [17/60] Loops: 121, [23:09:47] [18/60] DynamicPageList3: 340, [23:09:47] [19/60] Arrays: 89, [23:09:48] [20/60] TabView: 390, [23:09:48] [21/60] DiscussionsAbuseFilter: 267, [23:09:49] [22/60] AbuseFilterBypass: 372, [23:09:49] [23/60] Gadgets: 142, [23:09:50] [24/60] GadgetOverride: 142, [23:09:50] [25/60] 'Abuse Filter': 372, [23:09:51] [26/60] Editcount: 75, [23:09:51] [27/60] MessageWall: 1044, [23:09:52] [28/60] 'Regex Fun': 14, [23:09:52] [29/60] VariablesLua: 33, [23:09:53] [30/60] SemanticMediaWikiHelpers: 15, [23:09:53] [31/60] 'Semantic Drilldown': 8, [23:09:54] [32/60] SemanticMediaWiki: 15, [23:09:54] [33/60] SemanticScribunto: 6, [23:09:55] [34/60] LuaCache: 24, [23:09:55] [35/60] PageForms: 30, [23:09:56] [36/60] ImageSizeInfoFunctions: 26, [23:09:56] [37/60] TemplateSandbox: 6, [23:09:57] [38/60] RegexFunctions: 20, [23:09:57] [39/60] Popups: 14, [23:09:58] [40/60] TextExtracts: 100, [23:09:58] [41/60] ScryfallLinks: 2, [23:09:59] [42/60] Mermaid: 4, [23:09:59] [43/60] SemanticResultFormats: 4, [23:10:00] [44/60] WikiHiero: 3, [23:10:00] [45/60] UserActivity: 1, [23:10:01] [46/60] NewWikis: 2, [23:10:01] [47/60] DefaultLinks: 10, [23:10:02] [48/60] TorBlock: 10, [23:10:02] [49/60] FounderProgressBar: 632, [23:10:03] [50/60] DeleteBatch: 4, [23:10:03] [51/60] UploadFields: 3, [23:10:04] [52/60] Widgets: 41, [23:10:04] [53/60] CustomLogs: 1, [23:10:05] [54/60] 'Highlight Links in Category': 1, [23:10:05] [55/60] PDFEmbed: 4, [23:10:06] [56/60] FlaggedRevs: 4, [23:10:06] [57/60] MagicNoCache: 1, [23:10:07] [58/60] Spoilers: 2, [23:10:07] [59/60] 'PvX Code': 1 [23:10:08] [60/60] }``` [23:10:08] so that gives you an idea of what extensions fandom offers (and not all of them are ones they will enable for you) [23:32:03] Wtf is Mermaid [23:36:42] lol good question [23:36:55] https://www.mediawiki.org/wiki/Extension:Mermaid [23:37:30] not going to even try to read that [23:37:32] makes my head hurt [23:37:34] hahah [23:59:31] Oh, it makes charts really easily! I was messing with it during betaheze testing