[00:00:56] oooo noted [00:01:11] you’d be better off doing it locally [00:01:42] I don't have much experience with this. Is there a Special page that lets you install fonts directly? [00:01:55] Thanks for the help though, appreciate it. [00:02:20] i've just been uploading them in the same way you upload an image, then go to the font page in file list and grab the url from File that starts with static. [00:02:46] Funny that we were synchronized on this issue lol [00:03:05] posted an example of common.css but blocked by this server, sorry [00:04:19] but yeah basically what you wrote, for the URL just paste the File link starting "static.*wikitide.net/[yourwiki]/x/x/[fontfilename].woff2" [00:04:41] it used to be miraheze.org and that change is what broke it for me today [00:05:18] IIRC it has to be a .woff2 font file? I may be wrong on that but that's what Ive used [00:05:41] I think I used a cloud converter for that [00:05:51] links are blocked to non #verify ed [00:05:54] Just added woff3 as a file type for upload, so it might take some time to adjust in teh backend [00:05:57] what i meant yea [00:06:54] [1/6] > @font-face { [00:06:54] [2/6] > font-family: "Proxima Nova Alt"; [00:06:55] [3/6] > src: [00:06:55] [4/6] > local("Proxima Nova Alt"), [00:06:55] [5/6] > url("[the url you got from the file you uploaded]") format("woff2"); [00:06:55] [6/6] > } [00:06:56] there we go! [00:08:33] Is the class.name coding required? [00:10:09] I don't have it? But I also barely know what I'm doing, so XD [00:10:17] I feel that [00:10:49] [1/3] I am trying to apply the font now to a seperate page using some code I found online: [00:10:49] [2/3] TEST/span [00:10:49] [3/3] But that is unfortunately not working [00:10:58] Have you gotten it to apply to your pages? [00:13:01] @magebunkshelf [00:13:25] looks right to me? [00:13:43] Rip [00:14:10] Okay it ended up working, was just taking time to update [00:14:19] Well thanks for the help. Really appreciate that. [00:14:47] ohh gotcha, yeah often changes to common.css can take up to ten minutes to push through sometimes, might've been that? [00:14:55] though usually quicker [00:15:42] Ye, we're all good [00:15:58] Thanks, that was excellent timing to deal with that problem <# [00:37:40] Why can't I find the Translate or DiscussionTools extensions in ManageWiki/extensions? I tried searching and they are not there. [00:39:45] jagoda123456769: DT is available as "Discussion tools" for some bizzare reason [00:39:53] i can see translate though [00:40:54] I see DT [00:41:03] But i can't see Translate [00:57:35] [1/2] is there a way to make the random page button exclude certain namespaces? it includes a bunch of talk pages and user pages and etc which means it never actually reaches the pages of actual information [00:57:36] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1307872300829511710/image.png?ex=673be27f&is=673a90ff&hm=48d8effa612146e3761d64b9eb0e631814b659a5a53cf41a07b06959edec3c4d& [01:00:12] I think you can have it only do mainspace [01:00:46] I’m not sure how, maybe Special:RandomPage/Main? [01:08:45] doesn't seem to work [01:08:46] odd [01:11:03] Is there a way to have a link point towards a specific category? Since the normal code just adds a category to the page you're on. [01:12:49] [1/2] oh. my issue is weird and specific that's why [01:12:50] [2/2] the wiki i'm working on was imported from fandom which includes a bunch of weird fandom specific namespaces that didn't get ported over [01:12:54] i'll add those and see if it works [01:13:39] <.guardianx., replying to ericbomb> `[[:Category:Cleric]]` [01:13:39] https://meta.miraheze.org/wiki/:Category:Cleric [01:15:15] <.guardianx.> relay bot immortalizing my mistake LoL [01:16:13] Oh ty! I didn't think about the fact that starting a wiki would feel similar to learning a new programming language with a whole bunch of syntax things, but sure does! [01:17:56] <.guardianx.> Oh ya, especially when you start trying to do modules in LUA and using JS. [01:18:38] <.guardianx.> I jumped in thinking, "It's fine, they have LUA and JS, I'll just use those to make "automation" of things easier!" Nooope lol [01:25:34] ✨ wikitext [01:26:10] If you want a reason to use Lua over wikitext I can definitely give ya one :MeinaWink: [01:26:53] sniffed them all out and it's working now. swag [01:27:28] What's the recommended way to make a gallery of images centered? Can the tag allow you to horizontally center images? Is there an extension with a gallery-like function that can do it? Or should I just make a table? [01:28:09] [1/2] https://pandorastale.wiki/wiki/Pandora's_Tale_Wiki#Characters [01:28:09] [2/2] I want to move the two images in the "main characters" gallery to the center. [02:07:53] <.guardianx., replying to pixldev> Oh I love using languages over markup, the issue is I found out that a good portion of what I was looking to accomplish was fairly difficult to accomplish with my level of confidence. [02:11:44] You should be able to use
/center tags [02:16:15] Returning to my fonts question, the font is now displaying on the PC site, but is not displaying on mobile. [02:16:53] Is there a way to fix that? [02:28:36] {{MediaWiki:Mobile.css}} [02:28:36] https://meta.miraheze.org/wiki/MediaWiki:Template:Mobile.css [02:28:37] [02:32:28] does anyone know why multiple instances of the image crop template on one page just...... screws up on mobile (or maybe thats just me) [02:41:16] Mobile caching is also served differently, as I recall. So it may take longer to display changes. [02:55:26] Thanks folks [02:55:33] Phew I have a lot to learn. [02:55:36] Really appreciate it [03:10:27] the google fonts broke for mobile lol, they're fine on desktop but wouldnt load on mobile [03:13:47] Next font related question. Is there a way to apply a custom font to an article title? [04:13:51] And the most important thing you have to learn, is to just accept, that sometimes, trying to fully understand MediaWiki is an exercise in futility [04:16:26] hi [04:37:35] [1/8] > MediaWiki internal error. [04:37:35] [2/8] > [04:37:35] [3/8] > Original exception: [0e8eea964e220d685dec9b9d] 2024-11-18 04:36:09: Fatal exception of type "Wikimedia\Rdbms\DBQueryError" [04:37:36] [4/8] > [04:37:36] [5/8] > Exception caught inside exception handler. [04:37:36] [6/8] > [04:37:37] [7/8] > Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to show detailed debugging information. [04:37:37] [8/8] … getting errors on some User: pages… on varying wikis… [04:40:02] I seem to have been logged out and can't log back in. [04:40:23] (on both electowiki.org and meta.miraheze.org) [04:41:22] yeah same im getting an error message [04:45:29] that’s me [04:45:31] one moment [04:47:58] [1/2] Recent Changes is no longer working. [04:47:58] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1307930276525244468/Screenshot_20241117_214716_Chrome.jpg?ex=673c187e&is=673ac6fe&hm=c75edac51c83389334cabbe7ceaf200215772f3487377ac2946da5eeebe71902& [04:48:00] https://cdn.discordapp.com/attachments/407537962553966603/1307930285081493524/IMG_5855.png?ex=673c1880&is=673ac700&hm=a366c6e982fa6787a24e079136d05d8e7ee4c27046297ee07512790a2441828c& [04:48:11] seems to be affecting all wikis [04:48:34] nice it looks like we got a aserver banner [04:48:35] … can still access some…? [04:48:38] once more, give it a moment [04:48:55] <.labster> Actually it's just some pages on some wikis [04:48:57] my wikis working fine [04:49:00] all pages including RC [04:49:57] <.labster> https://allthetropes.org/wiki/Special:RecentChanges works fine, get the missing centralauth error on https://allthetropes.org/wiki/Talk:Main_Page [04:50:47] all should work now [04:51:22] <.labster> so weird to see wikis down but only on some pages. [04:51:32] <.labster> it does work now [04:51:32] thanks agent :lyriaSalute: [04:57:35] …right as I made a task for it [04:57:41] welp you can close this I guess https://issue-tracker.miraheze.org/T12899 [06:01:00] [1/5] Still getting this error: [06:01:00] [2/5] MediaWiki internal error. [06:01:01] [3/5] Original exception: [9f7d2f09891b9e506e5789e0] 2024-11-18 06:00:06: Fatal exception of type "Wikimedia\Rdbms\DBQueryError" [06:01:01] [4/5] Exception caught inside exception handler. [06:01:01] [5/5] Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to show detailed debugging information. [06:21:03] <.labster, replying to robc_34116> That's likely a different error. Can you link to the page you are trying to view? (You will need to #verify before posting links) [06:22:07] Hey Lobster. It's hit and miss. This one has the error at the moment: https://woldiangames.miraheze.org/wiki/Wold_History [06:22:25] Lol. @.labster . Not Lobster 🙂 [06:22:59] 🦞 [06:23:25] I can see that @rks.xyrinfe had the same error about 90mins ago [06:23:54] I will check the logs real quick and see if I can find a quick fix but I am heading off for the night soon. [06:24:30] <.labster> Originally it came from my 9th grade biology teacher mispronouncing "lobster" which we were dissecting at the time. It was close enough to my name that I adopted "labster". It's been my nick for 25 years now. [06:25:47] <.labster> #iamveryold #isthisbratlikethekidssay [06:27:51] [1/4] @agentisai this seems an error related to virtual domains again: [06:27:52] [2/4] ``` [06:27:52] [3/4] Table 'woldiangameswiki.matomo' doesn't exist [06:27:52] [4/4] ``` well that is good as it should be on mhglobal but doesn't seem to read that. [06:36:34] Thanks @cosmicalpha [09:29:17] Any update @agentisai? [09:45:23] [1/2] hey guys, I'm having a problem - I was logged out overnight and I'm trying to log back in but I keep getting this error: [09:45:23] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1308005122500198431/image.png?ex=673c5e32&is=673b0cb2&hm=b8452f85d49e6a17afebe9b737288556282139a22a164e6851148f9854e85bae& [09:45:38] What do I do? [09:46:06] Does it still happen? [09:46:15] Give me a second let me try something [09:47:17] (leaving home rn so please @ me if you have a solution!) [09:49:43] I think I made things worse... [09:53:22] ugh [09:57:30] Fixed @robc_34116 [10:01:04] checked and other people can log in; it's a problem on my end specifically. Any ideas? [10:01:29] try logging in in incognito or another browser or clearing your cookies. [10:02:44] That worked, thank you so much! [10:02:54] No problem! [11:48:31] [1/2] any way to not do this? trying to make a redacted template where people can just add how many bars they need [11:48:32] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1308036113679777852/image.png?ex=673c7b0f&is=673b298f&hm=302ec7782acf80403283b51a09880cc40c6c66a398b26adaf52a15049782201d& [11:48:53] verdessence: i'd try a small bit of lua tbh [11:49:13] https://tenor.com/view/tigger-tigger-winnie-the-pooh-tigger-sad-disappear-gif-25009198 [11:49:18] there goes the days of being lua free.... [11:49:37] oh wait [11:49:41] there's an enwiki template for that [11:49:48] https://en.wikipedia.org/wiki/Template:Loop [11:49:55] that, however, requires importing from enwiki [11:50:41] [[mw:Extension:Loops]] can do it too [11:50:41] https://www.mediawiki.org/wiki/Extension:Loops [11:50:42] [11:53:30] Hello [11:54:16] meow :3 [12:00:11] [1/2] ```[[Redacted|{{#loop:||{{{bars|5}}}|▇}}]]``` [12:00:11] https://meta.miraheze.org/wiki/%23loop:Template: https://meta.miraheze.org/wiki/Redacted [12:00:11] [2/2] this should do it with extension:loops [12:00:55] i got stumped reading the documentation [12:01:43] Wikipedia templates tend to be like that [12:01:47] this should be able to replace everything u have in that screenshot [12:05:46] That said, you may be able to make something work using https://www.mediawiki.org/wiki/Extension:Loops Though it can easily become unwieldy and messy quick as well. Wikitext is really not really suitable for actually programming [12:09:17] ohh wait, that was already mentioned, geuss I read the whole thing a bit too fast xD [12:11:48] it's possible w/o Lua lol [12:12:07] how? (excluding the loops extension that i forgot about) [12:12:12] a template on my wiki doesn't even use liips [12:12:14] loops [12:12:55] albeit it's fancier in output [12:29:38] [15:37:33] Is Miraheze server down? [15:37:37] Just white color [15:40:20] im getting error 500s on everything i try now [15:48:03] Down here too [15:48:38] Didn't get a specific error page, just can't reach mira [15:52:50] down ? [15:53:07] <.guardianx.> Yeah parts seem down, I dunno, mine is up, could be auth is down again? [15:55:21] <.guardianx.> [1/3] On a totally unrelated topic... [15:55:21] <.guardianx.> [2/3] I'm trying to learn Wikibase to implement but one question is on my mind on the outset. Is the data I create for my wiki that is based in Wikibase mirrored to the Wikidata..database? [15:55:22] <.guardianx.> [3/3] I don't need that functionality at all so I'm wondering if it's gunna be overkill. [16:00:47] it seems to be just one server but due to session affinity, you're not getting served by other servers [16:01:00] probably a bad opcache [16:01:09] I tried to check Grafana but [16:01:16] My phone [16:01:17] Does not like it [16:01:28] it won't tell you anything [16:01:48] php-fpm was restarted so it should work now [16:10:25] Returned to normal for me [16:10:30] Lost my edits though :) [16:11:10] if you keep the tab open until the error gets fixed, you can just refresh and have your edit go through [16:11:18] it's never lost until you close the tab [16:12:38] Yeah I made the mistake of trying to alt+left the tab [16:16:29] Pro tip, head to preferences, there’s an edit recovery feature you can try [16:16:44] Saves your edit locally every 5 second [16:17:12] Lemme see [16:22:19] Turned it on, neat [17:03:16] That seems odd [17:03:37] Using memcached or redis is supposed to eliminate the need to do this kind of stuff [17:03:48] Theres no need for sticky sessions in such contexts [17:05:16] yeah [17:05:24] it was set in the CF config to test something [17:06:45] All hail Fastly [17:06:51] Ugh i mean Cloudflare... [17:14:10] fastly is basically Varnish as a Service so I agree [18:32:59] There was a reason for session affinity [18:33:08] Can't remember why [18:34:06] Mattermost may say [18:45:48] no [18:46:24] not unless you setup a script to mirror them but that would be excessive and a waste of server resources lmao [18:46:50] <.guardianx., replying to zippybonzo> [1/2] Thank god because...I am already semiconscious about my edits and creates going to a channel here! [18:46:51] <.guardianx., replying to zippybonzo> [2/2] I feel like Wikibase makes dealing with a relational "Database" rough but at least I can change properties pretty dang easily even if those properties are in a non-sequential order...which is triggering as heck. LOL [18:48:09] <.guardianx.> Dealing with a batch of properties, all sequential... [18:48:12] <.guardianx.> https://tenor.com/view/ron-swanson-parks-and-rec-its-so-beautiful-gif-15644547 [18:49:03] <.guardianx.> move on to a new batch, 1/2 way in...remember i need to add a new property to the last batch and now it's no longer sequential. [19:09:02] [1/4] How do I put fonts on Cosmos skin, does it just not load unlike Monobook? [19:09:02] [2/4] I'm making my spare wiki look like Wikia until I find what to use it for [19:09:02] [3/4] https://cdn.discordapp.com/attachments/407537962553966603/1308146969960779856/Screenshot_20241118-110504.png?ex=673ce24d&is=673b90cd&hm=f27dc4b618a5c4cbf6fd4e2f84ad03210cfef6c2f389e489d459f0033da502f4& [19:09:03] [4/4] https://cdn.discordapp.com/attachments/407537962553966603/1308146970329747487/Screenshot_20241118-110638.png?ex=673ce24d&is=673b90cd&hm=66c0a71ab5ac8c2ed84adecd1305dbb58e0093282402d3102e99afdddf399e5f& [21:09:07] finally got the dump from fandom [21:09:42] 🎉 [21:16:53] :Partyheze: [21:17:44] Now I just gotta request admin and hopefully find other editors [21:17:50] and figure out what to do about the images [21:25:55] Hm? [21:26:13] Exporting images? [21:28:32] Yeah, I need to hold a local election first for rights I think? unless that can be skipped after a certain period of inactivity [21:29:03] Is it an existing wiki you’re taking over? [21:29:06] yeah, I don't think the fandom dumps include images? So that has to be done by hand [21:29:12] yeah [21:29:23] How many images, [21:29:50] If there’s any chance of there being local editors, would be a local election yes [21:30:06] at least a few thousand, some of them are probably not used though [21:30:20] If it’s definitely absolutely dead I think a direct appointment is possible but that’s not my purview [21:30:28] Ouch [21:30:40] I think I could help with that [21:31:46] [does this count as absolutely dead?]() [21:32:03] would be nice, I think trying to save them by hand would drive me insane [21:32:21] Looks to be activity [21:32:27] May I ask for the link? [21:34:12] https://regretevator.fandom.com/wiki/Special:ListFiles [21:34:24] Let’s see if I remember how this work [21:35:48] Also, how do I start a local election again? [21:35:52] Done! Running job `f701268d-3742-4b25-85be-649ca2276e60`. I will notify you when it updates. Use `/status` with the job ID or use the `Info` button to manually check the job. [21:36:09] That has a non zero chance of not failing first try [21:36:24] sucks that a chunk of these aren't even used [21:36:26] I think we have a template somewhere [21:36:32] and probably just a waste of space [21:36:43] curse of roblox wikis [21:36:59] As an ambassador to a collection of Roblox wikis [21:37:02] Sounds bout right [21:37:25] @tedkalashnikov how many images did you guys clean out when moving? [21:37:39] Like the games can be good, but there's also a guaranteed mass of children [21:37:53] Yeah definitely [21:38:01] Idk off the top of my head [21:38:04] I don’t actively play but I can appreciate the games regardless [21:38:21] Was there a significant amount of images at all? I remember the phighting babies as the main one [21:38:25] https://cdn.discordapp.com/attachments/1016598848212320266/1263549461963997305/pat.gif [21:38:37] Id say a couple hundred maybe [21:38:49] Wooooo boy [21:38:55] Hm [21:39:05] I downloaded our images by hand [21:39:12] Which was around 1500 or so [21:39:17] we have many more [21:39:22] I think like 7k or something [21:39:37] I wonder if it would be worth cobbling a script that checks image names against the pages dump to find unused or nearly unused images [21:39:46] It can be done it just takes time, I have arthritis and managed it from pacing myself LOL but I sweeped and deleted unneeded images first [21:40:05] Yeah [21:40:11] I also had to rename and convert all of them [21:40:18] It was a pain but it was done [21:40:33] The very awesome digital fixed up the discord version of @WikiBot so there’s that to download at least [21:40:41] No need to migrate images that arent actually used in articles [21:40:46] Its just a waste of time and space [21:41:18] Depending on what tooling you have, you’d save more time mass downloading and removing then manually downloading/filtering [21:41:30] There was no way to mass download when we migrated [21:41:40] Atleast not easily [21:41:46] Was gonna say [21:41:52] And I was trying not to alert Fandom I was scraping LOL [21:41:54] Wikiteam3 is years old [21:41:59] And a pain in the ass [21:42:06] Yeah but it wasnt openly provided as an option so [21:42:20] I chose to trim the fat so the MH wiki didnt start with tons of garbage [21:42:36] Im essentially in charge of all of our files so that was my preference [21:42:53] They had a niche IRC bot you could request for a while. Digital was nice enough to finish the discord bot. As a bonus also allows exporting XML of places that don’t like to provide [21:43:12] Well the point of the bot is to preserve copies of wikis on the Internet Archive [21:43:20] Migration is technically a secondary use [21:43:33] [1/2] :trout: [21:43:34] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1308185858809266206/image.png?ex=673d0685&is=673bb505&hm=ce21af61fc0d77ff6f3b8cca8b073732045c39f08cfb6891ec219aa79149b33c& [21:43:37] I need to do a file sweep sometime soon [21:43:37] Round of applause to the wiki team over at archiveteam [21:43:48] :TechnoHeartBroken: [21:43:52] Ive kept a good eye on PHWikis files atm but Im sure theres nonsense in there [21:44:08] If you find another phighting babies lmk [21:44:21] How much of that is used in mainspace? I would just focus on that [21:44:26] Idk if theres a way to see [21:44:43] I think you could bodge a script using grep to filter files with absolutely no uses [21:45:17] Going through the file names in the image folder and using a grep on the XML dump maybe [21:45:35] [[w:grep]] [21:45:35] https://en.wikipedia.org/wiki/grep [21:45:35] [21:46:00] I could try to make a proof of concept later if I finish everything I need to [21:46:11] I think this could be a useful tool in migration’s for people [21:47:19] @digitaldragon would you by chance have any personal ideas on how to implement something to filter images in a dump that aren’t used in mainspace? My idea was greping the XML rawly but curious if you might have a better idea :p [21:48:07] https://cdn.discordapp.com/attachments/407537962553966603/1308187006991339542/image.png?ex=673d0797&is=673bb617&hm=682539999d92e077969c768abe44c5d1f0abed10f03c0080b66e7449c30a6785& [21:49:11] [1/2] wtf is stinky pete but blue [21:49:11] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1308187272885047296/image.png?ex=673d07d6&is=673bb656&hm=a69d598d966c3d81cb466cd94dfd0b46f5b021c29139993d99884e2eb12a796b& [21:50:18] Huh [21:50:47] [1/2] why would you ever need this [21:50:47] [2/2] https://cdn.discordapp.com/attachments/407537962553966603/1308187677853745235/image.png?ex=673d0837&is=673bb6b7&hm=c840154e833f520c7195d3e022f0c5fbf830fd1b69fecb0522d15ffaf31c25c0& [21:52:13] god knows what percent of the images are spam [21:54:25] I’m not sure I want to know [21:54:29] Are they used in mainspace [21:56:38] mainspace? [21:56:46] its motivational and inspiring i literslly just shed a tear [21:57:58] Articles [21:58:26] Not User: or Talk: pages [22:00:09] Ohh [22:05:29] https://cdn.discordapp.com/attachments/407537962553966603/1308191378081054881/image.png?ex=673d0ba9&is=673bba29&hm=22a5948edafb9198875c02a3aafffbd4268fbaf38feb4dc07e36ddd5b19381f7& [22:07:23] I meant is the spam images used on articles [22:08:52] I don't think so [22:09:29] nope [22:09:34] just user pages [22:09:37] or nothing at all [22:10:13] Okay [22:10:20] Filtering nothing at all is the easy bit [22:15:29] :Partyheze: [22:17:05] I need to finish some posts and I’ll see about making a quick and dirty bash script to try and do this [22:18:38] Tyt! The fact that you’re helping at all is appreciated! [22:21:07] This is something I see that can help a good amount of crats, and less files for us to host so win win [22:21:36] I want to maybe make a little “migrating admin’s toolbox” page so may be good to make some stuff [22:36:52] Also do I just edit the bureaucrat's talk page to request an election or do the email option? [22:39:51] Yeah you probably want to grep the xml [22:40:22] Issue would be trying to filter by mainspace [22:40:23] Which [22:40:25] Says fix [22:40:31] Don’t [22:41:05] You can contact the crat and ask them to appoint you directly, or stewards can close an election [22:41:07] There might be mediawiki api paths that expose the data too that you could use the image list inside the dump to check, but wikiteam3 doesn't check them so you'd have to do that yourself [22:42:42] [1/2] This information should be in the xml [22:42:42] [2/2] but then you have to parse the xml [22:42:57] and nobody wants to parse xml :p [22:43:01] Yeah [22:43:23] Also takes a lot more processing power if you’re doing this for thousands of images [22:43:34] what's a good amount of time to expect a response btw? [22:43:54] No idea tbh, [22:44:01] Week max probably? [22:44:58] because this guy hasn't done anything since like May, so I'm slightly iffy on the odds of getting a response [22:45:05] Yea makes sense [22:48:13] Ok left something on their talk page, let's hope that works out [22:48:59] a week tends to be standard, if they've been gone since may then an email wouldn't hurt on top [22:50:28] Will definitely do that if I don't hear back [23:16:31] Thanks for your help @cosmicalpha ! 👍 [23:21:46] <.guardianx.> [1/3] I swear I dealt with this a while ago, does anyone recall why this won't parse the text in the array as a wikilink? [23:21:46] <.guardianx.> [2/3] ```{{#arraydefine:mg|{{#statements:MerchantGoods}}}} [23:21:46] https://meta.miraheze.org/wiki/%23arraydefine:Template:mg [23:21:47] <.guardianx.> [3/3] {{#arrayprint:mg|
|@@@@|[@@@@] }}``` [23:21:47] https://meta.miraheze.org/wiki/%23arrayprint:Template:mg [23:33:24] <.guardianx.> https://cdn.discordapp.com/attachments/407537962553966603/1308213499637268610/image.png?ex=673d2043&is=673bcec3&hm=d445326b6babe117a152814665227b867baf9a9c2fe96c65a3a1649be5ab38a2& [23:55:22] If you’re replacing/changing your wiki logo, do you just upload an svg to the wiki media and use that link? [23:59:47] hey guys so how do i change the default font of my wiki, ive uploaded the woff file to my wiki but dont know how to use it, dont know what a "sitewide CSS using @font-face" is either