[04:48:52] so, my wiki is quite messed up right now, I'm thinking of start over again, but with dumps exported out of course\ [04:50:05] Is it actually possible? [06:45:25] what do you mean by quite messed up [07:20:28] [1/2] https://discord.com/channels/1088870295412146347/1092731722602381414/1184031921496408135 [07:20:29] [2/2] i thought it was one for like 3 months now lol [07:21:01] We've been battling with the taxman for 6 months now to get the certification heh [07:21:07] Technically it is retroactively applying since July 9th, but we were only just actually approved [07:21:16] oh [07:21:24] does the no tax apply to all countries? [07:35:46] like, errors everywhere, mostly might be me messing with the settings [07:36:22] I would say a dump and rebuild is not going to fix that [07:41:21] This is huge news. Congratulations on completing the process, and I look forward to seeing where the farm goes from here. There's a bright future ahead [07:49:06] I know, but I'm thinking about going to self-hosting my wiki instead [07:50:24] With Wikibase, that's probably the best choice while you figure out what's going on [07:51:17] but I only have a cPanel with 2 GB of storages lol [07:51:34] so I'm not sure if I can install 1.41 for that [07:52:03] I know about 1.40, but I wants to try 1.41 first, for the MediaSpoiler extension [07:57:37] but even that, I can't do it without a dump [07:57:46] We can upgrade wikis to 1.41 whenever if you want to upgrade your wiki to 1.41 we can do it. [07:58:06] oh I do [07:58:28] We have a way to put different wikis on different MediaWiki versions now [07:58:41] Well 1.40 or 1.41, just to make upgrade process easier. [07:59:01] I can handle upgrading your wiki tomorrow if you want [07:59:16] *wiki(s) [07:59:53] that's nice then [08:00:17] DM me the wikis you want upgraded so I don't forget [08:00:20] uh and... still, we have to figure out the issues with my wiki first [08:00:45] We can do that after upgrade, and I can install MediaSpoiler for 1.41 also [08:01:01] oh, ok then- [08:03:02] Yep, I developed our versioning system in the very beginning of launch to make it so we can more easily upgrade and maintain things. There are options in our mediawiki-repos.yaml file for a lot of things also, including extensions to install only for specific versions and whatnot. [08:31:25] This is great news [11:36:24] [1/2] I have no wanted pages/categiries etc. [11:36:24] [2/2] Also my wiki still has an internal error from yesterday. [11:37:06] Wanted Page and Categories are disabled as Miser Mode has been enabled if I know correctly [11:38:09] Miser Mode is global, I forgot to add [11:38:34] And about the error, meh, you aren't alone [15:59:05] Please help with this when you have time @Site Reliability Engineers. Google Search Console is failing to index my wiki’s pages now. 😥 [16:00:38] Sorry about now. Should work now [16:01:11] On what pages.do you see the error? [16:02:34] [1/2] You can see an example on: [16:02:35] [2/2] https://totaltennis.wikitide.org/wiki/Main_Page [16:02:41] Thank you! 😊 [16:07:24] At any time, when you are ready, can you also upgrade `lhmnwiki` to 1.41? [16:14:46] Only my front page, I believe its the the Vietnam wiki which is also getting a similar error the last week [16:56:01] All the page on my wiki aren't accessable, either do your [17:00:09] [1/16] ```Error: Call to undefined method FallbackContent::getText() [17:00:10] [2/16] Backtrace: [17:00:10] [3/16] from /srv/mediawiki/1.40/includes/content/TextContentHandler.php(245) [17:00:10] [4/16] #0 /srv/mediawiki/1.40/includes/content/ContentHandler.php(1747): TextContentHandler->fillParserOutput(FallbackContent, MediaWiki\Content\Renderer\ContentParseParams, ParserOutput) [17:00:10] [5/16] #1 /srv/mediawiki/1.40/includes/content/Renderer/ContentRenderer.php(47): ContentHandler->getParserOutput(FallbackContent, MediaWiki\Content\Renderer\ContentParseParams) [17:00:11] [6/16] #2 /srv/mediawiki/1.40/includes/Revision/RenderedRevision.php(259): MediaWiki\Content\Renderer\ContentRenderer->getParserOutput(FallbackContent, MediaWiki\Title\Title, integer, ParserOptions, boolean) [17:00:11] [7/16] #3 /srv/mediawiki/1.40/includes/Revision/RenderedRevision.php(232): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(FallbackContent, boolean) [17:00:11] [8/16] #4 /srv/mediawiki/1.40/includes/Revision/RevisionRenderer.php(242): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string, array) [17:00:12] [9/16] #5 /srv/mediawiki/1.40/includes/Revision/RevisionRenderer.php(164): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, array) [17:00:12] [10/16] #6 [internal function]: MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array) [17:00:12] [11/16] #7 /srv/mediawiki/1.40/includes/Revision/RenderedRevision.php(199): call_user_func(Closure, MediaWiki\Revision\RenderedRevision, array) [17:00:13] [12/16] #8 /srv/mediawiki/1.40/includes/poolcounter/PoolWorkArticleView.php(87): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput() [17:00:13] [13/16] #9 /srv/mediawiki/1.40/includes/poolcounter/PoolWorkArticleViewCurrent.php(92): PoolWorkArticleView->renderRevision() [17:00:14] [14/16] #10 /srv/mediawiki/1.40/includes/poolcounter/PoolCounterWork.php(166): PoolWorkArticleViewCurrent->doWork() [17:07:44] is there any way of checking the status of a wiki after you close the tab? i forgot to bookmark it so I can't check it [it seems?] [17:08:39] If you go to https://meta.wikitide.org/wiki/Special:CentraAuth and search your user, you can see all wikis you've ever visited [17:10:44] Sorry, should have specified - I mean a wiki request status [17:13:16] i.e. can i get back to the page it describes here [17:17:00] All requests have been processed so you should be able to view if it was approved or not at [[Special:Notifications]] [17:17:01] [17:17:42] weeeeeeeirrrd [17:19:01] ig ill resubmitt? [17:20:02] Oh, if you're looking for wiki requests there's another method [17:20:33] https://meta.wikitide.org/wiki/Special:RequestWikiQueue, then put in your user name and change status to all, then search [17:24:38] hm [17:25:36] is there any way to dispute a denial or no? [17:26:40] Depends on cause, but you can edit the request description to reopen for consideration and leave a comment with rationale. [17:26:51] alr [17:27:09] Usually a reviewer will explain in comments why a request was declined [17:29:29] yeah i saw it [17:29:35] resub'd [17:39:41] so I get a new internal error on totaltennis.wikitide.org [19:10:38] It's not new, it just the fact that they enable detail traces within our view [19:11:12] its new on my wiki, as it hasn't shown that before [21:49:57] typically backtraces aren’t visible on production wikis, you would only get a more concise description of the error [22:52:20] [1/4] Is there any good, efficient way of referencing a table and performing a data pull similar to a vlookup? For example, let's say I have a table of 200 rows, and 4 columns. Each row is for an individual item, and its particular features... [22:52:20] [2/4] Item | Value | Craftable | Use [22:52:21] [3/4] ThingA | 24 | No | Quest [22:52:21] [4/4] ThingB | 0 | Yes | Crafting [22:57:00] So are you looking for a Flagicon/Country data type template to pull from? [22:58:11] [1/2] https://en.m.wikipedia.org/wiki/2023_WTA_1000_tournaments [22:58:12] [2/2] Or something like this which will just pull sections [23:00:51] Possibly. Specific use case would be a list of rpg items, the specific stats of each item, the values to purchase and sell, crafting recipe, what crafting recipes it's used in. Preferably a central table, like e.g. wiki/Tables:AllItems [23:02:29] I don't want to call the full table if I don't have to, but I also don't want a module for each specific item, as that seems like it defeats the purpose [23:05:46] Eventual goal is to have infoboxes on the specific item's page, that display useful information to the end user, while having a centralized repository for updating information, stats, etc, and having that info reflected on any pages that reference a given item [23:50:48] So the two paths to solve for that are going to be complex lua modules + templates or Semantic MediaWiki, neither of which is clean or fast to do. [23:51:23] Might also be doable with cargo or wikibase, but all of this pushes at the edge of my knowledge. [23:52:50] I do plan on a deep-dive into SMW one of these days, but all wiki-related projects are on hold for me until June once I'm out from under this masters program. 🙂