[02:33:29] I'm not sure if this is a bug, but after importDump.php, I see "You might want to run rebuildrecentchanges.php to regenerate RecentChanges, and initSiteStats.php to update page and revision counts" and I run those! but even after that, "Recent changes" barely shows any of the content imported. I imported 164 pages (4,684 revisions) and most of the pages don't appear in recent changes. [02:34:07] I am logged in as main admin user [02:35:49] I suppose that "Module" namespace aren't considered articles? But even so, 3 of the recent changes are exactly that: Module:Transclusion_count/data/R, Module:Transclusion_count/data/M, Module:Template_link_general [02:36:50] oh actually, hummm... I see one of the entries appears before i even registered the domain for the website, so maybe it's cuz the timestamps are from older dates [02:37:29] Aha! i see a bunch more when selecting 30 days. Is there a way I can see entire history beyond 30 days/no limit in days? [05:28:12] !wg RCMaxAge | ryzenda [05:28:12] ryzenda: https://www.mediawiki.org/wiki/Manual:%24wgRCMaxAge [05:28:30] by default recentchanges will keep data for the past 90 days, that config variable lets you tweak that [05:29:45] note: what RC keeps in the database and what the UI lets you select are unrelated, however. If you want to enable > 30 in the UI, tweak $wgRCLinkDays [05:29:52] !wg RCLinkDays [05:29:52] https://www.mediawiki.org/wiki/Manual:%24wgRCLinkDays [05:56:49] How can I link to the history of the page on which a template is translcuded from the template? [14:41:55] Hi Support Team, I just inherited a MediaWiki 1.4 and need to move it to a new server. I would like to immediately update to a current media wiki version. Could you Can you recommend me a migration path? [14:42:17] Step 1) Take a backup of everything you can [14:43:33] I've done a mysql dump and save all files by windows copy [14:43:57] Beyond that.. In theory, you should be able to upgrade the Wiki to MediaWiki 1.35 (which is an LTS and supported) and have it upgrade everything [14:44:48] !upgrade [14:44:49] http://www.mediawiki.org/wiki/Manual:Upgrading [14:44:51] !export [14:44:51] To export pages from a wiki, navigate to Special:Export on the wiki, type in the names of the pages to export, and hit "export". See for an example of this form. See also: !import [14:44:53] !import [14:44:54] To import a few pages, use Special:Import - you can also import pages directly from another wiki (see !importsources). For mass imports, use importDump.php - see for details. NOTE: when using content from another wiki, follow the LICENSE TERMS, especially, attribute source and authors! [14:45:02] "If you are upgrading from MediaWiki 1.4 or older, you should upgrade to MediaWiki 1.5 first." [14:45:10] lol [14:45:29] What did we break? [14:45:45] 1.35 still has updates back to 1.2... [14:47:04] I guess it was due to 1.4 supporting latin-1 [14:59:00] ok, than I'll try so. since the system comes from shadow IT, the web server has also not been maintained. Therefore, I will do it on the new server. Do you have any hot tip or advice what I should look out for? [14:59:30] Has anyone seen styling (RL) break on some pages on a site? [14:59:55] ?debug=true works, but the default mode for rl is not getting css (and js?) [15:01:20] Forest: What webserver is used currently? Generally we'd advise to not use IIS where possible [15:02:13] it's an apache [15:32:12] thanks reedy and zabe. Have a nice day. [16:16:34] hexmode[m]: sounds like some kind of an error in one or more modules...said error can usually be rather well /* hidden as a comment */ in a seemingly successful RL request so it doesn't show up as an error in your browser's dev tools etc. but rather you have to manually go through the requests (or enable the proper debug log channel but I can't remember if there even *is* such a thing...maybe just $wgDebugLogGroups['resourceloader'] will do?) [16:18:41] ashley: ty for the hint. I'll try DebugLogGroup. In the meantime, It looks like RL isn't verifying that the files it is serving are actually minified js since the minified js is a mix of that and what looks like corrupted or compressed data. [16:23:55] "fun" :-/ [16:53:32] ashley: Problem described here: https://pastebin.com/4inpdYHV -- rl debugloggroup doesn't seem to show anything. [17:05:34] Krinkle: You've done some RL work, right? Could you look at https://pastebin.com/4inpdYHV or tell me someone who could help? [17:06:22] well *that* is...wild o___O don't think I've seen anything quite like that before. have you already managed to rule out hardware-level faults? somehow this smells more hardware related than software to me [17:07:11] its in AWS, so there is no hardware to look at :P [17:07:43] file on disk is good [17:07:51] file served from web is good [17:08:07] somehow when RL is added to the mix, it goes crazy. [17:10:45] hexmode: the gaps and distortions appear to correlate to would-be repeated patterns. e.g. "implement" and then "client" where "client" became "cli ___" thus removing the "ent" seen from "implement", same for "jquery .. jQuery" becoming "jquery .. jQ@" [17:11:00] hexmode: I assume the web server or proxy etc has messed up compression, unrelated to load.php [17:11:57] Krinkle: I think you'd be right, but when you include other modules, they show up properly compressed and then this.... 1s [17:13:42] presumably it varies on which client races to first request it before it caches it. might be that you have a cache proxy serving a compressed response to a uncompressed curl or something [17:24:07] Krinkle: https://pastebin.com/Hnfw9hcR -- does RL handle a mix of compressed and uncompressed somehow? [17:25:01] (had to chop it up because otherwise pastebin thought I was being nasty and censored me.) [18:03:51] It does not, and this is not an RL problem as far as I can see. [18:53:07] Krinkle: I can accept that this is not an RL problem, but I am curious how you would think about the mix of minified js and garbage in that request. For example, what are the sources that you would look at to figure out how it was getting garbage for one file consistently when the file on disk is ok? [19:02:46] Maybe memcached or apcu is broken [19:03:14] This can't be the only thing that's affected [19:05:54] If you can reproduce it somewhere on a fresh install and/or have ruled out more of the infra (wipe or disable apcu and memc; bypass any proxy etc). Or specify more of what's different in this setup from a plain single wiki LAMP stack [19:29:14] Reedy, is T303455 another caching issue? [19:29:15] T303455: TypeError: Argument 1 passed to MediaWiki\Extension\Gadgets\GadgetRepo::getGadgetDefinitionTitle() must be of the type string, null given, called in /srv/mediawiki/php-1.38.0-wmf.25/extensions/Gadgets/includes/SpecialGadgets.php on line 114 - https://phabricator.wikimedia.org/T303455 [19:33:52] Do we need to bump the cache version for MediaWikiGadgetsDefinitionRepo? [19:36:15] "must be of the type string, null given" is a very interesting error if it comes from changing a class's namespace [19:39:12] https://github.com/wikimedia/mediawiki-extensions-Gadgets/blob/ba83b57d1d69d6b7802373b6670ef3210c35adcd/includes/SpecialGadgets.php#L114 especially if $gadget is not null but $gadget->getName() is [19:40:07] thats indeed interesting [19:40:23] zabe: was https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Gadgets/+/769434 intentionally not backported? [19:42:10] kind of, I did not really thought it was necessary since the class alias fixed fatals on beta [19:42:38] but that was something different [19:46:05] the fatals were due to serialization, this looks like caching (tbh I am quite unfamiliar with the extension) [19:58:47] I am also confused why error doesn't show up on any group0 wiki or testwiki [20:00:39] Krinkle: agreed that this isn't the only thing affected. Other RL urls have garbage, this is the one I focused on. We changed to $wgUseFileCache = true, which, if I read ResourceLoader.php right, should bypass memcached and the problem remained (although a comment with a timestamp for the cache was added). [20:07:02] Use of file cache and memc don't relate in MW in general. One is for caching individual computations, the other is a basic CDN like layer for entire http responses, which applies to page views and RL responses alike