[02:59:45] https://github.com/wikimedia/mediawiki/blob/d7889acf13777c0cdb22bab1bfb86fa43e1e85e0/includes/Rest/Handler/ParsoidHandler.php#L649 [02:59:53] shouldn't this be removed? [10:13:48] Guest80 (in case they read the log): not as far as I know, and if I understand what you mean correctly, it’s unlikely to be added due to https://www.mediawiki.org/wiki/Parsoid/Extension_API#No_support_for_sequential,_in-order_processing_of_extension_tags [10:14:48] paladox: looks like there’s a task for it 🤷 T339866 [10:14:49] T339866: Remove TemporaryParsoidHandlerParserCacheWriteRatio setting from MediaWiki - https://phabricator.wikimedia.org/T339866 [10:50:52] i have a curious error. after upgrading a mediawiki installation from 2019 (never updated *shudder*) to 1.39 LTS i have run the upgrade.php, ran fine. [10:53:22] now i get this error: Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/includes/title/MediaWikiTitleCodec.php on line 92 [10:53:34] its curious for 2 reasons: [10:54:26] 1. the tried size is less than the limit. 2. i cant get more verbose output with the usual options in localsettings.php [10:56:27] and yes i do understand that 1G of memory is nor good practice nor should increasing RAM be a solution to memory exhausting scripts. but i tried, since mediawikititleCodec.php l 92 seems to do returns of erros so i figured: maybe my error is too big?! [10:56:56] s/nor/neither/ [10:58:14] That's probably some recursion or loop bug here [10:58:45] Try disabling all extensions to discard an extension may be causing this [10:59:40] there are no extensions [11:00:01] squeeky clean installation. just took the database from an old installation [11:01:05] re “the tried size is less than the limit”, that just means enough memory was allocated by other things during the request that this last allocation, though it’s relatively small, pushed it over the limit (I assume) [11:01:15] don’t know how to debug this though, sorry :S [11:01:38] the old install was FUBAR, since the "admin" just kept installing newer php over and over. ended up with 10+ versions with onflicting files, ton of clickclick this addon click click that addon, fiddling with phpmyadmin in between... [11:03:57] but as far a i understand: i can just clean install, dump the old database, copy over the images folder, feed the dbdump into the db server, upgrade.php and it should give me the old wikis contents. [11:04:17] the cleaninstall worked fine. [11:04:46] So, you started the install with no LocalSettings.php, then the installer detected the database was an existing install, and it just upgraded it? Then you used the provided LocalSettings.php from the installer? [11:04:51] and i considered the version gap. [11:05:06] Vulpix: no... [11:05:14] i installed clean without db. [11:05:32] I mean the failed install [11:05:43] then i used that localsetting.php, tranferred the old db/images, upgrade.php, error [11:05:55] there was no failed install [11:06:32] the clean install worked. [11:07:01] btw: this is dockerhubs mediawiki:lts with mysql:stable [11:14:26] Vulpix: i tried to find the installation process with an existing database... can you link, explain what you meant? [11:18:15] noNamesLeft: If you do a normal install, when you write the name of the database, the installer will check if the database exists and it contains some known tables. In that case, it will just try to upgrade the database and generate a LocalSettings.php with that database. But it should be the same as you did (except it avoids creating a new database) [11:37:14] ill try anyways [11:37:20] maybe itll give me more insight [11:38:00] the whole install/deployment is automated anyways except for the inbrowser install so its just yank and paste of a few lines [11:39:03] The idea is basically to start with a clean LocalSettings.php, because maybe an old LocalSettings.php contains some configuration that changed drastically in an incompatible way (for example, where it was a string before, it expects now an array, or similar), which may often cause weird bugs like that [11:39:38] This is not common, but it has happened before, specially on old versions or with extensions [11:43:55] i never transferred the old localsetting.php [12:01:33] i asked over at #mysql but maybe you knwo aswell. i used mysqldump on the old host. whe using mysql < dump on the new one it wont import the table with a syntax error. so what are the steps to import before installation via browser... [12:18:41] mysqldump is correct. I'm not aware of incompatibilities between mysql versions regarding sql dumps [12:20:12] create table and insert syntax is pretty standard [12:22:54] I’ve at least experienced incompatibility between mariadb and mysql before (https://dba.stackexchange.com/questions/248904/mysql-to-mariadb-unknown-collation-utf8mb4-0900-ai-ci) [12:23:06] in principle I could imagine something like that happening between mysql versions too [12:23:22] but probably only if you’re trying to import into an older mysql version than what generated the dump [12:25:30] yeah, weird collations can be a problem [22:45:16] so today i checked some possibilities. the status quo is: an old working instance of mediawiki 1.33.unpatched. there are no extensions present that are absent in the newer version and i compared the LocalSetting.php by hand. that just brought up a single config difference regarding external images. still getting the error: https://paste.sh/5_UoScgv#yaU9WcuzBsQl9rwFc8motuJS no matter what [22:45:19] memorycap i set. even tried "-1". just filled up the memory of the machine. i get that there is a problem with this title generating script. [22:47:56] all the other config takes effect so LocalSettings is "succesfully parsed" i would guess. but none of the debug flags mentioned here: https://www.mediawiki.org/wiki/Manual:How_to_debug produce any different output. except for the memory limit. [22:49:02] and i am simple not proficient enough with php to tell anything about whats going on in that script on this line. [22:55:40] all the other "surroundings" are very controlled. there is an ansible playbook with only a few lines. without the database and `images/` transfer the installation is fine and just works with a new database. installed patched LTS ubuntu, setup as recommended by docker, container deployed as recommended by their hub-pages. and the migration is done as detailed in both the hub-pages and the [22:55:42] mediawiki wiki. [22:59:02] so except for a line number that tells me nothing and empty logs my best conclusion is: something goes wrong with the database upgrade. triggering an uncaught error in the titlegeneration(?!). any hints on how to proceed? except maybe for learning php and get to know the db scheme fixing this manually... [23:01:07] both the install/upgrade script run without errors. [23:16:39] You're sure the DB update went through ok? [23:16:49] yes [23:17:27] i used the graphical installer 2 times today and at least 20 times the upgrade php script. [23:17:38] does eval.php start? can you use the API? [23:18:09] It sounds like you've potentially got an infinite recursion going on somewhere [23:18:17] indeed [23:18:18] installing debug may get something more useful error message wise [23:18:20] ffs [23:18:22] xdebug [23:23:49] thb: i would like to refrain from editing/auditing php code in order to do this. if thats what it takes to do this "by the book" i will just try and grab the remote and if that fails too the project is done and the wishful old "admin" will have no good news next week. [23:24:35] well, you're trying to get a useful error message for someone to be able to help you [23:24:49] if you can't do that, and no one else can reproduce... [23:25:42] Usually it'll be something simple/obvious [23:26:08] as i said. i got no expirience with php whatsoever. maybe i misinterpret what a debugger is in the realms of php. [23:26:23] well, in this case, it works as an error handler [23:26:38] You shouldn't (at least at this stage), need to try and use it as an interactive debugger [23:27:02] http://xdebug.org/ [23:27:19] >Improvements to PHP's error reporting [23:27:19] > An improved var_dump() function, stack traces for Notices, Warnings, Errors and Exceptions to highlight the code path to the error [23:27:19] >Tracing [23:27:19] > Writes every function call, with arguments and invocation location to disk. Optionally also includes every variable assignment and return value for each function. [23:28:35] i need to install this into my running mediawiki installation?? [23:28:46] yeah [23:28:51] you can install it like you do any other PHP extension [23:29:46] https://packages.debian.org/bookworm/php-xdebug etc [23:31:40] * noNamesLeft walks into a corner and reflects upon the fact, that alltough doing unix stuff since the 00s, it never installed a php extension... or interacted with a php install other than by deployment systems... [23:33:06] nearly everything in PHP is extensions [23:33:13] mysql, apcu, memcached... [23:33:15] sounds like go... [23:33:39] but not all compiled into some binary [23:34:12] yeah thats more pythony i figured. [23:35:10] thanks for the hints. [23:37:07] ill go in for another round mingling with the abomination of the unmaintained service that needs an upgrade...