[00:37:46] Hi guys, we had a bot trying to scan for mediawiki.searchSuggest and mediawiki.less/mediawiki.mixins.less . Does anyone have any idea if that's looking for some kind of vulnerability or what this might be about? [00:44:37] Bots are dumb. [00:44:40] That's the answer [05:13:46] please provide me repo link for wiki lambda [09:30:30] where to connect to ask issues related to wikilambda? [09:33:48] Nikhil61: #wikipedia-abstract-tech [09:33:55] Or https://t.me/abstract_wikipedia_tech [09:34:13] See https://meta.wikimedia.org/wiki/Abstract_Wikipedia [09:36:39] Thanks a lot [11:51:31] Hi I have a problem regarding Mediawiki on docker. [11:51:31] I want to enable uploads, but I get the error "Could not open lock file for "mwstore://local-backend/local-public/4/4f/image.jpg". Make sure your upload directory is configured correctly and your web server has permission to write to that directory. See -{R|https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:$wgUploadDirectory}- for more [11:51:32] information." [11:51:32] The error let me believe that there is some sort of permission conflict in my folder, but i'm not quite sure where MediaWiki would put that. I used the following setting in my LocalSettings.php: https://dpaste.org/Jtp7W [11:51:33] Would he have them relativ to my System, relativ to the mediawiki-folder or even somewhere else. Maybe someone has experience regarding that? [11:57:18] Guest98: that's a permission issue. You should have mounted a volume read-write, and permissions for everyone to write to. Uploads should be configured to point to the path where that volume is mounted [12:22:20] Vulpix thank you for the advice. I used the standard setting on the mediawiki docker image, whichs is the following: https://dpaste.org/SC9xD [12:22:21] I tried overwriting it with docker-compose.override.yml (https://dpaste.org/XnWZw#L6), but unfortunatly, with that addition my wiki now produced an error "file not found" when ever I open anything on the wiki [16:32:16] hi, i just started poking around at appropedia.org; they seem to be having server errors on special pages on the wiki and the old irc channel on freenode appears empty. does anybody know where the project is now? [18:07:48] Hi, I'm constantly getting the following PHP error after installing LiteSpeedCache extension : Cannot modify header information - headers already sent by (output started at /includes/MediaWiki.php:1077) in /extensions/LiteSpeedCache/LiteSpeedCacheBase.php on line 200 [18:08:11] Can someone please help why I'm getting this error? [18:10:14] Apart from this I'm also getting the following notice: HP Deprecated: Use of Title::$mUrlform was deprecated in MediaWiki 1.37. [Called from LiteSpeedCache::getTags in /extensions/LiteSpeedCache/LiteSpeedCache_body.php at line 179] in /includes/debug/MWDebug.php on line 381 [18:10:39] What should I replace Title::$mUrlform with? [18:11:59] Please help thank you. [18:35:56] I've enabled the PageImages extension and get this on my pages: https://i.imgur.com/MDlyPy6.png That's not normal, is it? [18:36:03] I mean the duplication [18:40:04] lol [18:41:30] this happens on file description pages only, right? That's definitively a bug [18:42:07] and amusingly, the bug happens also on WMF wikis https://en.wikipedia.org/wiki/File:2023_Gaziantep_Earthquake-Diyarbakir_1.jpg [18:43:54] Vulpix: in this case it's a normal page, though generated with a big 'ole template that generates whole pages, dunno if that could be relevant [18:43:59] wait that's easy to check [18:44:24] nope, also happens on this manually written page: https://bg3.wiki/wiki/Acid_Splash [18:44:48] It's reported as T295521 [18:44:49] T295521: Duplicate og:image on Wikipedia and Wikidata (There are too many faces of Tom Hanks) - https://phabricator.wikimedia.org/T295521 [18:45:59] I'd say this would be fixed rather quickly because affects WMF wikis, but the report is from Nov. 2021! :yikes: [18:48:43] ugh... wonder if I should try [18:49:57] I'll eventually be using WikiSeo to override the auto-detected images anyway, once virtually all our pages are generated via templates which define a main image anyway... [19:31:04] is there a way to directly link to the visual editor rather than defaulting to text editor via url? [19:31:13] (or changing the site-wide default) [19:38:11] nsh: use the veaction URL parameter and set its value to "edit" (instead of using regular ?action=edit) [19:40:22] taylan: oh btw, did you see that I cc'd you on a relatively quick-ish Theme patch of mine? :) would appreciate your thoughts on it, but I did test the changes on my 1.39 box and they looked fine, but a bit of extra review never hurt, right? ;) [19:42:47] taylan, Vulpix: hm, re: duplicate og: meta tags, Weird Gloop has this patch for the OpenGraphMeta ext. to seemingly fix just that: https://github.com/weirdgloop/mediawiki-extensions-OpenGraphMeta/commit/17f2bb1a4dc0bf72e48b464f58633d485e4db668 -- I don't recall seeing that particular issue w/ OpenGraphMeta but I'm wondering if it'd still be worth to integrate their patch into the upstream version... [20:00:33] Wasn't there a config var/pref that sets default editor? [20:03:32] That WMF used to give VE to random users [20:06:01] There's like a billion different editors and various prefs to control them [20:06:46] I can't find a single one now [20:11:06] * one now that I'm looking for it [20:11:06] * one now that I'm looking for it [20:11:07] All I know is that there's no pref for disabling MF editor on Minerva [20:15:17] ashley: oh, I hadn't noticed that I had been explicitly pinged... just one thing: the legend on the recent changes page should have a dark background already from another rule, on line 1685. I'm not sure if the rule you added would even take effect, given other rules. [20:32:51] Apparently it's `$wgDefaultUserOptions['visualeditor-editor']` [20:32:51] I think at least [20:33:40] There's also this, but no idea what it does `$wgVisualEditorUseSingleEditTab = true;` [20:33:40] https://mediawiki.org/wiki/Topic:T4xbb2kgafndzsuo [20:34:01] * Apparently it's `$wgDefaultUserOptions['visualeditor-editor'] = 'visualeditor';` [20:34:01] I think at least [20:38:26] Bruh there seriously is no list of all user prefs? [20:39:26] Well it varies depending on what extensions you have installed [20:39:31] i think you can get a list from the api [20:39:38] gadgets are allowed to make new fake preferences too [20:41:02] Well each extension should explain their own prefs I suppose [20:41:02] You can get list from api, but that doesn't include possible values or what does it even mean [20:42:34] Like `"mobile-editor": ""` [20:42:35] Does this mean I can't edit on mobile or what [20:46:23] Having a list of all core preferences would also encourage extension devs to not straight up ignore them [20:47:47] Well tbf, if you want an explanation, there is one on Special:Preferences ;) [20:52:12] Hello everyone! I recently migrated my wiki to a new server on AWS, and I found when searching for foreign characters like Chinese it is not showing any results, it was working on the old server before the migration. Anyone have idea what I could did wrong? the database dump was created using command "mysqldump --default-character-set=binary [20:52:12] --user=wikidb_user --password=wikidb_userpassword wikidb > dump_of_wikidb.sql", could it be the "default-character-set=binary"? [20:52:37] BTW the old wiki was on 1.35 and also newly upgraded to 1.39 [20:53:41] does the content show correctly both in page titles and in content? [20:53:54] taylan: works on my devbox(TM) ;) it didn't seem to have a dark or transparent bg, hence why I added it; that silly box is such a pain on all darker themes, the explicitly set #fff background-color on that is such an annoyance :-/ [20:53:57] is there a simple way to strip wiki text of all the markup? some what to just get plain text? the intent is to populate the meta description field with something sensible [20:54:10] ashley: that's interesting... [20:54:45] I had initially written the rules for it on an older MW version so maybe the overkill CSS selector isn't needed anymore, but it was at least working fine for me still [20:56:32] I'd be tempted to just nix the whole background-color rule in core but contributing to core is such a mess these days... [20:57:18] re:meta description, tried [[mw:Extension:ArticleMetaDescription]] yet? :) (full disclaimer: why yes, it's another of the gazillion things I maintain) [21:02:17] google is such a dum dum nowadays, doesn't even return that for the query "mediawiki article meta description" [21:02:47] wait, is it not documented on mediawiki.org? [21:04:14] huh, apparently so...that'd explain it, yeah [21:06:07] ashley: is this a fully automatic thing or is there a configuration? I cloned the repo but can't see a README [21:06:38] and I wonder how it will combine with WikiSeo. assuming it's automatic, will WikiSeo override it? [21:08:26] "my" thingies usually don't come with a README because those get so outdated so quick (but this one apparently didn't have an extension: page either...working on fixing that :P) and other contributors often neglect the presence of such a file; it's automatic, but for the Main Page there's [[MediaWiki:Description]] that you can customize specifically and only for that particular page; for all other pages, it's automatic; not sure about WikiSeo [21:08:26] interoperability, it's not an extension I'm familiar with or one that we'd use at ShoutWiki (IIRC we use ArticleMetaDescription + OpenGraphMeta (+ PageImages) + a fork of the HydraWiki SEO extension) [21:10:35] hmm, for https://bg3.wiki/wiki/Die_Rolls it grabbed the description from what's after the first h2 (Karmic Dice) instead of the intro... [21:12:25] huh, interesting [21:38:41] taylan: so I think I sorta know what's going on in there - it's a definite bug in ArticleMetaDescription (on MW 1.39 at least): the parser output (=article text) is wrapped in a
...
*but* the ArticleMetaDescription "remove all divs" ends up removing that, and also somehow the whole intro section (and ToC HTML), so the first "real" section ends up being the first section that ArticleMetaDescription sees [21:40:05] ashley: could it be that it removes everything until the first
it encounters *within* mw-parser-output? in this case that would be the TOC... [21:43:27] true, in a way: the final HTML from which it (currently; I have a very crappy but seemingly functional patch coming up if you wanna test it out ;) selects the description (for that page) *does* seem to contain the ToC HTML (as well as some ugly section edit links), but in all that HTML soup the first

aragraph tag just so happens to be right before "In Baldur's Gate 3, you can enable the option [--]" so that's presumably why that snippet gets chosen [21:44:30] (PS1) Jack Phoenix: Filthy but potentially functional hack for selecting the first sentence(s) from the *intro* and not the first /actual/ section [extensions/ArticleMetaDescription] - https://gerrit.wikimedia.org/r/886944 [21:44:37] I warned you, it's not pretty :DDD [21:45:52] Can't you just take everything before first

🤔 [21:52:19] Got a question relating to templates and recursion, a bit long to explain so -> https://dpaste.org/9BGxe/raw [21:56:43] GUEST_Shadow: you could easily simplify {{Tooltip|[[#Libital|Libital]]|{{RecursiveChem/Libital}}}} into {{Tooltip|Libital}} [21:57:02] as for picking up pieces from the table [21:57:10] you probably need something like semantic [21:58:26] So basically you want a table search function where you input an argument and get an output associated with it? [22:00:26] Essentially, yeah. My *current* idea is that if I can pull the information from the recipe column, given the chem name (to find the row), I can essentially collapse the entire template hell into just a single template that takes the param for the recipe-to-find [22:00:55] The way I did something like that was using Extension:Arrays [22:00:55] I would have an array with keys and an array with values, then search for the argument inside `keys`, then output `value` at `index` that the `key` was found. [22:00:55] But you can do the same but easier using Extension:Scribunto [22:01:18] why not have an article per chem, then use the extension popups? [22:01:53] That's essentially what we have now. [22:02:01] Give me a moment, I can get an example. [22:02:50] you would just include a normal link: [[Libifal]] [22:02:56] look at wikipedia when you hover a link [22:04:16] issue with that is that recursion f [22:04:24] enter key, please. [22:04:40] Issue is that doesn't work with the recursive tooltips. [22:05:47] I guess the pages do exist and tooltip is not necessairly the page content? [22:06:12] Or it would be too many articles [22:06:38] The tooltips wikipedia have is just plaintext of the opening few sentences [22:06:48] and here's an Actual Interactive version of what I'm explaining [22:06:50] https://tgstation13.org/wiki/Guide_to_chemistry#Synthflesh [22:07:12] If you pop into the source viewer and check the included templates, there are. A few. [22:09:12] each template of a given chem, IE {{RecursiveChem/Synthflesh}} is just the recipe used to make that, with other recursivechem template inclusions as neccessary [22:25:32] It seems like Extension:DataTable2 might do what I need [22:36:56] it might be easier with javascript [22:37:18] something that install an onclick handler on a tags, then duplicates the node [22:57:38] ashley, ty [23:12:08] Anyone can throw some light on this issue? "I recently migrated my wiki to a new server on AWS, and I found when searching for foreign characters like Chinese it is not showing any results, it was working on the old server before the migration. Anyone have idea what I could did wrong? the database dump was created using command "mysqldump [23:12:08] --default-character-set=binary --user=wikidb_user --password=wikidb_userpassword wikidb > dump_of_wikidb.sql", could it be the "default-character-set=binary"?" [23:18:21] paulx: you didn't answer above [23:18:28] 20:53:40 < Platonides> does the content show correctly both in page titles and in content? [23:21:31] Does it not wor at all or only if the string is more than ~7 chinese characters? [23:21:32] s/wor/work/ [23:25:05] I would expect it to fail if it's *less* than X characters, not more [23:32:51] Well I've seen that happen, but without any updates at all [23:41:35] Platonides Sorry I thought it was replying to others' thread, you mean does the keyword are both in page titles and in content? [23:43:17] Archimedes1560[m it does not work at all no matter how long the keyword I use to search [23:44:06] but it is working fine on my old server, just stopped working on the new server, after upgraded to 1.39 [23:47:20] I guess it's something different then, I doubt you will find anything though as documentation of default search doesn't exist [23:52:08] default search would be converting them [23:52:13] but this might have changed