[08:27:49] Fair enough. :) Thanks for all the help everyone, especially Platonide! [09:59:08] interNIC, blimey. that's a word i've not heard in a long time [17:17:58] Hello, I'm starting to use the Mediawiki Action API to fetch wikitext data. I'd like to know if it's possible to request a specific language variant for the Chinese Wikipedia. I saw in the documentation some parameters like variant and Accept-Language header but those didn't work - are they only for the HTML output? [17:19:44] https://www.wikidata.org/w/api.php [17:19:54] there is a variant parameter... "Variant of the language. Only works if the base language supports variant conversion. " [17:21:40] Yeah, that's the one I saw but it doesn't seem to work [17:21:52] I mean, for the wikitext specifically [17:22:16] Did you pass a language parameter too? [17:22:30] oh wait, sorry [17:22:35] you said on the chinese wikipedia [17:22:47] I was wondering if it was something weird on wikidata because of it being in en by default [17:31:53] Do you know how I can find the answer to my question? [17:44:06] hey, I opened an RfC to gauge interest in bridging this channel to the unofficial discord server, please weigh in at https://www.mediawiki.org/wiki/Project:Requests#Bridge_MediaWiki_IRC_and_Discord [17:49:02] I won't !vote because I'm not active here, but I approve of bridging in general. [17:58:05] bridging is a nice that always seems good in theory and then becomes a bigger deal in reality, slightly skeptic [17:59:37] but I do like to read that it's the type where every user gets their own user on the other side [18:00:32] so since the matrix-method seems to be one that has been used before, sure, why not try it anyways [18:23:31] why discord? [18:23:46] While I'm not against bridging in general, it is a really closed platform [18:23:58] because there are supposedly 1000 users there [18:24:19] of course would be best to move them all to free alternatives, yea [18:24:30] but how realistic that is is another one [18:25:12] marcelogp58: it's probably not workind on wikidata because the variants may only work on the wikis which have them configured [18:25:26] it's similar discussion to having facebook groups [18:26:43] Platonides: bridging will make it possible for people who don't want to use closed platforms to still interact with the conversations there (and vice versa) [18:28:33] Platonides but doesn't https://zh.wikipedia.org/ has support for variants like zh-cn and zh-hk? [18:29:13] Platonides the variant parameter works if I request HTML data, but I need wikitext, unfortunately [18:31:03] MatmaRex: I don't know how ... [18:31:37] I'm surprised that there is supposedly such a big userbase there [18:31:44] welp, he's gone [18:32:28] i was confused for a second when you mis-pinged me because i'm debugging an issue with langauge variants at the moment :D [18:32:52] and you commented on that one too [18:38:38] oops sorry, MatmaRex [18:53:45] MatmaRex Do you know if it's possible to apply language variants directly to wikitext? [18:55:13] hi :) [18:55:43] marcelogp: i'm pretty sure it's not. as far as i know, the language conversion happens after parsing the wikitext into the HTML [18:56:12] but maybe there's some API you can call to apply it on arbitrary text [18:56:28] Ah, that's what I imagined. Thanks [18:58:44] marcelogp: for example, this seems to work: [18:58:47] https://zh.wikipedia.org/wiki/Special:ApiSandbox?uselang=en#action=parse&format=json&variant=zh-hans&text=中国&prop=text&formatversion=2 [18:58:50] https://zh.wikipedia.org/wiki/Special:ApiSandbox?uselang=en#action=parse&format=json&variant=zh-hant&text=中国&prop=text&formatversion=2 [18:59:07] note that the input 'text' is actually wikitext, so you might need to do something about special characters [18:59:14] and the return 'text' is actually HTML [18:59:28] but as you can see, it converts the language variant [19:00:30] as far as i know, the system is not really designed for converting arbitrary inputs – it's just a postprocessing step on the wikitext parser. i don't know what you're trying to use this for [19:01:40] Yes, I saw that it works for HTML. However, I need to manipulate the wikitext directly [19:01:42] ^ explaining a bit your usecase would probably help, marcelogp [19:03:38] Basically I need to remove some sections, lists, tables, etc and generate a markdown output. I found easier to do that by parsing wikitext with a library called wtf_wikipedia [19:11:19] we don't have an API for that [19:11:42] i don't know what you need the markdown for, but i'd consider whether you might be okay with HTML as the output [19:11:57] then you can probably just run the "filtered" wiktiext right through that API i linked [19:12:52] there are probably also HTML-to-markdown converters you can use ;) [19:15:15] or maybe you could even use wikitext instead of markdown :D