[02:03:59] Default link prefix. Links from this namespace to other pages which do not have a prefix will be treated as if they were prefixed with this text. For example, if this was set to "User", a link in the given namespace [[Dogmaster3000]] would be rendered as if it was written [[User:Dogmaster3000|Dogmaster3000]]. The most common use for this is [02:04:00] namespace-internal linking. For example, it could be assumed that any unprefixed link in the "Cookbook" namespace points to another recipe, rather than a page in the Main Namespace. [02:04:00] https://www.mediawiki.org/wiki/Namespace_manager [02:04:01] How can enable this feature on my newly defined namespace on my mediawiki instance? [02:04:01] I'm looking for easier way to do namespace-internal linking. [02:17:39] Rtnf : I don't think that feature actually exists [02:21:53] I'm looking for easier way to do namespace-internal linking. We at the Bluepages community have a discussion (https://www.bluepageswiki.org/wiki/User_talk:Reschultzed#Local_Language?) regarding how to split the project for each country. The consensus is : split this wiki to several "sub-wiki" by using namespace. [02:21:54] Local namespace is fine, as long as it provided the namespace-internal linking functionality. By using this, every wikilink that is made inside a namespace will not be leaked back to the main namespace. [02:28:18] If you type `[[User:Foo|]]` it expands to `[[User:Foo|Foo]]` [15:30:03] hi, is there a convenient way to change the color set of ? The backend, Pygments, says it comes with some builtin styles [15:39:17] self-answering: the colors are literals, no convenient way https://github.com/wikimedia/mediawiki-extensions-SyntaxHighlight_GeSHi/blob/master/modules/pygments.generated.css [15:39:55] It's CSS... Can't you override it? [15:41:07] I can... but I am too lazy to do that :) [15:41:51] lazy people tend to write programs to fix the problems. [15:42:08] (At least, I do) [15:43:06] so, maybe you can write a program to help you pick the colors you want and produce a CSS replacment [15:44:39] of course, if changing the colors is just an idle thought, then that isn't gonna happen. It only helps when it is something you really want, but are too lazy to do. :) [15:45:20] I have found a script file named as updateCSS.php which is for generating a CSS file in Extension:SyntaxHighlight, maybe I should create a bot to make changes on MediaWiki:common.css or somewhere [15:45:43] that's the spirit! [16:04:34] lens0021: you can do things like this to override the styles -- https://www.mediawiki.org/wiki/Template:Codesample [16:07:10] it's interesting... thank you, bd808! [16:46:04] I'm actually looking for enwiki data dumps, not sure if in right channel [16:47:02] Oh, I guess normally needs bt? [16:49:17] hey! dumps related stuff is usually in wikimedia-tech but since you're here... [16:49:42] there we go, that's the channel I was looking for ^^;; [16:49:43] what dumps are you looking for? historical page content? metadata about current edits? other? [16:49:45] thanks apergos :-) [16:49:53] current page content is fine. [16:50:06] I also thought there was an API call to get wikitext, but can't find it? [16:50:09] wll the ones for the June 20th run are already available I think [16:50:14] Amir1: addshore: https://gerrit.wikimedia.org/r/c/operations/mediawiki-config/+/701093/2/wmf-config/InitialiseSettings.php#b19134 [16:50:28] Do you know more about wgEntitySchemaShExSimpleUrl and what its impact may be? [16:50:41] how to test it / whether it has caused issues [16:50:54] there's a post/wikitext/{title} , but I can't find get/wikitext/{title} oddly ^^;; [16:50:56] https://dumps.wikimedia.org/enwiki/20210620/ if you want everything but talk pages, get the "pages-articles" ones [16:51:05] if you want also the talk pages, get the "pages-meta-current" ones [16:51:23] what do they mean by multistream? :-) [16:51:55] if you want to parallel process them and have the right tools, get the multistream ones (no talk pages, same as pages-articles but you have an index that points you into which bz2 "stream" the page is) [16:52:02] apergos, Ok, so those ARE the right dumps. I also found the torrents, which I guess is what you'd really want to use normally [16:52:19] apergos, I see! [16:52:24] I've not checked the torrents but if they have the same content, then yes [16:52:24] Krinkle: https://www.wikidata.org/wiki/EntitySchema:E100 see the link that says `check entities against this Schema` [16:52:34] afaik that is it [16:52:36] so should be usable already [16:52:52] I guess the next question is, how do you want to use them? [16:52:54] addshore: TIL this namespace exists [16:52:55] Krinkle: https://gerrit.wikimedia.org/g/mediawiki/extensions/EntitySchema/+/da6e9705290cf2196109c952e827a72585afac34/src/MediaWiki/Content/EntitySchemaSlotViewRenderer.php#226 [16:52:58] :D [16:53:03] * addshore dinner [16:53:28] apergos: thanks already :-) [16:53:35] sure thing! [16:53:53] if you want (waaaay too much) information on the dumps there's a pile of docs at [16:53:57] Now I just wonder if one can automatedly grab wikitext off the live wikis , or is that something that has been restricted? [16:54:13] * kim listens first I guess [16:54:14] https://meta.wikimedia.org/wiki/Data_dumps [16:54:28] yes you can, although we'd like it if people didn't just scrape the whole wiki :-D [16:54:43] apergos, ohhh, hooow? [16:54:44] there is absolutely an api to get that [16:54:56] apergos, yeah, I'm just confuzzeled , because I couldn't quite find it [16:56:07] sec, I need to find it again [16:56:47] https://www.mediawiki.org/w/api.php?action=help&modules=query%2Brevisions read that [16:56:51] it's got what you need! [16:57:37] Since Adam is having dinner, ping me if you still have questions [17:00:31] apergos, it does! [17:00:38] there's got to be an easier way though ^^;; [17:01:06] that's how apis work mostly: string a bunch of params together and get your data as one of the values [17:01:13] otherwise you coul duh [17:01:29] Oh, I found it already [17:01:39] just tack ?action=raw to the end of a regular mw url [17:01:51] have we not gotten rid of that yet? [17:01:56] now why had I forgotten that? [17:02:29] apergos, but why would you want to get rid of that? https://xkcd.com/1172/ [17:04:49] https://www.mediawiki.org/wiki/API:REST_API/Reference#Get_page_source well because https://xkcd.com/927/ [17:06:35] anyways you can definitely get it one way or another (but I would recommend against action=raw on any bulk basis) [17:09:41] ((apropos xkcd: One day we'll be able to speak in pure XKCD https://youtu.be/4Bb-Af3Dm8E?t=51 https://en.wikipedia.org/wiki/Darmok#Tamarian_use_of_language ) [17:09:45] ) [17:11:04] apergos, Ahh, that does look nice. And I'll tell the person I'm telling this to that possibly the raw dumps are best to use, since hitting wp a squillion times a day might not be the best. :-P [17:12:45] uh yeah please no [17:13:03] they will wind up eating 415 responses and no one will behappy! [17:13:32] for bulk, use the dumps, if you need updates, go to the api (one of them) [17:14:18] there's probably a short api query you can do to check if the date on a page has changed. [17:14:39] but if I tell them "just get dumps once per month" ... :-P [17:22:01] twice a month [17:22:22] we produce page articles dumps in the run on the 1st and the run on the 20th [17:22:30] so a few days after the start they should be ready for en [17:22:45] ALSO [17:22:45] also [17:22:47] it's good that the dumps actually complete again these days ;-) [17:22:55] * kim listens also also ;-) [17:22:59] we've been completing them for years :-P [17:23:15] uh yes ALSO there are these so-called adds-changes dumps [17:23:21] * kim quietly doesn't mention how many years I've sort of ... not ... been doung wiki stuff? [17:23:26] these can help fill in between dumps [17:23:52] they basically have all new revisions each day with content, it's not just current revisions [17:23:57] but still might be helpful [17:23:59] (I'll get keelhauled as a heretic. And/or congratulated. And/or be given condolences wrt my lapse) [17:24:05] heh [17:24:08] doing* [17:24:26] well I'm at this job ... twice as long as max(previous jobs)? unbelievable [17:24:46] it's addictive! [17:24:54] https://dumps.wikimedia.org/other/incr/ [17:25:09] there should be a wikimaniacs anonymous [17:25:17] so these aren't guaranteed to always be there every day and never break, though most of the time you can just pick them up [17:25:35] mgiht be helpful anyhow [17:25:56] https://en.wikipedia.org/wiki/Wikipedia:Wikipediholic <- once a wikipediholic, always a wikipediholic [17:26:05] not even gonna click through [17:26:19] you cannot send me on that rabbit hole, I already have too many :-P [17:26:41] This is the one exception, it's the diagnostics [17:26:56] Though I get your https://xkcd.com/214/ [17:27:08] so between 2xmonth dumps and daily (almost always) revision updates and api calls here and there, that should be pretty ok, with not too much load on our servers [17:27:23] makes sense [17:27:49] what's the project going to be anyways? I'm curious always to know what people do with the dumps [17:29:24] (I'm trying to collect up all the use cases in fact) [17:29:33] Hrrrrm, so far just introducing someone to the infrastructure [17:29:39] Ahehehe [17:30:52] Is it ok if I get back to you on that? [17:31:15] sure! I mean there's no requirement, I'm just trying to find out how everyone uses them so we can see what else is needed etc [17:31:32] Just introdocution so far. If something comes of it, I made a note to keep you in mind [17:31:49] ok, you know where to find me! [17:31:50] introduction* (how did I hit the keys that wrong?) [17:32:04] heh [17:34:26] apergos, and thank you so much for helping me out! [17:41:36] sure thing! good luck! [21:30:29] Is there a way (I'm guessing some method in MagicWord or MagicWordFactory class) to get all magic words in a particular article/ article ID/ title? [21:31:44] I see MagicWord::get and MagicWordFactory::get methods to instantiate a specific magic word object and check if that exists in content [21:31:55] and i'd have to somehow iterate through all magic words [21:32:03] i'm hoping for an easier approach [21:32:20] something analogous to MagicWord::getAllForArticle() [21:33:14] Probably need to look at the parser [21:33:42] Hm [21:35:22] The API doesn't expose "magic words on a page"... Which generally means it might be a harder problem [21:45:13] Ah okay, thanks for that info Reedy