[19:09:03] oooh the tech channel [19:09:18] copy pasta time [19:09:20] hi hi , im looking at why miraheze wikibase installs dont have entity as the default content model for the item namespaces? [19:09:20] example of a wikibase install https://addshore-alpha.wikibase.cloud/w/api.php?action=query&meta=siteinfo&siprop=namespaces|statistics&format=json that has "defaultcontentmodel": "wikibase-item", [19:09:20] example of a miraheze one I found https://ortaturk.miraheze.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces|statistics&format=json "defaultcontentmodel": "wikitext", [19:09:57] not totally sure why it ends up like this, but I'd love to figure out and get it fixed so that more info from miraheze wikibases can end up on https://wikibase.world/ ! [19:19:41] addshore: https://github.com/miraheze/mw-config/blob/main/ManageWikiExtensions.php#L3553 would be the answer to that [19:20:02] aaaah! nice, so that looks fixable :D [19:20:05] The answer to why is quite possibly because we never realised [19:20:27] addshore: feel free to send a PR [19:21:16] will do! [19:23:39] https://github.com/miraheze/mw-config/pull/5966 [19:23:57] addshore: a quick check of the Item namespace shows 12 wikis have wikibase-item set as content model, 457 have wikitext [19:24:26] interesting, is this stored in the db / a cache somewhere? [19:26:19] addshore: yup, that's from the ManageWiki database [19:26:26] I just pasted the query on your PR [19:26:56] addshore: what's the impact with them wikitext? Does it break wikibase or do we need to do any migration of pages that could exist? [19:28:04] it looks like wikibase handles it all just fine. If weird things do happen it wouldlikely be via other extensions, or via apis, [19:28:12] so you should be fine to just change the defaults now for all sites [19:28:49] Okay [19:33:07] addshore: wikibase is one of the extensions that we have very little internal SQEP on [19:33:55] feel free to ping me if you ever have any questions (or also the telegram channels are pretty active) [19:34:19] the really cool thing would be if you could get a query service setup, but that would certainly be a little more involed [19:35:17] addshore: last time I spoke to search team about WDQS we basically came down to too much resources and effort [19:35:40] yeah, I mean, i certainly wouldn't run it the way that they run it for multiple sites :) [19:35:51] addshore: ye, that scared me off [19:36:49] It ideally needs to be something that needs minimal steps to automatically make it available and doesn't need to run on more than a VM or two [19:36:54] wikibase.cloud though runs a single instance for 1000+ sites, on a out 4GB ram https://github.com/wmde/wbaas-deploy/blob/main/k8s/helmfile/env/production/queryservice.values.yaml.gotmpl#L9-L15 [19:37:52] that sounds a lot more promising [19:38:15] there is a very thin service that sits infront of it and basically redirect querys to a namespace within blazegraph, based on which wiki / domain th reuqest is coming from [19:38:33] What's storage requirements ? [19:38:49] Oh I see below, 80GB [19:39:00] yeha, but that depends on how many wikibase entities your sites have basically [19:39:12] 80GB provisioned for them, no idea how much is used though [19:39:26] I mean less than 80GB for 1K sites is promising [19:39:49] that redirecting service i talked about is implemented in https://github.com/wbstack/queryservice-gateway/blob/main/resolver.js#L79 for wikibase.cloud that looks up the details from an api call, then does the redirect [19:40:31] The final piece of the puzzle, and probably the complex and lame bit is the updater, that takes wiki changes and writes them into blazegraph [19:41:18] addshore: does the allowlist support wildcards like *.miraheze.org [19:43:12] addshore: we already push changes to irc + discord via various methods so broadcasting RC to blazegraph doesn't sound too difficult [19:43:28] that could be something to investigate [19:43:36] yeah, so it has to go into a difference process, lemme try and find a useful link or 2 [19:43:45] I've always wanted to be proved wrong that WDQS was impossible [19:43:55] This sounds like a method that might help us [19:44:14] We'd probably want some engineering help though cause our experience with wikibase is low and we're pretty stretched [19:44:29] I see a * at the end https://github.com/wmde/wbaas-deploy/blob/main/k8s/helmfile/env/production/queryservice.values.yaml.gotmpl#L93 :D not sure about at the start [19:48:02] so, th custom cloud updater is at https://github.com/wbstack/queryservice-updater which is again made to work easily with mutliple sites, and thus calls a dedicated API to get changes to udpate [19:48:14] addshore: on the namespace id question, look a few lines up [19:48:19] That's the ID [19:48:57] 860/1 for Item and 862/3 for property [19:50:08] right, so cant you do your update query using those? given https://github.com/miraheze/ManageWiki/blob/b3bec1d0fb304096f573889e70f1041b8a6e4485/sql/mw_namespaces.sql#L3 ? [19:50:32] addshore: good point [19:50:40] I could [19:50:43] And that would work [19:50:43] :D [19:52:33] addshore: I can deploy that when my brain is working [19:52:39] sonuds good [19:52:39] Probably tomorrow [19:52:43] And thanks for the patch [19:52:43] my brain is about to not be working :) [19:52:46] np! [19:53:29] I would love if you guys could help us look at WDQS properly [19:53:46] I'm glad to hear that the way the WMF do it is not the only way [19:55:43] hehe, "you guys" I dont work here any more ;) [19:56:04] addshore: since when? [19:56:10] 2 years or so now ;D [19:56:19] addshore: where are you now? [19:56:35] best bet probably is to start a phab task for it, cc me, and I can try to help on that [19:56:49] Currently https://lightbug.io/ [19:56:51] I can do that [19:58:10] addshore: do you have an account on our Phorge? [19:58:15] https://issue-tracker.miraheze.org/search/query/ddX8zJYzZEds/#R [19:58:19] I expect not! [19:59:02] The external service ("GitHub") you just authenticated with is not configured to allow registration on this server. An administrator may have recently disabled it. =[ [20:01:05] addshore: if you don't have an Mh account, I can create you a phorge directly if you send me your email [20:01:12] ill make a MH one! [20:01:55] im there now :) [20:04:55] its nice to see miraheze thriving after the whole orain thing, and after the recent wikitide thing :D [20:06:25] Our last fundraiser was our best ever [20:06:40] We could do with a bigger base of technical volunteers though [20:07:08] Especially with some of the bigger changes coming like SUL3, Parsoid [20:07:22] Debian upgrades this year too [20:09:15] We have got some new friends in the mediawiki mines, but always true [20:35:15] [1/2] Looks like vocaloidlyricswiki will need several runs. Judging from the size of `Category:Japanese_songs`, this operation is about 20-25% complete after 2 runs, so about 8-10 more are needed. There is a shell script that runs in smaller chunks at the end of [the refreshlinks documentation](https://www.mediawiki.org/wiki/Manual:RefreshLinks.php), which might be helpful. Alternat [20:35:16] [2/2] ively, if a null edit on a page is equivalent to running refreshlinks on it, I could have a bot run null edits on every page, which would achieve the desired effect? [20:48:20] MacFan4000: ^ [20:48:33] If it keeps OOMing, the shell script mentioned is likely the way to go [20:59:03] BlankEclair was running that, not me [20:59:10] (Just failed today) [20:59:47] ouchie [21:00:08] the elusive b.e.d. though