[09:54:19] [[Tech]]; 203.106.176.114; /* Sex video */ new section; https://meta.wikimedia.org/w/index.php?diff=22401833&oldid=22400636&rcid=20811256 [09:54:28] [[Tech]]; Hulged; Undid edits by [[Special:Contribs/203.106.176.114|203.106.176.114]] ([[User talk:203.106.176.114|talk]]) to last version by Wiki13; https://meta.wikimedia.org/w/index.php?diff=22401834&oldid=22401833&rcid=20811257 [13:54:31] hi, i'm trying to a template which would expand a diff-id into data-time format. such as {{template_name|DIFF_ID}} and that'd query wiki-en read only replica database, get the date and time, and replace the template text with something like "04:32 2 Dec", but i haven't written a template yet, not sure if templates could do this task (since i can't any existing solutions to this goal). [14:08:49] I don’t think this is possible, at least not in standard MediaWiki + Scribunto [14:09:07] (there might be some more or less obscure extension offering it) [14:21:48] Lucas_WMDE: i see. [14:22:32] i initially thoguht i need a bot or somethin to do this talk. but not sure, maybe we'll wait for other user to address this. [14:22:37] *thought [14:39:59] what is the process to get DB maintenance scripts run? [14:44:42] probably add it to a backport+config window [14:44:49] as long as it doesn’t take long to run [14:45:07] (scripts that take longer than one hour require their own deployment window) [14:49:28] but if this is about https://gerrit.wikimedia.org/r/c/mediawiki/extensions/ProofreadPage/+/703196/, that’ll first have to be merged and then also backported to the current train version [15:03:11] i don't actually know how long it'll take to run [15:03:34] i know it needs to be merged, I was just wondering what the process will be after than [15:05:23] so, say it had been merged and we were doing it right now (it hasn't, we aren't and won't, but say) I would make a backport to wmf.9 [15:06:09] but if it gets merged before the wmf.12 branch cut, it would not to be backported, right? it can just be run? [15:08:48] yes, as long as you’re happy to wait until wmf.12 is deployed to the relevant wikis [15:08:57] (which would be the group1 wikis in this case if I’m not mistaken) [15:09:04] sure, this is not urgent [15:09:14] let me look a bit closer at the script [15:09:46] I also recommend doing a test run on the beta cluster first, before having it run on production [15:10:09] good info, thanks ^_^ [15:10:38] we did indeed enable the change tagging on betaWS a while back, so it should be a good test [15:11:26] I think I would like some kind of limit option in the script, so I can run e.g. a single batch (+1 for using getBatchSize() btw) and see how long it takes [15:11:34] and estimate from there how long it would take for all revisions [15:14:30] would you rather the limit was in terms of revs or batches? [15:29:46] good question [15:30:13] I’d probably slightly lean towards revisions? but if one is easier to implement than the other, I think that’s fine too [15:30:26] batches is slightly easier I guess [15:31:04] but it not a lot harder [15:50:31] there we go [18:19:08] I have just been staring at RefreshLinksJob, trying to figure out when and how we write to the ParserCache when re-parsing pages in the background. I can't find it. To me it looks like we don't. [18:19:20] I HAVE to be missing something.... right? RIGHT? [20:32:58] Sue [20:34:04] duesen: If memory serves, there's a time cut off where we don't write for quick parses where the only demand was cascading link updates and then a write [20:41:06] The refactoring around ParserOutputAccess and ContentHandler more recently may have changed or moved it. I've not been able to keep up [20:50:26] There is also PageRenderer which has added additional things im not too familiar with [20:50:51] I was doing a search for callers to >save which is only a handful but didn't help [20:51:36] Looking at RefreshLinksJob directly I see the cache miss case calls PageRenderer indeed. Last relevant change was: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/f588586e16c9c8043d6db1a262a00db4c12dc9a1%5E%21/includes/jobqueue/jobs/RefreshLinksJob.php [20:52:37] This says that PageUpdater broke the ability for RefreshLinks so save new parser outputs (which it originally did for parses over >3s) [20:52:52] So the dead code was removed in 2019 [20:54:11] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/501444 [20:55:51] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/501444 [20:56:06] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/465157/