[11:29:55] > first you hit the options row and you get the address and then you hit the actual HTML. As result, misplacing 1/4th of rows led to halving the parsing keys being hits. https://phabricator.wikimedia.org/T133523 [11:32:36] apergos: Amir1: coincidentally, the topic of hashing comes up here today in the context of ParserCache and its SqlBag class. What we did for WAN cache back then with the "coalesceScheme" for sister keys, is absent from SqlBag, looks like it could benefit from something similar (if it weren't for parsoid obsoleting it) [11:34:02] I wonder where else it might be absent from (that won't be replaced soon) [11:34:03] SRE just added a fourth pool to parser cache and while that displaces 1/4th the keys it effectively halves the hit rate because we currently cache via a level of indirection like key1->value1 key2(value1)->realvalue [11:35:00] The idea of sister keys isn't very common. But yeah we have at least two that are conceptually similar. [11:36:16] guess I should learn about the impacts of 'parsoid for read' sometime soon too [11:58:33] I always wondered if that level of indirection was really needed [12:45:12] taavi: regarding migration to virtual domains, for the updater you could what this is doing https://gerrit.wikimedia.org/r/c/mediawiki/extensions/WikimediaMaintenance/+/972387/1/addWiki.php it's not great and I promise to implement the updater support for it ASAP [21:00:40] Krinkle: Any idea who other than t.gr|away might be reasonably able to review the OAuth hack for the wikitech bug? https://gerrit.wikimedia.org/r/c/mediawiki/extensions/OAuth/+/972913 [21:01:24] git history makes me think that Gergo might be the only person left who cares :/ [21:04:08] https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+log/HEAD/includes/block/BlockManager.php [21:04:21] This is where the recursion happens and was recently refactored a fair bit by TimStarling [21:05:15] I mentioned yesterday that I'd recommend a review from Tim or Gergo on whether we can fix core to revert the added isAllowed() call that caused the recursion we see on wikitech wiki [21:05:31] I see that the proposed patch actually isn't what we tried yesterday but something in the OAuth extension instead [21:06:15] The call was added in https://gerrit.wikimedia.org/r/c/mediawiki/core/+/966657/10/includes/user/User.php [21:11:09] I've added some details to https://phabricator.wikimedia.org/T350836#9321070 [21:14:45] Krinkle: Thanks for the thoughts and task updates. I know OAuth has surfaced this kind of unstubbing recursion before but I can't remember if we have generally figured out how to break the cycle in AuthManager or OAuth when it shows up. OAuth makes it obvious (once you see the stack traces) because of that nonce consumption feature. [21:15:39] I feel compelled to follow this because of wikitech, but also unprepared to actually do the work myself. [21:39:32] tgr|away: the main intention for extension.json was actually to introduce an abstraction layer in extensions setting $wg globals so we could move away from globals (though the main benefits seem to have been realized in other areas). So there became a distinction between globals for extensions to register stuff and globals to configure stuff, and extensions shouldn't be setting the latter, and instead use the defined interface (hooks or [21:39:32] sometimes subclassing) to adjust things. hence why extensions shouldn't set "config" settings. [21:39:49] (sorry for the late reply and we can follow up in person tonight or this weekend :)) [22:35:49] I'm looking [22:42:58] ok, so bd808 is using the term "unstubbing" loosely. I was wondering how anything managed to get an actual stub user these days and the answer is that it doesn't [22:43:37] yeah, sorry old brain words. it's really just loading the session in this case I guess? [22:44:38] OAuth makes that sort of weird and similar to the old User unstub flow because of how it works [22:46:55] it's a recursive call to WebRequest::getSession()