|
2026-01-05 14:43:32
|
<Lucas_WMDE>
|
ori: looking at https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1220661/2/includes/Config/SiteConfiguration.php, I wonder if you’d be interested in benchmarking array_key_exists() vs. \array_key_exists()?
|
|
2026-01-05 14:43:46
|
<Lucas_WMDE>
|
(I’m guessing that you already have a good setup to benchmark MW whereas I don’t ^^)
|
|
2026-01-05 14:44:30
|
<Reedy>
|
It would be good to have a policy on that and/or having use statements for these global functions...
|
|
2026-01-05 14:44:52
|
<Reedy>
|
Didn't Tim have a benchmark for this at one point?
|
|
2026-01-05 14:44:54
|
<Lucas_WMDE>
|
yeah I’ve been wondering about that in the back of my head for a while, but performance numbers on it would be a much better argument
|
|
2026-01-05 14:48:04
|
<Lucas_WMDE>
|
(for reference, https://github.com/php/php-src/commit/f5044a12dd and https://github.com/php/php-src/commit/0bbfebb6d9, via https://github.com/php/php-src/pull/3360, are the optimizations that could benefit from knowing that array_key_exists() isn’t a function in MediaWiki\Config\ or MediaWiki\)
|
|
2026-01-05 14:49:24
|
<Lucas_WMDE>
|
Reedy: I can’t find anything on Phabricator at least (but it doesn’t help with the search that “namespace” is a very overloaded word)
|
|
2026-01-05 14:49:42
|
<Reedy>
|
I can't remember if it was in a library...
|
|
2026-01-05 14:49:51
|
<Reedy>
|
Similarly searching for global functions isn't helpful either :D
|
|
2026-01-05 14:53:56
|
<taavi>
|
do we still sometimes load it from a symfony polyfill or is that old enough to be in all the versions we support?
|
|
2026-01-05 14:54:47
|
<Reedy>
|
array_key_exists
|
|
2026-01-05 14:54:47
|
<Reedy>
|
(PHP 4 >= 4.0.7, PHP 5, PHP 7, PHP 8)
|
|
2026-01-05 16:49:51
|
<ori>
|
Lucas_WMDE: that's so interesting, thanks for flagging it! Busy week at work but I'll try to benchmark it in the next couple of days. Please poke me if you don't hear back.
|
|
2026-01-06 02:59:20
|
<ori>
|
Lucas_WMDE: it made no difference (same benchmark, getAll() for enwiki on a deployment host with the change live-patched)
|
|
2026-01-06 03:11:59
|
<ori>
|
It does "work" though. I ran my benchmark script with `php -d zend_extension=opcache.so -d opcache.enable=1 -d opcache.enable_cli=1 -d opcache.opt_debug_level=0x20000` to dump the optimized generated opcodes. Without the leading slash, PHP emits:
|
|
2026-01-06 03:12:28
|
<ori>
|
0008 INIT_NS_FCALL_BY_NAME 2 string("MediaWiki\\Config\\array_key_exists")
|
|
2026-01-06 03:12:30
|
<ori>
|
0009 SEND_VAL_EX string("default") 1
|
|
2026-01-06 03:12:34
|
<ori>
|
With the leading slash, PHP emits:
|
|
2026-01-06 03:12:42
|
<ori>
|
0008 T5 = ARRAY_KEY_EXISTS string("default") CV0($thisSetting)
|
|
2026-01-06 03:16:12
|
<ori>
|
it's just that in this particular case (I906d5b3f2) the value of the optimization is in all the hairy tag-iteration logic that gets short-circuited. Making the array_key_exists() a shade faster doesn't meaningfully move the needle.
|
|
2026-01-06 03:24:00
|
<ori>
|
standalone benchmark: https://phabricator.wikimedia.org/P86767 (LLM-generated).
|
|
2026-01-06 03:25:19
|
<ori>
|
On a production host, PHP 8.3.26, 10m iterations: array_key_exists(): 0.2389 seconds, \array_key_exists(): 0.1681 seconds (~30% faster)
|
|
2026-01-06 03:28:59
|
<ori>
|
well, but..
|
|
2026-01-06 03:31:30
|
<ori>
|
remembering now that the short-circuit only applies to a third of cases and the fallback code is also array_key_exists()-heavy, I fully-qualified the array_key_exists() calls in the fallback code as well
|
|
2026-01-06 03:32:28
|
<ori>
|
With that, getAll() for enwiki goes from 0.952 ms to 0.688 ms, ~40% faster!
|
|
2026-01-06 03:55:09
|
<ori>
|
Lucas_WMDE: please create a change, it's a nice win and should be credited to you. You can cite these numbers in your commit message if you like.
|
|
2026-01-06 11:35:42
|
<Lucas_WMDE>
|
ori: nice, thanks! I’ll try to put together the commit
|
|
2026-01-06 11:36:16
|
<Lucas_WMDE>
|
I was wondering if this would be something to enforce via phpcs, but if we do it across the whole code base I think it might end up being a wash performance-wise, since we’d be adding more code bytes to parse 🤔
|
|
2026-01-06 11:38:53
|
<Lucas_WMDE>
|
bah, `use array_key_exists;` doesn’t work? I thought we could declare the non-namespaced function that way :/
|
|
2026-01-06 11:45:05
|
<taavi>
|
Lucas_WMDE: `use function array_key_exists;` iirc?
|
|
2026-01-06 11:47:52
|
<Lucas_WMDE>
|
ori, Krinkle: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1223644
|
|
2026-01-06 11:47:58
|
<Lucas_WMDE>
|
taavi: ah thanks, https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1223645
|
|
2026-01-06 16:55:48
|
<A_smart_kitten>
|
hey all, genuine question - do folks think that there's value in having T410514 open as a general task for that specific PHP deprecation? asking because i'm planning to file tasks relating to specific pieces of code that need addressing (for that PHP8.5 deprecation); but i'm not currently 100% sure of its utility as a (parent) task in general,
|
|
2026-01-06 16:55:49
|
<stashbot>
|
T410514: Using null as array offset or as the key parameter for array_key_exists() is deprecated in PHP 8.5 - https://phabricator.wikimedia.org/T410514
|
|
2026-01-06 16:55:55
|
<A_smart_kitten>
|
given that (to me) this seems like something that might more need to be addressed on the basis of individual deprecation reports, as opposed to something that can (e.g.) easily be found & fixed by Codesearch.
|
|
2026-01-06 17:14:25
|
<Lucas_WMDE>
|
I think I would be tempted to just attach a bunch of changes directly to that general task, and not bother creating subtasks for each occurrence
|
|
2026-01-06 17:14:44
|
<Lucas_WMDE>
|
maybe with subtasks for broad groups like “core”, “WMF-deployed extensions”, “WMF-deployed skins” etc.
|
|
2026-01-06 17:15:03
|
<Lucas_WMDE>
|
but I don’t know remember how tasks have been handled for other deprecations. I think various patterns have been used
|
|
2026-01-06 17:17:50
|
<Reedy>
|
There's no one size fits all answer ;)
|
|
2026-01-06 17:29:00
|
<A_smart_kitten>
|
fair fair
|
|
2026-01-06 17:31:34
|
<A_smart_kitten>
|
Lucas_WMDE: i guess the thing in my mind is that i could (e.g.) create a 'core' subtask, but then right now i only know about the failures in core that i'm seeing when running the tests locally
|
|
2026-01-06 17:31:48
|
<A_smart_kitten>
|
whereas there might be more that the tests aren't covering
|
|
2026-01-06 17:32:34
|
<A_smart_kitten>
|
so i guess it wouldn't be clear to me what the acceptance criteria for that task would be :p
|
|
2026-01-06 17:32:44
|
<Reedy>
|
at some people we just close it
|
|
2026-01-06 17:32:53
|
<Reedy>
|
because it's stale/superceded/whatever
|
|
2026-01-06 17:33:52
|
<A_smart_kitten>
|
Reedy: yeah i mean, that makes sense
|
|
2026-01-06 17:35:22
|
<A_smart_kitten>
|
anyways, there are 14 files in core that cause that specific error in tests (when I run them locally, I don't think they're visible on `check experimental` jobs yet bc of the order in which WMF CI runs the tests)
|
|
2026-01-06 17:36:32
|
<A_smart_kitten>
|
would it be better to file them as one task (for CI failures caused by this error in core) or separately? happy to do either but it does strike me that it might be a bit of an unwieldy task description if i try to put them all in the same one..
|
|
2026-01-06 17:39:52
|
<Reedy>
|
I don't think there's any real right answer...
|
|
2026-01-06 17:40:12
|
<Reedy>
|
They're all a similar class of problem, and if they're in the same codebase...
|
|
2026-01-06 17:43:02
|
<A_smart_kitten>
|
'I don't think there's any real right answer...' yeah, i see :/
|
|
2026-01-06 17:43:08
|
<A_smart_kitten>
|
i might file them separately, if only because i have no idea which ones might be more complicated to resolve 'properly' / which might involve more discussion than others.
|
|
2026-01-06 17:43:16
|
<A_smart_kitten>
|
ty for putting up with my questions :)
|
|
2026-01-06 17:43:42
|
<Reedy>
|
Yeah, I don't think anything is wrong with that
|
|
2026-01-06 17:43:59
|
<Reedy>
|
Maybe group together if it's in the same code file where it's occuring etc, but beyond that...
|
|
2026-01-06 18:14:20
|
<A_smart_kitten>
|
funnily enough that's actually what i've already done - i've (locally) sorted the deprecation errors by type & by file that they occur in. all that is left is to file the actual tasks about them :p
|
|
2026-01-06 20:06:01
|
<Reedy>
|
A_smart_kitten: do you want to update your parsoid patch? Sounds like we can get it merged (and backported!) pretty quickly with one of your propsed solutions
|
|
2026-01-06 20:35:09
|
<A_smart_kitten>
|
Reedy: yep, will hopefully do within the next few hours (and barring that, early-ish tomorrow) :) it's been on my mental list of things to do but i kept either getting sidetracked or having other stuff to do, apologies!
|
|
2026-01-06 21:18:21
|
<Reedy>
|
heh
|
|
2026-01-06 21:18:23
|
<Reedy>
|
rabbit holes
|
|
2026-01-06 21:20:40
|
<A_smart_kitten>
|
indeed indeed
|
|
2026-01-06 22:34:24
|
<AaronSchulz>
|
Krinkle: would be nice to have a 24.05.2 version of fresh that includes f9a996f .
|
|
2026-01-06 23:20:06
|
<ori>
|
Lucas_WMDE: enforcing this via phpcs would be very nice. It's fair to assume there are (and will be) other hot code paths using array_key_exists(). It's basically free.
|
|
2026-01-06 23:23:53
|
<ori>
|
Anyone want to +2 https://gerrit.wikimedia.org/r/c/mediawiki/extensions/EventStreamConfig/+/1220048 ? It got a +1 from Ottomata
|
|
2026-01-06 23:44:21
|
<Reedy>
|
lol
|
|
2026-01-06 23:44:39
|
<Reedy>
|
wikibase uses array_key_exists 160 times vs core's 187
|
|
2026-01-07 00:05:18
|
<ori>
|
thanks Reedy
|
|
2026-01-07 07:44:18
|
<Krinkle>
|
Amir1: there's quite a lot more under https://en.wikipedia.org/wiki/Special:LinkSearch?target=wikimedia.org than just wikis. I imagine we'll see a noticalbe drop in db size once this is fully internalised by refresh links over time. (Assuming no optimize for most hosts, I expect this will only show up visually in your graphs by lakc of growth rather than actual shrinkage)
|
|
2026-01-07 07:44:45
|
<Krinkle>
|
e.g upload.wikimedia.org and mail.wikimedia.org as well
|
|
2026-01-07 10:16:23
|
<Krinkle>
|
https://grafana-rw.wikimedia.org/d/QLtC93rMz/backend-pageview-timing?orgId=1&from=now-90d&to=now&timezone=utc&var-platform=$__all&forceLogin=true&viewPanel=panel-43
|
|
2026-01-07 10:17:02
|
<Krinkle>
|
mobile latency regressed from ~300ms to ~600ms at p75
|
|
2026-01-07 10:17:09
|
<Krinkle>
|
desktop elevated as well but much less so
|
|
2026-01-07 10:17:37
|
<Krinkle>
|
cscott: ^ may be of interest, re: parsoid and output transform changes
|
|
2026-01-07 10:17:53
|
<Krinkle>
|
https://grafana-rw.wikimedia.org/d/QLtC93rMz/backend-pageview-timing?orgId=1&from=now-90d&to=now&timezone=utc&var-platform=$__all&forceLogin=true
|
|
2026-01-07 10:40:36
|
<Lucas_WMDE>
|
ori: see T413958 for enforcing this via phpcs
|
|
2026-01-07 10:40:37
|
<stashbot>
|
T413958: Enforce `use function` import for array_key_exists() (and other common PHP functions?) in namespaced files via PHPCS - https://phabricator.wikimedia.org/T413958
|
|
2026-01-07 11:45:20
|
<A_smart_kitten>
|
Reedy: at long last, i managed to avoid getting sidetracked for long enough to push an update to the parsoid patch :p
|
|
2026-01-07 12:42:06
|
<Amir1>
|
Krinkle: yup, I estimate it'll be dropping 200M rows in commons \o/
|
|
2026-01-07 16:37:21
|
<aarohi>
|
Hi! I've submitted a patch for T413938 (diff pages crashing in Chrome 65-66 due to timezone RangeError). Fixed DateFormatter to use 'Etc/GMT' instead of 'Etc/GMT-0' for zero offset. Patch: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1224121 - would appreciate a review!
|
|
2026-01-07 16:37:22
|
<stashbot>
|
T413938: Diff pages do not render on older browsers: Last modified bar throws exception meaning diff popup overlay doesn't render - https://phabricator.wikimedia.org/T413938
|
|
2026-01-07 16:43:00
|
<Reedy>
|
aarohi: Generally you don't need to go around asking for CR after ~20 minutes of uploading a patch. You've correctly attached it to a bug, so the reporter will see that "soon" and usually would respond during their working day
|
|
2026-01-07 16:58:54
|
<A_smart_kitten>
|
Reedy: just to 100% clarify, in https://gerrit.wikimedia.org/r/c/mediawiki/vendor/+/1221225/comment/ba34be93_a05cd687/ are you saying i should rebase the parsoid patch (i.e. now, before it gets merged)?
|
|
2026-01-07 16:59:40
|
<Reedy>
|
There's no harm in doing so... Except when you start rebasing everytime another patch is merged into the repo (unless they conflict!)
|
|
2026-01-07 17:00:25
|
<A_smart_kitten>
|
fair enough :) i guess i'll find out if gerrit thinks the rebase is 'trivial' enough to keep the previous +1s :p
|
|
2026-01-07 17:00:37
|
<Reedy>
|
I think based on what has been merged.. it should be
|
|
2026-01-07 17:04:36
|
<A_smart_kitten>
|
you were right!
|
|
2026-01-07 17:05:07
|
<Reedy>
|
Like I say, I suspect we can get a +1 (or +2) from cscott in the next couple of days, which will get it merged
|
|
2026-01-07 17:08:33
|
<A_smart_kitten>
|
on a slight side note, it'd be nice if the CI infra could automagically load a parsoid patch into vendor/ without needing to also have an additional 3 DNM patches to get it to work. i admit that i don't know how feasible that would be, though
|
|
2026-01-07 17:12:46
|
<Reedy>
|
If the patch was merged, you could use dev-master#hash in the vendor repo
|
|
2026-01-07 17:12:53
|
<Reedy>
|
without needing to fiddle with the repo
|
|
2026-01-07 17:13:20
|
<Reedy>
|
But this is a very uncommon workflow to be wanting to test vendorised library patches in mw core before merge like this
|
|
2026-01-07 17:14:06
|
<Reedy>
|
composer uses github, and I guess the branches for patches pushed for review won't be replicated over there.. if we could point it at gerrit, there might be a way
|
|
2026-01-07 17:22:06
|
<A_smart_kitten>
|
i mean, to be fair i can easily believe you that this is a fairly uncommon scenario re testing. so maybe something like that wouldn't be worth the implementation effort :)
|
|
2026-01-07 17:37:31
|
<aarohi>
|
Thanks for letting me know, I'm a new contributor..
|
|
2026-01-07 21:22:12
|
<AaronSchulz>
|
Krinkle: so, regarding fesh, AFAIK, install-freshnode needs a bump to FRESH_VERSION, the SHA256 for fresh-node24 should be updated to match the current content, and the tag of the bumped version should be made pointing to the current master HEAD. Then the https://gerrit.wikimedia.org/g/fresh/+/%s/bin/%s?format=TEXT download and hash check should succeed. Manually putting fresh-node24 in ~/bin seems to work for me.
|
|
2026-01-07 21:23:18
|
<AaronSchulz>
|
Krinkle: I've been using it for running candidate take-home tests that use node (some need node 24).
|