[01:35:33] The maintainer of the CS1 module on enwiki agreed to make the change to avoid the redundant '0' => '0', '1' => '1', ... string replacements. [01:36:14] he updates the modules at a quarterly(!!!!!!!) interval [01:37:22] s/he/they/, I don't actually know their gender [01:38:23] I actually love that though. It's very much https://xkcd.com/2347/ [01:39:10] I pointed out a possible race condition depending on the order the updates are applied to Module:CS1 and Module:CS1/Configuration [01:39:40] "When I do the updates, I open a new browser window and then update each module that needs updating in its own tab and then publish the submodules one after the other as quicky as I can with Module:Citation/CS1 as the last to be published. Usually only a handful of articles get refreshed while the module suite is out of sync. I have an AWB script that can rapidly null edit those pages so [01:39:42] the disruption is minimized." [01:40:08] https://en.wikipedia.org/w/index.php?title=Help_talk%3ACitation_Style_1&diff=1334312753&oldid=1334309487 [01:41:03] what a badass [01:52:26] disruption to users maybe [01:52:31] what about the poor servers?!!? [02:04:01] it's not unreasonable, they're one person, it's about five quintillion lines of Lua, and a bug could propagate to a lot of articles pretty quickly [02:04:35] it's happened enough times with templates etc [02:20:28] I assume articles are considered as "using" both of those pages via templatelinks, so the last edit should trigger a recursive refreshlinks job of its own that will chase/supersede the previous ones. [02:21:36] I don't thikn tjhey'll cancel out entirely since they do root at different pages, and in theory they can have a different dependency graph, so they pobably just end up racing each other, but they shoudln't parse the same pages twice given that any parse after both edits will suffice and the next one will no-op (which we optimise for). [02:21:49] basically two jobs working together. [02:22:07] (with the second one first catching up to re-parse/fixup the ones from the first so far) [02:22:58] The second shoudl be able to catch up because it'll start going very fast with no-ops once it reaches pages the first one parsed after both edits.