[16:45:23] Krinkle: followup question to our discussion about configs - where does the builder for doc.wikimedia.org live or some docs about it live? I wanna experiment with adding a fancy page generated from schema [16:46:40] A lot of it is in https://github.com/wikimedia/integration-docroot [16:46:46] Combined with stuff in Jenkins jobs etc [16:46:54] Pchelolo: it's a static apache server. data comes from jenkins post-merge jobs. can be built by anything you like. For mw core, we use maintenance/mwdocgen.php which is mostly just shell out to Doxygen. But you could attach other things to it, e.g. produce a markdown file based on json file. [16:47:10] we used to have one or two generated pages like that, and we experimented with generating hooks docs from hooks.txt that way [16:47:18] but ended up going with hook interfaces, which made that obsolete [16:47:34] ok, gotcha, thank you [16:47:44] Is it still running on PHP 7.0? [16:47:55] * Reedy suspects not - https://github.com/wikimedia/integration-docroot/blob/master/composer.json#L3 [16:48:07] Pchelolo: to clarify, basically we run that script, and then rsync mediawiki-core/docs/html to $docroot/:project/:branchOrTag [16:48:22] with some proxy in between to avoid security issues [16:49:12] doc1001 is a Wikimedia Documentation Server (doc) [16:49:12] Debian GNU/Linux 9.13 (stretch) [16:49:12] krinkle@doc1001:~$ php --version [16:49:12] PHP 7.0.33-0+deb9u12 (cli) [16:50:09] lol [16:50:11] what a mess :) [16:50:45] ok, thank you. I'll poke you when I have something (hopefully) nice to show [16:51:50] Reedy: I vaguely recall updating linters in that repo to 7.2+ and then reverting it after use of a php72ism produced a fatal and reverting it to prevent mistakes [16:51:54] I probably missed the require field [16:52:03] lol [16:52:29] https://github.com/wikimedia/integration-docroot/blob/HEAD/shared/Page.php#L3 [16:52:37] https://phabricator.wikimedia.org/T247653 [16:52:45] "Open, Stalled," [16:53:09] "stalled" "I am not sure if there is actually a technical blocker here that keeps us from simply switching." [16:53:30] I'm guessing it just needs someone to confirm it works fine and comment that on the task [16:53:30] well, the data needs to be synced, and the producers need to know where to push to instead. [16:53:59] and yeah someone needs to test it with an ssh proxy and host override or something [16:54:24] or we can test it afterwards and keep the old one for a few days, it's fairly low stakes [16:54:41] If the revert is relatively trivial [16:54:47] And it's only documentation! [16:55:11] if we make the producers use a cname and reduce the amount of other hardcoding, that would be trivial [16:55:31] I wonder what we do today to keep the codfw one in sync, if anything [16:55:51] https://gerrit.wikimedia.org/g/operations/puppet/+/11504e4a1f8f215b369216e7649b5f251426c00d/hieradata/common/scap/dsh.yaml#145 [16:56:24] ok, I guess nothing, it's not even used in most places. [16:56:24] nothing I guess, as there wasn't a server in codfw previously [16:56:24] ah okay, so that part is new [16:56:35] that's for the integration/docroot scap, I don't see anything in puppet for moving the rsynced data to other nodes [16:56:56] profile::doc has a rsync server for incoming data from contint*, but not for moving it to other nodes [16:58:10] right [16:58:24] and the task doesn't mention anything about this changing, so the codfw is just a spare [16:59:35] we probably want to at least have something that constantly syncs things from eqiad->codfw [17:00:18] we do have backups thoguh [17:14:49] yeah, but restoring from backups still takes some effort and setting up a rsync server module and a systemd timer to periodically sync it is not that complicated