[00:02:47] ^ ... using any of the APIs [00:03:10] Special:ItemByTitle on Wikidata might be easiest [00:04:43] or in the Action API, also on Wikidata, action=wbgetentities with sites= and titles=, e.g. https://m.wikidata.org/wiki/Special:ApiSandbox#action=wbgetentities&format=json&sites=enwiki&titles=Douglas%20Adams&formatversion=2 [00:05:21] (that one can do more than one title per request) [00:06:01] lucaswerkmeister: Thanks. I'll try that out. I was asking because I want to make my web browser redirect articles in a Wikipedia language edition to another Wikipedia language (if there's such article), so I'm planning to write a Tampermonkey script. If someone has a simpler way to accomplish the same result, please let me know. [00:07:20] alternatively: https://en.m.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=pageprops&titles=Douglas%20Adams&formatversion=2&ppprop=wikibase_item [00:07:23] sounds like you'd want prop=langlinks, then https://en.wikipedia.org/w/api.php?action=query&prop=langlinks&titles=Main%20Page&redirects= [00:07:53] (if you're only concerned about other language editions and not the Wikidata entity itself) [00:08:09] yeah, for that langlinks sounds better, e.g. if the interwiki link from Wikidata is overridden locally for some reason [00:12:00] wg 3 [00:38:21] !log wikisp Sunsetting mars T364052 [00:38:24] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikisp/SAL [00:38:25] T364052: Actualizar instancias de wikisp en CloudVPS a Debían Bookworm - https://phabricator.wikimedia.org/T364052 [02:04:45] Here's the Tampermonkey script I wrote for redirecting es.wikipedia.org links to en.wikipedia.org, for those interested: http://web.archive.org/web/20240521020316/http://0x0.st/XPER.txt This is the first Tampermonkey script I write so bear with me [02:09:39] I decided to write it because most of the information I read is related to computer science and I prefer reading those topics in English. When I use search engines, results in Spanish Wikipedia appear because I live in a Spanish speaking country. [05:39:48] Where do I specify the mount parameter? [05:39:49] Is there a guide on how this mounting works? (re @wmtelegram_bot: For a build service app things are a bit more complicated as $HOME is not a persistent volume, so you would need to use...) [05:49:59] Found it here https://wikitech.wikimedia.org/wiki/Help:Toolforge/Build_Service#Using_NFS_shared_storage [05:50:00] If I understand correctly I can then: [05:50:01] Clone my git repo. [05:50:03] Generate the 59k files I need [05:50:04] Build the fastapi container using the url to the gitrepo and the buildservice and mount=all [05:50:06] Run the webservice and it should be able to access the generated files in the tool home. [05:50:07] If I succeed with this I'll write it up in the wiki for other fastapi users 😀 [05:54:04] i started using an Aptfile in my buildpack file because I needed access to ffmpeg. and man this install time is incredibly slow. >15 minutes to install one package. really hope I don't need to wait that lone on each deploy. [07:49:42] !log copypatrol copypatrol-backend-prod-01 deploy 9300746..f61d2c0 [07:49:44] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Copypatrol/SAL [08:18:42] !log admin stop neutron services on cloudnet1005 T364459 [08:18:48] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Admin/SAL [08:18:48] T364459: Migrate eqiad1 cloudnets to Neutron OVS agent - https://phabricator.wikimedia.org/T364459 [10:26:54] !log bsadowski1@tools-bastion-13 tools.stewardbots Restarted StewardBot/SULWatcher because of a connection loss [10:26:57] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL [10:58:22] !log bsadowski1@tools-bastion-13 tools.stewardbots Restarted StewardBot/SULWatcher because of a connection loss [10:58:25] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL [11:12:21] heads up, network operations ongoing [11:36:08] heads up, the critical part of the network operations are completed now [11:36:13] no more downtime is expected [17:01:42] !log multichill@tools-bastion-12 tools.multichill deployed infobox-missing-commons-category to empty out https://commons.wikimedia.org/wiki/Category:Uses_of_Wikidata_Infobox_missing_Commons_Category_statement [17:01:45] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.multichill/SAL [17:25:10] I had a question. is it possible to add additional disk space to a VM? Look at 500gb - 1tb (so that we can run visual diff tests on different configurations without having to manually juggle disk space usage). [17:29:40] subbu: yes, https://wikitech.wikimedia.org/wiki/Help:Adding_disk_space_to_Cloud_VPS_instances is the current solution for VPS storage. [17:31:17] Default quota for a project is 80Gb, but more can be requested via the normal quota change process -- https://phabricator.wikimedia.org/project/view/2880/ [17:44:35] thanks! :) [19:20:55] what would be the best Phabricator tag for like "SREs in wmcs" ? [19:21:32] or "dealing with cloudcephosd servers" [19:21:54] nothing more specific than just SRE? [19:24:31] mutante: I added #cloud-services-team [19:25:07] I suspect that falls under #data-services too [19:25:44] you beat me to it. I hit save and you had done it:) thanks [19:26:11] didn't see it because first Phab autocompletes to other stuff when you type cloud [22:33:52] When building my buildpack container for toolforge i'm getting errors saying the image is too big. See https://gitlab.wikimedia.org/-/snippets/139 I believe this is because I'm installing `ffmpeg` using an `Aptfile`. Can I get my quota increased or is there some way to save space? [22:38:04] oh i can run `build clean` to save some space. but i think i still need more quota [22:39:39] derenrich: you can request a quota bump at . I don't know that we have processed a request for more registry space yet, but someone always has to be the first. [22:40:12] thanks will do. i've never run into this before but this is the first time i've needed to depend on an apt package [22:40:26] I was peeking at your harbor repo to see if there were things you might clean up, but it looks like you beat me to that step. [22:41:01] yeah but i don't think there was much there. we first published yesterday [22:41:49] "DENIED: adding 306.0 MiB of storage resource, which when updated to current usage of 928.4 MiB will exceed the configured upper limit of 1.0 GiB." made it sound like there might have been, but I have been fooled by those error messages before myself, so :shrug: [22:43:13] FWIW I’ve requested a build service disk quota increase at https://phabricator.wikimedia.org/T355997 before :) [22:45:44] ok filed a request https://phabricator.wikimedia.org/T365536 [22:47:49] digging around in the UI for our harbor, it looks like we have things configured to try to keep the last 5 artifacts in each repo. [22:48:21] ok well i'm rebuilding right now (takes 20min to build) and i'll see how much space one artifact takes [22:48:29] derenrich: 5GB is a huge amount of quota growth (5x default) [22:48:56] i mean i don't have a sense of scale but 5GB seems very small to me [22:50:14] my intuition was 2 builds ate up 1GB and i might do 6 deploys a day and then round up a little [22:50:45] 5GB * 3286 tools == lots :) [22:51:23] well it all depends on context. that's 1 HDD (but yeah I have no idea what the infra looks like) [22:54:16] per https://openstack-browser.toolforge.org/project/tools it looks like we have a 500G volume for all of tools-harbor.wmcloud.org at the moment. [22:54:35] ok so running clean -> build -> leaves my quota usage at 928.42Mi [22:54:51] apparently ffmpeg pulls in a full graphics stack (libcairo2, libfontconfig1, x11-common)… :/ [22:55:09] I’m seeing some wayland in there too [22:55:16] yeah it's bad news