[01:40:02] is there a known issue with quarry? [01:41:26] ah, the error 500 on stop button is already a ticket, nevermind [18:44:28] Hi everyone! Not sure if this is the right channel to ask. [18:44:28] I'm trying to download a list of all the files in [[category:Images from Wiki Loves Monuments in 2011]] but one level down, and their direct category. For example, a subcategory would be [[Category:Images from Wiki Loves Monuments 2011 in Andorra]] and a file would be [[File:AND19 Caselles PM 52414.jpg]]. I would want to know the filename and that [18:44:29] it's in Andorra (the category name would do). [18:44:29] I launched this quarry query for this: https://quarry.wmcloud.org/query/61554 (but it's not going anywhere). Is it inefficient, or am I doing something wrong? Is there a more efficient way to get this data? [21:20:07] effeietsanders: hello, I recommend trying petscan.wmcloud.org instead. It should be easier to use than raw SQL, and give you the result you need. [21:23:19] urbanec: thanks, I have been looking at that, but didn't find a way of doing that that would also give me the country it's in. Am I overlooking some option? Unless I somehow call it as API and look it, I guess... [21:23:24] *loop [21:29:37] effeietsanders: I don't see a country mentioned in https://quarry.wmcloud.org/query/61554 either? [21:29:40] or am i missing something? [21:30:19] or, are you saying that you want the direct category with the country name in it? [21:30:50] in that case, i recommend doing a two-step process: 1) look the filenames via petscan 2) get the direct categories either via the API, or via quarry [21:31:12] yes, i would deduce country from the direct category ('subcat') :) [21:31:35] in the 2Nd step, you'd do something like `select cl_to from categorylinks where cl_to like 'Images_from_Wiki_Loves_Monuments_2011_%' and cl_from in (ids of the photos found in petscan)` [21:32:11] hmm, how well does 'in' work when it's 200k id's long? [21:33:30] hmm, i did that before with few hundreds of IDs there, 200k might be a bit too much [21:33:54] ok, i guess API it is :) [21:33:57] if you have shell access to toolforge, it should be a matter of a line in bash to run it as a dozen of separate queries [21:34:08] or, you can also try to optimize your SQL [21:37:53] hmm...that's weird. i copied and pasted your query to a paws notebook, and it runs within a second: https://public.paws.wmcloud.org/User:Martin_Urbanec/wlm-per-year-example.ipynb [21:38:35] I was already surprised it didn't because a slightly different query was quite fast. I know there are problems with quarry though... [21:39:57] Thanks for confirming it's not the query itself:)  I should be able to use the notebook route instead. [21:43:04] effeietsanders: it's probably something within quarry (you might want to fill a task or ask in -cloud about that). Notebook works though (even with the 10k). Have a nice day. [21:44:04] thanks! :) [21:53:52] 10Quarry: Query in quarry runs into trouble but same query runs fine in PAWS - https://phabricator.wikimedia.org/T299292 (10Effeietsanders) [21:57:24] 10Quarry: Query in quarry seems not to finish but same query runs fine in PAWS - https://phabricator.wikimedia.org/T299292 (10Aklapper)