[04:27:35] * Krinkle updates https://grafana.wikimedia.org/d/000000208/edit-count to remove "bad_token" from edit failures as that metric was apparently merged with "session_loss" in https://gerrit.wikimedia.org/r/c/mediawiki/core/+/699040 [04:27:58] TimStarling: I'm looking there for potential impact/improvement around less token failures/retries [04:28:16] I didn't find any change, but I do notice a fairly consistent spike around midnight every day for a long while [04:31:59] probably due to some backend error rather than expiry [09:28:13] Hi again, thanks for your recent support, I have couple more questions if anyone can help: [09:28:14] 1. Is the best way to collect every image available on wikimedia (CC-By-SA and PD) using the Commonswiki family datadump ? ( https://dumps.wikimedia.org/commonswiki/20220720/ ) or is there any better dump just aimed at images? This would be aimed for a one-of bulk ingestion to avoid crawling APIs and reduce our footprint on wikimedia services. [09:28:14] 2. Once then working on a delta fetch, if using allimages, I can get the most recent uploaded images with something like https://commons.wikimedia.org/w/api.php?action=query&list=allimages&aisort=timestamp&aistart=2022-07-29T07:00:00Zformat=jsonĀ  , but wouldn't this just give me Add operations? is there an endpoint to understand deletes and change [09:28:15] in licenses so that I can takedown any image which changes from CCBYSA/PD to something else or an image that has been taken down from commons ? [09:28:15] Thanks for any help [10:48:38] FabioQ: sounds like a lot of work. What is this in aid of? [10:53:07] use those images as part of Amazon Alexa video devices responses [12:23:32] FabioQ: maybe you should have a chat to the enterprise team https://enterprise.wikimedia.com/ [13:25:19] as a note, Enterprise also told me that there's nothing right now for images specific, but just text focused, therefore the available mediawiki APIs should be my focus