[00:40:38] [[Tech]]; SSethi (WMF); /* Question about non-Latin interwiki links */; https://meta.wikimedia.org/w/index.php?diff=25439099&oldid=25428347&rcid=28198999 [07:33:27] Platonides: want to add social.wikimedia.es to https://meta.wikimedia.org/wiki/Mastodon ? [13:48:08] [[Tech]]; Greatder; /* Question about non-Latin interwiki links */ Grammar & Spelling, @খাত্তাব_হাসান আশা করি মন খারাপ করবেন না; https://meta.wikimedia.org/w/index.php?diff=25442457&oldid=25439099&rcid=28204342 [18:19:32] I'm working on a tool that will show how large sets of articles change over time. One of the things we are using is edit counts between various time periods, which we're currently getting by first finding the current revision of a given article at each date we care about (for example, the first day of each month), and then using the REST API with the history/counts/edits endpoint to get the number of revisions between to rev_ids. We're hitting [18:19:32] rate limits after about 1500 requests over a couple minutes (well under 200 per second, which the docs note as a guideline). Any advice on working around this? [18:20:16] we could make a tool to do SQL queries against replica to get this data, probably, but i'd like to use existing APIs if possible. [18:20:43] but i can't find any process for dealing with rate limit increases for the REST API (in contrast to the Action API) [21:31:23] ragesoss: those limits exist to stop large-scale scraping from effecting performance for end-users [21:31:45] so in general if it can be got from a replica that's better :) [21:32:03] there is potentially some scope to increase it if we can identify your connections [21:32:12] user-agent etc., source ip range [21:33:09] it might be an idea to open a task on Phabricator with all that info marked for SRE attention