[10:49:56] apergos: hi again! continuing my research into incremental wikipedia dumps. the goal is to create a full history of a wikipedia that can then be synced by only fetching revisions since a given timestamp. i have a couple questions. #1 i'm trying to figure out what code generates e.g. enwiki-20211001-pages-articles-multistream1.xml. is this the mediawiki's handler of special:export, or is it [10:49:58] done by a different script? #2 i imagine that this would be a breaking change, because for incremental dumps to work we'd need to change the format so that changes are not ordered by article names, but by revision numbers. i assume that this would require publishing a new type of dump? would there be any interest in publishing something like this on dumps.wikipedia.org? having an official [10:50:00] authority confirm the hashes would greatly simplify mirroring. [18:28:57] https://wikitech.wikimedia.org/wiki/IP_and_AS_allocations doesn't appear to be up to date [18:29:26] at least, not completely compared to https://bgp.he.net/AS14907#_prefixes [18:31:15] wait, I just can't do CIDR math