[14:57:27] helllllllllllllllllllllllllo [15:03:47] im working on the rfc more [18:13:27] @posix_memalign This week I am presenting about Miraheze Communities and another similar wiki. Is there anything you’d like to describe or share about how your bot update process works? [22:21:47] anything you think worth mentioning on miraheze monthly? [23:21:59] I’m giving a talk at the MediaWiki Users and Developers Conference this week about using Wikibase to catalog MediaWiki sites. Miraheze Communities is one of the wikis and Wikibase World (hosted on Wikimedia Germany’s Wikibase Cloud) is the other [23:22:26] [1/6] I think there isn't anything sophisticated with the approach. The rough process is: [23:22:26] [2/6] 1. Scrape data with public APIs [23:22:27] [3/6] 2. Save the data in a local SQLite DB [23:22:27] [4/6] 3. Update the items on wikibase [23:22:27] [5/6] Perhaps stuff like https://communities.miraheze.org/wiki/List_of_wikis_about_Roblox_games is somewhat interesting because the numbers are derived solely from wikibase and are updated automatically. But it is prone to hitting the expensive parser function limit and the "Number of Wikibase entities loaded " limit, while other extensions designed for [23:22:28] [6/6] this sort of use case (e.g. Bucket) can handle retrieving thousands of of entries. [23:23:27] I am really looking forward to learning more about Bucket this week [23:25:09] Is your code in a git repository I could reference? [23:31:58] If Harej is going on WikiTide Foundation money that'll be worth mentioning for the sake of transparency. As for the presentations I don't see any that concerns Miraheze besides Harej's. [23:32:16] Yep. It's in https://github.com/lihaohong6/Miraheze/tree/master/communities [23:32:49] https://github.com/lihaohong6/Miraheze/tree/master/wiki_scanners has the code related to scraping [23:34:25] The foundation didn’t fund my trip. I guess I have the option to get registration reimbursed, but I probably won’t [23:38:25] That's really cool! I am so excited to see MH represented. :pupCoffeeMH: I'd love to listen to any session recordings if the conference is documenting them!