[16:58:28] anyone know of a python library i can use to parse an interwiki into the web url of the wiki? [17:00:19] ie [[:de:blahblah]] returns https://de.wikipedia.org/wiki/test [17:06:46] wikilinksbot https://github.com/jhsoby/telegram-wikilinksbot presumably has code for that and is written in Python, though I couldn’t tell you where exactly that code is [17:08:35] and it looks like it’s not a library, just bespoke code in the bot (wikilinksbot.py, starting at findlinks() and then continuing with linkformatter(), link_normal(), and other functions) [17:12:30] owuh: the code lucaswerkmeister is referencing basically has the Action API do the hard work which is probably the right way to think about this unless you are planing on resolving thousands of links at a time. https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&meta=userinfo&iwurl=1&titles=%3Ade%3Ablahblah&redirects=1&formatversion=2 [17:13:57] jhsoby did a neat hack there where he uses `meta=userinfo` as a cheap way to get the interwiki expansion. [18:35:33] on getting urls from interwiki links in python: with pwb you can do [18:35:33] pywikibot.Page(pywikibot.Site("en"), "s:de:Test").full_url() [18:35:34] (here assuming the links are from enwp.) it probably makes more or less the same requests, but it's cleaner [18:58:41] its not the requests i care about its basically for a clone of the wikibot that does the links in the wikimedia discord for a different community [18:59:20] so pwb isnt ideal [19:01:07] bd808: that might work actually thanks