[02:45:17] 1. Does Wikifunctions have access to the Wikidata Query Service? 2. Will it ever? 3. As a workaround, could a process query the query service, and then use that output as input into a Wikifunction function? [02:47:04] It's at least a good idea (re @harej: 1. Does Wikifunctions have access to the Wikidata Query Service? 2. Will it ever? 3. As a workaround, could a process query the ...) [03:38:11] it seems like something that would have to be done carefully. it wouldn't be good if people could easily insert a sparql query that takes forever and/or returns huge amounts of data into a function that is called a lot (re @harej: 1. Does Wikifunctions have access to the Wikidata Query Service? 2. Will it ever? 3. As a workaround, could a process query the ...) [03:53:26] Is current wikibase plug-in based on js or php [04:48:11] the query service already has a 60 second timeout, which is within execution tolerance for Wikifunctions I think (re @Nikki: it seems like something that would have to be done carefully. it wouldn't be good if people could easily insert a sparql query t...) [04:57:12] I'm thinking of malicious users who could try to sneak calls to slow/expensive queries into other functions (re @harej: the query service already has a 60 second timeout, which is within execution tolerance for Wikifunctions I think) [04:57:37] would those not also time out? [05:01:47] the point is that each time you call it, it's going to take a long time and it will all add up, and it has the potential to affect a lot of things. if you have a function which uses a compromised function twice, that's already 120 seconds before it times out [05:11:49] or say you have a list of 100 things and you call a compromised function on each entry, that's now 6000 seconds, or 1 hour 40 minutes 😶 [05:17:48] there's presumably a limit on how many functions can be evaluated at the same time as well, which might mean you could block other functions from being able to run by clogging the system with queries waiting to time out, if the query service lets you run that many simultaneously, or if doesn't, you could at least break functions that want to do sparql queries by making the query [05:17:50] service reject the requests [05:50:55] For many of the examples in its proposal, it has to be able to query something. Like "it's the nth largest city in". (re @harej: 1. Does Wikifunctions have access to the Wikidata Query Service? 2. Will it ever? 3. As a workaround, could a process query the ...) [15:53:27] I am mainly interested in knowing if Wikifunctions supports WDQS access either right now or in the next few months. If not that’s fine I just want to know [15:55:54] right now, no [15:55:55] in a few month, probably not [15:55:57] in a lot of month, probably (re @harej: I am mainly interested in knowing if Wikifunctions supports WDQS access either right now or in the next few months. If not that’...) [15:56:06] Sounds reasonable. Thank you! [15:57:34] I would work around this by having something else run the query, and then feed the JSON output of the query service into a function. That seems like the most direct path to turn WDQS data into a human-consumable answer [15:58:11] And in the Future™, instead of manually feeding input into the function, a magical API can pull the data and feed it into the function [20:56:51] How feasible is it to implement alternative data types in Wikifunctions at the moment?