[19:42:10] hello hello [19:43:09] I have a question with context to mediawiki api queries. Is someone here willing to help ? [19:43:41] It's easier to ask your question then wait, rather than asking to ask [19:55:58] So my problem is this. I would like to query my local docker instance of wikidata using a link such as the following, [19:55:59] http:///w/api.php?callback=jQuery36008102725374903522_1664962412781&action=query&list=search&srsearch=Data:pages2.tab contentmodel:Tabular.JsonConfig&srnamespace=486&srlimit=10&format=json&_=1664962412804 [19:55:59] This link is used to query for a list of jsontables present on the site that matches the format "Data:pages2.tab". But i receive no responses even though tables that match this format do exist on the site. With the response format being as the following, [19:56:00] I have noticed that the "batchcomplete" variable is not set for most of the queries that i build after referring to the mediawiki api documentation. and that the response time is always somewhere in between 270ms to 280ms. For such queries. [19:56:01] While the response time for querying json tables from commons wikimedia is always somewhere in between 40 - 50ms. And i receive valid responses from here. [19:56:01] I am not sure where i am going wrong. Can someone please help me make these queries for json tables on my local instance of wikidata work ? [19:57:58] This sounds much more of a wikidata question [20:00:10] But but aren´t api queries the same across all mediawiki subplatforms ? [20:00:24] Not necessarily [20:00:45] The search one, for example, can be answered by a mysql query, or from ElasticSearch [20:00:55] depending on how you set things up... [20:02:15] Also, comparing response times probably isn't conducive... You're comparing a many server cluster, with caching, with docker on your local machine [20:04:38] the query only directly communicates with the wikidata instance and not directly with a database [20:04:49] so even a link like the following gives the same result [20:05:40] If your setup is the same [20:05:55] And a wikidata instance... is still a database underneath ;) [20:06:24] i mentioned about the response time as i notice that when i query other types of data that have  namespace that`s not as high as 486, i do seem to receive a response and the response time is much faster [20:07:04] along with marking the `batchcomplete´ variable as ´true´ [20:08:17] If you're getting no response at all, you might want to enable some debugging, as a blank page sounds like a fatal error [20:08:39] According to the documentation, the batchcomplete variable represent`s the following " The API returns a batchcomplete element to indicate that all data for the current batch of items has been returned. ´" [20:10:06] i do receive a response. It is in the following format [20:10:28] damn for some reason irc seems to not allow it to be visible [20:10:35] " /**/jQuery36008102725374903522_1664962412781({"batchcomplete":"","query":{"searchinfo":{"totalhits":0},"search":[]}}) " [20:10:35] use a pastebin [20:10:42] works now [20:11:38] sorry, didn´t realise that the line representing the response to my query wasn`t visble in the initial question i posted [20:13:28] receiving an empty response after 270ms considering that the json tables are present at the 486 namespace (which is after the namespace for all the items on wikidata), lead me to believe that there is a possible timeout for the query response somewhere ? [20:14:44] Also the docker instance that i have running is within my local network. But even then it seems to take this much time. While commons media that isn't on my local network responds much more quickly [20:17:49] Caching :) [20:18:27] can you please elaborate [20:19:08] Your local install is almost certainly not comparable in terms of performance, layers of caching, optimisation etc [20:19:34] It's unclear what you actually have setup beyond "docker" [20:19:39] yes that`s possible. Is there some documentation that you can refer me to for this ? [20:20:02] Well, in Wikimedia production, we don't serve MediaWiki from/via Docker [20:20:19] yes that`s true [20:21:04] but does there exist a timeout somewhere to generate a response for these api queries ? [20:21:20] It depends what is serving your search results [20:21:30] Many timeouts are not handled gracefully [20:22:08] do you mean that there will be an error message somewhere if there is a timeout ? [20:22:16] Sometimes [20:22:21] Not a pretty one [20:22:27] yeah [20:24:00] I will experiment around a little bit more and see what i can do [20:24:18] thank you for responding :) [20:35:20] Yes i feel lead to believe that it is not a problem with timeout and more a problem that`s synonmous with the jsonconfig extension for mediawiki and the right content negotiation [20:36:24] i will reframe my question and pose it later on the wikidata IRC channel :) [20:37:05] You might have more luck during the working week for the Wikidata developers to be about [20:37:34] Ok yes this makes more sense. :P [20:38:46] Thanks!