[01:16:49] Hello, (I know you probably get this a lot), but I'm struggling to merge a duplicate Wikidata entry, is anyone willing to help? [01:30:58] Guest15‎: [[Help:Merge]] [01:30:59] 10[1] 10https://www.wikidata.org/wiki/Help:Merge [11:23:06] Any idea why Q174385 is not showing up in https://w.wiki/3srd ? [11:27:30] because Q174385 is not an direct instance of Q11424. You need to include its subclasses too: https://w.wiki/3srf [11:31:17] acagastya‎: (you could also drop the "instance of movie" constraint and get equally interesting results, but this is changing semantics: https://w.wiki/3srh) [11:32:23] Wait, I pasted the wrong link. [11:33:02] No, that seems correct. [11:33:48] Subclasses is where it becomes too hard for me to make any sense. [11:34:01] Thanks, haansn08. [13:20:26] hello [13:20:48] is there a way of bypass time out? https://query.wikidata.org/#SELECT%20DISTINCT%20%3Fproperties_for_this_type%20%3Fproperties_for_this_typeLabel%20WHERE%20%7B%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%20%20%7B%20%3Fitem%20wdt%3AP3984%20%3Fsubreddit.%20%7D%0A%20%20%7B%20%3Fitem%20wdt%3AP31%20%3Finstance_of.%20%7D%0A%20%20%7B%20%3Finst [13:20:48] ance_of%20wdt%3AP1963%20%3Fproperties_for_this_type.%20%7D%0A%7D%0ALIMIT%2014 [14:00:21] is there a workaround for timing out for qurrey for to the sparql? [14:00:29] other then limmit [14:36:09] ping nikki [14:39:07] town in China again has become a subclass of person: -> town -> community -> social group -> group of humans -> human -> person :/ [14:57:09] katpatuka: I've undone the edit with a message suggesting that the user ask on the project chat page if they need help [14:58:50] nikki: thanks ;) - er- which edit (Q-id) was it exactly? [14:59:30] https://www.wikidata.org/w/index.php?title=Q16334295&action=history [15:02:26] nikki: ok [15:27:14] https://query.wikidata.org/#SELECT%20%20%3Fproperty%20%20%20WHERE%20%7B%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%20%20%7B%20%3Fitem%20wdt%3AP856%20%3Fofficial_website.%20%7D%0A%20%20%3Fitem%20%3Fproperty%20%3Fvalue%20.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22%20%7D%0A%7D [15:27:19] i think i found a bug [15:30:52] I don't understand what you're trying to do. what happens for you, and what were you expecting to happen instead? [15:32:10] also no, there isn't a way to bypass the time limit - if there were, everyone would want to do it :) [15:33:10] sometimes the query can be changed to be more efficient, but some queries are too resource-intensive [15:37:17] expected output would be list of properties that are use the property official website. actual output JSON INPUT ERROR [15:37:51] sorry i mean "Server error: Unexpected end of JSON input" [15:40:17] for a list of properties which have an "official website" statement, you could do something like https://w.wiki/3sup [15:40:20] is there a way of splitting one query into multiple smaller queries to avoid timeout? if yes then how? [15:42:48] (if that's not what you wanted, then I'm still not sure what you want to do) [15:48:17] there's more than one way to split queries into multiple parts, e.g. using "with" or "union" (there are some examples using those on the example queries page), but you have to be able to split it in some way, and it's not guaranteed to be faster [15:49:37] I want to get all the properties that have sibling relationships to want to get all the properties that have sibling relationship too he official website. if an object has an official website, what other properties does it have. I want to do the same for subreddit, Reddit username, twitter user names, , and other social media websites, but I will do them in different qrrey. [15:50:32] or would it be possible to download certain parts of wikidata to be used locally [15:53:49] do you want to find which other properties a single item has, or which properties are used in general? [15:56:08] it is possible to download the data dumps (see https://www.wikidata.org/wiki/Wikidata:Database_download) but they're *huge* [15:56:37] i want to find other properties for a select group of items [16:02:30] as in the group of items that have official website. [16:04:25] that's a huge group of items, there's over 1.4 items with an official website statement, so I don't think you'll be able to do that via sparql [16:04:33] err... 1.4 million items [16:07:06] oh, https://wdumps.toolforge.org/ might help you export smaller dumps, I've never used it but it can apparently filter by whether the item has a particular property [16:07:18] could I get around it by making multiple queries with offset and limit? [16:08:08] thx [16:08:13] i will give it a try [16:08:52] there might be a way to use offset and limit, but I'm not sure how [16:13:52] does wdumps have a rest api? [16:15:38] I'm also wondering if we have that data somewhere already... the property suggester uses which properties are used together as part of how it generates suggestions, but I'm still looking to see if I can find anything [16:20:26] oh! I think this is the data https://github.com/wmde/wbs_propertypairs [16:20:33] and https://github.com/Wikidata-lib/PropertySuggester-Python seem to be the scripts which generate it [16:22:02] oh, that's an old repo, https://github.com/wikimedia/wikibase-property-suggester-scripts seems to be the current one [16:22:27] and I don't know if there's an api for wdumps, sorry [16:37:58] does the GraphiQL have the same time limmts?