[09:52:28] I'm seeking reviewers for these relatively straightforward changes if anyone has time https://gerrit.wikimedia.org/r/c/operations/puppet/+/977593 https://gerrit.wikimedia.org/r/c/operations/puppet/+/977594 [09:58:13] godog: looking [09:58:23] thank you taavi <3 [10:02:15] godog: done! [10:03:52] thank you [10:11:24] thank you volans too! [10:11:46] np :) [12:41:40] this may be me being silly, but in case someone gets bitten by this- do not put quotes on sudoers commands- sudo flattens them out and then it fails because it is not the same as executed (e.g. for nrpe_command) [13:51:22] Hi Everyone, [13:51:23] I'm a engineer in Wikimedia Enteprise and looking to find the team of SRE's regarding Wikidata project. [13:51:23] Anyone can point me in the right direction? [13:51:24] thanks [13:56:05] REsquito-WMF: there is no team for Wikidata, we get divided by vertical slice. Maybe mention what kind of help you need or send a ticket and the person on clinic duty (on topic) may guide you [13:58:37] yes, it depends on what you're looking for [13:59:24] jynus Thanks! [13:59:25] I'm looking into hitting their API's, to get JSONLD, atm i'm getting around 500ms.... [13:59:25] Question is if we make it part of our product and start doing 20 - 60 calls per second against their api's, will they hold? Any data on current traffic patterns? [13:59:47] ok, with that data you will need to talk to 2 teams, probably [13:59:53] traffic and service ops [14:00:03] REsquito-WMF: the WDQS APIs? [14:00:20] No, the mediawiki wikidata apis [14:00:31] 500ms? Oo [14:00:35] What's the call? [14:01:04] api 1: https://www.wikidata.org/wiki/Special:EntityData/%s.jsonld [14:01:04] api 2: https://www.wikidata.org/w/api.php?action=wbgetentities&props=sitelinks&ids=%s&sitefilter=enwiki&format=json [14:01:14] api 1 is taking us sometimes 500 ms [14:02:03] What's in %s? [14:02:22] wikidata qid...let me give you an example [14:02:26] ty [14:02:48] https://www.wikidata.org/wiki/Special:EntityData/Q1508958.jsonld [14:04:27] REsquito-WMF: when using RDF outputs of Special:EntityData add "&flavor=dump" to avoid dumping linked properties and entities [14:05:12] gotcha, will give it a go [14:06:43] REsquito-WMF: in any case, consider opening a ticket for communication, even if you don't require anything from us- that way people from different tz can chime in if necessary [14:08:46] Will do, can you point me to the right place to create it? [14:08:48] thanks [14:13:34] REsquito-WMF: I believe that sitelinks should be the jsonld output so there should be no need to call wbgetentities again the fetch enwiki sitelinks? [14:14:06] if you have already one with what you are working, you can just add #traffic and #serviceops and clarify wanting advice/awareness regarding the api [14:14:49] or just create a new one with a summary of what you want to do at https://phabricator.wikimedia.org [14:15:26] we usually create one when people from amazon or others notify us [14:16:03] jynus: tbh, that's search api, so while we at serviceops would be more than happy to assist if it's an appserver slowdown issue or something to do with mw-on-k8s, if it's due to the api call itself, #wikidata (dcausse can correct me) should be tagged aswell [14:16:05] correct, i don't need that api call. Will adapt our stuff to do less api calls. [14:16:13] claime: oh, sorry [14:16:32] no worries [14:16:41] @REsquito-WMF see what claime said, I didn't had the context [14:17:33] to make it clear, i'm looking to get an assessment, in terms of load. To see if i can integrate it in the future without bringing it down. The dumps flavour dropped call to 270ms, i can follow up that on another time [14:18:08] yeah, for load traffic is the first point of contact and probably they will contact someone else down the pile [14:18:11] on the appserver side, 20/60 calls/s we can take it [14:18:27] REsquito-WMF: feel free to ping us (#Discovery-Search in phab), we do call Special:EntityData to fetch the RDF output after every edit and I bet your use-case is very similar [14:19:15] Thanks claime and jynus [14:19:15] Will do dcausse Thanks [14:19:44] REsquito-WMF: one imortant thing is to follow https://www.mediawiki.org/wiki/API:Etiquette [14:20:05] hopefully you are aware of that, but a ticket for more specifics [14:21:15] Will have a look, Thanks jynus [14:24:29] in general, if you need to contact SRE and it is not an emergency, tagging with #SRE means that that the clinic duty person will triage and send it to the right person, without needing to know who to contact :-D [14:26:46] Thanks :) [14:43:00] the unknowns are mine, are known and should be fixed (hopefully) soon [16:02:19] sukhe: ok to merge Ssingh: dns4003: remove dns4003 from authdns_servers for reboot (87dbd64b56) :? [16:02:47] yes please [16:02:53] there was a lock so couldn't merge mine [16:03:23] thanks [16:03:51] puppet-merge seems to be running slower than usual [16:03:55] done sukhe [16:05:26] <3 [19:10:13] jbond: still around? [19:20:49] Puppet run failed for `prometheus6002.drmrs.wmnet,prometheus5002.eqsin.wmnet,prometheus3003.esams.wmnet,prometheus4002.ulsfo.wmnet` with message about CR https://gerrit.wikimedia.org/r/c/operations/puppet/+/976273 from the 22nd, does puppet still need to be disabled on these hosts? [20:46:04] mutante: remind me how to login to rt, to view old tickets? [21:13:03] jhathaway: the username and password is in pwstore, secret file "rt" [21:13:18] moritzm: thanks [22:00:51] jhathaway: extra info. also try adding 2000 to the RT ticket number and open that in Phabricator [22:01:16] ah, interesting, thanks [22:01:29] it should work for many (ops tickets), though unfortunately not for all queues