[05:05:11] maybe this is a silly question, but for SMTP (https://wikitech.wikimedia.org/wiki/Help:Toolforge/Email#Sending_via_SMTP), is there supposed to be like, an authentication step? [05:11:02] part 2: is there a way I can do local testing with the SMTP server, like using an SSH tunnel? [14:06:07] legoktm: for part 1: no :( [14:06:46] part 2: in theory yes, although you're better off using something like https://mailcatcher.me/ [14:08:12] nice tool [14:08:51] oooh, TIL [14:09:28] thanks, I'll expand the docs a bit later [14:29:51] Do we have access to other metrics not directly in the `Kubernetes namespace` dashboard? [14:47:48] !log tools.stewardbots Deploy 342e464 [14:47:51] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL [14:53:02] !log tools.toolschecker $ qdel test-long-running-stretch [14:53:04] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.toolschecker/SAL [15:03:46] chicovenancio: do you have any specifics in mind? you can find all the stats directly on prometheus: https://tools-prometheus.wmflabs.org/tools/classic/graph [15:11:54] awesome. That does solve my issue. I'm looking at cpu throtling metrics to decide if I should increase requests/limits for a tool [15:15:05] feel free to open a task to request adding it to the dashboard if you find it useful 👍 [17:24:32] !log tools reboot tools-sgeexec-10-14 [17:24:35] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [17:45:56] Hi, why is this query returning 1373 rows: https://quarry.wmcloud.org/history/22525/831311/806949 [17:46:02] But this one returns only 308 https://quarry.wmcloud.org/query/22525 [17:46:07] I'm guessing that one of them is wrong, but which one is exactly wrong? :) [17:46:35] It has to return all redirects in Serbian Wikipedia under NS_CATEGORY and NS_CATEGORY_TALK? [17:47:22] (the last message is not a question :D) [17:48:14] I also tried to separate them. One query for NS_CATEGORY, and second for NS_CATEGORY_TALK. First returns 303, and second one 308. [17:48:26] Sorry, first 303 and second 5. = 308. [17:48:44] What's confusing me exactly is why https://quarry.wmcloud.org/history/22525/831311/806949 is returning 1373. [17:48:53] per https://mariadb.com/kb/en/operator-precedence/ `page_is_redirect=1 and page_namespace=14 or page_namespace=15` in the first query is interpreted as `(page_is_redirect=1 and page_namespace=14) or page_namespace=15` which does not seem like what you want? [17:49:16] ^ [17:49:59] What I wanted originally is and page_namespace=14 and page_namespace=15, but I think that separating them is better. [17:50:40] Like I did: [17:50:41] https://quarry.wmcloud.org/query/27051 [17:50:43] did you try `page_is_redirect=1 and (page_namespace=14 or page_namespace=15)`? [17:50:45] https://quarry.wmcloud.org/query/31780 [17:51:07] Lofhi: Yup, it's here. https://quarry.wmcloud.org/query/22525 [17:51:34] Joined half-way, this is not what you want? [17:52:14] I think I'll just stick to https://quarry.wmcloud.org/query/27051 and https://quarry.wmcloud.org/query/31780 [17:52:25] It should be more accurate. :) [17:52:41] You want one request merging the two? [17:52:52] I originally wanted to have one query for both. [17:53:06] It's alright, I'll just use separated ones, thank you! [17:53:20] So the final result you want would be 303 rows? [17:53:26] 308* [17:54:44] Hi! Who should I poke to get the toolsadmin membership requests checked out? [17:54:49] I guess so, I was confused why it was returning 1373 here https://quarry.wmcloud.org/history/22525/831311/806949 [17:55:06] oh okay; cause this one looks fine https://quarry.wmcloud.org/query/22525 [17:55:57] and taavi found your answer: operator precedence/boolean algebra :P [17:56:11] easy to mess it when not explicit [17:56:36] Understandable, thanks! [17:56:58] by the way [17:57:12] if you want to throw out the CASE, you could surely join the categories [17:57:58] Honestly, ChatGPT gave me that, because I want(ed) to have namespace name appended. [17:58:05] to reduce the numbers of rows analysed by the db engine [17:58:10] wait a second [17:58:11] Like Category:Name or Category_talk:Name [17:58:19] here: https://sql-optimizer.toolforge.org/ [17:58:46] but seems good enough anyway [17:58:49] :-) [17:59:08] Yup, sql-optimizer isn't returning me any suggestions. [17:59:39] another thing [17:59:50] I think Quarry is getting deprecated, or something like that [18:00:13] You can use https://superset.wmcloud.org/ too in replacement [18:00:32] https://phabricator.wikimedia.org/T169452 [18:00:47] ^ :) [18:01:15] I didn't read the details, I just gave you the link to save the queries, I don't know if they can export them or something like it [18:02:56] Thank you so much! Superset looks great to me, although I just have some suggestions at this moment (to add more options for downloading results like Quarry has), but that's alright. [18:03:53] Don't know the tool [18:04:01] But I'm sure you can download them at least [18:04:24] I see CSV, Json, Image, Excel [18:04:43] I looked a bit the API last week [18:04:45] I mean that Quarry has all that options, but Superset has only CSV. [18:05:03] I think you could even request with the API to execute you saved query and fetch the results, but NOT sure [18:05:11] your* [18:06:58] Kizule, in the query lab, save it as dataset, then you can download in various formats [18:07:29] it opens a chart view with your new data source [18:08:15] And you can just overwrite the data source independently of the data view I think [18:08:23] When you need to edit the query [18:08:34] Can even switch I guess... [18:09:25] Ah nah, it just uses the query as the data source of the dataset :P [18:09:52] Seems a good tool to do ETL on wiki data [18:10:49] the big pro is the dashboard sharing, it could be used in a lot of ways, even for the communities [18:11:23] with quarry, you could only share ugly rows for people who don't handle these tools very well, not very useful or expressive [18:14:36] Lofhi: Oh, thanks for the tips! Let me give it a try. [18:16:48] Nah, it's asking to sign in. Quarry can display results no matter what. [18:19:12] how sad [18:20:32] maybe they could exclude the authentication middleware on this set of pages, no idea, a bit sad [18:21:09] i know a wikiway: create a Lux module eating the Superset JSON data [18:21:14] *evil laughs* [18:21:18] Lua* [18:23:55] It's alright, I'm sure that Superset will get more improvements once work with moving from Quarry to Superset actually starts. :D [18:30:31] Currently reading https://wikitech.wikimedia.org/wiki/Help:Toolforge/Node.js I want to be sure that the recommended deployment way is the one described in the "Kubernetes Configuration"?