[08:19:44] !log tools.apt-browser Updated from 51e89bc to a98c617 [08:19:47] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.apt-browser/SAL [08:35:44] !log clouddb-services revoke puppet certs for now-deleted clouddb1003/4 (osmdb replicas) T323159 [08:35:48] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Clouddb-services/SAL [08:35:48] T323159: Shut down osmdb.eqiad.wmnet (clouddb100[3-4])? - https://phabricator.wikimedia.org/T323159 [09:25:40] !log tools hard-reboot the 3 k8s control nodes [09:28:36] arturo: Unknown project "tools" [09:28:37] arturo: Did you mean to say "tools.tools" instead? [09:28:56] no, I meant "tools" :-( [09:28:58] !log tools hard-reboot the 3 k8s control nodes [09:29:00] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [09:32:57] !log tools.stewardbots Restarted StewardBot stucked on IRC [09:33:00] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL [12:12:26] Can anyone provide me a tutorial to host react page on toolforge? [12:32:16] do I need to do anything special to receive tools.*@toolforge.org emails for tools I maintain? [12:32:33] https://wikitech.wikimedia.org/wiki/Help:Toolforge/Email#Mail_to_a_Tool sounds like it should just work if I don’t create any special .forward* files [12:32:47] but I sent some test email to tools.quickcategories earlier today and so far they haven’t arrived afaict [12:34:08] dammit, nevermind, they arrived and went to my junk folder. sorry for the noise >.< [12:34:23] (I remembered to check my mail server logs about two seconds after sending my first messages) [16:02:35] Thanks for the help yesterday with exploring the Wikidata database. I now have a new question: How can I get the page text from the database? Reading the documentation, `content_address` should have `tt:` where `` is the `old_text_id` in the `text` table, but I do not see the text table in the database [16:02:38] am I lost here? [16:03:27] page text is not available via the replicas, you would need to use the API for that [16:05:50] RPI2026F1: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#Unavailable_tables [16:06:53] I wanted access to the page text since there seems to be no easy way to query all statements for an item [16:07:16] Guess I'm going to have to download a dump if I want to find items without statements [16:07:38] Or make SPARQL queries? [16:07:54] several options for finding items without statements were mentioned yesterday [16:07:57] * bd808 does not know how to do this, but it seems like it should be pssible [16:08:54] Sorry, but I was unable to see those messages as I had classes and had to move between them [16:09:12] SPARQL lags out every time I try it [16:09:44] you likely need to narrow the scope of your query. SQL would probably have similar issues [16:10:09] you can probably find the messages in the logs https://wm-bot.wmcloud.org/browser/index.php?start=04%2F03%2F2023&end=04%2F04%2F2023&display=%23wikimedia-cloud [16:10:21] maybe you can share one of the queries you tried? [16:10:37] I want to do a filter by label starts with, but the problem is that I haven't been able to find out how to do that, and I think I once found an example but it would time out [16:11:28] SQL might be the best option for that, terms tables for the label and page_props for wb-claims=0 [16:11:42] (assuming the label prefix is reasonably selective, i.e. there aren’t too many labels with that prefix) [16:12:00] that's the thing, I wanted to find all items with "Category:%" [16:12:16] that have no statements [16:12:20] that's a big table scan :) [16:12:23] Because I see a lot of these in recent changes [16:12:47] Probably the best way to do this is to actually obtain a dump now that I think about it [16:13:56] hm, searching for "inlabel:Category -haswbstatement:*" only finds 210 results, that feels like not enough [16:14:18] (haswbstatement is limited, as was mentioned yesterday, but in this case it should be fine since I assume you expect at least a P31 statement) [16:14:31] that's correct [16:15:28] I think for categories it might also make sense to search for sitelinks (wb_items_per_site table) rather than labels [16:16:14] I still haven't figured out the query for getting items without descriptions in a language yet [16:16:21] like provided I have the item ID [16:17:13] I tried this SPARQL query: ```SELECT ?item ?itemLabel WHERE { [16:17:14]   SERVICE wikibase:label { bd:serviceParam wikibase:language "en". } [16:17:14]   ?item wikibase:statements 0 . [16:17:15]   FILTER(strStarts(?itemLabel, 'Category:'^^xsd:string)) [16:17:15] }``` [16:17:19] but it isn't returning anything [16:19:09] yeah, variables from the label service can’t be used in FILTER(), the label service runs very late in the query [16:19:32] That explains why it never worked [16:19:38] this works but is very slow (35 seconds for 10 results): https://w.wiki/6XyY [16:20:38] I was thinking of incrementally doing this (getting all items with category, then filtering by description, then by statement count) [17:23:34] !log admin resetting all three rabbitmq nodes and restarting all openstack services as per https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Rabbitmq#Resetting_the_HA_setup [17:23:38] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Admin/SAL [19:41:20] got a few "Invalid image? Could not read image dimensions for file: /mnt/nfs/labstore-secondary-tools-project/croptool/public_html/files/f1d3185ac920a2900f024ce4a7896b8ba0636f13.jpg. Refreshing the page might help in some cases. [19:41:39] for croptool [19:44:16] danmichaelo: ^ stemoc is reporting some possible cache corruption in croptool [19:52:24] !log tools.croptool Deleted $HOME/public_html/files/f1d3185ac920a2900f024ce4a7896b8ba0636f13.jpg based on IRC error report. File was ascii text error message. [19:52:27] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.croptool/SAL [19:52:45] stemoc: I deleted the file. Maybe things will work if you try again? [19:53:43] nope, happened to 2 other images too, odd.. [19:56:09] stemoc: https://wikitech.wikimedia.org/wiki/User:Danmichaelo is the maintainer of croptool if you want to try and get their attention. [19:56:45] Per the tool's UI, https://github.com/danmichaelo/croptool/issues is the bug tracker [20:33:14] hasn' t fixed anything there since july 2022 lol [23:33:09] Is there a traffic stats page or tool to see what, if any, traffic a tool is getting? [23:37:12] SQLite: there is no pretty graphing UI (#someday), but https://toolviews.toolforge.org/api/ has the raw request data for all things Toolforge. [23:37:55] * bd808 should try to nerd snipe someone into putting a UI on that again [23:38:36] bd808: tyvm, I can make use of that! Debating on continuing maintaining a tool, but I don't *think* it's been touched in years. [23:41:52] bd808: I'm sorry to ping repeatedly. Even the example returns a 404. https://toolviews.toolforge.org/api/v1/unique/tool/toolviews/daily/2020-03-01/2020-03-31 [23:42:45] well that's not very helpful... [23:42:47] * bd808 looks [23:51:53] T333215 [23:51:53] T333215: Toolviews /daily API call returns 404 - https://phabricator.wikimedia.org/T333215 [23:52:10] SQLite: something I'm not quite understanding seems to be wrong with both the raw and unique daily range endpoints. The single day ones seem to be working? [23:53:11] JJMC89: yeah. that seems to be the deal. [23:53:45] Oh, my bad! Didn't know it was a known issue. TYVM JJMC89 / bd808 [23:53:59] I didn't know it was known either ;) [23:54:03] heh [23:54:34] I wrote the damn thing, but very many days ago