Fork me on GitHub

Wikimedia IRC logs browser - #wikimedia-cloud

Filter:
Start date
End date

Displaying 97 items:

2024-04-29 08:50:10 <wm-bot> !log bsadowski1@tools-bastion-13 tools.stewardbots Restarted StewardBot/SULWatcher because of a connection loss
2024-04-29 08:50:13 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL
2024-04-29 09:53:29 <wm-bb> <Mennolt> I'm trying this now. When doing this without options I get `Could not find a public_html folder or a .lighttpd.conf file in your tool home.`
2024-04-29 09:53:29 <wm-bb> <Mennolt> When putting in the options used in the My first NodeJS OAuth tool, I get into a new seperate shell, and then when I get to the "start the webservice" point, it says the webservice command doesn't exist. I tried just doing 'start' but that also doesn't exist (re @bd808: It sounds like you may have missed running webservice (some options here) start on
2024-04-29 09:53:30 <wm-bb> login.toolforge.org to actua <clipped message>
2024-04-29 09:53:31 <wm-bb> <Mennolt> lly start the ser...)
2024-04-29 10:02:49 <wm-bb> <Mennolt> Ah I found the page https://wikitech.wikimedia.org/wiki/Help:Toolforge/Node.js and it looks like I need to put my code into a different folder and also I need to write a file named server.js which ... must open the file that I actually want to display on the server? I'm not certain about that
2024-04-29 10:09:57 <wm-bb> <Mennolt> also if I, as the tutorial suggests, run a git clone inside my home/www/js folder, the resulting package.json also doesn't seem to end up in the right folder, right? should I be supplying specific options to git clone to make it put the code right into the current folder rather than in a child folder named after the repo?
2024-04-29 10:33:18 <wm-bb> <Mennolt> alright my webserver is now serving hello world at all URLs that start with the name of my tool (see eg. https://shex-validator.toolforge.org/). Now I want it to serve a specific html file in my repo instead. I assume I need to edit my server.js for this but idk what it should be, does anyone else here know?
2024-04-29 10:39:37 <dhinus> Mennolt: there are multiple ways to achieve that, you could e.g. use ExpressJS, see https://expressjs.com/en/starter/static-files.html and all the "Getting Started" section in their docs
2024-04-29 10:40:41 <wm-bb> <Mennolt> Thanks dhinus! I'll take a look at that
2024-04-29 10:40:59 <dhinus> you can also take a look at the next section in the Node.js wikitech page: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Node.js#Deploying_a_Vue_JS_Application_using_Node_JS_and_Vite
2024-04-29 10:42:34 <wm-bb> <Mennolt> I had issues with step 3 of that section because at the moment my dependency tree is broken, but the stuff I need to work for my tool works anyways so I've been ignoring it
2024-04-29 13:02:51 <wm-bb> <Mennolt> Thanks for the help, I've now got my page up and running: https://shex-validator.toolforge.org/packages/shex-webapp/doc/shex-simple-improved.html
2024-04-29 13:06:09 <dhinus> Yay!
2024-04-29 13:08:20 <wm-bb> <Mennolt> I'll probably spend the rest of today changing the code that I uploaded to make more sense for wikidata (eg. changing the defaults away from my specific wikibase cloud instance), and then tomorrow I can spread the news of the launch
2024-04-29 13:46:10 <rdrg109> [Question] I am learning how to query Wikidata data from Quarry. I have noticed that "logging" is a table in the database "wikidatawiki_p", that table is documented at https://www.mediawiki.org/wiki/Manual:Logging_table . I have also noticed that there exists "logging_logindex", I couldn't find the documentation page of this table at https://www.mediawiki.org/wiki/Manual:Database_layout ,
2024-04-29 13:46:12 <rdrg109> I checked their columns using "DESCRIBE <<table>>" and they seem to have the same columns. My question is: What is the difference between the tables "logging" and "logging_logindex"?
2024-04-29 13:47:16 <taavi> rdrg109: https://wikitech.wikimedia.org/wiki/Help:Wiki_Replicas/Queries#Alternative_views
2024-04-29 14:13:50 <wm-bot> !log anticomposite@tools-bastion-13 tools.stewardbots SULWatcher/manage.sh restart # SULWatchers disconnected
2024-04-29 14:13:52 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL
2024-04-29 14:35:52 <Baskerville> !log wikisp Add Davevzla as project user
2024-04-29 14:35:54 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikisp/SAL
2024-04-29 15:27:40 <dcaro> !status doing an incidence response drill, nothing affected
2024-04-29 15:32:08 <dcaro> !status OK
2024-04-29 15:58:58 <rdrg109> taavi: Thanks!
2024-04-29 16:50:08 <Rook> !log paws k8s to 1.26 T326985
2024-04-29 16:50:11 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL
2024-04-29 16:50:11 <stashbot> T326985: Test PAWS on k8s 1.25 - https://phabricator.wikimedia.org/T326985
2024-04-29 17:54:23 <Rook> !log paws jupyterlab to 4.1.8 T363596
2024-04-29 17:54:27 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL
2024-04-29 17:54:27 <stashbot> T363596: jupyterlab to 4.1.8 - https://phabricator.wikimedia.org/T363596
2024-04-29 17:54:41 <Rook> !log paws upgrade pywikibot T363131
2024-04-29 17:54:43 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL
2024-04-29 17:54:44 <stashbot> T363131: New upstream release for Pywikibot - https://phabricator.wikimedia.org/T363131
2024-04-29 18:07:30 <wm-bot> !log bd808@tools-bastion-12 tools.gitlab-webhooks Built new image from 1a65b00c
2024-04-29 18:07:32 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.gitlab-webhooks/SAL
2024-04-29 18:08:29 <wm-bot> !log bd808@tools-bastion-12 tools.gitlab-webhooks Restarted to pick up new image
2024-04-29 18:08:30 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.gitlab-webhooks/SAL
2024-04-29 18:11:29 <Sohom_Datta> A few of my tools use a VueJS frontend + python backend https://gitlab.wikimedia.org/toolforge-repos/wsstats is probably the most sane
2024-04-29 18:11:50 <Sohom_Datta> Feel free to steal any code you want
2024-04-29 18:23:57 <rdrg109> [Question] I am learning how to query Wikidata data on Quarry. I have written some queries: Get number of statements on a Wikidata item given its QID: https://quarry.wmcloud.org/query/82372 Get all revisions on a Wikidata item given its QID: https://quarry.wmcloud.org/query/82373 . Now, I want to write a query that gets all the statements that currently exist on a Wikidata item given its
2024-04-29 18:23:59 <rdrg109> QID. Does anyone know how to do this?
2024-04-29 18:27:55 <wm-bb> <Mennolt> I happen to have a query laying around that I think fits your needs:
2024-04-29 18:27:56 <wm-bb> <Mennolt> ```
2024-04-29 18:27:58 <wm-bb> <Mennolt> SELECT ?item ?p_property ?p_propertyLabel ?statementLink ?simplevalue ?simplevalueLabel
2024-04-29 18:27:59 <wm-bb> <Mennolt> WHERE
2024-04-29 18:28:01 <wm-bb> <Mennolt> {
2024-04-29 18:28:02 <wm-bb> <Mennolt> wd:Q1065 ?property ?statementLink .
2024-04-29 18:28:04 <wm-bb> <Mennolt> ?statementLink ?simplevalueLink ?simplevalue .
2024-04-29 18:28:05 <wm-bb> <Mennolt> wd:Q1065 ?propdirect ?simplevalue.
2024-04-29 18:28:07 <wm-bb> <Mennolt> wd:Q1065 rdfs:label ?item.
2024-04-29 18:28:08 <wm-bb> <Mennolt>
2024-04-29 18:28:10 <wm-bb> <Mennolt>
2024-04-29 18:28:11 <wm-bb> <Mennolt> #find property label (thanks to tagishsimon)
2024-04-29 18:28:13 <wm-bb> <Mennolt> ?p_property wikibase:claim ?property .
2024-04-29 18:28:14 <wm-bb> <Mennolt>
2024-04-29 18:28:16 <wm-bb> <Mennolt> #find only properties & values with the right namespace
2024-04-29 18:28:17 <wm-bb> <Mennolt> FILTER(STRSTARTS(STR(?propdirect), STR(wdt:)))
2024-04-29 18:28:19 <wm-bb> <Mennolt> FILTER(STRSTARTS(STR(?property), STR(p:)))
2024-04-29 18:28:20 <wm-bb> <Mennolt> FILTER(STRSTARTS(STR(?simplevalueLink), STR(ps:)))
2024-04-29 18:28:22 <wm-bb> <Mennolt>
2024-04-29 18:28:23 <wm-bb> <Mennolt> SERVICE wikibase:label { bd:serviceParam wikibase:language "en". } # Helps get the label in your language, if not, then en language
2024-04-29 18:28:25 <wm-bb> <Mennolt> #only get English language item names
2024-04-29 18:28:26 <wm-bb> <Mennolt> FILTER(LANGMATCHES(LANG(?item), "en"))
2024-04-29 18:28:28 <wm-bb> <Mennolt> }```
2024-04-29 18:28:29 <wm-bb> <Mennolt> Simply replace Q1065 with your QID
2024-04-29 18:29:36 <rdrg109> Mennolt: Thanks for the help. Since I am learning how to query Wikidata data on Quarry. I'm more interested on learning how to do it with SQL not with SPARQL. I am just being curious here. I know that the same data can be obtained conveniently using SPARQL.
2024-04-29 18:31:43 <wm-bb> <Mennolt> ah I see
2024-04-29 18:31:53 <wm-bb> <Mennolt> I have no experience with that
2024-04-29 18:50:39 <rdrg109> Mennolt: No problem. Thanks for the help.
2024-04-29 18:50:41 <rdrg109> For the record, I have created a topic with that question in Talk:Quarry here: https://www.mediawiki.org/wiki/Topic:Y3u6dz3ci6eqlura
2024-04-29 19:08:19 <wm-bb> <lucaswerkmeister> rdrg109: it’s basically not doable. it’s best to use SPARQL instead
2024-04-29 19:30:48 <rdrg109> lucaswerkmeister: Ok, thanks for the help!
2024-04-29 19:35:59 <rdrg109> I still have a question though: How does Wikibase is able to know the existing statements on a given Wikibase item? I presume Wikibase stores that information somewhere in a table in a database. Isn't that information stored in any of the tables named wb_* or wbt_*?
2024-04-29 19:44:56 <rdrg109> ^ For the record, I opened a topic for this question in Talk:Wikibase: https://www.mediawiki.org/wiki/Topic:Y3u994hnsgkbgbz0
2024-04-29 19:46:26 <wm-bb> <lucaswerkmeister> it’s stored in a table that’s not available on the replicas
2024-04-29 19:46:56 <wm-bb> <lucaswerkmeister> see https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#Unavailable_tables, the `text` table is the one with all the page contents
2024-04-29 20:06:22 <bd808> rdrg109: the part that may not be obvious here is that a Q and it's statements are stored natively in MediaWiki as a json blob. That json blob lives in the "text" table. This is the same way that the wikitech source for a wikipedia article is stored. The Wiki Replica databases do not provide access to the "text" table basically because we do not have the storage space to do so.
2024-04-29 20:06:46 <bd808> *the wikitext source
2024-04-29 20:07:25 <rdrg109> lucaswerkmeister, bd808: Thanks for the help!
2024-04-29 20:07:55 <bd808> if you did have access to the text table you would not get much value from it in SQL as it is not a structured table. It is really just a blob with a bit of metadata
2024-04-29 20:09:52 <wm-bb> <lucaswerkmeister> yes, pretty much
2024-04-29 20:10:26 <wm-bb> <lucaswerkmeister> for “gets all the statements that currently exist on a Wikidata item” it would be *kinda* usable, as that would just be one JSON blob (though decoding it would still be a pain)
2024-04-29 20:10:53 <wm-bb> <lucaswerkmeister> but things like “get all the items with a particular statement” are only doable in SPARQL, the SQL database schema just isn’t optimized for that at all
2024-04-29 20:31:41 <rdrg109> bd808, lucaswerkmeister: I want to know how a JSON blob looks like for a given item (perhaps, Q8487081 which contains very few statements and a few sitelnks) in the table "text" in Wikidata. Do you happen to know which database dump contain this information? I downloaded the smallest file from one of the mirrors called
2024-04-29 20:31:43 <rdrg109> wikidatawiki-20240401-pages-articles-multistream27.xml-p118185874p119393781.bz2 and saw the top of the file using $ head -n 10 because it is 12G, but all I can see is XML objects that describe a single revision.
2024-04-29 20:34:12 <wm-bb> <lucaswerkmeister> rdrg109: the <text> in there should be the JSON
2024-04-29 20:34:17 <wm-bb> <lucaswerkmeister> (but with tons of horrible `&quot;` in it)
2024-04-29 20:34:30 <wm-bb> <lucaswerkmeister> at least that’s what I see in https://dumps.wikimedia.org/wikidatawiki/latest/wikidatawiki-latest-pages-articles-multistream27.xml-p118185874p119393781.bz2
2024-04-29 20:35:16 <wm-bb> <lucaswerkmeister> you can also get it from the API via the non-Wikidata-aware core actions https://www.wikidata.org/w/api.php?action=query&titles=Q42&prop=revisions&rvprop=content&rvslots=main&formatversion=2
2024-04-29 20:35:32 <wm-bb> <lucaswerkmeister> but please only look at this for curiosity and don’t use it, this format isn’t stable and has changed in the past :)
2024-04-29 20:36:21 <rdrg109> lucaswerkmeister: Got it. Thanks for the help!
2024-04-29 20:36:42 <wm-bb> <lucaswerkmeister> (`action=wbgetentities` and `Special:EntityData` is how you get it in a stable format :))
2024-04-29 20:43:10 <bd808> rdrg109: I think one of the things you may be learning today is why we have poured so much time and energy into WDQS. It really is a magically transformative thing added to the core capabilities of Wikibase. Wikibase itself takes care of the very interesting bit of managing and applying a community curated ontology. WDQS then takes the data from that model and gives it new capabilities for finding correlations and connections.
2024-04-29 21:57:46 <wm-bot> !log bd808@tools-bastion-12 tools.gitlab-webhooks Restart to pick up PYTHONUNBUFFERED=1 envvar
2024-04-29 21:57:48 <stashbot> Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.gitlab-webhooks/SAL

This page is generated from SQL logs, you can also download static txt files from here