[04:36:44] ive just setup mediawiki with docker but when I goto myip:8080 it reroutes me instantly to localhost:8080 which is a dead end where do I modify this? [04:46:58] "Dear people, I have a possible..." <- See https://www.semantic-mediawiki.org/wiki/Professional_support and https://www.mediawiki.org/wiki/Hosting_services if you want SMW out of the box. Pro.Wiki has a free trial, not sure about the others. [12:45:48] Hmm, is there really no way to set a connection variable for the DB when using a Simple LoadBalancer? [12:48:59] Dear people, I have a possible research project, and I would like to make it based on Semantic MediaWiki. Is there some test hosting that allows to make a test wiki with SMW extension? Or maybe you may advice something else, so I could test if SMW is a suitable platform for the project? Thank you. [12:50:39] (Got an answer from the log file.) https://www.mediawiki.org/wiki/Hosting_services — the table shows there's no free possibility to test SMW. [12:51:49] PMZ: The vanilla mediawiki community and the SMW community tend to be a bit separate, you might get a better answer asking on the semantic mediawiki mailing list or another forum dedicated to them [12:52:17] You could also just run it on your laptop, if all you want to do is test it out to see if its suitable [12:52:21] I looked for SMW chat, yet failed to find. [12:53:10] PMZ: allegedly the official one is https://app.element.io/#/room/#semantic-mediawiki:matrix.org [12:54:12] Got it, many thanks. [13:57:55] Interesting, in my (unrealistic) testing of sqlite, $wgMainCacheType = CACHE_DB seems better than $wgMainCacheType = CACHE_NONE; [15:26:43] Hmm, maybe not after i tried it a second time [15:27:54] I don't really understand what we are doing with IMMEDIATE transaction mode on sqlite. I read through T93097, but i don't get why its a benefit to start transactions as write ones immediately instead of at first write statement. Either way the DB could be locked [15:27:55] T93097: MediaWiki should support SQLite 3.8 (or later) - https://phabricator.wikimedia.org/T93097 [15:30:26] > To recap, SQLite 3.8+ generates an error when a read transaction is upgraded to a write transaction, if another thread holds the write lock. The solution to this is to use BEGIN IMMEDIATE instead of BEGIN in order to force the transaction to be a write transaction from the start [15:30:47] Ah, i just found https://phabricator.wikimedia.org/T89180 which goes into more depth [15:31:24] * Remilia silently wishes all tables in the mediawiki schema had primary keys [15:31:46] Remilia: Do they not at this point, i thought they all got changed to have them [15:31:49] ? [15:32:20] bawolff: some do not [15:32:36] So the answer to my question seems to be, that at least in 3.8.0 sqlite, BEGIN IMMEDIATE waits for the lock, where automatic upgrade of a read transaction to a write transaction does not wait [15:32:39] but keep in mind I am on PostgreSQL, so it might be different in MySQL land [15:33:15] I know that it used to be not the case in mysql-land, but then WMF dbas got annoyed [15:34:27] the most important ones have primary keys now. But plenty don't [15:34:31] as an example: error: the trait bound `user_groups::PrimaryKey: sea_orm::IdenStatic` is not satisfied\nlabel: the trait `sea_orm::IdenStatic` is not implemented for `user_groups::PrimaryKey` [15:34:59] oh hmm, i guess querycache does not (first example i could find) [15:35:23] user_newtalk does not [15:35:32] user_groups is suposed to have a primary key on (ug_user, ug_group) [15:35:45] user_groups, user_former_groups, user_newtalk, transcode, updatelog, searchindex, revision, page, etc. [15:36:12] bawolff: it does have a unique index on that [15:36:37] but not a primary key [15:37:44] revision should have rev_id as a primary key, page should have page_id [15:38:00] oh wait [15:38:00] Remilia: it is supposed to be a primary key https://github.com/wikimedia/mediawiki/blob/master/maintenance/postgres/tables-generated.sql#L335 [15:38:30] page seems fine, I commented it out because it depends on revision or something like that [15:38:31] sorry [15:39:20] bawolff: I wonder if the migrator does not convert uniques to primary keys [15:39:33] the only core tables with no pk in tables.json are querycache, querycachetwo, user_newtalk, and searchindex [15:40:07] AntiComposite: right, now I am starting to think it is something about the updater [15:40:24] does postgres not do compound primary keys [15:40:28] oh it;'s actually better than documented... seems old_image is really the only one without primary key [15:41:04] oldimage is a mess [15:41:14] I bet searchindex is weird because of all the fulltext search stuff [15:41:22] "page_pkey" PRIMARY KEY, btree (page_id) [15:41:38] yeah. that's been on the never-todo list since 2007... [15:41:41] when in the schema bawolff linked it is PRIMARY KEY(pl_from, pl_namespace, pl_title) [15:41:53] this looks weird [15:41:57] I mean [15:42:10] how did my schema end up this different when I used update.php religiously [15:42:20] We did totally change how postgres works not all that long ago, it wouldn't surprise me if update.php doesn't fully update [15:42:36] postgres definitely gets less love than the other db backends [15:42:44] true [15:43:07] even though it is arguably the best one out of the available options :P [15:43:35] poor postgres, now that MS-SQL support is gone, it is least loved :P [15:43:51] its just that none of the developers use postgres themselves much i think. [15:44:08] seting up more than one DB backend is a PITA [15:44:13] wait, I am an idiot and looked at the wrong place in the schema [15:44:22] And i have to setup mysql for wikimedia, so might as well stick with it [15:44:32] yeah, most core devs are targeting WMF prod [15:44:52] * bawolff says that at the same time as spending all day playing with sqlite [15:44:53] I am glad the issues I report (regularly) get fixed (quite fast) [15:45:32] but now I really should investigate why does my schema have so many tables without primary keys [15:46:02] Our default options for sqlite really are quite terrible though [15:47:43] i used to use sqlite a lot for laravel development. was really handy to write your whole app in the db abstraction layer, without having a proper db running, then deploy and stuff just worked in mysql or postgres. [15:48:14] I'm having fun testing it. Mostly i wanted to just learn a bit more about how sqlite works [15:48:34] I kind of wish i had something resembling real traffic to test it on [15:48:51] My current benchmark is running curl in a bash for loop on three terminal windows at the same time [15:49:07] in an example of weirdness: on user_groups I have indices "ug_expiry" btree (ug_expiry), "user_groups_expiry" btree (ug_expiry), but no ug_group [15:58:43] Hmm, i guess the ultimate answer to my question is https://www.mail-archive.com/sqlite-users@mailinglists.sqlite.org/msg90508.html . But our answer of just always holding write locks even when we don't need them seems sad [16:00:34] * Remilia goes through tables and applies ALTER TABLE ADD PRIMARY KEY [16:01:31] oh, I can do drop table profiling? [16:01:45] Although im not sure if all this applies when in WAL mode [16:02:01] Remilia: umm, i think so. I feel like that hasn't been used in a very long time [16:02:12] "The table has been removed in MediaWiki 1.35 gerrit:545308. " [16:02:33] I suspect the last time it was actually used for anything was like mediawiki 1.10 [16:08:47] bawolff: I feel like at least some of my primary index woes are because I am on 1.38 [16:09:15] 1.39 should be coming out pretty soon [16:09:24] aye [16:10:42] and 1.39 alters templatelinks in a major way it seems like [16:12:14] for example tl_target_id is NULL throughout in my table but 1.39 makes it not null, that will be interesting [16:18:54] yeah, that's the beginnings of the linktarget migration [16:20:12] turns out writing "License_template_tag" 88 million times makes the database unhappy [16:25:11] did this just now: https://paste.ee/p/HyJvc [16:25:25] might help it go faster hahaha [16:29:02] I guess i wonder how much of a difference it really makes. 20 bytes vs 8 bytes doesn't seem like that much. Although i suppose multiply by 88 million [16:29:29] and i suppose varbinary might be stored less efficiently than a bigint [16:30:57] MW schema really carries a lot of baggage [16:31:20] and there is a distinct lack of a decent way to check which tables can be dropped [16:31:38] shouldn't update.php drop them [16:32:35] oh I mean extensions-related stuff [16:32:53] oh yeah, that is hard [16:32:57] there are extensions that you cannot drop tables for even after removing the extension in question [16:33:11] (like Flow/StructuredDiscussions) [16:33:16] WMF kept around extensions tables for decades after extensions got disabled in some cases [16:35:18] my primary key complaints were mostly due to using an ORM against the MW schema ahaha, I am writing a small utility that scans the ConfirmAccount tables and shows me which ASNs/networks requests came from so I can mass-blacklist these on my load balancer [16:35:28] If the extension is disabled, I don't see why wouldn't you be able to drop those tables, since no code would be accessing those tables [16:36:40] Vulpix: I phrased it incorrectly in case of Flow, sorry [16:37:14] as you cannot disable Flow without performing some direct database UPDATEs [17:37:16] "I bet searchindex is weird..." <- I imagine searchindex is weird because Wikimedia doesn't use it so no one cares [17:37:33] true dat [17:40:26] tgr_: Its even still set to myisam engine, even though innodb has supported full text search since 5.6.4 (2011) [17:41:23] "We did totally change how..." <- MediaWiki switched to abstract schemas (all schemas for all supported DBs being generated from the same JSON data) around 1.37, postgres support was an utter mess before because of insufficient maintenance. In theory we provide migrations from the old manual schemas to the new automatic ones, but see "insufficient maintenance". I'd probably just try exporting the data, recreating the tables from scratch and [17:41:23] reimporting. [17:42:56] The good thing in the abstract schemas though is that you can check whether your DB has the schema it should have. The abstract schemas are Doctrine based, I think we have some automated tooling for detecting schema drifts, but even if not it should be easy to write. [17:44:13] (There is https://github.com/Ladsgroup/db-analyzor-tools although that's Wikimedia-specific.) [17:52:09] "as you cannot disable Flow..." <- AFAIK you can. You will lose Flow content but that's how it is generally with extensions that define new types of content. There is a migration script to turn it into normal wikitext content, which might or might not work. [17:52:41] It will be less disruptive than disabling content extensions generally is because Flow doesn't use the revision table. [17:54:26] The generic problem with disabling extensions is that you lose the i18n messages which would still be needed for logs, user groups etc. Probably at some point we should create a stub extension which only contains those. [17:55:12] But then what would the commons community complain about? [18:08:11] Extension:ImageMap has a `require` in the composer.json on a specific version of composer/installers. This is throwing an error after I upgraded to 1.39.... (full message at ) [18:11:33] Extension:Variables as well. The php requirement can probably be removed from this one as well. [18:12:49] What error? [18:19:08] bawolff, legitimately or illegitimately? 🙂 [18:20:01] Regarding postgres, https://phabricator.wikimedia.org/T315396 exists. [18:21:02] Izno[m]: https://i.kym-cdn.com/photos/images/original/000/538/731/0fc.gif [18:22:11] > - Root composer.json requires composer/installers ~2.1, 1.*,>=1.0.1, found composer/installers[dev-main, v1.0.0, ..., 1.x-dev, v2.0.0-alpha1, ..., 2.x-dev (alias of dev-main)] but it does not match the constraint. [18:39:08] IM GAY [18:44:46] congratulations [19:00:14] That feels like a throwback [19:00:36] The homophobic spamming of "so and so is gay" was such an early 2000s thing [19:04:25] That still happens in schools, you just don't notice it because you're not in school anymore. [19:04:40] Well, you might notice it if you edit Wikipedia. [19:04:49] And/or watch filters there. [19:11:26] tgr_: if you disable Flow, any pages that had it will exception out [19:12:14] at least in my experience, but it might be because of PostgreSQL or something [19:12:53] Nah, turning off extensions tends to cause that kind of issue. [20:24:35] Izno[m]: at the time I failed to find a way to switch pages back to wikitext (the solution from the extension's page did not work) and had to blank out the content via psql haha [20:56:20] "> - Root composer.json..." <- So apparently different extensions you have require different versions (1.* and 2.*) of `composer/installers`. Maybe you can just upgrade those extensions to 2.*? I don't think the extension code actually interacts with Composer so it should be a trivial change. [20:57:01] s/.*/.\*/, s/.*/.\*/, s/.*?/.\*?/ [21:07:25] if it doesn't interact with composer, it shouldn't even have the dependency. I checked a few other extensions and none of them mentioned it [21:12:04] Most extensions don't have composer/installers [21:12:11] Some random ones do [21:25:12] I don't really understand Composer beyond generating the vendor directory. [21:36:14] "I don't really understand..." <- That's all you need to know 😁 [21:43:33] I've found a few other extensions (EmbedVideo, DynamicPageList) that have a dependency on composer/installers: [21:43:33] ` "composer/installers": ">=1.0.1"` [21:43:51] But I'm not convinced any of them need it. Could it just be old code that we can just delete? [21:46:03] If you want to upset the people that think you should be able to use composer to manage skins and extensions [21:46:04] * Prod[m] sent a DynamicPageList code block: https://libera.ems.host/_matrix/media/v3/download/libera.chat/9223b79bebd0df76eee64e216845dbe390b8e086 [21:46:31] did a search through all the extensions in my folder, these are the only ones that mentioned it [21:48:27] you're only in trouble if you get caught! [21:48:42] https://phabricator.wikimedia.org/T249573 is relevant [21:49:10] thnx [21:49:29] I've seen more chatter more recently somewhere else too [21:49:53] https://phabricator.wikimedia.org/T311321 probably [21:50:24] i've been on composer 2.x line on REL1_35 as well, so I'm not sure why this is coming up now. [21:50:59] https://phabricator.wikimedia.org/T250406 also relevant [21:51:01] though i guess this is the vendor package, not the composer I use for install [21:51:50] I'll submit a change to remove it from the one's i'm impacted by... Hopefully there's a maintainer to approve [22:04:13] "If you want to upset the..." <- Honestly.. those people might just be right. Especially in an enterprise environment.. [22:06:01] Maybe, maybe not [22:12:01] TabberNeue, WikiSEO, EmbedVideo (fork) do not need composer. It is probably copied from other extension as a boilerplate [22:12:54] some of them have php requirements as well, which seems unnecessary when it's branched [22:13:53] Unless it's more restrictive than the version in the branch, probably not [22:14:41] As least in the extensions I mentioned since I am a maintainer of those, composer is only there so CI and test work correctly [22:15:02] 😉 [22:15:20] The PHP requirements are actually lower than what MW requires, it is just leftover or dev dependencies