[00:27:01] Hey folks! https://techbase.kde.org/File:Dialog-information.png claims its from a "shared repository", but the links to https://commons.wikimedia.org/wiki/File:Dialog-information.png are a 404. I think the actual shared image repository is https://userbase.kde.org/File:Dialog-information.png. Where is this (mis-)configured? Is it $wgSharedUploaddirectory, $wgForeignFileRepos, or something else? [00:28:26] Hmm, are mediawiki admins still using IRC or is everyone on Discord? https://www.mediawiki.org/wiki/Communication#Chat mentions both. [21:42:23] I will let you know when I see skierpage and I will deliver that message to them [21:42:23] @notify skierpage Yes, there are still folks on IRC, apparently just not when you were asking your question. :) [21:44:14] Heh, they waited 90 seconds before their "is anyone here" [21:55:20] I think `$this->getFile()->getDescriptionUrl()` is the place the URL comes from in ImagePage. So the answer skierpage sought was likely "descBaseUrl in $wgForeignFileRepos" [22:22:11] hello, the StopForumSpam extensions doesn't seem to work for one of my wikis, despite seeming to be installed and configured correctly [22:23:31] I have `$wgMainCacheType = CACHE_ACCEL;`, `$wgSFSIPListLocation` set to a proper location with an automatically updated file and `wgDebugLogGroups["StopForumSpam"]` set to a writable path [22:24:33] yet spammers continue to create pages, despite being known by the StopForumSpam database and present in the file, and no log file is created in the specified location [22:25:32] Is it actually enabled/loaded... not in report only mode, and the threshold set to the a reasonable value? [22:25:43] it may be something specific to CACHE_ACCEL though [22:26:19] other than the configs I listed, it is all defaults [22:26:36] and yes, it's in Special:Version, so it's loaded [22:26:56] weirdest part is I have a different wiki with seemingly the same setup that it works for [22:27:05] it even runs under the same PHP process [22:27:24] so I assume it isn't an issue with CACHE_ACCEL [22:27:35] that said I can try the database cache option [22:28:10] CACHE_ACCEL is known to have issues... [22:28:19] any chance there's just not enough cache space for them both? [22:28:42] It seems CACHE_ACCEL may have a non persistent mode too [22:29:19] we had issues in the past with CACHE_ACCEL effectively behaving like CACHE_NONE on setups. But I imagine a wiki using it as the main cache type would have bigger problems if that happened? [22:29:41] ...some setups... [22:29:47] Guess it depends how big/complex the wiki is in terms of views/edits/extensions etc/ [22:29:48] yeah no there's three wikis on this process (same wiki family) and they all run fine [22:30:01] they are also all relatively small [22:30:28] "any chance there's just not enough cache space for them both?" how do I check? [22:31:24] https://www.php.net/manual/en/apcu.configuration.php [22:31:37] it's 32M by default [22:31:41] https://www.php.net/manual/en/apcu.configuration.php#ini.apcu.shm-size [22:32:11] yeah I know how to configure that, I don't know how to check usage though [22:32:42] in a file running under the same process [22:33:19] https://www.php.net/manual/en/apcuiterator.gettotalsize.php maybe.. [22:38:17] maybe I'll go with the approach of trying to use the DB backend rather than trying to figure out if APC is working right lol [22:38:37] apcu_cache_info gives a lot of information, but nothing obviously related to total usage [22:38:57] APCUIterator::getTotalSize() is not static so I would need to get an APCUIterator first somehow [22:39:09] so yeah I'll just do that [22:39:43] if you can use memcached or something... that's likely to be more performant than using the DB [22:43:15] I'd have to set that up, so I'll use that in case DB works at all [22:52:52] no change [22:53:13] I just put an IP of my phone into the list and successfully edited the wiki [22:54:09] oh wait might've tested it wrong [22:55:32] nope, it doesn't work [22:55:54] log file still doesn't exist [22:56:14] did you touch touch log file? [22:56:39] nope, am I supposed to? [22:56:58] I'm pretty sure it just appeared by itself on the other wiki [22:57:14] file permissions etc... [22:57:23] it's hard to know exactly what will happen on some random system :) [22:57:30] (it shouldn't harm) [22:57:38] the directory is writable by the mediawiki user [22:58:20] huuuh apparently the other wiki stopped working as well [22:58:45] I was able to make an edit from my phone IP address [22:58:58] which should be blocked [23:02:04] last log is from 2025-04-10, but there hasn't been any spam since then [23:02:21] 04-01* [23:02:35] ohh I think I know what changed since then [23:03:07] I updated MediaWiki on 04-03 [23:03:29] maybe I didn't run update.php?! nah I'm pretty sure I did [23:03:47] at the very least I did today when I installed CheckUser [23:04:54] well I reran it now for all wikis [23:05:25] edits still come through [23:06:51] extensions/StopForumSpam/maintenance/updateDenyList.php --check-ip 1.2.34 [23:06:53] extensions/StopForumSpam/maintenance/updateDenyList.php --check-ip 1.2.3.4 [23:07:12] ooh didn't know this existed [23:09:16] "NOT Found!" well that would point to it NOT working :p [23:09:42] try it without any parameters... see if it think sit can load/cache [23:09:43] so for some reason the IP addresses don't end up in the cache [23:09:54] I did, it can [23:10:04] I suspect the size of the IP list is non trivial... [23:10:21] "Done! Loaded 20142 IPs." [23:10:33] *however* [23:10:37] wc -l gives 77324 [23:10:43] so there's something wrong for sure [23:10:55] didn't think to check that before [23:11:27] truncation? [23:12:28] why would it do that lol [23:13:58] not enough space? [23:14:04] certainly explainable with accel [23:14:09] using the db... [23:16:39] surely 87 GiB is enough space lol [23:18:33] Are you able to vaguely check how many IPs are v4 vs v6? [23:18:50] oh that might be it actually. let's see. [23:21:00] nope. vast majority (73120) is IPv4 [23:22:06] Starting update of SFS deny-list in cache... [23:22:06] Done! Loaded 40434 IPs. [23:22:19] using https://www.stopforumspam.com/downloads/listed_ip_90_ipv46_all.gz [23:22:49] I use https://www.stopforumspam.com/downloads/listed_ip_30_ipv46_all.gz [23:23:24] hmmm [23:23:37] the list you sent is 191193 addresses long [23:23:42] 40k is a fraction of that [23:23:45] so that would check out [23:23:58] how does the extension decide on which to use? [23:24:17] which what? [23:24:24] which addresses, of course [23:24:39] I assumed it just loads everything [23:24:44] but clearly it doesn't [23:24:49] for both my and your case [23:24:57] $ipData = str_getcsv( $line, ",", "\"", "\\" ); [23:24:57] $ip = (string)$ipData[0]; [23:24:57] $score = (int)$ipData[1]; [23:24:57] if ( $score && ( $score < $wgSFSIPThreshold ) ) { [23:24:57] $scoreSkipped++; [23:24:57] continue; [23:24:59] } [23:25:13] oh there's a check for $wgSFSIPThreshold [23:25:29] I haven't configured that as I said [23:25:49] it's not documented on the extension page [23:25:51] https://www.mediawiki.org/wiki/Extension:StopForumSpam [23:26:36] it's set to 5 by default [23:26:41] whatever 5 means [23:27:10] "the number of times that IP number been reported in the period for which the file was generated" [23:27:15] according to StopForumSpam [23:27:48] that seems to check out [23:28:08] I think I'm just gonna set that to 1 [23:28:25] for 30 days that sounds reasonable [23:28:34] or I could maybe get the 90 days list? [23:31:06] I've documented the other variables onwiki now [23:34:08] Reedy: I rewrote the explanation $wgSFSIPThreshold to explain what the "score" is [23:34:22] woo, collaboration :P [23:34:35] we should probably copy these docs to the extension.json at some point [23:38:49] ok I think I know why one wiki wasn't getting spam, but the other was [23:39:03] I had QuestyCaptcha on one but not the other [23:39:26] so I guess I'm gonna add a QuestyCaptcha to the other one rather than reducing $wgSFSIPThreshold [23:39:36] as the default seems to work well in combination with it [23:41:40] tfw you think something broke but it's actually working as intended (but is not documented) [23:42:32] unrelated, but is there something like Nuke but for users? [23:44:57] to use against a user? rather than for a user to use? :P [23:45:08] against a user, I mean [23:45:21] I thought Nuke let you do that? [23:45:36] > Go to Special:Nuke in order to mass delete pages recently added by a user or IP address. If you don't want to filter by user, you can also just filter by namespace. [23:46:24] mass delete pages *added by a user* [23:46:35] it doesn't block users [23:50:02] https://en.wikipedia.org/wiki/User:Timotheus_Canens/massblock.js type of thing? [23:51:13] I never used userscripts on MW so idk if it would work for me [23:51:19] I would definitely prefer a special page [23:52:39] what I usually use is blockUsers.php [23:52:53] but it would be good to give this capability to other admins too [23:53:10] and you still need to make a list of users (which I do via the API)