[05:51:32] Wtf [05:54:41] Stay safe @paladox [05:55:05] Is it possible to edit my wiki's robots.txt? I'm asking because I'd like to add the lines in this article that block ChatGPT's web scraper https://arstechnica.com/information-technology/2023/08/openai-details-how-to-keep-chatgpt-from-gobbling-up-website-data/ [06:03:41] Thanks x [06:27:57] @paladox you over in US? [06:28:40] Add it to MediaWiki:Robots.txt - it may take up to 24 hours to take effect [06:29:23] Thanks! [06:33:57] Yeh until next week [06:35:06] Stay safe [15:39:50] fellas, Nuke extension isn't broken? [15:40:26] tryin to delete FANDOM trash, it doesn't list pages per `title%` [17:42:48] Nuke doesn't work on imports [17:43:09] interesting [17:43:34] Either got to delete manually, make a script, or send a list of pages to SRE to hit with the deleteBatch script [17:43:35] we had to go w/ DPLNuke in the end [17:43:47] Or that, I guess :) [19:49:09] https://github.com/wikimedia/mediawiki-extensions-Nuke/commit/0934e345005bf09b570e944b305da3abf90e00a1 [19:49:17] https://github.com/wikimedia/mediawiki-extensions-Nuke/blob/master/includes/SpecialNuke.php#L357C5-L357C5 [19:49:18] hmm [19:52:01] https://github.com/wikimedia/mediawiki-extensions-Nuke/commit/46c369b58c9c9ee862403e198717105b7973c25f [19:52:02] oh [19:55:36] I made 2 Wiki deletion requests on Stewards Noticeboard and I’ve been patient just need you to check it out @paladox [20:02:00] paladox is SRE, stewards delete wikis [20:06:59] And once again you have been warned to be patient in the past. [20:07:33] I’ve been patient it’s been days [20:07:52] Well yes, but that's to be expected with the Steward shortage [20:08:03] Great, theres people whom have been waiting longer [20:08:16] Ok