[08:34:50] This is now fixed btw. [08:54:07] Oh yeah forgot to update here lol ... I posted the reason on the task when I fixed it also. [09:00:46] Yeah I saw your response on Phorge and thought of notifying the original person. Most people don't check Phorge, so I try to notify people on discord when something is resolved. [21:19:24] I don't know if this is the right place to ask, but I'm viewing the analytics for a wiki I am admin on, and the numbers are being skewed by a Chinese bot that's visiting the same pages over and over. Is there any way to exclude particular hosts or countries from analytics, or ban particular hosts or bots from the wiki itself? It doesn't appear to respect robots.txt. [21:21:59] And you have a `NOINDEX` on these pages? [21:22:51] Almost useless against badly behaving scrapers [21:22:59] We blocked some the other day [21:23:35] @posix_memalign was it you who suggested the block? [21:24:28] Oh I never turned that on [21:24:31] I set the log up [21:26:10] @themysteriouskm thanks for the reminder, I just switched it from log -> challenge [21:26:35] They're all special pages, and the page source has noindex. I don't think it respects robot meta tags either. [21:27:15] Alright, thanks 🙂 [21:28:12] Is there any way to remove a country from analytics, or to get data with the bot's visits removed? I can take a guess which requests are due to the bot, but having the data would make it easier. [21:29:02] That suggestion was almost definitely useless [21:29:30] Not really cause the metric that was pointed out to us isn't well exposed [21:30:47] The vast majority of the traffic is china though [21:30:56] @agentisai might have thoughts [21:31:09] Yeah. I brought it up and @pskyechology found a bad UA. [21:31:56] 11.3 million requests or so in a day [21:32:04] I'm now actually captchaing it [21:32:37] Filtering might be something cool to add to Special:Analytics [21:33:50] You have a feature request then 🙂 [21:45:39] 11 million is insane [21:59:19] it's like my parents trying to talk to me