[12:47:42] DemiMarie we use some of these features but it's mainly for ACLs and TLS termination [17:32:50] One thought on QUIC is that HAProxy has had a recent severe bug in this area. I can’t tell if it’s a buffer overflow (which would be remote code execution, so really really really really bad) or “just” remote denial of service (still very bad). [20:07:58] Hello, [20:07:59] i just read this article https://news.slashdot.org/story/25/04/04/2357233/wikimedia-drowning-in-ai-bot-traffic-as-crawlers-consume-65-of-resources [20:07:59] tldr: bots are scraping wikipedia and it is heavy on infrastructure [20:08:00] My idea to reduce the bot traffic would be to implement a redirect if someone is suspected of being a bot. [20:08:00] The page to which the person/bot got redirected to should have information about options to download wikipedia without scraping. [20:08:01] For example here is a downloadable wikipedia dump https://library.kiwix.org/#lang=eng&category=wikipedia in the zim format. [20:08:01] I would hope that a dev sees this redirect and stopps scraping because a torrent download is faster than scraping. [20:08:02] More about wikipedia dumps: https://en.wikipedia.org/wiki/Wikipedia:Database_download [20:08:02] As this is my first time in the chat and i can't see the history it could be that it is already considered. [20:08:03] Should this be the wrong chat please tell me. [22:44:09] FIRING: LVSHighRX: Excessive RX traffic on lvs2013:9100 (eno12399np0) - https://bit.ly/wmf-lvsrx - https://grafana.wikimedia.org/d/000000377/host-overview?orgId=1&var-server=lvs2013 - https://alerts.wikimedia.org/?q=alertname%3DLVSHighRX [22:49:09] RESOLVED: LVSHighRX: Excessive RX traffic on lvs2013:9100 (eno12399np0) - https://bit.ly/wmf-lvsrx - https://grafana.wikimedia.org/d/000000377/host-overview?orgId=1&var-server=lvs2013 - https://alerts.wikimedia.org/?q=alertname%3DLVSHighRX