[14:45:00] NBC reports that America's "Teamsters" labor union was hit by a ransomware attack demanding $2.5 million back in 2019. [14:45:01] "But unlike many of the companies hit by high-profile ransomware attacks in recent months, the union declined to pay, despite the FBI's advice to do so, three sources familiar with the previously unreported cyberattack told NBC News." [14:45:01] Personal information for the millions of active and retired members was never compromised, according to a Teamsters spokesperson, who also said that only one of the union's two email systems was frozen along with other data. Teamsters officials alerted the FBI and asked for help in identifying the source of the attack. They were told that many [14:45:02] similar hacks were happening and that the FBI would not be able to assist in pursuing the culprit. [14:45:02] The FBI advised the Teamsters to "just pay it," the first source said. "They said 'this is happening all over D.C. ... and we're not doing anything about it,'" a second source said. [14:45:03] Union officials in Washington were divided over whether to pay the ransom — going so far as to bargain the number down to $1.1 million, according to the sources — but eventually sided with their insurance company, which urged them not to pony up... The Teamsters decided to rebuild their systems, and 99 percent of their data has been restored [14:45:03] from archival material — some of it from hard copies — according to the union's spokesperson. [14:45:04] The FBI's communications office did not reply to repeated requests for comment. The FBI's stance is to discourage ransomware payments. [14:45:04] NBC News draws a lesson from the fact that it took nearly two years for this story to emerge. "An unknown number of companies and organizations have been extorted without ever saying a word about it publicly." [14:45:27] spammer... [16:16:05] friend wondering how big the article text of all of wikimedia is. i'm sure we've got stats for that somewhere but I can't find it? [16:28:34] MC8: I reposted your question in #wikimedia-databases. If anyone knows how to answer it is likely to be the DBAs. Also I expect the answer to come with multiple "it depends on how you ..." caveats. :) [16:29:54] thanks :) I was expecting us to have a dashboard or something [16:38:48] MC8: I'm sure there are some, but the story they tell will be complicated. The canonical storage is the https://wikitech.wikimedia.org/wiki/External_storage MariaDB cluster. In that location the data is stored using several different historical compression schemes which makes getting a "raw content size" number non-trivial. [16:41:31] * urbanecm feels the easiest answer is through dumps [16:41:40] (full dumps, with all historical revisions) [16:42:01] there is also the "which size" question. should all revisions be counted or only the current head revision? should some namespaces be excluded? Flow boards? with or without template expansion? [16:42:29] I apologise for the nerd snipe [16:42:34] :) [16:47:08] if you get good enough at it MC8, somebody might give you the phabricator badge :) https://phabricator.wikimedia.org/badges/view/14/ [16:48:04] lmao [18:27:49] 😎