[11:20:23] lunch [14:18:46] \o [14:27:22] .o/ [14:35:06] o/ [14:41:30] hmm, sync from hdfs to public is weird :S It created a second-level cirrus-search-index dir, and didn't copy any data files, only directorys. https://dumps.wikimedia.org/other/cirrus_search_index/ [14:41:51] i was hoping i didn't need to understand how it worked and could just fill out the puppet bits :P [14:42:36] :/ [14:44:05] yes looked briefly into what's inside hdfs and saw '=' but did not dig further to understand if possibly this could have annoyed the sync script [14:49:38] hmm, looks like dumps occur on clouddumps100[12].wikimedia.org, i can't ssh in, and don't see any logs in logstash. Should be interesting [14:49:48] err, s/dumps/hdfs-rsync/ [16:36:26] d-causse it does look like `backend_roles` are important for OpenSearch RBAC, thanks for pointing that out [18:25:08] dinner [20:17:21] oh, silly me...i think the reason the data didn't export from hdfs for hte dump is that it's not world readable [21:10:17] ryankemper: followup on yesterdays patch, missing trailing slash strikes again: https://gerrit.wikimedia.org/r/c/operations/puppet/+/1202289 [21:33:08] classic :P [21:33:13] Back in 15 mins to merge [22:00:22] ebernhardson: merged. working on the cleanup now [22:01:06] should be simple, i also messed up the perms on the other side so the sync could only see the directory but not the contents [22:01:56] i suppose in theory since it invokes with --delete it might actually clean up on it's own at the next sync, not sure [22:05:47] ebernhardson: I actually don't see anything in `/wmf/data/exports/` so it might have already cleaned up [22:07:15] oh wait i was looking on clouddumps1001 but that might be the wrong place [22:07:33] if i understand, clouddumps1001 and 1002 both mount the same dir to /srv/something [22:07:35] umm, lemme check [22:08:05] ryankemper: file:///srv/dumps/xmldatadumps/public/other/cirrus_search_index [22:08:20] yeah , it looks like the `/wmf/data/exports` is part is config for HDFS? [22:08:48] yea /wmf/data/exports is the path on hdfs, /srv/dumps/ is clouddumps [22:09:44] OK, so right now we have `/srv/dumps/xmldatadumps/public/other/cirrus_search_index/cirrus-search-index/20251102` , so we need to remove one of `cirrus_search_index` or `cirrus-search-index`? [22:09:57] the inner one, cirrus-search-index [22:10:15] on next sync it will put 20251102 dir one level up, hopefully with contet this time [22:10:42] alright, ran `sudo rm -rfv /srv/dumps/xmldatadumps/public/other/cirrus_search_index/cirrus-search-index/` [22:11:06] should do it, thanks! [22:13:12] oh goody, our "friend" who's hosing wdqs codfw is back https://grafana.wikimedia.org/goto/MHfJwtkDg?orgId=1