[12:33:42] hello folks! I am going to rollout PKI TLS certs for the AQS codfw cassandra instances, wanted to do it yesterday but got dragged in other tasks [12:33:50] lemme know if you have anything against it [12:33:59] Cc: btullis,brouberol --^ [12:36:19] today seems quiet so far [12:36:57] elukey: that's fine by me. Thanks. [12:40:28] ack thankss [12:43:14] I also noticed a lot of warnings for certs about to expire on aqs nodes, good timing :D [12:45:07] elukey: nothing against it, go ahead! [12:51:04] first node upgraded, all good so far, will wait a second and then I'll roll restart the other nodes [12:53:25] lovely I see an alert [12:54:37] the instances are up but they are failing the TLS check [12:58:48] maybe I need to run puppet on alert1001 [13:02:36] yep temporary, they are gone [13:02:39] proceeding with the rest [13:13:15] it will take a bit, please lemme know if you see anything weird [14:05:58] ok lovely the cassandra restart cookbook failed on aqs2008 sigh, checking [14:08:10] the instances are good, maybe it took too much for them to come up [14:08:13] restarting the cookbook [15:05:17] restart done! [16:06:26] anyone konw if there is a way in Puppet to merge hashes of string=>list so that after the merge all the lists are concatenated? simple .merge just keeps the last RHS for each key [16:07:15] I've tried using reduce |x,y| {x.merge(y)} but that has the above result [16:13:57] klausman: "merging" as a hash operation just keeps one value, by definition, but you could write your own reducer (or for loop, whatever) that does the thing you want instead [16:14:20] (the exception is if you're doing a hiera lookup, where the thing also called "merge" has some more options for behavior) [16:14:51] I've tried writing this as a reduce, but failed miserable :-/ [16:14:54] I don't know of a builtin way to do it though, so I think spelling out the logic is a non-crazy thing to do, if that's the question [16:16:08] What I van't wrap my head around is how to access the items in the RHS of the merge step [16:17:51] (omitting $ for brevity) so a reduce({}) |k,v| {merge(k, v)} doesn't work (see above), but k[???] = k[???]+v[???] probably won't work, since hashes can't be changed [16:18:09] (not to mention that I don't know the ??? value) [16:20:01] klausman: does wmflib::deep_merge do what you need? [16:20:31] checking... [16:22:44] From the docs yes, but of course the proof of the .pp is in the pcc'ing [16:23:14] ah, no [16:23:20] "When there is a duplicate key that is not a hash, the key in the rightmost hash will "win."" [16:23:29] My RHS is an array [16:24:41] I think you're looking at the one in stdlib, not the one in wmflib [16:24:57] https://doc.wikimedia.org/puppet/puppet_functions_ruby3x/deep_merge.html I was looking at this [16:25:33] aaah, the wmflib on is ther,e but further down [16:26:01] good that i already started a PCC run before misreading it initially :) [16:27:36] is this what you wanted klausman? [16:27:38] $butter = { [16:27:41] 'r1' => [1,2,3], [16:27:43] 'r2' => [1,2,3], [16:27:45] } [16:27:47] $bubbles = { [16:27:49] 'r1' => [4,5,6], [16:27:51] 'r2' => [4,5,6], [16:27:53] } [16:27:55] $bar = [$butter, $bubbles].reduce({}) |$memo, $r| { [16:27:57] $r.reduce($memo) |$memo, $v| { [16:27:59] if $v[0] in $memo { [16:28:01] $memo + { $v[0] => $memo[$v[0]] + $v[1] } [16:28:03] } else { [16:28:05] $memo + { $v[0] => $v[1] } [16:28:07] } [16:28:09] } [16:28:11] } [16:28:28] Probably, the wmflib deepmerge looks very similar [16:28:43] cool, haven't looked at it yet [16:29:36] aaaargh.... Error: Could not run: failed to allocate memory [16:36:47] yeah, I am clearly using this wrong, leading to memory exhaustion somehow [16:37:37] how big are the hashes? [16:38:32] So I ma creating a series of hashes where the keys are on the order of a dozen, and the right hand side is a one-element array with a v4 IP as string [16:38:45] and then: `.reduce({}) |$mem, $val| {wmflib::deep_merge($mem, $val)}` [16:39:12] https://gerrit.wikimedia.org/r/c/operations/puppet/+/1020194 Code is here [16:41:55] your map is returning an array of one elment hashes, is that intended? [16:42:04] yes [16:42:30] and then I want to merge all those hashes such that when a key is duplicate its RHS arrays are concated [16:43:42] using the stdlib merge I'd get only whatever RHS showed up last [16:45:02] that seems like it should work [16:46:16] both keys and values are strings or arrays of strings, so the memory exhaustion is rather puzzling [16:46:26] if you were reducing on a hash like { a => [ip], a => [ip], b => [ip] } it would work. I think what's throwing a wrench in this is that it's actually [ { a => [ip] }, {a => [ip]}, ... ] [16:46:40] not that the hash could have two a's, but I think that's the crux of the mismatch here [16:47:00] it's always going to only operate on the last one [16:47:03] but doesn't reduce do this one at a time? [16:47:50] so you'd have a merge of a=>[1] and a=>[2], then a=>[1,2] + b[2] etc [16:48:56] I wonder if I somehow managed to make an array contain itself or sth [16:49:07] I think the $memo ends up in array form in your example? [16:49:25] memo starts as a hash ({}) and should remain so [16:49:33] ok [16:50:26] are the keys meant to be one-item arrays too? [16:50:40] No, they should be plain strings [16:50:52] $k in the top part [16:51:16] yeah was just noticing the comment at the top [16:51:29] Note that the comment is the _intent_ :) [16:54:54] what if at the end you do: [16:55:11] }.reduce({}) |$mem, $val| { $mem.merge($val) } [16:55:28] In addition? or instead of the deep merge? [16:55:33] I mean it's not the right answer either [16:55:51] but it's interesting from a debugging perspective about what's off the rails here [16:56:04] With just plain megre, I get almost what I want: all the right k's, but the values are whatever was merged last [16:56:13] right [16:56:58] I'll try jesse's version instead of deep_merge [16:57:07] yeah maybe [16:57:39] the wmflib version uses the stdluib version in some way I don't quite understand [16:58:05] I think you could replace the map and only do the reduce, https://phabricator.wikimedia.org/P60953 [16:58:41] oh neat, I will give that a whirl after this run [17:04:41] the wmflib version uses the stdlib version to do the inital merge of keys and values, then checks for keys in hash1 whose value would have been overwritten by the merge and rewrites them with the combined value of hash1 and hash2, a little funky, but it should work [17:17:15] wll, it now compiles and gives me an empty hash... [17:17:40] oh wait nvm, I need to recheck sth [17:22:26] looks good klausman [17:22:50] yeah, currently waiting for the PCC run to complete for the actual _use_ of the darned thing :) [17:23:25] bblack, Jesse & taavi: thank you all _so_ much, I've been banging may head against this for a day or two :D [17:24:39] np, happy to, $cassandra_clusters ['eqiad_ml_cache_a'], need to remove the space [17:25:05] all I did was ask dumb questions and say silly things, like a reverse version of https://en.wikipedia.org/wiki/Rubber_duck_debugging [17:25:14] but happy to do so anytime :) [17:25:20] A good rubber duck is still a good rubber duck :) [17:28:25] [angel's choir] it works! [17:29:12] woohoo [17:30:43] lil' bit of cleanup, and I can send it for review. Phew.