[13:08:04] \o [13:10:41] oh no... what happened? [13:56:22] nothing terrible it's just that the ticket gave an endpoint, so i put it in the patch, then ryan tested it before shipping and found the url was the UI and not the SPARQL api [13:56:41] so basically...ryan fixed my laziness :P [14:00:20] yay ryan! [14:21:00] sometimes i wonder if we should be investigating these kinds of things, but i dunno...comment from a lucidworks co-founder: If someone searches for "blue shoes", issue that query to the "signals" (aggregate) collection to find products that were clicked on more often for similar (not necessarily exact) products and boost those. At least that's how we did at Lucidworks. [14:21:17] where "signals" is essentially aggregating user metrics [14:22:06] but i dunno...it would either be a big batch thing, or you would almots want a streaming app that watches events come in and updates an index you can query [14:23:11] perhaps interesting aside, this lucidworks co-founder has also moved on (now working @ mongo) [15:59:08] hmm...not sure if worth fixing. debian-glue fails for the opensearch plugins repo. Poked at it a little to get it going, but it ends up running the build without downloading the artifacts so fails still :S [15:59:23] but i don't actually know that debian-glue does anything useful for us, it's always failed. [16:03:03] my favorite CI error: Could not resolve host: gitlab.wikimedia.org [16:08:00] hmm...i suspect it's intentional the integration-agent-pkgbuilder-nnnn instances can't talk to the internet...but curiously the term 'pkgbuilder' is not found in integration/config, operations/puppet, or even just codesearch.wmcloud.org [16:09:42] * inflatador is not familiar with debian-glue [16:13:17] it's a generic CI pipeline that is put on repos that build debian packages. We use debian-glue-non-voting because it never passed :P [16:13:19] Oh nice, I found https://gerrit.wikimedia.org/g/integration/config/+/f352b04f40b4e1e16f914f393a635a30a8a86dac/jjb/operations-debs.yaml . That would be nice if we could get our packages to build in CI [16:14:08] so curiously....when it runs on "integration-agent-pkgbuilder-1004" it has the error about not being able to lookup the dns, but if i ssh in i can ping just fine. I wonder what magic the runner is using... [16:22:52] "On the other hand, do not use eatmydata when you care about what software stores...The library is called libEAT-MY-DATA for a reason." [16:23:25] (apparently we run some CI stuff with libeatmydata) [16:34:22] Oh, I see what you mean...we do use the debian glue checks in our plugins repo [16:36:39] currently suspecting that pbuilder is chrooting to an env that doesn't have a good resolv.conf? I dunno, pulling random strings :P [16:37:16] Assuming I can login to the runners, I can take a look [16:37:55] i can login, the runners are actually in wmcloud and i'm an member (somehow) for the integration project [16:39:32] i guess i've been collecting memberships for awhile now, sometimes makes things easier :) [16:44:02] not to get ahead of ourselves, but maybe time to revisit https://gitlab.wikimedia.org/repos/sre/wmf-debci . We could build the packages in CI...we still have to publish manually, but that's still one less step [16:45:44] see also https://wikitech.wikimedia.org/wiki/Debian_packaging_with_dgit_and_CI [16:52:11] hmm, probably more useful to spend the time on :) [16:52:40] workout, back in ~40 [17:44:13] o/ I would like to publish a new docker image of flink (1.20.1) but I am struggling with docker-pkg. Is anyone familiar with it? [17:44:56] pfischer I haven't used it in awhile but I'm game to give it a shot [17:45:35] inflatador: awesome, would you have a minute? [17:46:31] https://meet.google.com/wtc-fjpo-hcy [17:46:36] pfischer sure, OMW [17:53:01] in theory...i guess that works? I'll have to do a little more cleanup though: https://gitlab.wikimedia.org/repos/search-platform/opensearch-plugins-deb/-/jobs/563758/artifacts/browse/WMF_BUILD_DIR/ [17:53:39] i also don't like how their system is configured...it only does the build after you merge to master, not in the merge requests. Can probably fix but not sure if there are good reasons they did that [18:11:28] pfischer looks like it finished building, might take a few minutes to appear on docker-registry website though [18:12:16] ebernhardson damn, you work fast! Looks great so far [18:27:20] I dunno too much about gitlab CI, but maybe they didn't want to make too many artifacts? Github lets you specify maximum age, compression level etc for non-release artifacts [18:33:23] lunch, back in ~40 [19:01:45] i suppose i can add my own test that run verify_commit instead of the full thing. seems reasonable enough [19:22:35] proposed metrics for autocomplete analysis (1 vs 2 char fuzziness): https://phabricator.wikimedia.org/T397732#10980356 [19:54:22] sorry, been back awhile [20:47:18] Slightly late to pairing [20:51:32] np [20:54:54] inflatador: thanks, it's there by now [21:02:45] sigh...chrome still wont do google meet. It must have broken when i changed the thing that was supposed to let me screen share things that are not tabs... [21:12:12] Just joined pairing...sorry I lost track of time ;( [21:12:29] sec i'll join, i'm constantly restarting trying to make screen sharing happen, but chrome isn't playing nice :P [21:16:58] https://gitlab.wikimedia.org/repos/search-platform/opensearch-plugins-deb/-/merge_requests/2 [21:20:54] ryankemper for context: https://wikitech.wikimedia.org/wiki/Debian_packaging_with_dgit_and_CI [22:13:26] inflatador: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Tool_accounts#Joining [22:13:35] tl/dr: contact the maintainer [22:44:47] ACK [22:45:18] https://gitlab.wikimedia.org/repos/releng/gitlab-trusted-runner/-/merge_requests/124 is ready after me spending too much time in git hell ;)