[07:18:19] 10Machine-Learning-Team, 10Wikilabels: Translations updates are blocked - https://phabricator.wikimedia.org/T282449 (10elukey) @Nikerabbit I removed the pull request setting, can you check if it works now? (the restrictions about who can push to master are still in place so I think it is enough!) [09:11:03] I am testing istio 1.6.2 install via helm and it doesn't work, going to try 1.6.14 (note that istioctl worked fine sigh) [09:21:22] uff same thing for .14 [10:55:37] 10Machine-Learning-Team, 10Wikilabels: Translations updates are blocked - https://phabricator.wikimedia.org/T282449 (10Nikerabbit) Works now: `lang=bash $ repomulti commit wiki-ai wiki-ai cd 'wiki-ai/wikilabels'; git add .; if ! git diff --cached --quiet; then git commit -m 'Localisation updates from https://t... [11:27:41] 10Machine-Learning-Team, 10Wikilabels: Translations updates are blocked - https://phabricator.wikimedia.org/T282449 (10elukey) 05Open→03Resolved a:03elukey I think that we can close, pleas re-open if anything is missing! [11:36:44] bbiab, need to buy groceries and lunch [12:09:54] I tried to follow https://istio.io/latest/docs/setup/install/helm/ for istio 1.6.14, but it doesn't seem to work (some errors from helm that some global values are nil so accessing their attributes is not possible) [12:10:11] tried the same with 1.9.4, it went fine [12:55:35] ok so I tried the helm chart shipped with 1.9.4 (base + istiod) with 1.6.14 images and they worked flawlessly [13:04:56] 10Lift-Wing, 10Machine-Learning-Team, 10Patch-For-Review: Install Istio on ml-serve cluster - https://phabricator.wikimedia.org/T278192 (10elukey) @Theofpa so far I tried to follow your guidelines outlined in T278194#6964746, but I am wondering if there is any issue mixing something like the following: * is... [18:33:53] awesome, looks like loading the model binary from external storage works for custom images works using both the v1alpha2 and v1beta1 api [18:35:05] we'll just have to pass STORAGE_URI as an environment variable to the container, but it seems to work well and also reduces our size when autoscaling the inference services [22:28:53] 10Lift-Wing, 10Machine-Learning-Team (Active Tasks), 10Patch-For-Review: Load outlinks topic model in to KFServing - https://phabricator.wikimedia.org/T276862 (10ACraze)