Reputation
Badges 1
383 × Eureka!@<1523701205467926528:profile|AgitatedDove14> - any thoughts on this. Would like to use profile / iam roles as well.
Can I switch off git diff (change detection?)
In this case I have data and then set of pickles created from the data
Nothing in mind, just wanted to know if there was one ๐
Ok, but doesn't work for me though. Can you or AgitatedDove14 help me in linking to relevant code so that I can see what's wrong?
Thanks you! Does this go as a root logging {} element in the main conf? outside SDK right?
In this case, particularly because of pickle protocol version between 3.7 and 3.8
AgitatedDove14 - on a similar note, using this is it possible to add to requirements of task with task_overrides?
forking and using the latest code fixes the boto issue at least
Think I will have to fork and play around with it ๐
Yes, I have no experience with triton does it do lazy loading? Was wondering how it can handle 10s, 100s of models. If we load balance across a set of these engine containers with say 100 models and all of these models get traffic but distribution is not even, each of those engine container will load all those 100 models?
I am also not understanding how clearml-serving is doing the version for models in triton.
Yeah please if you can share some general active ones to discuss both algos and engineering side
Like it said, it works, but goes into the error loop
Progress with boto3 added, but fails:
Only allowed to have ssh key, not username-password
As of now solving by updating the git config locally before creating the task
AgitatedDove14 - are there cases when it tries to skip steps?
I also have a pipelines.yaml which i convert to a pipeline
Anything that is shown in git status as untracked? So ignore .gitignored. and maybe a oaram or config to say include untracked. Anyway, it's only a nice to have feature.
Found the custom backend aspect of Triton - https://github.com/triton-inference-server/python_backend
Is that the right way?
AgitatedDove14 - it does have boto but the clearml-serving installation and code refers to older commit hash and hence the task was not using them - https://github.com/allegroai/clearml-serving/blob/main/clearml_serving/serving_service.py#L217
Will debug a bit more and see whatโs up
Would be good to have frequentish releases if possible ๐
AgitatedDove14 - apologies for late reply. So to give context this in a Sagemaker notebook which has conda envs.
I use a lifecycle like this to pip install a package (a .tar.gz downloaded from s3) in a conda env- https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-pip-package-single-environment/on-start.sh
In the notebook I can do things like create experiments and so on. Now the problem is in running the cloned experimen...