Reputation
Badges 1
25 × Eureka!. I am not sure this is related to the fact the model is not correctly converted to TorchScript
Because Triton Only supports TorchScript (Not torch models) π
I think the limit is a few GB, I'm not sure, I'll have to check
And yes the oldest experiments will be deleted first (with the exception of published experiments, they will be deleted last)
Are you doing from keras import ... or from tensorflow.keras import ?
Exactly, thatβs my problem: I want to remove it to make sure it is reinstalled (because the version can change)
JitteryCoyote63 yes, this is definitely a pip bug... can you test with the latest pip version, maybe it was fixed? (i.e. git+https:// link)
PompousBeetle71 so basically exclude parameters that are considered "local" only, so that other people will not accidentally use them?
If we have the time maybe we could PR a fix?!
PompousBeetle71 , what you are saying that for some reason the --gpus all will not configure the Nvidia drivers to use all the gpus, when running bare metal (i.e no docker). Did I understand you correctly ?
error in my-package setup command:
Okay this seems like an error in the setup.py you have in the "mypackage" folder
Try:task.update_requirements('\n'.join([".", ]))Β
JitteryCoyote63 see here https://stackoverflow.com/questions/55385900/pip3-setup-py-install-requires-pep-508-git-url-for-private-repo bottom line, you have to add package@ before the link, but if you do that and the package is already installed it will not install using the git repo, this is an issue with pip. I think that since the agent installs everything from scratch it should work for you. Wdyt?
Hi GrittyKangaroo27
Maybe check the TriggerScheduler , and have a function trigger something on k8s every time you "publish" a model?
https://github.com/allegroai/clearml/blob/master/examples/scheduler/trigger_example.py
IrritableGiraffe81 could it be the pipeline component is not importing pandas inside the function? Notice that a function decorated with pipeline component become a stand-alone, this means that if you need pandas you need to import inside the function. The same goes for all the rest of the packages used.
When you are running with run_loclly or debug_pipeline you are using your local env , as opposed to the actual pipeline where a new env is created inside the repo.
Can you send the Entire p...
Hi JuicyDog96
The easiest way at the moment (apologies for still lack of RestAPI documentation, it is coming:)
Is actually the code (full docstring doc)
https://github.com/allegroai/trains/tree/master/trains/backend_api/services/v2_8
You can access it all with an easy Pythonic interface, for example:from trains.backend_api.session.client import APIClient client = APIClient() tasks = client.tasks.get_all()
Hi RoughTiger69
Is the pipeline in question based on decorators or is it based on existing Tasks?
Could be nice to write some automation
@<1541954607595393024:profile|BattyCrocodile47> not restarting the docker, restarting the Docker service (on Mac it's an app, I think there is an option on the Docker app to do that)
Sure π
BTW: clearml-agent will mount your host .ssh into the docker to /root/.ssh by default.
So no need to do that manually
PungentLouse55 could you test with 0.15.2rc0 see if there is any difference ?
Correct, but do notice that (1) task names are not unique and you can change them after the Task was executed (2) when you clone the Task, you can actually rename it, when an agent is running the Task, basically the init function is ignored, because the Task already exists. Make sense ?
Yes actually that might be it. Here is how it works,
It launch a thread in the background to do all the analysis of the repository, extracting all the packages.
If the process ends (for any reason), it will give the background thread 10 seconds to finish and then it will give up. If the repository is big, the analysis can take longer, and it will quit
just want to be very precise an concise about them
Always appreciated π
Hi RipeGoose2 all PR's are welcome, feel free to submit :)
I'm hoping we are ready to release
I find it quite difficult to explain these ideas succinctly, did I make any sense to you?
Yep, I think we are totally on the same wavelength π
However, it also seems to be not too prescriptive,
One last question, what do you mean by that?
Correct.
It starts with the initial script (entry point), if it is self contained (i.e. does not interact with the rest of the repo) it will only analyze it, otherwise it will analyze the entire repo code.