Assuming Tensorflow (which would be an entire folder)local_folder_or_files = mode.get_weights_package()
OK, I got it by modifying the .conf file and putting the credentials on node
Nice! π
Glad to hear that! π
How come the second one is one line?
GrievingTurkey78 in your cleaml.conf do you have?agent.package_manager.type: conda
Or
https://github.com/allegroai/clearml-agent/blob/73625bf00fc7b4506554c1df9abd393b49b2a8ed/docs/clearml.conf#L59
DepressedChimpanzee34
I have sad news, it is working just fine for me π
Python3.6 Ubuntu hydra-core==1.1.0.dev6
ElegantKangaroo44 definitely a bug, will be fixed in 0.15.1 (release in a week or so)
https://github.com/allegroai/trains/issues/140
Hmmm that is a good use case to have (maybe we should have --stop get an argument ?)
Meanwhile you can do$ clearml-agent daemon --gpus 0 --queue default $ clearml-agent daemon --gpus 1 --queue default then to stop only the second one: $ clearml-agent daemon --gpus 1 --queue default --stop
wdyt?
Hi DilapidatedDucks58
how to force-reinstall package from github in Installed Packages
You mean make sure that the agent installs it from github?
The "Installed packages" section is equivalent to "requirements.txt" anything you can put in requirements.txt, you can put there.
For example adding to "Installed Packages"git+
Will make sure you install the latest clearml from GitHub.
Notice that you cannot have two packages with the same name (just like with regular requirements.txt)...
Oh I do not think this is possible, this is really deep in a background thread.
That said we can sample the artifacts and re-register the html as a debug media:url = Task.current_task().artifacts['notebook preview'].url Task.current_task().get_logger().report_media('notebook', 'notebook', iteration=0, url=url)
Once the html is uploaded, it will keep updating on the same link so no need to keep registering the "debug media". wdyt?
does that mean that it will install my package lastly?
It will install last, but not because it was last in the list, but because it is local/repo package π
Can I do in setup.py the modifications to the tensorflow code?
You mean like have the changes as part of the "uncommitted changes" section ?
You are correct, the agent will clone the git and install the requirements, as written in the task installed packages section. Regrading the git branch, notice it will pull the specific commit id as stated in the execution section, and it will apply any uncommitted changes. You can edit the execution section and change the commit to the latest in a specific version (you should probably also clear the uncommitted changes of you do that)
"what's the trains/trains-agent/trains-server versions ?" how can I check it?
trains/trains-agent are pip packages os,pip freeze | grep trains
trains-server you can check in the /profile page top left corner
Did you set an agent on a machine? (See clearml agent in docs for details)
SubstantialElk6 when you say "Triton does not support deployment strategies" what exactly do you mean?
BTW: updated documentation already up here:
https://clear.ml/docs/latest/docs/clearml_serving/clearml_serving
shows that the trains-agent is stuck running the first experiment, not
the trains_agent execute --full-monitoring --id a445e40b53c5417da1a6489aad616fee
is the second trains-agent instance running inside the docker, if the task is aborted, this process should have quit...
Any suggestions on how I can reproduce it?
Is this example working for you?
https://github.com/allegroai/clearml/blob/master/examples/reporting/model_config.py
LovelyHamster1
Also you can use pip freeze
instead of the static code analysis , on your development machines set:detect_with_pip_freeze: false
https://github.com/allegroai/clearml/blob/e9f8fc949db7f82b6a6f1c1ca64f94347196f4c0/docs/clearml.conf#L169
So I can set output_uri = "s3://<bucket_name>/prefix" and the local models will be loaded into the s3 bucket by ClearML ?
Yes, magic π
Hi LovelyHamster1 ,
you mean totally ignore the "installed packages" section, and only use the requirements.txt ?
PompousBeetle71 so basically exclude parameters that are considered "local" only, so that other people will not accidentally use them?
Is is across the board for any Task ?
What would you expect to happen if you clone a Task that used the requirements.txt, would you ignore the full "pip freeze" and use the requirements .txt again, or is this thime we want to use the "installed packages" ?
While if I just download the right packages from the requirements.txt than I don't need to think about that
I see you point, the only question how come these packages are not automatically detected ?
Just making sure i understand, you are to upload your models with clearml to the Yandex compatible s3 storage?
Make sure you have the S3 credentials in your agent's clearml.conf :
https://github.com/allegroai/clearml-agent/blob/822984301889327ae1a703ffdc56470ad006a951/docs/clearml.conf#L210
Itβs the correct way to do it, right?
Yep π that said this is not running as a service you will need to spin it on your machine. that said you can definitely connect it with the free SaaS server, and spin the serving on your machine with docker-compose