Reputation
Badges 1
25 × Eureka!task = Task.init(...) if task.running_locally(): # wait for the repo detection and requirements update task._wait_for_repo_detection() # reset requirements task._update_requirements(None)
π
GiddyTurkey39
BTW: you can always add the missing package via code:Task.add_requirements('torch', optional_version)
Hi GiddyTurkey39
Are you referring to an already executed Task or the current running one?
(Also, what is the use case here? is it because the "installed packages are in accurate?)
GiddyTurkey39 can you ping the server-address
(just making sure, this should be the IP of the server not 'localhost')
GiddyTurkey39 do you mean to delete them from the server?
GiddyTurkey39 Okay, can I assume "Installed packages" contains the packages you need?
If so, you can setup trains-agent on a machine (see instructions on the github)
And then clone the experiment, and enqueue it into the "default" queue (or any other queue your agent is connected to)
https://github.com/allegroai/trains-agent
GiddyTurkey39 what do you have in the Task itself
(i.e. git repo uncommitted changes installed packages)
Are you seeing the entire jupyter notebook in the "uncommitted changes" section
JoyousKoala59 what is the Trains server you have? the link you posted is to upgrade from v0.15 to v0.16, not from trains to clearml
Hmm, let me check, the link is definitely there but this is not a valid link
The bug was fixed π
Hi ReassuredTiger98
I think DefiantCrab67 solved it π
https://clearml.slack.com/archives/CTK20V944/p1617746462341100?thread_ts=1617703517.320700&cid=CTK20V944
Oh dear, I think your theory might be correct, and this is just the mongo preallocating storage.
Which means the entire /opt/trains just disappeared
And if this is the case, that would explain the empty elastic as well
Could it be it was never allocated to begin with ?
so if the node went down and then some other node came up, the data is lost
That might be the case. where is the k8s running ? cloud service ?
post_optional_packages: ["google-cloud-storage", ]
Will install it last (i.e. after all the other packages) but only if you have it in the "Installed packages" list
well, it's only when adding aΒ
- name
Β to the template
Nonetheless it should not break it π
at the end it's just another env var
It should work GIT_SSH_COMMAND
is used by pip
You need to mount it to ~/clearml.conf
(i.e. /root/clearml.conf)
SubstantialElk6 if you call Task.init with continue_last_task=<task_id> it will automatically add the last_iteration of the previous run, to any logging/report so you never overwrite the previous reports π
I have to commit the YAML with my AWS credentials to git.
CleanPigeon16 please do not π
either put them on the Task itself, or as OS env on the machine/agent running the Task.
Regrading where it is stored (I think the default is DevOps project, need to look at the code)
Exactly, just pointing to the fact that, that machine is yours ;)