![Profile picture](https://clearml-web-assets.s3.amazonaws.com/scoold/avatars/HurtWoodpecker30.png)
Reputation
Badges 1
41 × Eureka!So if I want to train with a remote agent on a remote machine, I have to:
spin up clearml-agent on the remote create a dataset using clearml-data, populate with data… from my local machine use clearml-data to upload data to google gs:// bucket modify my code so it accesses data from the dataset as here https://clear.ml/docs/latest/docs/clearml_data/clearml_data_sdk#accessing-datasetsAm I understanding right?
A quick note for others who may visit this… it looks like you have to do:Task.force_requirements_env_freeze(force=True, requirements_file="requirements.txt")
to ensure any changes in requirements.txt are reflected in the remote venv
I see, so there’s no way to launch a variant of my last run (with say some config/code tweaks) via CLI, and have it re-use the cached venv?
Actually with base-task-id
it uses the cached venv, thanks for this suggestion! Seems like this is equivalent to cloning via UI.
And I will look into the non-cli workflow you’re suggesting.
… but I have a feeling they will not give me the “instant venv activation” behavior I’m looking for.
Oh I think I know what missed. When I set --project … --name …
they did not match the names I used when I did task.init( )
in my code
I have a strong attachment to a workflow based on CLI, nice zsh auto-suggestions, Hydra and the like. Hence why I moved away from dvc 🙂
Yes after installing , it listed the installed packages in the console , with version of each
I usedtask.execute_remotely(queue_name=..., clone=True)
and indeed it instantly activates the venv on the remote. I assume clone=True is fine
I would also be interested in a GCP autoscaler, I did not know it was possible/available yet.
Thanks for the quick response . Will look into this later , I think I understand
AgitatedDove14 thanks yes I assume I would follow these instructions:
https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server_gcp
this is great… so it looks like best to do it in a new dir
should I nuke the .clearml/cache
I use a CLI arg remote=True so depending on that it will run locally or remotely.
I mean it is in Pip mode and the agent installs deps from git repo that it pulls
Dataset.get
works fine from python script, it pulls in the data into cache. Just the cli seems broken
Great, and this would show up in the description column in the dashboard ?
Thanks, I guess I need to have a bucket under Cloud Storage?