Oh if this is the case, then by all means push it into your Task's docker_setup_bash_script
It does not seem to have to be done after the git clone, the only part the I can see is setting the PYTHONPATH to the additional repo you are pulling, and that should work.
The main hurdle might be passing credentials to git, but if you are using SSH it should be transparent
wdyt?
Thanks a lot. I meant running a bash script after cloning the repository and setting the environment
Hmm that is currently not supported 😞
The main issue in adding support is where to store this bash script...
Perhaps somewhere inside clear ml there is an order of actions for starting that can be changed?
Not that I can think of,
but let's assume you could have such a thing, what would you have put in the bash script (basically I want to see maybe there is a workaround for you, based on your script)
see here the docker_setup_bash_script
argument
None
It will be executed (no need for the #!/bin/bash
btw) before starting to setup the env inside the container, so apt-get and the like can be executed if needed. Notice that if this is something that Always needs to be executed, you can put the same list of commands here: None
@<1523701205467926528:profile|AgitatedDove14> Thanks a lot. I meant running a bash script after cloning the repository and setting the environment
Thank you for your response @<1523701205467926528:profile|AgitatedDove14> . I will definitely try the solutions you described above. Could you please advise if it is possible to execute the "bash.sh" script directly before the environment setup stages for reproducing the experiment? The repository setup involves downloading resources from AWS. While creating a container that incorporates my requirements would help solve this problem, I am interested in finding a more flexible approach.
@<1523701205467926528:profile|AgitatedDove14> Perhaps somewhere inside clear ml there is an order of actions for starting that can be changed?
@<1523701205467926528:profile|AgitatedDove14> The bash script does the unloading of the necessary resources from aws and sets the environment variable
aws s3 cp ..... --recursive
export PYTHONPATH=" "
All commands can be added to the generated docker image, but you will have to change the project structure
Hi @<1524560082761682944:profile|MammothParrot39>
The traditional solution is git submodules, basically main repo links to other repos. This way the agent can fully reproduce the full env.
Another option is to install the second repo as Python package with link to the repo and commit
And a third option is having the second repo as part of the docker.
Regrading env variables, you can add '-e env=val' as part if the docker arts section
Wdyt?