Reputation
Badges 1
83 × Eureka!While creating a GCP credentials using None
What values should I insert in the following step so that the autoscaler has access, as of now I left this field blank
I am uploading the dataset (for Yolov8 training) as an artifact, when I am downloading the artifact (.zip file) from the UI the path to images is something like /Users/adityachaudhry/.clearml/cache/......, but when I am doing .get_local_copy() I am getting the local folder structure where I have my images locally in my system as path. For running the pipeline remotely I want the path to be like /Users/adityachaudhry/.clearml/cache/......
My git repo only contains the hash-ids which are used to download the dataset into my local machine
One more thing in my git repo there is a dataset folder that contains hash-ids, these hash-ids are used to download the dataset. When I am running the pipeline remotely the files/images are downloaded in the cloned git repo inside the .clearml/venvs but when I check inside that venvs folder there are not images present.
I want to understand what's happening at the backend. I want to know how running the pipeline logic and the tasks on separate agents gonna sync everything up
So I should clone the pipeline, run the agent and then enqueue the cloned pipeline?
Is there a way to clone the whole pipeline, just like we clone tasks
Ok so here's what I want to do, I want to export Google Application credentials to my docker container. Here's what I have tried so far
agent.extra_docker_shell_script: [
"echo -E '{
'type': 'xxx',
'project_id': 'xxx',
'private_key_id': 'xxx',
....
}' > google-api-key.json", "export GOOGLE_APPLICATION_CREDENTIALS=google-api-key.json"]
Heyy guys, I was able to run the pipeline using autoscaler, thanks to @<1523701070390366208:profile|CostlyOstrich36> @<1523701087100473344:profile|SuccessfulKoala55> for all your help and suggestions!!
So what I want to do is import the custom packages into my remote execution
@<1523701205467926528:profile|AgitatedDove14> I was able to resolve that, but now I am having issues with fiftyone, it's showing me the following error
import fiftyone as fo
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/fiftyone/init.py", line 25, in <module>
from fiftyone.public import *
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/fiftyone/public.py", line 15, in <module>
_foo.establish_db_conn(config)
File "/root/.clearml...
And one more thing is there a way to make changes to the .bashrc which is present inside the docker container
Ok I was able to resolve the above issue, but now I am getting the following error while executing a task
import cv2
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/cv2/init.py", line 181, in <module>
bootstrap()
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/cv2/init.py", line 153, in bootstrap
native_module = importlib.import_module("cv2")
File "/usr/lib/python3.8/importlib/init.py", line 127, in import_module
return _boots...
I provided the credentials while setting up the autoscaler instance, where can I look for the clearml.conf. When I ssh into the instance, spin up by the autoscaler, I am not able to see the clearml.conf
Ok I'll try that out, enable_git_ask_pass: true is not working
Just a follow up on this issue, @<1523701087100473344:profile|SuccessfulKoala55> @<1523701205467926528:profile|AgitatedDove14> I would very much appreciate it if you could help me with this.
Let me know if this is enough information or not
I have a pipeline which I am able to run locally, the pipeline has a pipeline controller along with 4 tasks, download data, training, testing and predict. How do I run execute this whole pipeline remotely so that each task is executed sequentially?
So for my project I have a dataset present in my local system, when I am running the pipeline remotely is there a way the remote machine can access it?
Is there a way to work around this?
when I am running the pipeline remotely, I am getting the following error message
There appear to be 6 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
And also I have a requirements file which I want to be installed when I run the pipeline remotely
I am able to run the pipeline locally though
individual steps are failing
so inside /Users/adityachaudhry/.clearml/venvs-builds.1/3.10/task_repository/ I have my git repo, I have one component that make a dataset directory inside this git repo, but when the other component starts executing this dataset directory is not there