Reputation
Badges 1
83 × Eureka!So what I want to do is import the custom packages into my remote execution
So for my project I have a dataset present in my local system, when I am running the pipeline remotely is there a way the remote machine can access it?
I am able to run the pipeline locally though
I have a pipeline which I am able to run locally, the pipeline has a pipeline controller along with 4 tasks, download data, training, testing and predict. How do I run execute this whole pipeline remotely so that each task is executed sequentially?
All I need to do is
pip install -r requirements.txt
pip install .
@<1523701205467926528:profile|AgitatedDove14> I was able to resolve that, but now I am having issues with fiftyone, it's showing me the following error
import fiftyone as fo
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/fiftyone/init.py", line 25, in <module>
from fiftyone.public import *
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/fiftyone/public.py", line 15, in <module>
_foo.establish_db_conn(config)
File "/root/.clearml...
So I am running a pipeline(using tasks) remotely and one of my task is importing stuff from one of my local repository, but it's giving me an error when I run the pipeline remotely
And also I have a requirements file which I want to be installed when I run the pipeline remotely
My git repo only contains the hash-ids which are used to download the dataset into my local machine
So I should clone the pipeline, run the agent and then enqueue the cloned pipeline?
individual steps are failing
One more thing in my git repo there is a dataset folder that contains hash-ids, these hash-ids are used to download the dataset. When I am running the pipeline remotely the files/images are downloaded in the cloned git repo inside the .clearml/venvs but when I check inside that venvs folder there are not images present.
Is there a way to change the path inside the .txt file to clearml cache, because my images are stored in clearml cache only
I am able to get the requirements installed for each task
Can you explain how running two agents would help me run the whole pipeline remotely? Sorry if its a very basic question
Is there a way to clone the whole pipeline, just like we clone tasks
The issue I am facing is when i do get_local_copy() the dataset(used for tarining yolov8) is downloaded inside the clearml cache (my image dataset contains images, labels, .txt files which has path to the images and a .yaml file). The downloaded .txt files shows that the image files are downloaded in the git repo present inside the clearml venvs, but actually that path doesn't exist and it is giving me an error
Is there a way to work around this?
when I am running the pipeline remotely, I am getting the following error message
There appear to be 6 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
I want to know how to execute pip install . to import all the custom packages
I want to understand what's happening at the backend. I want to know how running the pipeline logic and the tasks on separate agents gonna sync everything up