Reputation
Badges 1
24 × Eureka!Nice, I’ll try this out
TimelyPenguin76 As I remember, I’ve closed all dataset right after upload the data to ClearML-server
I publish the task with UI interface
Yes, basically like this
Additional information
When I leave packages
argument as default None value and use debug_pipeline
to run the pipeline, everything works as expected,
CostlyOstrich36 Sorry, I don’t understand what did you mean when mentioning with same file
I face the same problem.
When running the pipeline, some tasks that use multiprocessing would never be completed.
CostlyOstrich36
Great to hear that!
In short, we hope ClearML server can act as the bridge to connect local servers and Cloud infrastructure. (Local servers for development and Cloud for deployment and monitoring.)
For example,
- We want to deploy ClearML somewhere on the Internet.
- Then use this service to track experiments, orchestrate workflow, etc. in our local servers.
- After finished experiments, we get returned artifacts and save them somewhere, local disk or cloud for instance.
-...
... @PipelineDecorator.component(parents=['step_1'], packages='../requirements.txt', cache=False, task_type=TaskTypes.data_processing, repo='.') def step_2(): import os ...
CostlyOstrich36
AgitatedDove14 GreasyPenguin14 Awesome!
Awesome! Thanks! SuccessfulKoala55
CostlyOstrich36 I haven’t tried.
The error above occurs when I trying to build a pipeline with decorator.
AgitatedDove14 Nice! I’ll try this out
CostlyOstrich36 Maybe because I don’t clearly understand how ClearML works.
When I use PipelineV1 (using Task
), by default, all artifact will be uploaded to ClearML-fileserver if I configure output_uri = True
. If I want to use S3 bucket instead, I must “point” output_uri
to the URI of that bucket.
Back to PipelineV2, I cannot find “a place” on where I could put my S3 bucket URI.
AgitatedDove14 Sorry for the confusing question. I mean I cannot use relative imports inside the “wrapping” function.
In detail, my project have this directory structure
└── project
├── package1
│ ├── build_pipeline.py
│ ├── module1.py
│ └── module2.py
└── package2
├── init.py
├── module3.py
├── module4.py
└── subpackage1
└── module5.py
From build_pipeline.py, inside each “wrapping” function, I cannot import module...
AgitatedDove14 This make sense. Hope we can have this argument in the next ClearML version 😄
@<1523701205467926528:profile|AgitatedDove14> do you have any documents and/or instruction to do so?
@<1523701205467926528:profile|AgitatedDove14> The only reason I want to change the destination
is because of an unforeseeable mistake in the past. Now I must change the old destination
(private IP address) of my past datasets to the new alias (labserver-2) to be able to download using the python script.
I’ll try to reproduce this scenario to confirm no problem occurs during the generation of these datasets
So, I have to package all the modules first and find a way to install that package at the beginning of pipeline execution to be about to use these module, am I right?