Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GrittyKangaroo27
Moderator
12 Questions, 26 Answers
  Active since 10 January 2023
  Last activity 4 months ago

Reputation

0

Badges 1

24 × Eureka!
0 Votes
5 Answers
552 Views
0 Votes 5 Answers 552 Views
I’ve played around with ClearML Data and spotted sth weird Basically, I’ve created 3 datasets #1 have 9444 files added to #2 has #1 as parent and I’ve added ...
2 years ago
0 Votes
5 Answers
557 Views
0 Votes 5 Answers 557 Views
Hi, I’m using PipelineDecorator.component to integrate the training task to a big pipeline. How could I turn off model logging when running this training ste...
2 years ago
0 Votes
4 Answers
536 Views
0 Votes 4 Answers 536 Views
I have a question related to PipelineV2 (using decorator) Is PipelineV2 support uploading artifacts to S3 (instead of fileserver)?
2 years ago
0 Votes
6 Answers
611 Views
0 Votes 6 Answers 611 Views
2 years ago
0 Votes
1 Answers
584 Views
0 Votes 1 Answers 584 Views
2 years ago
0 Votes
3 Answers
602 Views
0 Votes 3 Answers 602 Views
2 years ago
0 Votes
13 Answers
506 Views
0 Votes 13 Answers 506 Views
I get an AssertionError when passing my local requirements.txt file to packages argument of @PipelineDecorator.component() ... assert not packages or isinsta...
2 years ago
0 Votes
6 Answers
198 Views
0 Votes 6 Answers 198 Views
Hi, I want to update the destination some of my completed datasets, is it possible to do so and how? Thanks!
4 months ago
0 Votes
5 Answers
551 Views
0 Votes 5 Answers 551 Views
Currently, I’m manually tracking multiple output models at the end of the training process model_paths = list(Path(checkpoint_dir).absolute().glob('*')) for ...
2 years ago
0 Votes
3 Answers
528 Views
0 Votes 3 Answers 528 Views
Hi, our research team uses both local servers and cloud services to run a ML project In details, we do EDA, data preprocessing, conducting experiments and ot...
2 years ago
0 Votes
2 Answers
604 Views
0 Votes 2 Answers 604 Views
2 years ago
0 Votes
4 Answers
629 Views
0 Votes 4 Answers 629 Views
2 years ago
0 Hi, I’M Using

AgitatedDove14 This make sense. Hope we can have this argument in the next ClearML version ๐Ÿ˜„

2 years ago
0 Hi, I’M Using

AgitatedDove14 Sure

2 years ago
0 Hi, Our Research Team Uses Both

CostlyOstrich36
Great to hear that!

In short, we hope ClearML server can act as the bridge to connect local servers and Cloud infrastructure. (Local servers for development and Cloud for deployment and monitoring.)

For example,

  • We want to deploy ClearML somewhere on the Internet.
  • Then use this service to track experiments, orchestrate workflow, etc. in our local servers.
  • After finished experiments, we get returned artifacts and save them somewhere, local disk or cloud for instance.
    -...
2 years ago
0 I Have A Question Related To Pipelinev2 (Using Decorator) Is Pipelinev2 Support Uploading Artifacts To S3 (Instead Of Fileserver)?

CostlyOstrich36 Maybe because I donโ€™t clearly understand how ClearML works.

When I use PipelineV1 (using Task ), by default, all artifact will be uploaded to ClearML-fileserver if I configure output_uri = True . If I want to use S3 bucket instead, I must โ€œpointโ€ output_uri to the URI of that bucket.

Back to PipelineV2, I cannot find โ€œa placeโ€ on where I could put my S3 bucket URI.

2 years ago
0 I Get An

Additional information

When I leave packages argument as default None value and use debug_pipeline to run the pipeline, everything works as expected,

2 years ago
0 I Get An

Thanks!

2 years ago
0 I Get An

... @PipelineDecorator.component(parents=['step_1'], packages='../requirements.txt', cache=False, task_type=TaskTypes.data_processing, repo='.') def step_2(): import os ...CostlyOstrich36

2 years ago
0 I Get An

Hi CostlyOstrich36 , do you have any update on this issue?

2 years ago
0 I Get An

It works with full path

2 years ago
0 I Get An

CostlyOstrich36 I havenโ€™t tried.

The error above occurs when I trying to build a pipeline with decorator.

2 years ago
0 I Get An

CostlyOstrich36 Sorry, I donโ€™t understand what did you mean when mentioning with same file

2 years ago
0 I’Ve Played Around With Clearml Data And Spotted Sth Weird Basically, I’Ve Created 3 Datasets

Iโ€™ll try to reproduce this scenario to confirm no problem occurs during the generation of these datasets

2 years ago
0 I’Ve Played Around With Clearml Data And Spotted Sth Weird Basically, I’Ve Created 3 Datasets

TimelyPenguin76 As I remember, Iโ€™ve closed all dataset right after upload the data to ClearML-server

2 years ago
2 years ago
0 Is It Possible To Import User-Defined Modules When Wrapping Tasks/Steps With Functions And Decorators? As Far As I Know, When I Want To Define A Single “Step” In A Pipeline Using Function For Decorator, I Need To Import All Required Libs Inside This Wrapp

AgitatedDove14 Sorry for the confusing question. I mean I cannot use relative imports inside the โ€œwrappingโ€ function.

In detail, my project have this directory structure
โ””โ”€โ”€ project
โ”œโ”€โ”€ package1
โ”‚ โ”œโ”€โ”€ build_pipeline.py
โ”‚ โ”œโ”€โ”€ module1.py
โ”‚ โ””โ”€โ”€ module2.py
โ””โ”€โ”€ package2
โ”œโ”€โ”€ init.py
โ”œโ”€โ”€ module3.py
โ”œโ”€โ”€ module4.py
โ””โ”€โ”€ subpackage1
โ””โ”€โ”€ module5.py

From build_pipeline.py, inside each โ€œwrappingโ€ function, I cannot import module...

2 years ago
0 Hi, I Want To Update The

@<1523701205467926528:profile|AgitatedDove14> Thanks, I'll give it a try

4 months ago
0 Hi, I Want To Update The

@<1523701205467926528:profile|AgitatedDove14> The only reason I want to change the destination is because of an unforeseeable mistake in the past. Now I must change the old destination (private IP address) of my past datasets to the new alias (labserver-2) to be able to download using the python script.

4 months ago
0 Hi, I Want To Update The

@<1523701205467926528:profile|AgitatedDove14> do you have any documents and/or instruction to do so?

4 months ago