Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SmugDolphin23
Moderator
0 Questions, 418 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi! Pipelinecontroller Has Method:

Hi @<1523701240951738368:profile|RoundMosquito25> ! Yes, you should be able to do that

one year ago
0 Hi All

Hi @<1523701523954012160:profile|ShallowCormorant89> ! This is not really supported, but you could use continue_on_fail to make sure you get to your last step: None

one year ago
0 Hi Everyone, Weird Problem With Dataset.Get_Local_Copy (Both From Sdk And From Clearml-Data): I Have A Dataset With A Single File And Lots Of S3 Links. Used To Work Perfectly Until Those Files Started Becoming Larger (Or It Is Just A Matter Of Bad Timing

Hi @<1523705721235968000:profile|GrittyStarfish67> ! This looks like a boto3 error. You could try lowering sdk.aws.s3.boto3.max_multipart_concurrency in clearml.conf and setting max_workers=1 when calling Dataset.get_local_copy

one year ago
0 Hi There, Is There A Way To Upload/Connect Artifact To A Certain Running/Completed Task, Using A Different Scope Other Then The One That'S Running? (I Mean, Instead Of Use Task.Upload_Artifact, Use Task,Get_Tasks(Task_Id=<Some_Task_Id>) And Then Use This

Hi @<1539417873305309184:profile|DangerousMole43> ! You need to mark the task you want to upload an artifact to as running. You can use task.mark_started(force=True) to do so, then mark it back as completed using task.mark_completed(force=True)

9 months ago
0 Hey Guys! I Would Love To Know How To Integrate Hpo Inside Clearml Pipelines. I Have Made A Continuous Learning Pipeline With Data Etl And Model Training And As A Next Step, It Would Be Cool To Add Hpo. Most Of The Examples On The Website Create A New Ta

Hi @<1676400486225285120:profile|GracefulSquid84> ! Each step is indeed a clearml task. You could try using the step ID. Just make sure you pass the ID to the HPO step (you can do that by simply returning the Task.current_task().id

10 months ago
0 I'M Wondering If I'Ve Run Into A Bug, Or Am Not Understanding Something Correctly. In A

Hi SteadySeagull18 ! The docs are correct.
How do you run the pipeline controller? Is it remotely, locally, locally with an agent? If you run os.path.exists(model.url[len("file://"):]) does it return True ?
Can you provide a minimal example that could help us reproduce the issue?
Thank you

2 years ago
0 Hey All, Hope You'Re Having A Great Day, Having An Unexpected Behavior With A Training Task Of A Yolov5 Model On My Pipeline, I Specified A Task In My Training Component Like This:

FierceHamster54 I understand. I'm not sure why this happens then 😕 . We will need to investigate this properly. Thank you for reporting this and sorry for the time wasted training your model.

2 years ago
0 Hi All, I Have A Question Regarding Multiple Parents: I Have A Pipe That Runs On Multiple Datasets, And The Last Step Does Something On The Bulk Of Those Sets (The Thing Itself Is Not Important). Sometimes One Of The Parents Fails Or Skipped Due To A Prev

Hi @<1639799308809146368:profile|TritePigeon86> ! Please see continue_behaviour . You should be able to pass the parameter to your parent step. It is not documented yet, but it should be available in the latest version of clearml. See this for some documentation: None

7 months ago
0 Hello! I Have A Dataset On A /Mnt Share. When I Try To Get A Local Copy, The Dataset On The Share Is Deleted. Is This Correct Behaviour? This Is How I Get The Dataset:

Hi DeliciousKoala34 . I was able to reproduce your issue. I'm now looking for a solution for your problem. Thank you

2 years ago
0 Hey, We Run A Pipeline Using The Pipelinecontroller, When We Do It From

Hi @<1544853695869489152:profile|NonchalantOx99> ! In your clearml.conf , try to set, at the end of the whole file, outside any curly brackets, agent.package_manager.pip_version: "23.1.2"

one year ago
0 Hi, Bug Report. I Was Trying To Upload Data To S3 Via Clearml.Dataset Interface

Perfect! Can you please provide the sizes of the files of the other 2 chunks as well?

2 years ago
one year ago
0 Hey All, Is There A Way To Upload A Fiftyone Dataset As An Artifact In A Clearml Pipeline? I Am Getting The Following Error When I Try To Upload It

Hi @<1610083503607648256:profile|DiminutiveToad80> ! You need to somehow serialize the object. Note that we try different serialization methods and default to pickle if none work. If pickle doesn't work then the artifact can't be uploaded by default. But there is a way around it: you can serialize the object yourself. The recommended way to do this is using the serialization_function argument in upload_artifact . You could try using something like dill which can serialize more ob...

8 months ago
0 Hi, I Have Noticed That Dataset Has Started Reporting My Dataset Head As A Txt File In "Debug Samples -> Metric: Tables". Can I Disable It? Thanks!

You're correct. There are 2 main entries in the conf file: api and sdk . The dataset entry should be under sdk

2 years ago
0 Hi Team, I Am Trying To Run A Pipeline Remotely Using Clearml Pipeline And I’M Encountering Some Issues. Could Anyone Please Assist Me In Resolving Them?

@<1657556312684236800:profile|ManiacalSeaturtle63> what clearml SDK version are you using? I believe there was a bug related to pipelines not showing in the UI, but that was fixed in clearml==1.14.1

11 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Hi @<1570220858075516928:profile|SlipperySheep79> ! What happens if you do this:

import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
from clearml import Task

parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)

if __name__ == '__main__':
    if not Task.current_task():
      args = parser.parse_args()
      with open(args.config) as f:
          config = yaml.load(f, yaml.FullLoader)
    run_pipeline(config)
one year ago
0 Hey Folks, Trying To Use The Model Class From The Clearml Sdk And Seeing Some Weird Errors. I Am Loading A Model This Way And Trying To See A Metadata Value For The Model Object.

Hi @<1523701132025663488:profile|SlimyElephant79> ! Looks like this is a bug on our part. We will fix this as soon as possible

one year ago
0 I’M Trying To Understand The Execution Flow Of Pipelines When Translating From Local To Remote Execution. I’Ve Defined A Pipeline Using The

Hi @<1533620191232004096:profile|NuttyLobster9> ! PipelineDecorator.get_current_pipeline will return a PipelineDecorator instance (which inherits from PipelineController ) once the pipeline function has been called. So

pipeline = PipelineDecorator.get_current_pipeline()
pipeline(*args)

doesn't really make sense. You should likely call pipeline = build_pipeline(*args) instead

9 months ago
0 Hi, I Have Noticed That Dataset Has Started Reporting My Dataset Head As A Txt File In "Debug Samples -> Metric: Tables". Can I Disable It? Thanks!

HandsomeGiraffe70 your conf file should look something like this:
` {
# ClearML - default SDK configuration

storage {
    cache {
        # Defaults to system temp folder / cache
        default_base_dir: "~/.clearml/cache"
        # default_cache_manager_size: 100
    }

    direct_access: [
        # Objects matching are considered to be available for direct access, i.e. they will not be downloaded
        # or cached, and any download request will ...
2 years ago
0 Hello, I Have A Question Regarding The Usage Of

Hi JumpyDragonfly13 ! Try using get_task_log instead of download_task_log

2 years ago
0 Why Is Async_Delete Not Working?

@<1590514584836378624:profile|AmiableSeaturtle81> weren't you using https for the s3 host? maybe the issue has something to do with that?

10 months ago
Show more results compactanswers