Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SmugDolphin23
Moderator
0 Questions, 418 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hello All, I Want To Clarify Something. In The

I think we should just have a new parameter

10 months ago
0 Hello All, I Want To Clarify Something. In The

No need, I think I will review it on Monday

10 months ago
10 months ago
10 months ago
0 Hi, I Am Struggling For Following Points. 1. Trying To Update Model Metadata Through

Hi @<1654294820488744960:profile|DrabAlligator92> ! The way chunk size works is:
the upload will try to obtain zips that are smaller than the chunk size. So it will continuously add files to the same zip until the chunk size is exceeded. If the chunk size is exceeded, a new chunk (zip) is created. The initial file in this chunk is the file that caused the previous size to be exceeded (regardless of the fact that the file itself might exceed the size).
So in your case: am empty chunk is creat...

one year ago
0 I Tried Using

Hi @<1523708920831414272:profile|SuperficialDolphin93> ! What if you do just controller.start() (to start it locally). The task should not quit in this case.

one month ago
0 For Some Reason, When I Try To Load A Dataset (Dataset.Get), Method _Query Task Is Called And This Method Try To Call _Send Method Of Interfacebase Class. This Method May Return None And This Case Is Not Handled By The _Query_Task Method That Tries To Rea

Hello MotionlessCoral18 . I have a few questions that might help us find out why you experience this problem:
Is there any chance you are running the program in offline mode? Is there any other message being logged that might help? The error messages might include Action failed , Failed sending , Retrying, previous request failed , contains illegal schema Are you able to connect to the backend at all from the program you are trying to get the dataset?
Thank you!

2 years ago
0 Hi Team, I Am Trying To Run A Pipeline Remotely Using Clearml Pipeline And I’M Encountering Some Issues. Could Anyone Please Assist Me In Resolving Them?

@<1626028578648887296:profile|FreshFly37> can you please screenshot this section of the task? Also, how does your project's directory structure look like?
image

11 months ago
0 Hello, For Some Reason My Upload Speed To S3 Is Insanely Slow, I Noticed In Logs That It Upoads To /Tmp Folder. What Does That Mean? Why Tmp?

Hi @<1590514584836378624:profile|AmiableSeaturtle81> ! What function are you using to upload the data?

8 months ago
0 Hey, Just A Quick Question. I'M Trying To Create A Pipeline And In One Step I'M Passing A Model From The Previous Step. Is It Possible To Get Model By Name And Not By Index. More Concretely I Can Do

@<1531445337942659072:profile|OddCentipede48> Looks like this is indeed not supported. What you could do is return the ID of the task that returns the models, then use Task.get_task and get the model from there. Here is an example:

from clearml import PipelineController


def step_one():
    from clearml import Task
    from clearml.binding.frameworks import WeightsFileHandler
    from clearml.model import Framework

    WeightsFileHandler.create_output_model(
        "obj", "file...
one year ago
0 Can Steps Be Removed From Pipelines, And/Or Can Pipelines Be Generally Modified Other Than Adding Steps To Them?

@<1523701083040387072:profile|UnevenDolphin73> are you composing the code you want to execute remotely by copy pasting it from various cells in one standalone cell?

11 months ago
0 Hey All, Hope You'Re Having A Great Day, Having An Unexpected Behavior With A Training Task Of A Yolov5 Model On My Pipeline, I Specified A Task In My Training Component Like This:

FierceHamster54
initing the task before the execution of the file like in my snippet is not sufficient ?It is not because os.system spawns a whole different process then the one you initialized your task in, so no patching is done on the framework you are using. Child processes need to call Task.init because of this, unless they were forked, in which case the patching is already done.
` But the training.py has already a CLearML task created under the hood since its integratio...

2 years ago
0 Hey Everyone, As A Pro-Tier Saas User, I'M Experiencing A Very High Latency When Finalizing A Dataset, It Is Attached In A Big Dataset Version Hierarchy And Since Recently The

Hi @<1523702000586330112:profile|FierceHamster54> ! Looks like we pull all the ancestors of a dataset when we finalize. I think this can be optimized. We will keep you posted when we make some improvements

one year ago
0 Hello, Is There A Way To Disable Dataset Caching So That When

FreshParrot56 You could modify this entry in your clearml.conf to point to your drive: sdk.storage.cache.default_base_dir .
Or, if you don't want to touch your conf file, you could set the env var CLEARML_CACHE_DIR to your remote drive before you call get_local_copy. See this example:
` dataset = Dataset.get(DATASET_ID)
os.environ["CLEARML_CACHE_DIR"] = "/mnt/remote/drive" # change the clearml cache, make it point to your remote drive
copy_path = dataset.get_loc...

2 years ago
0 Are There Any Resources On How I Can Implement Hyperparameter Optimisation Using Ray Tune On Clearml?

Hi @<1581454875005292544:profile|SuccessfulOtter28> ! You could take a look at how the HPO was built using optuna: None .
Basically: you should create a new class which inherits from SearchStrategy . This class should convert clearml hyper_parameters to some parameters the Ray Tune understands, then create a Tuner and run the Ray Tune hyper paramter optimization.
The function Tuner will optim...

8 months ago
0 Hello All! Is It Possible To Utilize Shared Memory In Clearml For Tasks Like Model Inference, Where Instead Of Transferring Images Over The Network (E.G., Http, Rpc), We Use A Shared Memory Extension? Please Refer To The Link Below:

Hi @<1657918706052763648:profile|SillyRobin38> ! If it is compatible with http/rest, you could try setting api.files_server to the endpoint or sdk.storage.default_output_uri in clearml.conf (depending on your use-case).

9 months ago
0 Hi, Is There A General Github Actions Workflow Just To Login Into Your Clearml App (Demo Or Server) So I Can Run Python Files Related To Clearml. I'Ve Seen Clearml-Actions-Train-Model And Clearml-Actions-Get-Stats And They Seem To Be Very Specific. Maybe

Indeed, running pipelines that were started with pipe.start_locally can not be cloned and ran. We will change this behaviour ASAP such that you can use just 1 queue for your use case.

2 years ago
0 Cannot Upload A Dataset With A Parent - Seems Very Odd! Clearml Versions I Tried: 1.6.1, 1.6.2 Scenario: * Create Parent Dataset (With Storage On S3) * Upload Data * Close Dataset * Create Child Dataset (Tried With Storage On Both S3 Or On Clearml Serv

Hi RoughTiger69 ! Can you try adding the files using a python script such that we could get an exception traceback, something like this:
` from clearml import Dataset

or just use the ID of the dataset you previously created instead of creating a new one

parent_dataset = Dataset.create(dataset_name="xxxx", dataset_project="yyyyy", output_uri=" ")
parent_dataset.add_files("folder1")
parent_dataset.upload()
parent_dataset.finalize()

child_dataset = Dataset.create(dataset_name="xxxx", dat...

2 years ago
0 Hi, I Have An Issue, But Lets Start With The Description. This Is Snippet Of My Project'S Structure:

@<1554638160548335616:profile|AverageSealion33> Can you run the script with HYDRA_FULL_ERROR=1 . Also, what if you run the script without clearml? Do you get the same error?

one year ago
0 Hi All, I Observed That When I Get A Dataset With

SmallGiraffe94 You should use dataset_version=2022-09-07 (not version=... ). This should work for your use-case.
Dataset.get shouldn't actually accept a version kwarg, but it does because it accepts some **kwargs used internally. We will make sure to warn the users in case they pass values to **kwargs from now on.
Anyway, this issue still exists, but in another form:
Dataset.get can't get datasets with a non-semantic version, unless the version is sp...

2 years ago
0 Hi All

That's unfortunate. Looks like this is indeed a problem 😕 We will look into it and get back to you.

one year ago
0 Hey There, I Am A New User Of Clearml And Really Enjoying It So Far! I Noticed That My Model Checkpoints Are Saved After Each Epoch. Instead I Would Like To Only Save The Best And Last Model Checkpoint. Is That Possible? I Could Not Find Something Regardi

Hi @<1547390464557060096:profile|NuttyKoala57> ! You can use wildcards in auto_connect_framework to filter your models. Check the docs under init: None . You might also want to check out this GH thread for an another way to do this: None

one year ago
0 Hello! I Can'T Seem To Be Able To Stop Clearml From Automatically Logging Model Files (Optimizer, Scheduler). It'S A Useful Feature But I'D Like To Have Some Control Over It, So That The Disk Space In My File Storage Isn'T Overused. I'M Using

Hi @<1523701345993887744:profile|SillySealion58> ! We allow finer grained control over model uploads. Please refer to this GH thread for an example on how to achieve that: None

6 months ago
0 I Configured S3 Storage In My Clearml.Conf File On A Worker Machine. Then I Run Experiment Which Produced A Small Artifact And It Doesn'T Appear In My Cloud Storage. What Am I Doing Wrong? How To Make Artifacts Appear On My S3 Storage? Below Is A Sample O

Hi again, @<1526734383564722176:profile|BoredBat47> ! I actually took a closer look at this. The config file should look like this:

        s3 {
            key: "KEY"
            secret: "SECRET"
            use_credentials_chain: false

            credentials: [
                {
                    host: "myendpoint:443"  # no http(s):// and no s3:// prefix, also no bucket name
                    key: "KEY"
                    secret: "SECRET"
                    secure: true  # ...
one year ago
0 Hi There. In A Clearml Pipeline Step With Docker, I Specify The Git Repo And Branch I Want To Use. How Can I Also Specify A Repos Optional Dependecies? It Uses Poetry For Deendency Management

Hi @<1688721797135994880:profile|ThoughtfulPeacock83> ! Make sure you set agent.package_manager.type: poetry in your clearml.conf . If you do, the poetry.lock of pyproject.toml will be used to install the packages. See None

9 months ago
Show more results compactanswers