Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SmugDolphin23
Moderator
0 Questions, 433 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi, Working With Clearml 1.6.4 What Is The Correct Way To List All The

Hi OutrageousSheep60 . The list_datasets function is currently broken and will be fixed next release

3 years ago
0 Why Is Async_Delete Not Working?

what about this script (replace with your creds, comment out creds in clearml.conf for now)

from clearml import Task
from clearml.storage.helper import StorageHelper

task = Task.init("test", "test")
task.setup_aws_upload(
    bucket="bucket1",
    host="localhost:9000",
    key="",
    secret="",
    profile=None,
    secure=True
)
helper = StorageHelper.get("
")
one year ago
0 Hi, I'Ve Three Questions Regarding Clearml Pipelines.

Hi @<1523701504827985920:profile|SubstantialElk6> !
Regarding 1: pth files get pickled.
The flow is like this:

  • The step is created by the controller by writing some code to a file and running that file in python
  • The following line is ran in the step when returning values: None
  • This is eventually ran: [None](https://github.com/allegroai/clearml/blob/cbd...
2 years ago
0 I Configured S3 Storage In My Clearml.Conf File On A Worker Machine. Then I Run Experiment Which Produced A Small Artifact And It Doesn'T Appear In My Cloud Storage. What Am I Doing Wrong? How To Make Artifacts Appear On My S3 Storage? Below Is A Sample O

Hi again, @<1526734383564722176:profile|BoredBat47> ! I actually took a closer look at this. The config file should look like this:

        s3 {
            key: "KEY"
            secret: "SECRET"
            use_credentials_chain: false

            credentials: [
                {
                    host: "myendpoint:443"  # no http(s):// and no s3:// prefix, also no bucket name
                    key: "KEY"
                    secret: "SECRET"
                    secure: true  # ...
2 years ago
0 Anyone Here With Any Idea Why My Service Tasks Get Aborted When Going To Sleep?

Hi @<1523701868901961728:profile|ReassuredTiger98> ! Looks like the task actually somehow gets ran by both an agent and locally at the same time, so one of the is aborted. Any idea why this might happen?

2 years ago
0 Hello, Im Having Huge Performance Issues On Large Clearml Datasets How Can I Link To Parent Dataset Without Parent Dataset Files. I Want To Create A Smaller Subset Of Parent Dataset, Like 5% Of It. To Achieve This, I Have To Call Remove_Files() To 60K It

otherwise, you could run this as a hack:

        dataset._dataset_file_entries = {
            k: v
            for k, v in self._dataset_file_entries.items()
            if k not in files_to_remove  # you need to define this
        }

then call dataset.remove_files with a path that doesn't exist in the dataset.

one year ago
0 Hi! Is There A Way To

Hi @<1523707653782507520:profile|MelancholyElk85> ! I don't think this is possible at the moment 😕 Feel free to open a GH issue that proposes this feature tho

2 years ago
0 Hi, Following

Hi HandsomeGiraffe70 ! We found the cause for this problem, we will release a fix ASAP

3 years ago
0 Hi, Bug Report. I Was Trying To Upload Data To S3 Via Clearml.Dataset Interface

Perfect! Can you please provide the sizes of the files of the other 2 chunks as well?

3 years ago
0 What Sort Of Integration Is Possible With Clearml And Sagemaker? On The Page

Hi @<1532532498972545024:profile|LittleReindeer37> @<1523701205467926528:profile|AgitatedDove14>
I got the session with a bit of "hacking".
See this script:

import boto3, requests, json
from urllib.parse import urlparse

def get_notebook_data():
    log_path = "/opt/ml/metadata/resource-metadata.json"
    with open(log_path, "r") as logs:
        _logs = json.load(logs)
    return _logs

notebook_data = get_notebook_data()
client = boto3.client("sagemaker")
response = client.create_...
2 years ago
0 Hello Everyone! I Ran A Test Experiment And Got An Error. I'M Running On An M1 Mac. Worker Local Without Gpu. Has Anyone Already Solved This Problem?

We used to have "<=20" as the default pip version in the agent. Looks like this default value still exists on your machine. But that version of pip doesn't know how to install your version of pytorch...

2 years ago
0 Hello! I Have The Following Error In The Task'S Console:

Btw, to specify a custom package, add the path to that package to your requirements.txt (the path can also be a github link for example).

2 years ago
0 Hi

Hi @<1546303293918023680:profile|MiniatureRobin9> The PipelineController has a property called id , so just doing something like pipeline.id should be enough

one year ago
0 Hi, We Have Recently Upgraded To

Regarding 1. , are you trying to delete the project from the UI? (I can't see an attached image in your message)

3 years ago
0 Hello, Im Having Huge Performance Issues On Large Clearml Datasets How Can I Link To Parent Dataset Without Parent Dataset Files. I Want To Create A Smaller Subset Of Parent Dataset, Like 5% Of It. To Achieve This, I Have To Call Remove_Files() To 60K It

Hi @<1590514584836378624:profile|AmiableSeaturtle81> ! Looks like remove_files doesn't support lists indeed. It does support paths with wildcards tho, if that helps.
I would remove all the files to the dataset and add only the ones you need back as a workaround for now, or just create a new dataset

one year ago
0 Hi! Is There Any Way To Add Git-Like Ignore File For Versioning Clearml Data? I Saw In Docs A Wildcard Argument When Files Are Added To A Dataset. How Can I Specify Ignoring Of Some File Types? For Example, I Want To Ignore Ipynb Checkpoints. How Can I Do

Hi @<1676038099831885824:profile|BlushingCrocodile88> ! We will soon try to merge a PR submitted via Github that will allow you to specify a list of files to be added to the dataset. So you will then by able to do something like add_files(glob.glob(*) - glob.glob(*.ipynb))

one year ago
0 I Dont Exactly Know How To Ask For Help On This... Nor Have A Reproducible Minimal Example... I Downgraded Back To 1.15.1 From 1.16.2 And Have The Same Issue There. I Have A Pipeline That'S Repeatedly Failing To Complete. It Correctly Marks Things As Cach

do you have the agent logs that is supposed to run your pipeline? Maybe there is a clue there. I would also suggest to try enqueuing the pipeline to some other queue, maybe even run the agent on your on machine if you do not already and see what happens

one year ago
0 Hello All, Although I Call Pipe.Wait() Or Pipe.Start(Wait=True), The Pipelinecontroller Does Not Wait In The Script Until The Pipeline Actually Terminates And Throws: Warning - Terminating Local Execution Process. Can Someone Please Help Me? Thanks A Lot

Oh I see what you mean. start will enqueue the pipeline, in order for it to be ran remotely by an agent. I think that what you want to call is pipe.start_locally(run_pipeline_steps_locally=True) (and get rid of the wait ).

2 years ago
0 Hi. I Have A Job That Processes Images And Creates ~5 Gb Of Processed Image Files (Lots Of Small Ones). At The End - It Creates A

Hi PanickyMoth78 ! This will likely not make it into 1.9.0 (this will be the next version we release, most likely before Christmas). We will try to get the fix out in 1.9.1

2 years ago
0 So From What I Can Tell Using

ShinyPuppy47 do you have a small example we could take a look at?

2 years ago
0 Hey Folks, Trying To Use The Model Class From The Clearml Sdk And Seeing Some Weird Errors. I Am Loading A Model This Way And Trying To See A Metadata Value For The Model Object.

Hi @<1523701132025663488:profile|SlimyElephant79> ! Looks like this is a bug on our part. We will fix this as soon as possible

2 years ago
0 Does Clearml Somehow

UnevenDolphin73 looks like we clear all loggers when a task is closed, not just clearml ones. this is the problem

2 years ago
0 Hi Everyone

Hi @<1546303293918023680:profile|MiniatureRobin9> ! When it comes to pipeline from functions/other tasks, this is not really supported. You could however cut the execution short when your step is being ran by evaluating the return values from other steps.

Note that you should however be able to skip steps if you are using pipeline from decorators

2 years ago
0 I Have An Environment Error When Running Hpo:

Hi @<1694157594333024256:profile|DisturbedParrot38> ! If you want to override the parameter, you could add a DiscreteParameterRange to hyper_paramters when calling HyperParameterOptimizer . The DiscreteParameterRange should have just 1 value: the value you want to override the parameter with.
You could try setting the parameter to an empty string in order to mark it as cleared

one year ago
0 Hi Guys, Are There Any Ways To Suppress Clearml’S Console Messages? I’M Not Interested In Messages Like This, Especially About Uploading Models. I Tried Some Stuff With Loggers ” Logging.Basicconfig(Format=‘%(Name)S - %(Levelname)S - %(Message)S’, Level=

Hi @<1715900760333488128:profile|ScaryShrimp33> ! You can set the log level by setting the CLEARML_LOG_LEVEL env var before importing clearml. For example:

import os
os.environ["CLEARML_LOG_LEVEL"] = "ERROR"  # or str(logging.CRITICAL/whatever level) also works 

Note that the ClearML Monitor warning is most likely logged to stdout, in which case this message can't really be suppressed, but model upload related message should be

one year ago
Show more results compactanswers