Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SmugDolphin23
Moderator
0 Questions, 433 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi, I’M Trying To Integrate Logger In My Pipelinedecorator But I’M Getting This Error -

Your object is likely holding some file descriptor or something like that. The pipeline steps are all running in separate processes (they can even run on different machines while running remotely). You need to make sure that the objects that you are returning are thus pickleable and can be passed between these processes. You can try to see that the logger you are passing around is indeed pickalable by calling pickle.dump(s) on it an then loading it in another run.
The best practice would ...

one year ago
0 Hi All, I Observed That When I Get A Dataset With

SmallGiraffe94 You should use dataset_version=2022-09-07 (not version=... ). This should work for your use-case.
Dataset.get shouldn't actually accept a version kwarg, but it does because it accepts some **kwargs used internally. We will make sure to warn the users in case they pass values to **kwargs from now on.
Anyway, this issue still exists, but in another form:
Dataset.get can't get datasets with a non-semantic version, unless the version is sp...

3 years ago
0 Hi Everyone! I Would Need Your Opinion About How To Proceed With Offline Mode When Serving Models. We Currently Have Some Serving Endpoints Which Are Constantly Running And We Report Images And Statistics. How Do You Recommend Dealing With Offline Mode In

Hi @<1817731756720132096:profile|WickedWhale51> ! ClearML is tolerant to network failures. Anyway, if you wish the upload the offline data periodically, you could zip the offline mode folder and import it:

# make sure the state of the offline data is saved
Task.current_task()._edit()
# create zip file
offline_folder = Task.current_task().get_offline_mode_folder()
zip_file = offline_folder.as_posix() + ".zip"
with ZipFile(zip_file, "w", allowZip64=True, compression=ZIP_DEFLATED) as zf:
...
7 months ago
0 Hi Everyone, I Have A Question About Using

Hi @<1643060801088524288:profile|HarebrainedOstrich43> ! Could you please share some code that could help us reproduced the issue? I tried cloning, changing parameters and running a decorated pipeline but the whole process worked as expected for me.

one year ago
0 Hi All

Thank you 😊

9 months ago
0 Hello. I Have A Question Regarding Pipeline Parameters. Is It Possible To Reference Pipeline Parameters In Other Fields Of The

Hi DangerousDragonfly8 ! At the moment, this is not possible, but we do have it in plan (we had some prior requests for this feature)

2 years ago
0 Hello! When Running This Code:

Please let me know if this works!

2 years ago
0 I Configured S3 Storage In My Clearml.Conf File On A Worker Machine. Then I Run Experiment Which Produced A Small Artifact And It Doesn'T Appear In My Cloud Storage. What Am I Doing Wrong? How To Make Artifacts Appear On My S3 Storage? Below Is A Sample O

@<1526734383564722176:profile|BoredBat47> Yeah. This is an example:

 s3 {
            key: "mykey"
            secret: "mysecret"
            region: "us-east-1"
            credentials: [
                 {
                     bucket: "
"
                     key: "mykey"
                     secret: "mysecret"
                    region: "us-east-1"
                  },
            ]
}
# some other config
default_output_uri: "
"
2 years ago
0 I Uploaded Direct Access File To Clearml Dataset System Like This One. How Can I Access The Link Of The Uploaded Item. Whenever I Try To Call

Hi @<1570583237065969664:profile|AdorableCrocodile14> ! get_local_copy will always copy/download external files to a folder. To get the external files, there is property on the dataset called link_entries which returns a list of LinkEntry objects, which contain a link attribute, and each such link should point to a extrenal file (in this case, your local paths prefixed with file:// )

2 years ago
0 Hello Everyone! I Cant Connect Clearml With Yandex Storage S3. I Have An Error With Keys And Permissions (See The Screenshots), But I Can Upload Model Weights On Yandex Storage S3 Without Clearml. Maybe I Have Problems With My Config? Could You Help Me, P

Hi @<1675675705284759552:profile|NonsensicalAnt77> ! How are you uploading the model weights without using the SDK? Can you please share a code snippet (might be useful in finding why your config doesn't work). Also, what is your clearml version?

one year ago
0 Hello. I Am Using Hydra As Configuration Manager And I Am Using A Decorator To Specify The File And The Folder It Is Contained In (Typical Hydra Syntax). The Code Now Runs Into This Error That Says, "Primary Config Directory Not Found. Set The Environment

Hi @<1715175986749771776:profile|FuzzySeaanemone21> ! Are you running this remotely? If so, you should work inside a repository such that the agent can clone the repository which should include the config as well. Otherwise, the script will run as a "standalone"

one year ago
0 Hi, I'Ve Three Questions Regarding Clearml Pipelines.

Hi @<1523701504827985920:profile|SubstantialElk6> !
Regarding 1: pth files get pickled.
The flow is like this:

  • The step is created by the controller by writing some code to a file and running that file in python
  • The following line is ran in the step when returning values: None
  • This is eventually ran: [None](https://github.com/allegroai/clearml/blob/cbd...
2 years ago
0 Hi Everyone, I Have A Question About Using

Hi @<1643060801088524288:profile|HarebrainedOstrich43> ! Thank you for reporting. We will get back to you as soon as we have something

one year ago
0 For Some Reason, When I Try To Load A Dataset (Dataset.Get), Method _Query Task Is Called And This Method Try To Call _Send Method Of Interfacebase Class. This Method May Return None And This Case Is Not Handled By The _Query_Task Method That Tries To Rea

Hello MotionlessCoral18 . I have a few questions that might help us find out why you experience this problem:
Is there any chance you are running the program in offline mode? Is there any other message being logged that might help? The error messages might include Action failed , Failed sending , Retrying, previous request failed , contains illegal schema Are you able to connect to the backend at all from the program you are trying to get the dataset?
Thank you!

3 years ago
0 Hi Team, I Am Trying To Run A Pipeline Remotely Using Clearml Pipeline And I’M Encountering Some Issues. Could Anyone Please Assist Me In Resolving Them?

Regarding pending pipelines: please make sure a free agent is bound to the queue you wish to run the pipeline in. You can check queue information by accessing the INFO section of the controller (as in the first screenshort)
then by pressing on the queue, you should see the worker status. There should be at least one worker that has a blank "CURRENTLY EXECUTING" entry
image
![image](https://clearml-we...

one year ago
0 Hey, We Run A Pipeline Using The Pipelinecontroller, When We Do It From

Basically, it looks like the agent installs an outdated pip version and this should fix it, and hopefully install your packages correctly

2 years ago
0 Hi Team, I Am Trying To Run A Pipeline Remotely Using Clearml Pipeline And I’M Encountering Some Issues. Could Anyone Please Assist Me In Resolving Them?

Oh I see. I think there is a mismatch between some clearml versions on your machine? How did you run these scripts exactly? (like the CLI, for example python test.py ?)

Or if you ran it via an IDE, what is the interpreter path?

one year ago
0 Hi All, I'Ve Been Experimenting Around With Automating The Data Sync. This Is Related To This Thread:

Hi @<1545216070686609408:profile|EnthusiasticCow4> ! I have an idea.
The flow would be like this: you create a dataset, the parent of that dataset would be the previously created dataset. The version will auto-bump. Then, you sync this dataset with the folder. Note that sync will return the number of added/modified/removed files. If all of these are 0, then you use Dataset.delete on this dataset and break/continue, else you upload and finalize the dataset.

Something like:

parent =...
2 years ago
0 I Get These Warnings Whenever I Run Pipelines And I Have No Idea What It Means Or Where It Comes From:

Hi @<1694157594333024256:profile|DisturbedParrot38> ! We weren't able to reproduce, but you could find the source of the warning by appending the following code at the top of your script:

import traceback
import warnings
import sys

def warn_with_traceback(message, category, filename, lineno, file=None, line=None):
    log = file if hasattr(file,'write') else sys.stderr
    traceback.print_stack(file=log)
    log.write(warnings.formatwarning(message, category, filename, lineno, line))
...
one year ago
0 Hey All. Wanting To Log

Hi @<1674226153906245632:profile|PreciousCoral74> !

Sadly, Logger.report_matplotlib_figure(…) doesn't seem to log plots. Only the automatic integration appears to behave.

What do you mean by that? report_matplotlib_figure should work. See this example on how to use it: None .
If it still doesn't work for you, could you please share a code snippet that could help us track down...

one year ago
0 Hi, I’M Trying To Integrate Logger In My Pipelinedecorator But I’M Getting This Error -

Yes, passing custom object between steps should be possible. The only condition is for the objects to be pickleable. What are you returning exactly from init_experiment ?

one year ago
0 Why Is Async_Delete Not Working?

Hi @<1590514584836378624:profile|AmiableSeaturtle81> ! To help us debug this: are you able to simply use the boto3 python package to interact with your cluster?
If so, how does that code look like? This would give us some insight on how the config should actually look like or what changes need to be made.

one year ago
0 Hi, I'M Running

hi OutrageousSheep60 ! We didn't release an RC yet, we will a bit later today tho. We will ping you when it's ready, sorry for the delay

2 years ago
0 Hi, We Have Recently Upgraded To

Regarding 1. , are you trying to delete the project from the UI? (I can't see an attached image in your message)

3 years ago
Show more results compactanswers