Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SmugDolphin23
Moderator
0 Questions, 425 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi All, After Upgrading To Sdk 1.8.0 We Are Having Issue Adding External Files To Dataset From Gcs. This Is The Code We Use:

You could try this in the meantime if you don't mind temporary workarounds:
dataset.add_external_files(source_url=" ", wildcard=["file1.csv"], recursive=False)

2 years ago
0 Hello, Community, I Hope This Message Finds You All Well. I Am Currently Working On A Project Involving Hyperparameter Optimization (Hpo) Using The Optuna Optimizer. Specifically, I'Ve Been Trying To Navigate The Parameters 'Min_Iteration_Per_Job' And 'M

Hi @<1523703652059975680:profile|ThickKitten19> ! Could you try increasing the max_iteration_per_job and check if that helps? Also, any chance that you are fixing the number of epochs to 10, either through a hyper_parameter e.g. DiscreteParameterRange("General/epochs", values=[10]), or it is simply fixed to 10 when you are calling something like model.fit(epochs=10) ?

11 months ago
0 Hello All! Is It Possible To Utilize Shared Memory In Clearml For Tasks Like Model Inference, Where Instead Of Transferring Images Over The Network (E.G., Http, Rpc), We Use A Shared Memory Extension? Please Refer To The Link Below:

Hi @<1657918706052763648:profile|SillyRobin38> ! If it is compatible with http/rest, you could try setting api.files_server to the endpoint or sdk.storage.default_output_uri in clearml.conf (depending on your use-case).

11 months ago
0 Cannot Upload A Dataset With A Parent - Seems Very Odd! Clearml Versions I Tried: 1.6.1, 1.6.2 Scenario: * Create Parent Dataset (With Storage On S3) * Upload Data * Close Dataset * Create Child Dataset (Tried With Storage On Both S3 Or On Clearml Serv

Hi RoughTiger69 ! Can you try adding the files using a python script such that we could get an exception traceback, something like this:
` from clearml import Dataset

or just use the ID of the dataset you previously created instead of creating a new one

parent_dataset = Dataset.create(dataset_name="xxxx", dataset_project="yyyyy", output_uri=" ")
parent_dataset.add_files("folder1")
parent_dataset.upload()
parent_dataset.finalize()

child_dataset = Dataset.create(dataset_name="xxxx", dat...

2 years ago
0 Hello, For Some Reason My Upload Speed To S3 Is Insanely Slow, I Noticed In Logs That It Upoads To /Tmp Folder. What Does That Mean? Why Tmp?

Hi @<1590514584836378624:profile|AmiableSeaturtle81> ! What function are you using to upload the data?

10 months ago
0 Hi, I Tried This, But Got Unexpected Result When Set

this is a bug, we will fix this asap

2 years ago
0 How Does One

Hi @<1654294828365647872:profile|GorgeousShrimp11> ! add_tags is an instance method, so you will need the controller instance to call it. To get the controller instance, you can do PipelineDecorator.get_current_pipeline() then call add_tags on the returned value. So: PipelineDecorator.get_current_pipeline().add_tags(tags=["tag1", "tag2"])

one year ago
0 Does Clearml Somehow

Hi UnevenDolphin73 ! We were able to reproduce the issue. We'll ping you once we have a fix as well πŸ‘

2 years ago
0 Hi Team,When Clearml-Agent Is Used To Run The Code,I T Will Setup The Environment ,How It Take The Python Package Version?

Hi @<1533257278776414208:profile|SuperiorCockroach75> Try setting packages in your pipline component to your requirements.txt or simply add the list of packages (with the specific versions). None

2 years ago
0 Hi There, I Am Having Issues Executing A

@<1654294828365647872:profile|GorgeousShrimp11> Any change your queue is actually named megan-testing and not megan_testing ?

one year ago
0 If I Ran A Hyperparemeter Sweep And I Wanted To Create A Graph Where The X-Axis Was One Of The Hyperparameters, Let'S Say The Momentum Term Of The Optimizer, And I Wanted To Plot That Vs The Min-Loss Over All Epochs, Is There A Good Way To Do This With Cl

Hi @<1545216070686609408:profile|EnthusiasticCow4> ! Can't you just get the values of the hyperparameters and the losses, then plot them with something like mathplotlib then just report the plot to ClearML?

one year ago
0 Hi, I’M Trying To Integrate Logger In My Pipelinedecorator But I’M Getting This Error -

Your object is likely holding some file descriptor or something like that. The pipeline steps are all running in separate processes (they can even run on different machines while running remotely). You need to make sure that the objects that you are returning are thus pickleable and can be passed between these processes. You can try to see that the logger you are passing around is indeed pickalable by calling pickle.dump(s) on it an then loading it in another run.
The best practice would ...

one year ago
0 Hi Everyone! Could Someone Tell How To Use

Hi @<1569496075083976704:profile|SweetShells3> ! Can you reply with some example code on how you tried to use pl.Trainer with launch_multi_node ?

one year ago
0 Hi! Is There A Way To

I left another comment today. It’s about something raising an exception when creating a set from the file entries

one year ago
0 Why Is Async_Delete Not Working?

@<1590514584836378624:profile|AmiableSeaturtle81> if you wish for you debug samples to be uploaded to s3 you have 2 options: you either use this function: None
or you can change the api.files_server entry to your s3 bucket in clearml.conf . This way you wouldn't need to call set_default_upload_destination every time you run a new script.
Also, in clearml.conf , you can change `sdk.deve...

one year ago
0 Hi, Trying To Report A Matplotlib Figure With

@<1566596968673710080:profile|QuaintRobin7> not for now. Could you please open a GH issue about it? Maybe we can fit this in a future patch.

one year ago
0 Hello, Is There A Way To Disable Dataset Caching So That When

Hi FreshParrot56 ! This is currently not supported πŸ™

2 years ago
0 Hello, I Am Testing My Hidra/Omegaconf With Clearml And I Have A General Question. Why Is It Necessary To Indicate That I Want To Edit The Configuration (Setting

Hi @<1603198134261911552:profile|ColossalReindeer77> ! The usual workflow is that you modify the fields in your remoter run in either the Hyperparameters section or the configuration section, but not usually both (as in Hydra's case). When using CLI tools, people mostly modify the Hyperparameters section so we chose to set the allow_omegaconf_edit to False by default for parity.

one year ago
0 Hi, I Have Noticed That Dataset Has Started Reporting My Dataset Head As A Txt File In "Debug Samples -> Metric: Tables". Can I Disable It? Thanks!

HandsomeGiraffe70 your conf file should look something like this:
` {
# ClearML - default SDK configuration

storage {
    cache {
        # Defaults to system temp folder / cache
        default_base_dir: "~/.clearml/cache"
        # default_cache_manager_size: 100
    }

    direct_access: [
        # Objects matching are considered to be available for direct access, i.e. they will not be downloaded
        # or cached, and any download request will ...
2 years ago
0 Hi! I'M Currently Considering Switching To Clearml. In My Current Trials I Am Using Up The Api Calls Very Quickly Though. Is There Some Way To Limit That? The Documentation Is A Bit Sparse On What Uses How Many Api Calls. Is It Possible To Batch Them For

FlutteringWorm14 we do batch the reported scalars. The flow is like this: the task object will create a Reporter object which will spawn a daemon in another child process that batches multiple report events. The batching is done after a certain time in the child process, or the parent process can force the batching after a certain number of report events are queued.
You could try this hack to achieve what you want:
` from clearml import Task
from clearml.backend_interface.metrics.repor...

2 years ago
0 Reporting Nonetype Scalars.

By default, as 0 values

7 months ago
0 Hi Clearmlers, I'M Trying To Create A Dataset With Tagged Batches Of Data. I Firstly Create An Empty Dataset With Dataset_Name = 'Name_Dataset', And Then Create A Another Tagged Dataset With The First Batch And With Parent_Datasets=['Name_Dataset']. It'S

Hi @<1668427950573228032:profile|TeenyShells80> , the parent_datasets should be a list of dataset IDs or clearml.Dataset objects, not dataset names. Maybe that is the issue

one year ago
0 Hi, Is There A Way To Abort Task (Not Reset, Not Delete) From Code?

Hi @<1523701240951738368:profile|RoundMosquito25> ! Try using this function None

2 years ago
Show more results compactanswers