Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
CostlyOstrich36
Moderator
0 Questions, 4213 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi All, I Hope I'M In The Right Channel, We(

StaleButterfly40 , I'm trying to get an estimate of what you have because if the content is too large the preview isn't shown....

3 years ago
0 Hi Folks, Any Idea Why I Am Getting This Strange Error From Clearml-Data:

Reproduces for me as well. Taking a look what can be done 🙂

3 years ago
0 Hi, Anyone Know How To Report Scalars With Tenserflow? Thanks

Do you mean reporting scalars with tensorflow OR having the reported tensorflow scalars show up on ClearML?

3 years ago
0 Hi, Anyone Know How To Report Scalars With Tenserflow? Thanks

Yeah, but how are iterations marked in the script?

3 years ago
0 I Wanted To Ask, How To Run Pipeline Steps Conditionally? E.G If Step Returns A Specific Value, Exit The Pipeline Or Run Another Step Instead Of The Sequential Step

VexedCat68 , what if you simply add pip.stop() ? Does it not stop the pipeline? Can you maybe add a print to verify that during the run the value is indeed -1? Also looking from your code it looks like you're comparing the 'merged_dataset_id' to -1

3 years ago
0 Hi Folks, How Does Clearml Figure Out The Virtual Environment That It Needs To Use For The Task. Is There Any Way To Override The Default Virtual Environment That Is Picked ?

Hi @<1523704207914307584:profile|ObedientToad56> , the virtual env is constructed using the detected packages when run locally. You can certainly override that. For example use Task.add_requirements - None

There are also a few additional configurations in the agent section of clearml.conf I would suggest going over

2 years ago
0 Hi, I'M Trying To Use Pipelines In The Free Version And Encountered This: Is It Because I'M Using The Free Version Or Code Based?

Hi IrritableJellyfish76 , it looks like you need to create the services queue in the system. You can do it directly through the UI by going to Workers & Queues -> Queues -> New Queue

3 years ago
0 Help Please. I Have My Clearml Server Running In A Docker Container. Now, I Am Training My Ml Models In Another Docker Container. I Want To Track These Models With My Clearml Server Located In The First Container. What Configuration Do I Need To Do?

Hi @<1673501397007470592:profile|RelievedDuck3> , you simply need to integrate clearml into your code.

from clearml import Task
task = Task.init(...)

More info here:
None

one year ago
one year ago
0 Is There Any Way To Change The X-Axis On The Charts For Scalars, To Say E.G. "Epochs" Instead Of "Iterations"? Or Is That Hardcoded?

I'm not sure. Maybe @<1523703436166565888:profile|DeterminedCrab71> might have some input

2 years ago
0 Hi, What Would Be The Recommended Way To Add/Track Arbitrary Models To/With Outputmodels? Currently Hacking It By Using Joblib Dump And Subsequently Deleting Unwanted "Local" Files. Arbitrary In This Case Just Extensions To Some Scikitlearn Classes.

If you set Task.init(..., output_uri=<PATH_TO_ARTIFACT_STORAGE>) everything will be uploaded to your artifact storage automatically.
Regarding models. I to skip the joblib dump hack you can simply connect the models manually to the task with this method:
https://clear.ml/docs/latest/docs/references/sdk/model_outputmodel#connect

3 years ago
0 Second: Is There A Way To Take Internally Tracked Training Runs And Publish Them Publicly, E.G. For A Research Paper? "Appendix A: Training Runs Can Be Found Here, Feel Free To Explore Them And Look At The Loss Curves"? For Example

Regarding this one, there is actually a way. If you work on http://app.clear.ml you can share an experiment for other users to see. However, to see the experiment people getting the link would need to sign up. This could also be a pretty cool feature request to make it completely public and open. Maybe open another feature request.

3 years ago
0 Hi Folks I Have A Problem I Can'T Understand. Plots Are Not Shown When Experiments Are Executed From The Ui. For Example, If I Run The Code On My Laptop, And I Go To The Experiment Page I Can See Correctly The Plots: But If I Then Clone The Task, And Ex

(the API keys are exposed through environment variables)

Where are the env variables pointing? I'm interested in all CLEARML related env vars if you could add them here 🙂

3 years ago
0 Hi, I'M Setting A

I played a bit with it and got to the value. OutrageousSheep60 , please tell me if this helps you 🙂

` >>> task.set_user_properties(x=5)
True

y=task.get_user_properties()
y
{'x': {'section': 'properties', 'name': 'x', 'value': '5'}}
y["x"]["value"]
'5' `

3 years ago
0 Hello,

GrittyCormorant73 , K8s deployment will have easier time to spin up agent instances to run the tasks 🙂

3 years ago
0 Hi All, Is There An Easy Way To Ping The Server Programatically? I'M Just Trying To See What Is The Default Server That Is Set, And Is It Responsive

The highlighted line is exactly that. Instead of client.tasks.get_all() I think it would be along the lines of client.debug.ping()

4 years ago
0 Hello! Tell Me Please, Is It Intended That Nan Values Are Converted To 0 When Logging? Upd: I See Nan In The Tensorboard, And 0 In Clearml. Upd2: Use V1.1.*

CheerfulGorilla72 , can you point me to where in the script the reported scalars are?

I think this might be happening because you can't report None for Logger.report_scalar() so the auto logging assigns it some sort of value - 0. What is your use case? If the value of the scalar is None then why log it?

3 years ago
0 Is This The Write Way To Add A Tag To An Output Model Artifact Of A Task? Torch.Save(Model, '

Please try like this model.tags=['Test'] and not with append

3 years ago
Show more results compactanswers