Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8122 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 Hey, What Is The Recommended Approach To Speed Up The Spin Up Of A Task In A Gcp Autoscaled Instance ? It Takes 20Mins To Build The Venv Environment Needed By The Clearml-Agent To Run It, Would Providing A Vm Image With Preinstalled Pip Packages On It Hel

I think it's inside the container since it's after the worker pulls the image

Oh that makes more sense, I mean it should not build the from source, but make sense
To solve for build for source:
Add to the "Additional ClearML Configuration" section the following line:
agent.package_manager.pip_version: "<21"
You can also turn on venv caching
Add to the "Additional ClearML Configuration" section the following line:
agent.venvs_cache.path: ~/.clearml/venvs-cache
I will make sure w...

2 years ago
0 Hi, If I Am Starting My Training With The Following Command:

now realise that the ignite events callbacks seem to not be fired

So this is an ignite issue ?

3 years ago
0 Hi! Can Someone Show Me An Example Of How

BTW: I think an easy fix could be:
if running_remotely(): pipeline.start() else: pipeline.create_draft()

3 years ago
0 Hi! I Have Local Minio Setup, Via Minio Browser I Can Upload 50-100 Mb Per Second As Its Local. But When I Try To Use Task.Upload_Artifact It Uploads 500 Kb Per Second. Does Anyone Have An Idea About This?

When I give my Minio to output_uri argument, it uploads 500 KB /sec as before.

But it worked well when using StorageManager and uploading to the minio directly, is that correct?

.. I give my Minio to output_uri argument

How long did it take to run the demo code I posted?
(The one you mentioned took 0.16s to run locally)

5 years ago
0 When I Run An Experiment (Self Hosted), I Only See Scalars For Gpu And System Performance. How Do I See Additional Scalars? I Have

BoredHedgehog47 you need to make sure "<path here>/train.py" also calls Task.init (again no need to worry about calling it twice with different project/name)
The Task.init call will make sure the auto-connect works.
BTW: if you do os.fork , then there is no need for the Task.init, the main difference is that POpen starts a whole new process, and we need to make sure the newly created process is auto-connected as well (i.e. calling Task.init)

2 years ago
0 Hi! Can Someone Show Me An Example Of How

So I think it makes more sense in this case to work with the former.

Totally !

3 years ago
0 Hi Guys, Do You Support Pipenv And Pipfile.Lock As Deps List Instead Of Requirments.Txt?

If you set the package_manager to peotry then it will only use the lock files
https://github.com/allegroai/clearml-agent/blob/21c4857795e6392a848b296ceb5480aca5f98e4b/docs/clearml.conf#L53
If you clear the "Installed Packages" section, it will just use the "requirements.txt" in the repository itself.
What's the specific use case, and the problem we are trying to solve?

4 years ago
0 Hi Everyone, I Have A Question About Using

The other order (with custom decorator above pipeline fails - just for you info

)

This is on "purpose" the pipeline decorator has to be the top decorator.
Glad it works!

one year ago
0 Hi There!

Would it also be possible to query based on

multiple

user properties

multiple key/value I think are currently not that easy to query,
but multiple tags are quite easy to do

tags=["__$all", "tag1", "tag2],
2 years ago
0 Hi Everyone, Has Someone Of You Tried To Track Your Shap Plots With Clearml? Somehow In My Dashboard The Tracked Plots Are Empty. Might They Be Too Complex Or Something? Br Sophie

Notice that in your example you have

plt.figure()

This actually clears the matplotlib figure, this is why we are getting a first white image then the actual plot,
once I removed it I got a single plot (no need for the manual reporting)

X, y = make_regression(
    n_samples=100,     # Number of samples
    n_features=10,     # Number of features
    noise=0.1,   # Number of informative features
    random_state=42    # For reproducibility
)

# Convert to DataFrame for better f...
8 months ago
0 Hi Everyone! I'M A Clearml Newbie Trying It Out In My Local Environment With The Docker Compose Installation Described Here:

Hi @<1668065560107159552:profile|VivaciousPenguin20>
I think you are looking at the wrong experiment, this is a 3 year old experiment ? this does not seem to be your currently executed experiment, right?

one year ago
0 Hi All, I Have An Issue With The Way Hyper Parameters Are Logged Under Configuration, The Values That Are Stored Seem To Add Unnecessary Escape Characters To The Original Values.. Is It A Known Issue? Is There A Way To Change It? Thanks

Hmm DepressedChimpanzee34 my bad it seems the loading is done via YAML loader, but the dumping is straight forward str casting...
https://github.com/allegroai/clearml/blob/6e6271fb91f2aeb2aa7a13c6d07d4e635baaa670/clearml/backend_interface/task/task.py#L934
What would you expect to get (BTW "value\blah" is Not a valid string assignment in python as there is no \b escape character, it should be "value\blah" which translates into the text "value\blah")

4 years ago
0 Any Ideas Why This Is Happening? It Was Fine Yesterday

TenseOstrich47 this looks like elasticserach is out of space...

4 years ago
0 Looking At Clearml-Serving - Two Questions - 1, What’S The Status Of The Project 2. How Does One Say How A Model Is Loaded And Served Etc? For Example, If I Have A Spacy Ner Model, I Need To Specify Some Custom Code Right?

'config.pbtxt' could not be inferred. please provide specific config.pbtxt definition.

This basically means there is no configuration on how to serve the mode, i.e. size/type of lower (input) layer and output layer.
You can wither store the configuration on the creating Task, like is done here:
https://github.com/allegroai/clearml-serving/blob/b5f5d72046f878bd09505606ca1147d93a5df069/examples/keras/keras_mnist.py#L51
Or you can provide it as standalone file when registering the mo...

4 years ago
0 I’M Trying To Use

Hi LazyTurkey38
What do you mean the git repo is not recognized? When execute_remotely leaves you should see on the task a ref to the git repo with the exact commit ID you have locally pulled, do you see it under the Execution tab?

4 years ago
0 Hello, I Would Like To Use Spot Instances Together With The Aws Autoscaler To Train Models With Pytorch/Ignite And I Am Wondering How To Support Interruptions During The Training (In Case The Instance Is Terminated By Aws). Is There Anything Already Built

JitteryCoyote63

somehow the previous iterations, not sure yet if it’s coming from my code, ignite or clearml

ClearML will automatically continue reporting from the previous iteration (i.e. if before continuing the Task the last iteration was 100, then the next report with iteration =0 will actually be 101)

task.set_initial_iteration(engine.state.iteration)

Basically it is called automatically by ClearML (obviously only when you continue an aborted Task)

4 years ago
0 Another Question, I Have Written A Code That Includes A Task Scheduler That Calls A Function. That Function Watches A Folder And If There Are Sufficient Images, It Creates And Publishes The Dataset, After Which It Clears The Folder. Problem, For Some Rea

Hi VexedCat68

The scheduler is set to run once per hour but even now I've got around 40+ anonymous running tasks.

Based on the screenshots these are the Datasets (which are also a Task with specific type etc).
I would actually name the Datasets you are creating You need to specify the parent version (i.e. how would it know it is a child dataset changeset) I'm assuming they are all uploading everything, hence still running?BTW: you can use the argument single_instance=True maki...

3 years ago
0 Is It Possible To Give The Agent Access To Install Private Pip Packages (Needs To Be Installed From The Repo)?

This means that in your "Installed packages" you should see the line:
Notice that this is not a pypi artifactory (i.e. a server to add to the extra index url for pip), this is a direct pip install from a git repository, hence it should be listed in the "installed packages".
If this is the way the package was installed locally, you should have had this line in the installed packages.
The clearml agent should take care of the authentication for you (specifically here, it should do nothing).
If ...

4 years ago
4 years ago
0 Hey Guys, Do You Have Any Plans To Add Functionality To Export Training Config With All Hyperparameters To The Different Formats, Such As Training Command Line Command, Yaml, Etc.?

DilapidatedDucks58 if you have so many parameters, why don't you use the
task.connect_configuration(dict)
It will put it in the artifacts, as an editable json alike string.

5 years ago
0 Hi Everyone, I Have A Question About Using

Hmm that is odd. Let me take a look and ask the guys. Thank you for quickly testing the RC! I'm hoping a new RC with a fix will be there tomorrow, if we can quickly replicate

one year ago
0 Hello, I Am Using Clearml In Docker Mode. I Have A Simple Script That Runs Locally, Runs On The Target Machine Running The Same Tensorflow Container, But Doesn'T Run When I Deploy It Using Clearml. Here'S The Log Of The Error:

It runs directly but leads to the above error with clearml

Both manually (i.e. calling Task.init and running it without agent, and with agent ? same exact behavior ?

2 years ago
0 Hey! Did Anyone Try Hpo On Yolov5 Model According To The Following Tutorial:

Hi CheekyFox58
If you are running the HPO+training on your own machine, it should work just fine in the Free tier

The HPO with the UI and everything, is designed to run the actual training on remote machines, and I think this makes it a Pro feature.

2 years ago
0 Hi, If I Am Starting My Training With The Following Command:

Thanks JitteryCoyote63 , once we have a reproducible example the fix should be very quick to push (with these things reproducing it is the challenge)

3 years ago
0 Hi All, I'Ve Successfully Run A Task Locally, And Now I'M Trying To Clone It And Send It To A Queue. It Looks Like The Environment Is Built Successfully, But It Hangs Here:

@<1724960464275771392:profile|DepravedBee82> I just realized, the agent is Not running in docker mode, correct? (i.e. venv mode)
If this is the case how come it is running as root? (could it be is is running inside a container? how was that container spinned?)

one year ago
0 Post_Packages:

GentleSwallow91 notice that on the Task you have "Installed Packages" this is the equivalent of requirments.txt , you can edit it and add a missing package, or programatically add it in code (though usually directly imported packages are automatically registered, how come this one is missing?)

to add a package in code:
Task.add_requirements(package_name="my_package", package_version=">=1") task = Task.init(...)

base docker image but clearML has not determined it during the script ru...

3 years ago
0 Hi Everybody, I'M Running Experiments Inside A Docker Which Includes Multiple Python Instances, Some Of Them Are Inside Conda Environments. How Can I Specify The Agent To Use A Specific Conda Environment Inside The Docker?

The agent is using Bash (but when you add command line to the docker run, .bashrc is not executed, hence no conda in PATH)
Maybe add the full path to the conda executable:
ocker_setup_bash_script= [ "export PATH=""/workspace/miniconda/bin:$PATH", "export LOCAL_PYTHON=/workspace/miniconda/bin/python3","/workspace/miniconda/bin/conda activate /PATH_GOES_HERE"])

3 years ago
Show more results compactanswers