Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8124 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 Hi, I Am Planning To Use Clearml To Train Yolo Model In Aws Ec2 Instance. I Am New To Clearml, Could Someone Please Point Me The Steps Involve Or Any Article To Get Started With

Hi CheekyElephant36
First you need to run it once on your machine, once this is done (only a few steps is enough), you can one it and enqueue it. Then to actually connect the aws autoscaler (the part that spins machines and runs tasks) go to applications and select the aqs autoscaler.
Btw i think the next video will be about YOLO + autoscaler

2 years ago
0 Hello! I'M Trying To Test The (Unpublished) Feature That Should Help Me To Deal With Running Cloned Pipelines From Different Commits/Branches. I Found This Commit:

Hi CleanPigeon16
Put the specific git into the "installed packages" section
It should look like:
... git+ ...(No need for the specific commit, you can just take the latest)

4 years ago
0 Encountered An Odd Bug. Upon Attempting To Write Images To Clearml (3D Projected, Matplotlib),

Hmm can you test with the latest RC?
pip install clearml==0.17.6rc1

4 years ago
0 Hi Guys, I Managed To Set Up A Kubernetes Cluster And Install Trains Into It. While Testing My Set-Up I Run The Test_Reporting.Py Example

So why is it trying to upload to "//:8081/files_server:" ?
What do you have in the trains.conf on the machine running the experiment ?

4 years ago
0 Hi! For

And can I store models with no attachment to tasks?

Assuming you have the Model ID :
model = InputModel(model_id='aabbcc') local_file_or_folder = model.get_weights()Is this what you are looking for?

3 years ago
3 years ago
0 Hi Friends! I'M Trying To Upgrade The

I don't have the compose file, or at least can't seem to find it in 

/opt

you can manually take down all dockers with:
docker psthen docker stop <container id> for each container id

4 years ago
0 Hi. Question About Dataset Upload Errors: When Uploading A

PanickyMoth78 quick update the fix is already being tested, I'm hoping an RC tomorrow 🙂

2 years ago
0 Hello Guys, I Read About Trains Some Days Ago And Think It Is Exectly What I Was Looking For, So I Ran The Docker Image And Started Thinking Of What I Would Like To Do And The Processing Steps I Would Like To Automize Which I Currently Run Manually Trigge

Hi WickedGoat98
This sounds like a great design (obviously you have scale in mind 😉 ) Feel free to ask "stupid" questions, based on what you already wrote I doubt they will be
A few questions that come to mind (probably a few others after):
You mentioned FS synchronization, from where? i.e. what is the single source of truth ? K8s (Rancher 2.0 is basically k8s manager) can take care of mounting volumes, so no need to sync, is this a valid solution ?

BTW : (you can drag and drop an i...

4 years ago
0 Hello Everyone, I Am Using Self Hosted Clearml Server On Ec2 (Clearml Community Amis). This Ec2 Instance Is Attached To S3 With Iam Role. Now If I Create Or Upload Data From Client Side , I Want It To Be Uploaded On S3. There Is A Way Mentioned For Mentio

can I mount the s3 bucket as file system on place where

you need to mount it where the file server is storing it's files, correct (notice, not the DBs, just the files server)

one year ago
5 years ago
0 Hi All, I Have Python File Build_Pipeline, That Contain Pipelinecontroller With One Step Only. When I Try To Run The File I Get 'Build_Pipline.Py': [Errno 2] No Such File Or Directory' On The Webui. What I Do Wrong? Thanks!

SparklingElephant70 , let me make sure I understand, the idea is to make sure the pipeline will launch a specific commit/branch, and that you can control it? Also are you using the pipeline add_step function or are you decorating a function with PipelineDecorator ?

3 years ago
0 Hi All

CooperativeFox72 a bit of info on how it works:
In "manual" execution (i.e. without an agent)

path = task.connect_configuration(local_path, name=name

path = local_path , and the content of local_path is stored on the Task

In "remote" execution (i.e. agent)

path = task.connect_configuration(local_path, name=name

"local_path" is ignored, path is a temp file, and the content of the temp file is the content that is stored (or edited) on the Task configuration.
Make sense ?

4 years ago
0 Hi! I Am Researching Different Mlops Libraries / Platforms. I Don'T Want To Use Platform As A Service Solutions. Could You Suggest Me What Are The Main Differences Between Clearml And Mlflow? What Are Advantages Of Using Clearml?

Hi RoundMosquito25
This is a bit old but probably a good start:
https://clear.ml/blog/stacking-up-against-the-competition/
tl;dr
ClearML advantages (at least a few I can think of)
Scales way better Enables out of the box experiment orchestration (i.e. remote execution etc) Data management Nicer UI Full RestAPI Full MLops platform Model serving Query-able model repositoryProbably more 🙂

2 years ago
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

eval

 built-in. wdyt?

eval is never recommended as basically you could do Args/float='os.system("rm ...")' 🙂
In theory type is stored on the hyper parameter (this is a relatively new feature the backend supports)
The casting though, is done based on the Original value type, which means Task.connect needs to be called with the original dict. Is there a specific reason for using get_parameters instead of task.connect ?

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

. However, despite having imported the required types from the 

typing

 library in the script where the function decorated with 

PipelineDecorator.component

 is defined, later in the generated script the 

typing

 library is not imported outside the scope of the function

Actually the typing part is not passed to the "created step" , because there are no global imports, for eexample:
` def step(a: pd.DataFrame):
import pandas as pd
...

3 years ago
0 There Is Some Specificity With The Way We Setup Our Environment At My Company That Prevents Me From Using The Full Features Of

I want to inject a bash command after the repo has been clone (and maybe even after the venv has been installed).

LazyTurkey38 the created venv inherits from the system environment, so in theory you can do all the installation on the system python and the created venv will just inherit the packages, no?
(btw: just to clarify, there is only one entry point for the custom bash script and that is before everything, so users can configure the container before the agent starts)

4 years ago
0 Hi There, I Have A Problem With Pyjwt: I Am Using

Sure. JitteryCoyote63 so what was the problem? can we fix something?

4 years ago
4 years ago
0 Hello, Does Anybody Here Have Much Experience In Creating Sub-Tasks Or Sub-Pipelines? I'M Not Sure The Concept Is Particularly Well Established But The Docs Mention:

using caching where specified but the pipeline page doesn't show anything at all.

What do you mean by " the pipeline page doesn't show anything at all."? are you running the pipeline ? how ?
Notice PipelineDecorator.component needs to be Top level not nested inside the pipeline logic, like in the original example

@PipelineDecorator.component(
        cache=True,
        name=f'append_string_{x}',
    )
2 years ago
0 I’M Trying To Use

LazyTurkey38
The last part makes sense, not sure I get the "if clone", we are calling execute_remotely, so I'm assuming we do not need to clone ourselves, but send the current Task.
Other than that yes, makes sense (BTW, assuming you have upgraded the server >=1.0 you can just do mark_stopped, no need to reset

4 years ago
0 Hi There, I Am Running A Clearml-Agent In Services Mode (With Docker) On A Machine With Two Disks: One With The Os (8Go, 91% Space Used) And One For The Data (100Go, 40% Space Used). When Executing The Auto-Scaler Task In This Agent, I Get The Following E

Maybe there is setting in docker to move the space used in a different location?

No that I know of...

I can simply increase the storage of the first disk, no problem with that

probably the easiest 🙂

But as you described 

 it looks like an edge case, so I don’t mind 

🙂

4 years ago
0 Unrelated Problem (Or Is It?) The Clearml'S Built In Cleanup Service Fails

Very odd, I still can't reproduce. This is just the cleanup service running without anything else ?
What's the clearml version it is using ?

3 years ago
0 Hello, I Have Been Using Clearml Interactive Session For More Than 3 Months And I Am Facing With Random Ssh Disconnection Errors In Vscode Once In A While After Creating The Session. Sometimes Reconnecting Works, If It Does Not Work I Reconnect The Clear

@<1699955693882183680:profile|UpsetSeaturtle37> can you try with the latest clearml-session (0.14.0) I remember a few improvements there

The remote machine is in Azure behind the load-balancer, we are using docker images, so directly connecting to pods.

yeah LB in the middle might be introducing SSH hiccups, first upgrade to the latest clearml-session it better ocnfigures the SSH client/server to support longer timeout connection, if that does not work try the -- keepalive=true
Le...

one year ago
0 Clearml Plots Question. There Is A Tiny Problem With The Experiment Pages Where The Plots We Create In The Notebook Are Not Saved As It Was Made. For Example, We Have A Scatter Plot With A Red Line Y=X On Top Of The Scatter Plot, But In Clearml, It Is Bl

Hi @<1541229818828296192:profile|HurtHedgehog47>

plots we create in the notebook are not saved as it was made.

I'm assuming these are matplotlib plots ?
Notice that ClearML tries to convert the plot into interactive plots, in that process sometimes, colors and legend is being lost (becomes generic).
You can however manually report the plot, and force it to store it as non-interactive:

task.logger.report_matplotlib_figure(
    title="Manual Reporting", series="Just a plot", ite...
2 years ago
0 Hi All! I Might Have Found An Issue With The Migration Guide.

Hi @<1556450111259676672:profile|PlainSeaurchin97>

While testing the migration, we found that all of our models had their

MODEL URL

set to the IP of the old server.

Yes all the artifacts/models/debug-samples are stored "as is" , this means that if you configured your original setup with IP, it is kind of stuck there, this is why it is always preferred to use host-name ...

you apparently also need to rename

all

model URLs

Yes 😞

2 years ago
0 Hi, Is There A Way To List All Agents Running In A Host, I Do Not Find Relevant One In Clearml-Agent -H.

And the agent continue running.

oh just kill al the processes with clearml-agent in the cmd line

pkill -9 -f clearml-agent
2 years ago
Show more results compactanswers