Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SweetBadger76
Moderator
1 Question, 239 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

4 × Eureka!
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
hello TartSeagull57 This is a bug introduced with version 1.4.1, for which we are working on a patch. The fix is actually in test, and should be released ver...
3 years ago
0 Hi Everyone, Quick Question Regarding Minio And Logging:

oups yes, you are right. output_uri is used for the artifacts
for the logger it is https://clear.ml/docs/latest/docs/references/sdk/logger#set_default_upload_destination

btw what do you get when you do task.get_logger().get_default_upload_destination() ?

3 years ago
0 I’M Trying To Get A Copy Of A Model Through Clearml Which Is Stored In S3:

Hi BeefyHippopotamus73
did you managed to get rid of your issue ?

3 years ago
0 Hi Folks, Is There A Way To Force Clear-Ml Agent With --Docker To

can you try to create an empty text file and provide its path to Task.force_requirements_env_freeze( your_empty_txt_file) ?

3 years ago
0 Hi, Bug Report. I Was Trying To Upload Data To S3 Via Clearml.Dataset Interface

Hi,
It would be great if you could also send your clearml package version 🙂

3 years ago
0 Hi, I Am Trying To Use The Parameterset For Hyper-Parameter Tuning With Dependencies, An Example Of How I Use It: Parameterset([{“Prm1”:1, “Prm2": 1},{“Prm1”:2, “Prm2":2}]) But I Get A Warning :

Just to keep you updated, as promised 🙂
we have found the bug and will release a fix asap. for that too i will keep you updated 🙂

3 years ago
0 Hi, I Have A Local Package That I Use To Train My Models. To Start Training, I Have A Script That Calls

You can force to install only the packages that you need using a requirements.txt file. Type into what you want the agent to install (pytorch and eventually clearml). Then call that function before Task.init :
Task.force_requirements_env_freeze(force=True, requirements_file='path/to/requirements.txt')

3 years ago
0 Hi, I Am Having Problem With Clearml Running On Our Private Server. This Error Occured On Older Version On Clearml And Server. Now After Update And Purge Of All Old Database With

Hey Igor
I am not the expert about this topic. I have someone who better knows the topic that is coming back to you straight after his meeting. 🙂

3 years ago
0 Hey Guys, Is There An E2E Working Example Of Writing A Pipeline With 2-3 Tasks? Just An Hello World. I Am The First One Who Tries To Make Clearml Pipeline To Work I Wasn'T Able To Make It:

check that your task are enqueued in the queue the agent is listening to.
from the webUI, in your step's task, check the default_queue in the configuration section.
when you fire the agent you should have a log that also specifies which queue the agentis ssigned to
finally, in the webApp, you can check the Workers & Queues section. There you could see the agent(s), the queue they are listening to, and what tasks are enqueued in what queue

3 years ago
0 Hi Community, Is There A Way To Download All The Logged Scalars/Plots Using Code Itself?

hey TenderCoyote78
Here is an example of how to dump the plots to jpeg files

` from clearml.backend_api.session.client import APIClient
from clearml import Task
import plotly.io as plio

task = Task.get_task(task_id='xxxxxx')

client = APIClient()

t = client.events.get_task_plots(task=task.id)

for i, plot in enumerate(t.plots):
fig = plio.from_json(plot['plot_str'])
plio.write_image(fig=fig, file=f'./my_plot_{i}.jpeg') `

3 years ago
0 Hey,

hi WickedElephant66
you can log your models as artifacts on the pipeline task, from any pipeline steps. Have a look there :
https://clear.ml/docs/latest/docs/pipelines/pipelines_sdk_tasks#models-artifacts-and-metrics
I am trying to find you some example, hold on 🙂

3 years ago
0 Hey,

To provide an upload destination for the artifact, you can :
add the parameter default_output_uri to Task.init ( https://clear.ml/docs/latest/docs/references/sdk/task#taskinit ) set the destination into clearml.conf : sdk.development.default_output_uri ( https://clear.ml/docs/latest/docs/configs/clearml_conf#sdkdevelopment )
To enqueue the pipeline, you simply call it, without run_locally or debug_pipeline
You will have to provide the parameter execution_queue to your steps, or defau...

3 years ago
0 Hi, I Have A Local Package That I Use To Train My Models. To Start Training, I Have A Script That Calls

Hi
could you please share the logs for that issue (without the cred 🙂 ) ?

3 years ago
0 Since V1.4.0, Our

this is because the server is thought as a bucket too = the root to be precise. Thus you will always have at least a subfolder created in local_folder - corresponding to the bucket found at the server root

3 years ago
0 Since V1.4.0, Our

Hi UnevenDolphin73
I am going to try to reproduce this issue, thanks for the details. I keep you updated

3 years ago
0 Hi We Are Getting The Following Error When We Are Trying To Run A Task On Our On Premis

btw can you screenshot your clearml-agent list and UI please ?

3 years ago
0 Hey, So I'M Trying To Upload An Artefact To Clearml’S Fileserver(I Have A Self Hosted Clearml Server Running), I'Ve Uploaded The File Using Storagemanager.Upload_File(Path, Url) And Giving The Url As “

you can specify the destination of the uploading like that :
when you initiate a task, you can set the parameter output_uri. If you set it to True, then the model will be uploaded to the uri specified in your conf file. Youcan also directly specify an url or you can use OutputModel.set_default_upload_uri or set_upload_destination ( https://clear.ml/docs/latest/docs/references/sdk/model_outputmodel#outputmodelset_default_upload_uri or https://clear.ml/docs/latest/docs/references/sdk/model_...

3 years ago
0 Need

i dont know if it will help but here is what i would test :
remove temporary the task init in the controller use name and project parameters that dont have spaces in their name dont use services as a default queue

3 years ago
0 Hi. When Using The Logger'S

In the meantime, it is also possible to create a figure that will contain +2 histo and then report it to the logger using report_plotly.
You can have a look there :
https://plotly.com/python/histograms/#overlaid-histogram
https://plotly.com/python/histograms/#stacked-histograms

` log = task.get_logger()

x0 = np.random.randn(1500)
x1 = np.random.randn(1500) - 1

fig = go.Figure()

fig.add_trace(go.Histogram(y=x0))
fig.add_trace(go.Histogram(y=x1))

fig.update_layout(barmode='overlay') ...

3 years ago
0 Hey,

can you share your logs ?

3 years ago
0 Hi, I Have A Local Package That I Use To Train My Models. To Start Training, I Have A Script That Calls

you can freeze your local env and thus get all the packages installed. With pip (on linux) it would be something like that :
pip freeze > requirements.txt
(doc here https://pip.pypa.io/en/stable/cli/pip_freeze/ )

3 years ago
0 Hello Community! How I Can Add S3 Credentials To S3 Bucket In Example.Env For Clearml-Serving-Triton? I Need To Add Bucket Name, Keys And Endpoint

hi AbruptHedgehog21
which s3 service provider will you use ?
do you have a precise list of the var you need to add to the configuration to access your bucket ? 🙂

3 years ago
0 Hey,

regarding the file extension, it should not be a problem

3 years ago
0 Hi, I Am Trying The Triggerscheduler To Catch When A User Add Specific Tag To A Task. I Used The Below Code But The Schedule_Function Is Not Called When Adding Tags To Task (It Seems The Task.Last_Update Is Not Modified After Adding Tag)

hey ApprehensiveSeahorse83
can you please check that the trigger is correctly added ? Simply retrieve the return value of add_task_trigger
res = trigger.add_task_trigger( .....
print(f'Trigger correctly added ? {res}')

3 years ago
0 I’M Trying To Get The Meta-Information About The Code (Section Execution) To Be Auto-Filled, However When I Run The Script With The Pycharm Testrunner, It Is Missing. If I Use

it is a bit old - i recommand you to test again with the latest version 1.4.1
can you please give me some more details about what you intent to do ? it would be easier then to reproduce the issue

3 years ago
0 Since V1.4.0, Our

the fact that the minio server is called "bucket" in the doc (

) is for sure confusing. i will check the reason of this choice, and also why we dont begin to build the structure from the bucket (the real one

)
i keep you updated

3 years ago
0 Hi, I Have A Local Package That I Use To Train My Models. To Start Training, I Have A Script That Calls

hey H4dr1en
you just specify the packages that you want to be installed (no need to specify the dependancies) and the version if needed.
Something like :

pytorch==1.10.0

3 years ago
0 Upload_Artifact Not Working With Minio

Also, change this line of the conf file to false :

development {
# Development-mode options

    # dev task reuse window
    task_reuse_time_window_in_hours: 72.0

    # Run VCS repository detection asynchronously
    vcs_repo_detect_async:  true      <== change to false
3 years ago
Show more results compactanswers