Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
RoundElephant20
Moderator
0 Questions, 25 Answers
  Active since 12 September 2024
  Last activity 5 months ago

Reputation

0
0 Hello, I Am Using The Clearml Integration With Ultralytics. I Have Very Simple Code

The auto magical will register the original model as an artifact, so the model that will be register it the original one, you can upload the model to your task as a model (so it will get a model id, like in this example ) or as a regular artifact (like in this example ).

3 months ago
0 Hello, Is It Possible To Upload Artifacts Using The Rest Api? It Seems Like

Can I suggest using the sdk? It will do both, log it to the task and will upload it to any storage wanted, like in this example

5 months ago
0 Hi All, Just Started Tinkering With Clearml And Wondering If This Is Suitable To Deploy An Etl/Elt Pipeline Using The Agent On My 2Nd Machine? Or Is Clearml Designed To Pick Up After The Data Has Been Loaded And Cleaned? I Cant See Any Mention Of Feature

Hi @<1773158059758063616:profile|PanickyParrot17> ,

You can do that with ClearML pipelines . step 1 will be pulling the data, step 2 will store the data, step 3 will create the dataset.
The pipeline controller can have the parameters of which data to pull, the name of the created dataset and more, so you can run it again from the UI and just change the data source

All the steps can run with the clearml agent, and you can also specify using o...

2 months ago
0 Hi Folks! I Am Creating A Pipeline Using Decorators. The First Step Of The Pipeline Is Downloading And Processing The Dataset. Inside The Step Function, There Is A Clearml Import: From Clearml Import Dataset. However, For Some Reason In The Created Envir

Hi @<1790552666904989696:profile|GlamorousDog61> ,

Can you update clearml ( pip install -U clearml ) to the latest version?
Are you using add_function_step to add steps to the pipeline controller? if so, did you specify the packages with the packages parameter?

one month ago
0 Hi, I'M Working On Our Ml Project Using Clearml For Pipeline Management. I Have A Separate Function For Data Preparation That I'D Like To Use In The Clearml Pipeline. To Keep The Pipeline Script Clean, I Prefer Not To Define This Function Directly Within

Hi @<1669152726245707776:profile|ManiacalParrot65> ,

Yes, you can wrap the separate function with a decorator so the function will run as a separate step in the pipeline, and even can cache the step for multi runs.

You can also add the function without a decorator, as a step to the pipeline with PipelineController.add_function_step() .

You can read about it [here](https://clear.ml/docs/latest/docs/pip...

5 months ago
0 When Running A Data Processing Task With Joblib.Parallel, The Print Statements Don'T Appear In The Clearml Console. Also The

Hi @<1774245260931633152:profile|GloriousGoldfish63> ,

which clearml version are you using? Can you add a snippet of the code? I tried to reproduce it but didn't succeed.

one month ago
0 Do You Know How To Pass Args To Python Script Through Clearml-Task Without --Args? Because I Am Using "Click" Library For Parsing Args And When I Write Clearml-Task --Script Main.Py --Args "Input_Path=/Home" I See That Clearml Launches That As "Running Ta

Hi @<1742355077231808512:profile|DisturbedLizard6> ,

Currently, only argparse arguments are supported for clearml-task , click is also support, but for now, with the python sdk.

4 months ago
0 Hi, Is It A Well Known Issue That Once You Upload An Artifact With The Prefix Of "Data_" To A Task, You Cannot Fetch The Task Since Clearml Sees It As A Data Logging?

Hi @<1594863230964994048:profile|DangerousBee35> ,

i run


    from clearml import Task
    import pandas as pd
    task = Task.init(project_name='examples', task_name='Artifacts with data_')

    df = pd.DataFrame(
        {
            'num_legs': [2, 4, 8, 0],
            'num_wings': [2, 0, 0, 0],
            'num_specimen_seen': [10, 2, 1, 8]
        },
        index=['falcon', 'dog', 'spider', 'fish']
    )

    # Register Pandas object as artifact to watch
    # (it will be mon...
one month ago
0 Hi, Is It A Well Known Issue That Once You Upload An Artifact With The Prefix Of "Data_" To A Task, You Cannot Fetch The Task Since Clearml Sees It As A Data Logging?

Hi @<1594863230964994048:profile|DangerousBee35> , I can get reproduce the issue, will keep you posted about it

one month ago
0 I'M Setting

Hi @<1613344994104446976:profile|FancyOtter74> , it's high on our list, will keep you posted once release

one month ago
0 Hi Everyone, Documentation Stands That

@<1691983266761936896:profile|AstonishingOx62> agreed, will push it forward

2 months ago
0 Hello, In My Teams Workspace (In The Web Ui) If I Remove A User/Delete It, Do I Lose The Data That User Created In That Workspace (Experiments, Artifacts, Etc). Thanks

Hi @<1523701295830011904:profile|CluelessFlamingo93> , no, all the data will be the same, but the user will be deactivate, and won't have access to the workspace (including the credentials, experiments, artifacts, datasets and all the other parts)

one month ago
0 I'M Setting

Hi @<1613344994104446976:profile|FancyOtter74> , I’m getting the same, will keep you posted once a fixed version is out

2 months ago
0 Hi There, I’Ve Run Into The Following Error When Trying To Enqueue A New Task:

Hi @<1745616566117994496:profile|FantasticGorilla16> , it looks like this error coming from the Elastic, can you check the disk space you allocated and check if its full?

one month ago
0 Hello, Any Idea How To Log Tables With Hyperlinks? When Logging Dataframes Using

Hi @<1774245260931633152:profile|GloriousGoldfish63> , checking it

one month ago
0 Hi, I Have A Question About The Model Registry. Here'S My Situation: I'M Using K8S_Example And Struggling With Uploading A Model. Should Models Be Uploaded To The Fileserver, Or Should I Create Another S3 Bucket As Mentioned In The Documentation?

Hi @<1742355077231808512:profile|DisturbedLizard6> , not sure I get that, did you use torch.save (like in here ) or some other command to save the models? When running with the clearml-agent. you have a print of all the configurations at the beginning of the log, can you verify your values are as you configure it?

Additionally, which version of clearml , clearml-agent and `...

5 months ago
0 Hello, I Am Using The Clearml Integration With Ultralytics. I Have Very Simple Code

Hi @<1644147961996775424:profile|HurtStarfish47> ,

ClearML will automatically upload you model with the original name and data, if not mistaken, best.pt is given by default from the train function.

You can rename it after the training and upload it, something like:

import shutil

# Rename best model checkpoint after training
shutil.move("runs/train/my_model/weights/best.pt", "my_model.pt")

# upload with the StorageManager
model_path = "my_model.pt"
# Define your...
3 months ago
0 Hi, I'Ve Run Into A Problem And Would Appreciate Some Help. I Installed Clearml Locally. When I Run A New Task On A Remote Server And In The Python Training Code I Set It To Only Train On One Gpu. Everything Works Fine And I See All The Scalars Automatica

Hi @<1779681046892122112:profile|EnviousHare17> and @<1774969995759980544:profile|SmoggyGoose12> ,

I run this code example:

# ClearML - Example of pytorch with tensorboard>=v1.14
#
from __future__ import print_function

import argparse
import os
from tempfile import gettempdir

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torchvision import datasets, transforms
from torch.autograd import Variable
from torch.utils.tensorboard im...
one month ago
0 Hi, I'Ve Run Into A Problem And Would Appreciate Some Help. I Installed Clearml Locally. When I Run A New Task On A Remote Server And In The Python Training Code I Set It To Only Train On One Gpu. Everything Works Fine And I See All The Scalars Automatica

@<1774969995759980544:profile|SmoggyGoose12> how do you report the scalars? with tb SummaryWriter?

In the UI, if you click the eye symbol, you have only the monitoring options?
image

one month ago
0 My Current Training Setup Is A Hyperparameter Optimization Using The Tpesampler From Optuna. For Configuration We Use Hydra. There Is A Very Nice Plugin That Let'S You Define The Hyperparameters In The Config Files Using The

Hi @<1577468611524562944:profile|MagnificentBear85>

Thank you for bringing this up! We’re always excited to see contributions from the community, especially around areas like hyperparameter configuration. We’d be happy to consider a PR if you’re open to working on it! Our team encourages contributions 🙂

Did you check the relevant examples from our docs?
None
[None](https://c...

3 months ago
0 Hi Everyone, Documentation Stands That

Hi @<1691983266761936896:profile|AstonishingOx62> , I think it's an issue related to the schema generated by the sdk, can you try adding _allow_extra_fields_ as True with the same call?

2 months ago
0 Hello, Is It Possible To Upload Artifacts Using The Rest Api? It Seems Like

With the API you can register an artifact to a task, but the upload will be done separately with the ClearML sdk (the sdk wrap the registration and upload, with some other things inside the upload_artifact function).

4 months ago
0 Hello, Is It Possible To Upload Artifacts Using The Rest Api? It Seems Like

Hi @<1747066118549278720:profile|WhoppingToad71> , can you share the use case? You want to upload the file to some storage? Or upload to a task?

5 months ago