Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SweetBadger76
Moderator
1 Question, 239 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

4 × Eureka!
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
hello TartSeagull57 This is a bug introduced with version 1.4.1, for which we are working on a patch. The fix is actually in test, and should be released ver...
2 years ago
2 years ago
0 Hi Everyone! Does Anyone Know If It Possible To Change The

Hi NonsensicalWoodpecker96
you can you the SDK 🙂

task = Task.init(project_name=project_name, task_name=task_name)
task.set_comment('Hi there')

2 years ago
0 Hi Everyone, I’Ve Been Using Clearml For A While Now And I Wanted To Add The Option To Execute My Code Remotely As A Command Line Argument. I Have The Clearml Agents And Queues Set Up, And That Seems To Be Working Correctly (Cloning And Running Experiment

Hi SteepDeer88
I wrote this script to try to reproduce the error. I am passing there +50 parameters and so far everything works fine. Could you please give me some more details about your issue, so that we could reproduce it ?

from clearml import Task
import argparse

'''
COMMAND LINE:
python -m my_script --project_name my_project --task_name my_task --execute_remotely true --remote_queue default --param_1 parameter...

2 years ago
0 Hi I'M Looking Into How Clearml Supports Datasets And Dataset Versioning And I'M A Bit Confused. Is Dataset Versioning Not Supported At All In The Non-Enterprise Or Is Versioning Available By A Different Mechanism? I See That

Hi PanickyMoth78
There is indeed a versioning mechanism available for the open source version 🎉

The datasets keep track of their "genealogy" so you can easily access the version that you need through its ID

In order to create a child dataset, you simply have to use the parameter "parent_datasets" when you create your dataset : have a look at
https://clear.ml/docs/latest/docs/clearml_data/clearml_data_sdk#datasetcreate

You also alternatively squash datasets together to create a c...

2 years ago
0 Need

i dont know if it will help but here is what i would test :
remove temporary the task init in the controller use name and project parameters that dont have spaces in their name dont use services as a default queue

2 years ago
0 Hey Guys! Has Anyone Ever Seen An Error Like This? I'M Using My Code In A

hey SmugSnake6
Can you give some more precisions on your configuration please ? (clearml, agent, server versions)
Also, if you have some example code to share it could help us reproduce the issue and thus help you a lot faster 🙂 (script, command line for firing your agent)

2 years ago
0 Need

I see some points that you should fix
in the train step, you return 2 items but you have only one in its decorator: add mock do you really need to init a task in the pipeline controller ? you will automatically get one when executing the pipeline

2 years ago
0 Hey Guys, Is There An E2E Working Example Of Writing A Pipeline With 2-3 Tasks? Just An Hello World. I Am The First One Who Tries To Make Clearml Pipeline To Work I Wasn'T Able To Make It:

you are in a regular execution - i mean not a local one. So the different pipeline tasks has been enqueued. You simply need to fire an agent to pull the enqueued tasks. I would advice you to specify the queue in the steps (parameter execution_queue ).
You then fire your agent :
clearml-agent daemon --queue my_queue

2 years ago
0 Hey,

when you spin a container , you map a host port with a container port using -p parameter
docker run -v ~/clearml.conf:/root/clearml.conf -p 8080:8080 -e CLEARML_SERVING_TASK_ID =<service_id> -e CLEARML_SERVING_POLL_FREQ =5 clearml-serving-inference:latest
Here you map your computer's port 8080 with the container port 8080. If your 8080 port is already used, you can use another, using for example -p 8081:8080

2 years ago
0 Hey,

can you share your logs ?

2 years ago
0 Hey,

can you do docker ps to check if there are running containers that already bind the port ?

2 years ago
0 Hey,

i managed to import a custom package using the same way you did : i have added the current dir path to my system
i have a 2 steps pipeline :

  1. Run a function from a custom package. This function returns a Dataloader (built from torchvision.MNIST) 2) This step receives the dataloader built in the first step as a parameter ; it shows random samples from itthere has been no error to return the dataloader at the end of step1 and to import it at step2. Here is my code :

` from clearml import Pi...

2 years ago
0 Since V1.4.0, Our

Do you think that you could send us a bit of code in order to better understand how to reproduce the bug ? In particular about how you use dotenv...
So far, something like that is working normally. with both clearml 1.3.2 & 1.4.0

`
task = Task.init(project_name=project_name, task_name=task_name)

img_path = os.path.normpath("**/Images")
img_path = os.path.join(img_path, "
*.png")

print("==> Uploading to Azure")
remote_url = "azure://****.blob.core.windows.net/*****/"
StorageManager.uplo...

2 years ago
0 Since V1.4.0, Our

but in the other hand, when you parse your minio console, you have all the buckets shown as directories right ? there is no file in the root dir. So we used the same logic and decided to reproduce that very same structure. Thus when you will parse the local_folder, you will have the same structure as shown in the console

2 years ago
0 Hi, I Am Getting This Error When Using The Aws Auto_Scaler Service (With The Pro Version):

Hi,
We are going to try to reproduce this issue and will update you asap

2 years ago
0 Since V1.4.0, Our

Interesting. We are opening a discussion to weight the pros and cons of those different approaches - i ll of course keep you updated>
Could you please open a github issue abot that topic ? 🙏
http://github.com/allegroai/clearml/issues

2 years ago
0 Hey Guys, Is There An E2E Working Example Of Writing A Pipeline With 2-3 Tasks? Just An Hello World. I Am The First One Who Tries To Make Clearml Pipeline To Work I Wasn'T Able To Make It:

check that your task are enqueued in the queue the agent is listening to.
from the webUI, in your step's task, check the default_queue in the configuration section.
when you fire the agent you should have a log that also specifies which queue the agentis ssigned to
finally, in the webApp, you can check the Workers & Queues section. There you could see the agent(s), the queue they are listening to, and what tasks are enqueued in what queue

2 years ago
0 Hi, Having Problems During Credentials Verification In Clearml-Init. Server Installed On A Separate Machine, Web Ui Is Accessible Where I Generated Credentials For Api. Here Is The Log. Don'T Know Why I'M Getting Http 403. Detected Credentials Key="Wrdif3

i am also speaking with another user this morning, who has the very same issue
can you give me some more details about your config, and share your error logs please ?

2 years ago
0 Hi All! Is There A Function/Command That Returns Number Of Tasks In A Queue?

One agent is assigned to one queue ; so he will execute one task at the time, sequentially, according to their rank in the queue. But you can create as many queues as you want, and assign an agent for each one. You simply fire each agent from a terminal, with a command like :
clearml-agent daemon --queue my_queue_i
if you have more than one gpu, you can also choose for each agent which gpu(s) to allocate

2 years ago
0 Hi All! I Trying To Organize My Workflow With Clearml, And I Found Out About Datasets. I Like The Concept And I Wonder If I Can Connect A Dataset To A Task / Experiment? Currently The Dataset Appears As Another Task In The Project Page. Thanks!

You can initiate your task as usual. When some dataset will be used in it - for example it could start by retrieving it using Dataset.get - then the dataset will be registered in the Info section (check in the UI) 😊

2 years ago
0 Hi, I Have A Local Package That I Use To Train My Models. To Start Training, I Have A Script That Calls

you can freeze your local env and thus get all the packages installed. With pip (on linux) it would be something like that :
pip freeze > requirements.txt
(doc here https://pip.pypa.io/en/stable/cli/pip_freeze/ )

2 years ago
0 Hi, That I'M Running The Line Dataset = Clearml.Dataset.Get (Dataset_Project = 'Datasets', Dataset_Tags = ....) I Get: File "/Root/.Clearml/Venvs-Builds/3.8/Lib/Python3.8/Site-Packages/Clearml/Datasets/Dataset.Py", Line 1534, In Get Dataset_Id = Cls

what bother me is that it worked until yesterday, and you didnt changed your code. So the only thing i can think of is a bug introduced with the new sdk version, that was released yesterday. I am inverstigating with the sdk team, i will keep you updated asap ! 🙂

2 years ago
0 Hi, That I'M Running The Line Dataset = Clearml.Dataset.Get (Dataset_Project = 'Datasets', Dataset_Tags = ....) I Get: File "/Root/.Clearml/Venvs-Builds/3.8/Lib/Python3.8/Site-Packages/Clearml/Datasets/Dataset.Py", Line 1534, In Get Dataset_Id = Cls

is it a task you are trying to access to or a dataset ? if you need to retrieve a task, you should use Task.get_task()

if i do that :
ds=Dataset.create(dataset_project='datasets',dataset_name='dataset_0')
it will result in the creation of 2 experiments :
results page: the task that corresponds to the script that launched the dataset creation - it will be in PROJECTS/datasets/.datasets/dataset_0 dataset page: the dataset itself : would be in DATASETS/dataset_0

2 years ago
0 Hi We Are Getting The Following Error When We Are Trying To Run A Task On Our On Premis

i am not sure i get you here.
when pip installing clearml-agent, it doesnt fire any agent. the procedure is that after having installed the package, if there isnt any config file, you do clearml-agent init and you enter the credentials, which are stored in clearml.conf. If there is a conf file, you simply edit it and manually enter the credentials. so i dont understand what you mean by "remove it"

2 years ago
Show more results compactanswers