Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
TimelyPenguin76
Administrator Moderator
0 Questions, 711 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0
0 Hi , I Have This Use Case.

With this scenario, your data should be updated when running the pipeline

3 years ago
0 Hi, We Have Clearml On K8 Setup. Using The Below, We Run Dynamic Pods On The Cluster.

Hi DeliciousBluewhale87

Can you share the version you are using? Did you get any other logs? maybe from the pod?

3 years ago
0 Two Simple Lineage Related Questions:

Task B is a clone of Taks A. Does B store the information that it was cloned from A somewhere?

You can add any user properties you like to any task, so maybe “origin” : <task_id> will do the work?

3 years ago
0 I Was Running The Hyperparameter Optimization Code In Clearml. I Understand Iteration Is One Single Experiment But What Does The Other Parameters Refer To ? Like Compute Time, Iterations And Jobs And Time , What Do They Mean ?

Hi DeliciousBluewhale87

compute time : The maximum compute time in minutes (sum of execution time on all machines). When time limit is exceeded, all jobs aborted.
jobs : set the maximum number of experiments for the optimization.
time : set the maximum time limit for the optimization process

3 years ago
3 years ago
0 Let'S Say I Have A Project Call Proj1 To Store Datasets With Type "Data Process".. What Is The Best Practice To Get The Latest Datasets ? Example, I Start The First Data (A). Then Using Clearml-Data, I Add Another Dataset (B) As Child To The Previous On

Hi DeliciousBluewhale87 ,

You can just get a local copy of the dataset with ds.get_local_copy() , this will download the dataset from the dataset task (using cache) and return a path to the downloaded files.

Now, in this path you’ll have all the files that you have in the dataset, you can go over the files in the dataset with ds.list_files() (or ds.list_files()[0] if you have only 1 file) and get the one you want

maybe something like:

` ds_path = ds.get_local_copy()
iri...

3 years ago
0 Let'S Say I Have A Project Call Proj1 To Store Datasets With Type "Data Process".. What Is The Best Practice To Get The Latest Datasets ? Example, I Start The First Data (A). Then Using Clearml-Data, I Add Another Dataset (B) As Child To The Previous On

Hi DeliciousBluewhale87 ,

You can get the latest dataset by calling Dataset.get :

from clearml import Dataset ds = Dataset.get(dataset_project="dataset-project", dataset_name="dataset-task-name")This will return you the latest dataset from the project

3 years ago
0 Is It Possible To Make A Callback To When A Task Is Grabbed By A Trains-Agent Worker So I Could Edit Its Configuration Before The Creation Of The Tasks Starts? (I Want To Change The Base Docker Path Depending On The Worker That Grabbed The Task)

Hi SmarmySeaurchin8 ,

You can configure each trains-agent to run with a different image. In your trains.conf file, under agent.default_docker.image section, specify the image you want the trains-agent to run with. When you have a value in this section, and with empty BASE DOCKER IMAGE, you can avoid changing it manually.

Can this do the trick?

4 years ago
0 Hi, I'M Getting Permission Errors When Trying To Read A File Using Trains-Agent, Which User Does It Use To Access Files? How Can I Grant It Permissions? (With My Own User Everything Works Smoothly)

Hi SmarmySeaurchin8 ,

The trains-agent default uses the ~trains.conf file for credentials, can you verify the api section in this file?

3 years ago
0 Hey, How Can I Point Trains To Look For It'S Train.Conf File In A Different Path Than ~/Trains.Conf?

Hi SmarmySeaurchin8

You can configure TRAINS_CONFIG_FILE env var with the conf file you want to run it with. Can this do the trick?

4 years ago
0 Hey, How Can I Point Trains To Look For It'S Train.Conf File In A Different Path Than ~/Trains.Conf?

You can configure env vars in your docker compose, but what is your scenario? Maybe there are some other solutions

4 years ago
0 Hey, How Can I Point Trains To Look For It'S Train.Conf File In A Different Path Than ~/Trains.Conf?

For the trains-agent , you have an option to specify the trains.conf file you want it to run with. just start the trains-agent with trains-agent --config ~/trains_agent.conf (where ~/trains_agent.conf is your ~/trains.conf file for the agent run).

how could I configure this in the docker compose?

Do you mean to env vars?

4 years ago
0 Hi. One Question Regarding Instantiation Of Tasks. The Docu States That Providing

Hi ThankfulOwl72 ,

You can create only one main execution Task. In the code you wrote, you are trying to have two tasks, which causing the exception. You can read more about the task object in the https://allegro.ai/docs/task.html#trains.task.Task .

The reuse_last_task_id will create a new task, which is not the default for https://allegro.ai/docs/task.html#task.Task.init (will override the last one)

What is your use case? maybe I can help with that

BTW, you can use ` Task.init...

4 years ago
0 Hello All, I Am Trying To Make Clearml Store The Output(Model) To Minio By Setting The Output_Uri.

Hi OddShrimp85 ,

Whats the clearml version you are using? Do you have boto3 installed?

3 years ago
0 Hi Guys, Needed A Bit Of Clarification. Every 15 Minutes, The Scheduler Prints, "Syncing Scheduler, Sleeping For 15 Minutes Until Next Sync". Can You Guide Me As To It Is Syncing With What?

đź‘Ť
this is a message about configuration sync.

Its allow you to change scheduler in runtime by editing the Task configuration object

2 years ago
0 How Can I Register A Json File I'M Creating As An Artifact

yes -

task.upload_artifact('local json file', artifact_object="/path/to/json/file.json")

2 years ago
0 Is It Possible To Write Text File And See It In Results Of The Experiment? I Want To Use It To Version Data As In Keeping A Track Of What Images Have Been Trained On. Or Is There A Better Way Of Data Versioning In Clearml?

Hi VexedCat68

Is it possible to write text file and see it in results of the experiment?

You can upload any file as an artifact to your task, try:

task.upload_artifact(name="results", artifact_object="text_file.txt")

I want to use it to version data as in keeping a track of what images have been trained on. Or is there a better way of data versioning in ClearML?

You can use https://clear.ml/docs/latest/docs/clearml_data/ for making the data accessible from every machine...

3 years ago
0 Is It Possible To Write Text File And See It In Results Of The Experiment? I Want To Use It To Version Data As In Keeping A Track Of What Images Have Been Trained On. Or Is There A Better Way Of Data Versioning In Clearml?

Is it possible to write text file and see it in results of the experiment?
You can upload any file as an artifact to your task, try:

task.upload_artifact(name=“results”, artifact_object=“text_file.txt”)

Notice the max preview for an artifact is 65k, and it is suggested to have one file like this (and not for every iteration for example)

3 years ago
0 I Just Getting This In My Agent Run Task. Would Appreciate If Someone Can Advise Where I Externalrequirement Is Pointing At.

You can always run the command with CLEARML_CONFIG_FILE=<YOUR clearml.conf file path>
and then the running command

For the agent you can add --config-file with the path to the configuration file

3 years ago
Show more results compactanswers