Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
TimelyPenguin76
Administrator Moderator
0 Questions, 711 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0
0 Hi, I'Ve Recently Upgraded To 0.15.1 From 0.14.2, And For Some Reason A Code That Previously Worked In Which I'M Getting The Tags Of A Model Using

Hi PompousBeetle71 ,

Can you please share with me some more information? Where can you see the tags in the server? Do you mean in the web-app? Do you see the tags under the task or the model?

4 years ago
0 Hi, I'Ve Recently Upgraded To 0.15.1 From 0.14.2, And For Some Reason A Code That Previously Worked In Which I'M Getting The Tags Of A Model Using

PompousBeetle71 , Can you try those and tell me if it’s still empty?

` from trains import InputModel

print(InputModel(<Put a string copy from the UI with the tag id>).tags) `I can’t reproduce this issue, and I just want to be sure it’s not a new model

model id can be found like in the pic. after clicking the ID mark

4 years ago
0 Hey I Use The Clearml-Agent. In The Code I Have The Two Lines I Need. When I Run The Program It Shows That My Program Is Running On The Demo Trains Page. But I Want It To Run On My Own Server. I Tried To Register And Then Use The Following Page:

how can i check if it is loaded?

When a task is starting, the configuration will be print first

it worked with trains-agent init

Do you have 2 configuration files? ~/trains.conf and ~/clearml.conf ?

3 years ago
0 Looking At The Docs.. I Couldn'T Find A Way To Cleanup The Experiments... Only Archive Them... I Also Noticed

You can loop over the tasks you want to delete, Based on the cleanup service:

` import logging

from trains.backend_api.session.client import APIClient

client = APIClient()

you can get the tasks you want to delete with client.tasks.get_all, in this example we will get you all the tasks in a project, but you have other filters too

tasks = client.tasks.get_all(project=[<your project id>])
for task in tasks:
try:
# try delete a task from system
client.tasks.delete(task=ta...

4 years ago
0 Hey, About The Dependency Propagation Of Pipeline Components, If I Call A Vanilla Python Function From A Component Does The Dependencies Specified In The Internal Imports Propagated To This Function Call Too ? And Additionally If That Function Is In Anoth

Hi FierceHamster54 ,

I think

And is this compatible with the

Task.force_store_standalone_script()

option ?

is causing the issue, you are storing the entire script as a standalone without any git, so once you are trying to import other parts of the git, BTW any specific reason using it in your pipeline?

2 years ago
0 Hello, I Always Got A "Modulenotfounderror: No Module Named..." Error When I Run A Pipeline And Import A Local .Py File. I'M Using Decorator Pipeline, How Can I Import That Local .Py Files? My Pipeline Code And That .Py Files I Wanted To Import Already In

Hi

I specify the repo for each step by using the ‘repo’ arguments from PipelineDecorator.component.
Here is my reference

MoodyCentipede68 do you see the repo under EXECUTION tab?

2 years ago
0 Hi , I Have This Use Case.

With this scenario, your data should be updated when running the pipeline

3 years ago
2 years ago
0 I Have A Question Regarding "Imitating" An Agent Pulling Some Task For Debugging Purposes I Am Trying To Do Something Like: Creating A Task On The Server

DepressedChimpanzee34 how do you generate the task thats running remotely? once the agent pulled the task, this is your running configuration (it will pull the same configuration from the server as you see in the UI)

3 years ago
0 We'Re Trying To Use The Aws Autoscaler And Have Managed To Get It Up And Running With Spinning Up Instances. However, It Does Not Seem To Pull Any Of The Tasks For The Remote Instances. We See It Gets

Hi UnevenDolphin73 ,

try to re run it, a new instance will be created, under this specific instance Actions you have Monitoring and troubleshoot, and you can select Get system logs
I want to verify you scaler doesnt have any failures in this log

3 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

When connecting a nested dict the keys will be in the struct of period/end and period/start , so those are the keys you need to change, in additional to the section name, General if name not given.

This should work for example in your example:

` cloned_task = Task.clone(source_task=template_task,
name=template_task.name+' for params', parent=template_task.id)

put back into the new cloned task

cloned_task.set_parameters({"General/period/start": "...

3 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

I just tried and everything works.

I run this for the template task:

` from clearml import Task

task = Task.init(project_name="Examples", task_name="task with connected dict")

period = {"start": "2020-01-01 00:00", "end": "2020-12-31 23:00"}

task.connect(period, name="period") `

and this for the clone one:

` from clearml import Task

template_task = Task.get_task(task_id="<Your template task id>")
cloned_task = Task.clone(source_task=template_task,
name=templat...

3 years ago
0 Hi - What Is The Difference Between

Hi TeenyFly97 ,

With task.close() the task will do a full shutdown process. This includes repo detection, logs, metrics and artifacts flush, and more. The task will not be the running task anymore and you can start a new task.

With task.mark_stopped() , the task logs will be flushed and the task will mark itself as stopped, but will not perform the full shutdown process, so the current_task will still be this task.

For example:
` from trains import Task

task = Task.in...

4 years ago
0 What Does It Mean To Publish A Model Or A Dataset?

You are definitely right! We will fix this issue, Thanks 🙂

3 years ago
0 Hi Guys, Suppose I Have The Following Script:

Hi GiganticTurtle0 ,

All the packages you are using should be under installed packages section in your task (in the UI). ClearML analyze and the full report should be under this section.

You can add any package you like with Task.add_requirements('tensorflow', '2.4.0') for tensorflow version 2.4.0 (or Task.add_requirements('tensorflow', '') for no limit).

If you dont want the package analyzer, you can configure in your ~/clearml.conf file: ` sdk.development.detect_with_...

3 years ago
0 Hey, How Can I Point Trains To Look For It'S Train.Conf File In A Different Path Than ~/Trains.Conf?

You can configure env vars in your docker compose, but what is your scenario? Maybe there are some other solutions

4 years ago
0 Hey, How Can I Point Trains To Look For It'S Train.Conf File In A Different Path Than ~/Trains.Conf?

Hi SmarmySeaurchin8

You can configure TRAINS_CONFIG_FILE env var with the conf file you want to run it with. Can this do the trick?

4 years ago
Show more results compactanswers