Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8060 Answers
  Active since 10 January 2023
  Last activity 9 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Hi, I Faced With A Silly Error, When I Run The Python Script With Task = Trains.Init(Project_Name='My Project', Task_Name='My Task'). The Task Goes To The Trains Server, But In The Trains Server, In Installed Packages Part One Of The Line

Yes, I mean trains-agent. Actually I am using 0.15.2rc0. But, I am using local files, I mean I clone trains and trains-agent repos and install them. Their versions are 0.15.2rc0

I see, that's why we get the git ref, not package version.

4 years ago
0 Hi Everybody. I Have Problem When Logging Model In A Specific Case. If Model Has Parameter That Is A Dict Than It Is Not Saved To Clearml Even Tho It Is Saved In A Model Folder Normally. I Have Also Attached Example When This Is Happening As A Snippet. D

Thanks for pinging OutrageousGiraffe8
I think I was able to reproduce.

model is saved to the clearml as an output model when

b

is not a dictionary.

How did you make the example work with the automagic ?

2 years ago
0 Question About

That sounds about right to me πŸ™‚

3 years ago
0 This Message Is For The Clearml Team. I'Ve Found A Bug. I Think It'S Reproducible. Basically, When Dealing With Bools Inside Args, I Think What You Guys Do Is Just Cast It To Bool Since All The Args Are Stored As Strings If I'M Correct. Only Issue Is, Boo

Hi VexedCat68
can you supply more details on the issue ? (probably the best is to open a github issue, and have all the details there, so we have better visibility)
wdyt?

3 years ago
0 Hi Everyone. I'M New To Trains. I Do Not Have Sudo Access To My Departmental Servers. Can I Still Use Trains Beyond The Demo Server?

Hmm you will have to set the trains-server on a machine somewhere, it can be any machine win / Mac / Linux

4 years ago
0 I'M Trying To Understand How Clearml Serving Works And Trying To Set It Up. I Have An Agent Listening To The Serving Queue And I'M Trying To Set Up Clearml Serving To Launch On The Serving Queue. Do I Need To Have Clearml-Serving Installed On The Machine

can you tell me what the serving example is in terms of the explanation above and what the triton serving engine is,

Great idea!

This line actually creates the control Task (2)
clearml-serving triton --project "serving" --name "serving example"
This line configures the control Task (the idea is that you can do that even when the control Task is already running, but in this case it is still in draft mode).
Notice the actual model serving configuration is already stored on the crea...

3 years ago
0 Any Ideas Of Using Label Studio With Clearml Datasets - Base Dataset, Load To Label Studio, Annotate, Child Annotated Dataset Is The Kind Of Flow

I assume so πŸ™‚ Datasets are kind of agnostic to the data itself, for the Dataset it's basically a file hierarchy

3 years ago
0 Hi, I'M Trying To Get An Understanding Of How

just got the pipeline to runΒ 

Nice!

using the default queue okay?

Using the default queue is fine. The different queue is the "services" queue that by default the "trains-server" is running an agent the will pull jobs from there.
With "services" mode, an agent will pull jobs right after the other (not waiting for the previous job to finish), as opposed to regular queue (any other) that the trains-agent will pull a job only after the previous one completed .

4 years ago
0 Is There Any Way To Clear The Installed Packages Of A Task Programmatically? (I.E. Using The Python Sdk And Not The Ui)

It was set to true earlier, I changed it to false to see if there would be any difference but doesn’t seem like it

I would actually just add:
Task.add_requirements('google.cloud')Before the Task.init call (Notice, it has to be before the the init call)

4 years ago
3 years ago
3 years ago
0 Good Day! I Ran Into A Problem When When Running Two Or More Identical Nodes In A Pipeline (Multi_Instance_Support=True ), Only One Of Them Uses An Already Created Venv From Cache For This Task. And The Other Node Starts To Re-Create The Same Virtual Envi

Hi @<1598487094601191424:profile|MysteriousCow84>

only one of them uses an already created venv from cache for this task. And the other node starts to re-create the same virtual environment.

Just be clear, the second one is running, but it does not use the same venv as the other one (that is running in parallel), is that correct?

one year ago
0 Is It Possible To Give The Agent Access To Install Private Pip Packages (Needs To Be Installed From The Repo)?

-e

:user/private_package.git@57f382f51d124299788544b3e7afa11c4cba2d1f#egg=private_package

Is this the correct link to the repo and a valid commit id ?

Can you post a few more lines from the agent's log ?
Something is failing to install I'm just not sure what

3 years ago
0 Is It Possible To Give The Agent Access To Install Private Pip Packages (Needs To Be Installed From The Repo)?

The agent is installing the "Installed Paclages" section of the Task (think of it as requirements.txt)
And again, what do you have there? Is it the outcome of the Task.init auto populating it?

3 years ago
0 Hello, Can I Get Somehow Json Files Of Plots For The Given Task? I Know There Is The "Download Json" Button Near The Plots In Your Web Ui, But I Need Do It Programatically (There Are Many Plots And Many Tasks).

If this is the case then the easiest is:
from clearml.backend_api.session.client import APIClient client = APIClient() res = client.events.get_task_plots(task="<task-id>")We should defiantly have a nice interface πŸ™‚

3 years ago
4 years ago
0 I Have Another Small Technical Question, I Am Trying To See The Workers Status Programatically Using The Folowing:

Hmm yes we should probably provide metrics:
client.workers.get_stats(..., items=[dict(key='cpu_usage'), dict(key='gpu_usage')])

3 years ago
0 Is There Any Way To Clear The Installed Packages Of A Task Programmatically? (I.E. Using The Python Sdk And Not The Ui)

GiddyTurkey39
BTW: you can always add the missing package via code:
Task.add_requirements('torch', optional_version)

4 years ago
0 Hey, Great Product! I'Ve Installed Trains Agent On A Python3 Venv, But When I Run A Script On The Worker, It Calls Python2 Instead Of Python 3. How To Change It?

VivaciousWalrus99
Yes this is odd:
1608392232071 spectralab:gpu0 DEBUG New python executable in /cs/usr/gal.hyams/.trains/venvs-builds/3.7/bin/python2So it thinks it has python v3.7 but it is using python2 in the venv...
In your trains.conf file, set agent.python_binary to the python3.7 binary. It should be something like:
agent.python_binary=/path/to/python/python3.7

4 years ago
0 Hi, Guys, I Have A Problem With Clearml-Serving. When I Try To Request Sklearn Model (I Try To Reproduce An Example

LOL πŸ™‚
Make sure that when you train the model or create it manually you set the default "output_uri"

task = Task.init(..., output_uri=True)

or

task = Task.init(..., output_uri="s3://...")
2 months ago
0 When Launching A Task To Trains Agent, I'M Having Trouble Getting The Imports From Other Files Working Correctly. For Instance, If My Task Imports A Function From Another File Within The Same Git Repo [

Hi GiddyTurkey39
First, yes you can just edit the "installed packages" section and add any missing package (this is equal to requirements.txt)
I wonder why trains failed detecting the "bigquery" package in the first place... Any thoughts ?

4 years ago
0 Hi I Have A Most Probably A Beginer Question Abour Loading The Data In Pycharm And Later On In Google Colab From An Dataset From Clearml. I Used From Page:

If i point directly to the data.yaml the training starts without any problem

what do you mean? how do you know where the extracted file is?
basically:

data_path = Dataset.get(...).get_local_copy()

then you should be able to open your file with open(data_path + "/data.yaml", "rt")
doe that work?

one year ago
0 [Clearml With Pytorch-Based Distributed Training} Hi Everyone! Is The Combination Of Clearml With

It should actually work the same, if you find out it fails to properly register let me know (and then I guess a github issue is the next step)

one year ago
0 Currently Trying To Figure Out How To Extend Clearml'S Automagical Reporting To Joeynmt.

Hi SmallDeer34
ClearML automagical logging will work on the current python process. But in your example yyour Bash is running another python script (that has nothing to do with the original notebook), hence clearml automagic is not aware of it (i.e. it cannot "patch" the tensorboard calls).
In order to make it work.
you should do something like:
from joeynmt import train train.main(...)Or something similar πŸ™‚
Make sense ?

3 years ago
Show more results compactanswers