Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8048 Answers
  Active since 10 January 2023
  Last activity 5 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Hey

Okay, I think this might be a bit of an overkill, but I'll entertain the idea πŸ™‚
Try passing the user as key, and password as secret?

10 months ago
0 What Happens To File That Are Downloaded To A Remote_Execution Via Storagemanager? Are They Removed At The End Of The Run, Or Does It Continuously Increases Disk Space?

UnevenDolphin73 following the discussion https://clearml.slack.com/archives/CTK20V944/p1643731949324449 , I suggest this change in the pseudo code
` # task code
task = Task.init(...)

if not task.running_locally() and task.is_main_task():
# pre-init stage
StorageManager.download_folder(...) # Prepare local files for execution
else:
StorageManager.upload_file(...) # Repeated for many files needed
task.execute_remotely(...) `Now when I look at is, it kinds of make sense to h...

2 years ago
0 Hello, I Have An Error While Installing Git Dependencies Of Local Package: So Far I Used Task.

oh dear 😞 if that's the case I think you should open an Issue on pypa/pip , I'm not sure what we can do other than that ...

3 years ago
one month ago
0 Hey

You described getting a secret key pair from the UI and feeding it back into the compose file. Does this mean it's not possible to seed the secrets in the compose file, starting from clean state? If so, that would explain why I can't get it to work.

Long story short, no. This would basically mean you have a pre-build credentials in the docker, this sounds dangerous πŸ™‚
I'm not sure I'm following the use case here, what exactly are we trying to do?
(or maybe I missed something here?)

10 months ago
0 Hi Guys! How Do You Handle Tasks With A Complex Parametrization? For Example, A Script That Trains A Machine Learning Model, Where You Want To Parametrize Model Name, Hyperpars, Preprocessing Steps, Etc. So A Nested Configuration With Many Parameters Do I
  1. yes they will! This is exactly the idea :)
  2. yes it will store it as text file (as is raw text) notice the return value is the file you should open. This is because when running via agent the return file will contain the conf file from the UI. Make sense?
4 months ago
0 Can You Please Tell Me How To Make The Agent Use The Docker Env By Default? Instead Of Creating Venv It Already Has All The Necessary Environment And Libraries Installed

Can you please tell me how to return the folder where the script should run?

add it to the python path

PYTHONPATH="/src/project"
one year ago
0 Hey! I Would Like To Connect To Same Task From Multiple Consumer And Upload Debug Image. Is It Possibile? It Seems Like I Can Connect To The Task. Get The Logger But Nothing Is Uploaded.

FranticCormorant35 As far as I understand what you have going is a multi-node setup, that you manage yourself. Something like Horovod Torch distributed or any MPI setup. Since Trains support all of the above standard multi-node. The easiest way is to do the following:

On the master Node set OS environment:
OMPI_COMM_WORLD_NODE_RANK=0Then on any client node:
OMPI_COMM_WORLD_NODE_RANK=unique_client_node_numberIn all processes you can Call Task.init - with all the automagic kicking in....

4 years ago
21 days ago
0 Hi! Is There Something Happening With The

This is what I just used:
` import os
from argparse import ArgumentParser

from tensorflow.keras import utils as np_utils
from tensorflow.keras.datasets import mnist
from tensorflow.keras.layers import Activation, Dense, Softmax
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.callbacks import ModelCheckpoint

from clearml import Task

parser = ArgumentParser()
parser.add_argument('--output-uri', type=str, required=False)
args =...

3 years ago
0 Hi! Is There Something Happening With The

"Fix TF 2.4 keras load/save model"

3 years ago
0 Hi, When It First Asks Me To Enter My Full Name, It Fails To Perform The Request (Timed Out). Checked Server Side And Receiving This Error

Seems like something is not working with the server, i.e. it cannot connect with one of the dockers.
May I suggest to carefully go through all the steps here, make sure nothing was missed
https://github.com/allegroai/trains-server/blob/master/docs/install_linux_mac.md

Especially number (4)

4 years ago
0 Hi Everyone, Yesterday I Pushed An Experiment To The

Let me check, it was supposed to be automatically aborted

3 years ago
0 Is The App/Ui/Backend Customizable? Any Tutorials For That?

CleanWhale17 nice ... πŸ™‚
So the answer is Trains supports the Pipeline / Automation of it, but lacks that dataset integration (that is basically up to you to manage, with either artifacts or any other method)
The Allegro Enterprise allows you to rerun the code, on a new version of the dataset from the UI (or automation) without changing a single line of code πŸ™‚

4 years ago
0 Hi, Is It Possible To Change The Visibility Of The Projects On The Dashboard So That Only Specific Users Can See The Projects?

Hi SoreDragonfly16
Sadly no, the idea is to create full visibility to all users in the system (basically saying share everything with your colleagues) .
That said, I know the enterprise version have permission / security features, I'm sure it covers this scenario as well.

3 years ago
0 Hi, I’M Currently Running Clearml With Pytorch And Everytime I Run Into

PompousHawk82 unfortunately this is kind of binary, either you have full tracking of load/save operations or you do not.
This warning message will disappear in the next version as we will be able to log multiple models under the same Task :)

3 years ago
0 Hey

a CI for the vscode extension ? So spin server + agent & connect to it as part of the CI?

10 months ago
0 Trains Seems To Fail To Capture My Conda Environment, Any Idea? Os: Window 10

And still a difference between A/B , one detecting the repo the other does not?

4 years ago
0 Hey, My Name Is Ido, And I Am A New Clearml User. My Goal Is To Monitor The Accuracy Of My Llm Outputs In Production. I Understand That I Can Log Each Iteration With A Binary Output (0 For Incorrect And 1 For Correct), But This Approach Makes The Visual G

I prefer serving my models in-house and only performing the monitoring via ClearML.

clearml-serving is an infrastructure for you to run models πŸ™‚
to clarify, clearml-serving is running on your end (meaning this is not SaaS where a 3rd party is running the model)

By the way, I saw there is a project dashboard app which might support the visualization I am looking for. Is it suitable for such use case?

Hmm interesting, actually it might, it does collect matrices over time ...

2 months ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

Hi CrookedWalrus33

When we enqueue the task to run remotely, not all conda packages are installed,

Yes it actually lists all the python packages inside "installed packages" regradless of whether they are coming from pip / conda. Internally it holds the conda part in a separate section (maybe we should present it?!)

and the task is failing (they

Can you provide the log for the Task executed by the agent?

2 years ago
0 Are There Any Particular System Dependencies Needed To Enable

I still don't get resource logging when I run in an agent.

@<1533620191232004096:profile|NuttyLobster9> there should be no difference ... are we still talking about <30 sec? or a sleep test? (no resource logging at all?)

have a separate task that is logging metrics with tensorboard. When running locally, I see the metrics appear in the "scalars" tab in ClearML, but when running in an agent, nothing. Any suggestions on where to look?

This is odd and somewhat consistent with actu...

5 months ago
0 Hi! I Am Currently Using Clearml (With Remote Execution), To Train An Object Detection Model With

clearml - WARNING - Could not retrieve remote configuration named 'hyperparams'

What's the clearml-server version you are working with ?

In both logs I see (even in the single GPU log, it seems you "see" two GPUs, is that correct?)
GPU 0,1 Tesla V100-SXM2-32GB (arch=7.0)

Last question, this is using relatively old clearml version (0.17.5), can you test with the latest version (1.1.1)?

3 years ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

I'm not sure if it matters but 'kwcoco' is being imported inside one of the repo's functions and not on the script's header.

Should work.
when you run pip freeze inside the same env what are you getting ?
Also, is there anyother import that is missing? (basically 'clearml' tryies to be smart, and see if maybe the script itself, even though inside a repo, is not actually importing anything from the repo itself, and if this is the case it will only analyze the original script. Basically...

2 years ago
0 Hi All, Looking For Some Help When Executing Pipelines With Custom Docker Images. I Have A Component Defined And I Expect Its Python Runtime Environment To Be Managed By A Custom Docker Image (

Hi WickedStarfish97

As a result, I don’t want the Agent to parse what imports are being used / install dependencies whatsoever

Nothing to worry about here, even if the agent detects the python packages, they are installed on top of the preexisting packages inside the docker. That said if you want to over ride it, you can also pass packages=[]

2 years ago
Show more results compactanswers