Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8126 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 I Have A Bunch Of Python Modules With Clearml Tasks. They Are Using 3Rd-Party Libraries But No Module Uses Code From Another Module. When I Run Such A Task Remotely - Then Clearml Deduces The Dependencies From Imports, Which Works Fine. Now I Decided To T

FiercePenguin76 the git repo should detect only clearml as required python package
Basically the steps are:
decide if the initial python entry script is a standlone script (i.e. no local imports) in the git repo (in your example "task_with_deps.py") If this is a "standlone script" only look for imports inside the calling python script, and list those packages under "installed packages" If this is Note a standalone script, go over All the python files inside the repository, look for "i...

3 years ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Do note that the needed module is just a local folder with scripts.

Oh that is the issue, is it in the git repo ?

2 years ago
0 Hi, We Have A Use Case That We Would Like To Upload A Local Folder Into The Cloud

Hi OutrageousSheep60

AS-IS

  • without compressing or breaking it up into chunks.

So for that I would suggest to manually archive it, and upload as external link?
Or are you saying you want to control the compression used by Dataset class ?
https://github.com/allegroai/clearml/blob/72d9b22e0d27f317a364acfeacbcf5c70f852e8c/clearml/datasets/dataset.py#L603

2 years ago
0 Hi

You can only edit I on the cloned (copy of) the original experiment. Make sense?

3 years ago
0 Hi, Is There Any Way To Get Experiment Debug Images Programmatically?

Hi HandsomeCrow5 .
Remember the debug images are events with links to the actual images, so you first have to get the events and then you can download the images with https://allegro.ai/docs/examples/examples_storagehelper/#storagemanager (which by definition has the credentials, because it was able to upload them πŸ™‚
To get the events:
from trains.backend_api.session.client import APIClient client = APIClient() client.events.debug_images(task='aabbcc')

5 years ago
2 years ago
0 I'D Like The Console In A Clearml Run To Show Only The Stdout/Stderr As It Does Now, But I'D Also Like Clearml To Capture Debug Level Logs. Is There An Easy Around This? It Would Be Nice If One Could E.G. Set

No it will not 😞 the closer is closer to the actual print.
That said, I'm sure it would not be complicated to add.
But I have to wonder, this will really create a mess in the console log, so if someone wants it, it will be global (i.e. also in the visible console. not only in the backend), so the case where the console on the machine itself is "clean" but the backend log is full of debug stuff is not clear to me

3 years ago
0 Hi - Quick Question. I Am Using The Pipelinecontroller With Abort_On_Failure Set To False. I Have A Pipe With A First Task That Branch Out In 3 Branches.

if the first task failed - then the remaining task are not schedule for execution which is what I expect.

agreed

I'm just surprised that if the first task is

aborted

instead by the user,

How is that different from failed? The assumption is if a component depends on another one it needs its output, if it does not then they can run in parallel. What am i missing?

one year ago
0 Hi Team, How To Configure Gerrit Details In Clearml So That Tasks Or Pipeline Will Be Executed Depends On Gerrit?

Hi @<1542316991337992192:profile|AverageMoth57>
Not sure I follow how the integration what you have in mind regarding Gerrit integration None
Sounds interesting ...
wdyt?

2 years ago
0 What Is The Suggested Way Of Running Trains-Agent With Slurm? I Was Able To Do A Very Naive Setup: Trains-Agent Runs A Slurm Job. It Has The Disadvantage That This Slurm Job Is Blocking A Gpu Even If The Worker Is Not Running Any Task. Is There An Easy Wa

but I need to dig digger into the architecture to understand what we need exactly from k8s glue.

Once you do, feel free to share, basically there are two options , use the k8s scheduler with dynamic pods, or spin the trains-agent as a service pod, and let it spin the jobs

5 years ago
0 Hi, V1 Of Agent Seems To Have Removed Agent.Package_Manager.Force_Repo_Requirements_Txt. Is This Still Available In Other Forms?

SubstantialElk6 This seems to be the issue
cp: failed to access '/root/default_clearml.conf': Permission denied clearml_agent: ERROR: Could not find task id=024a421c0e174650a1c7ff64af756c26 (for host: )Notice it seems it just cannot read the clearml.conf , wdyt?

4 years ago
0 Hi Friends! I'M Trying To Upgrade The

I don't have the compose file, or at least can't seem to find it inΒ 

/opt

you can manually take down all dockers with:
docker psthen docker stop <container id> for each container id

4 years ago
0 Hi All, I'M Trying To Use The Relatively New Jupyter Preview Feature But For Some Reason I Have The Notebook Artifact Under Artifacts But The Preview Is Unavailable.. Am I Missing Some Needed Steps? Thanks!

RipeGoose2

HTML file is not a standalone and has some dependencies that require networking..

Really? I thought that when jupyter converts its own notebook it packages everything into a single html, no?

4 years ago
0 Hello! Since Today I Get

Okay this is very close to what the agent is building:
Could you start a new conda env,
then install cudatoolkit=11.1
then run:

conda env update -p <conda_env_path_here> --file the_env_yaml.yml
4 years ago
0 Hey, Trying To Figure Out How To Create An

Hi FierceHamster54

Do I need to instantiate a task inside my component ? Seems a bit redundant....

Yes, so the idea is that the Task (along the code) will be automatically linked with the output model, for better traceability.
That said you can "import" a model into the system (i.e. it was created somewhere else and you want to register it with InputModel.import_model
https://clear.ml/docs/latest/docs/clearml_sdk/model_sdk#importing-models
I guess "Input" from that perspecti...

3 years ago
0 Hello, Are There Any Plans To Add Support For Pdm Package Manager? It'S What We Use Since Poetry Dependcy Solver Is Quite Slow And It Would Be Neat To Have Direct Support In Clearml-Agents As Well. Thanks!

Hi @<1547028074090991616:profile|ShaggySwan64>
I have to admit that personally I do not know pdm , could you share links, and help us understand what is the value over pip/poetry/conda ?

one year ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

... these nested components are not tagged with 'pipe: <pipeline_task_id>'. I assume this should not be like that, right?

Helper functions are not "component", they are actually files that will be accessible when running the component itself.
am I missing something ?

4 years ago
0 Hi I Have A Most Probably A Beginer Question Abour Loading The Data In Pycharm And Later On In Google Colab From An Dataset From Clearml. I Used From Page:

'

' error [Errno 13] Permission denied:

Seems like a permission issue ?
Try to remove your entire clearml cache folder None

one year ago
0 Hi, I Am Getting Following Error While Trying To Checkout A Gut Hub Rep. Error: Rpc Failed; Curl 56 Gnutls Recv Error (-54): Error In The Pull Function. Fatal: The Remote End Hung Up Unexpectedly Fatal: Early Eof Fatal: Index-Pack Failed Repository Cloni

BTW: the cloning error is actually the wrong branch, if you take a look at your initial screenshot, you can see the line before last branch='default' which I assume should be branch='master' (The error itself is still weird, but I assume that this is what git is returning)

5 years ago
0 Hi All, I Am Getting A Bunch Of This Kind Of Log Messages "Clearml.Storage - Info - Starting Upload: /Tmp/.Clearml.Upload_Model_6Ou50Pb1.Tmp =>" I Am Pretty Sure They Happen As A Part Of The Model Initialization About 10 Of Those, My Guess Is That Every T

RipeGoose2 models are automatically registered
i.e. added to the models artifactory, but it only points to where the files are stored
Only if you are passing the output_uri argument to the Task.init, they will be actually uploaded.
If you want to disable this behavior you can pass
Task.init(..., auto_connect_frameworks={'pytorch': False})

4 years ago
0 Hi, I'Ve Got A Quick Question About

. Does

Task.connect

send each element of the dictionary as a separate api request? Has anyone else encountered this issue?

Hi SuperiorPanda77
the task.connect ends up as a single call with all the data being sent on a single request.
That said, maybe the connect dict is not the best solution for thousand key dictionary ...
Maybe artifact, or connect_configuration are better suited ?
wdyt?

3 years ago
Show more results compactanswers