Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8124 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 I Am Using Pipelines (Just Starting) And I Am Checking Different Options For Overriding Parts Of Configuration Of The Base Task (Step Of My Pipeline). In The Docs For Parameter_Override One Can Find:

Hi UpsetTurkey67

"General/my_parameter_name" so that only this part of the configuration will be updated?

I'm assuming this is a Hyperparameter not a configuration object (i.e. task.connect not task.connect_configuration), if this is the case then Yes 🙂

2 years ago
0 Hi All, Are There Any Alternatives To Storing User Credentials In

Hi @<1687653458951278592:profile|StrangeStork48>
I have good news, v1.0 is out with hashed passwords support.

4 years ago
0 Hello All, We’Re Trying To Use

Hmm, might be, check if your files server is running and configured properly

2 years ago
0 I'M Trying To Use

Follow up: I see that if I move an Experiment to a new project, it does not copy the associated model files and must be done manually. Once I moved the models to the new project, the query works as expected.

Correct 🙂
Nice catch!

4 years ago
0 Hi, I Have A Question About

I think it would be nicer if the CLI had a subcommand to show the content of 

~/.clearml_data.json

 .

Actually, it only stores the last dataset id at the moment, no not much 🙂
But maybe we should have a cmd line that just outputs the current datasetid, this means it will be easier to grab and pipe
WDYT?

4 years ago
0 Hi All

CooperativeFox72 a bit of info on how it works:
In "manual" execution (i.e. without an agent)

path = task.connect_configuration(local_path, name=name

path = local_path , and the content of local_path is stored on the Task

In "remote" execution (i.e. agent)

path = task.connect_configuration(local_path, name=name

"local_path" is ignored, path is a temp file, and the content of the temp file is the content that is stored (or edited) on the Task configuration.
Make sense ?

4 years ago
0 Hello! Getting Credential Errors When Attempting To Pip Install Transformers From Git Repo, On A Gpu Queue.

Okay now let's try the final lines:
$LOCAL_PYTHON -m virtualenv /root/venv /root/venv/bin/python3 -m pip install git+

4 years ago
0 Hi Everyone, I'M Using The

AttractiveCockroach17 can I assume you are working with the hydra local launcher ?

3 years ago
0 With

I find it quite difficult to explain these ideas succinctly, did I make any sense to you?

Yep, I think we are totally on the same wavelength 🙂

However, it also seems to be not too prescriptive,

One last question, what do you mean by that?

4 years ago
0 I Am Using Pipeline From Decorators. In The Pipeline, There Is A Training Step That Returns A Model (I Want This Model To Also Be Uploaded As An Artifact On Clearml). But This Results In The Following Error:

Hi DilapidatedCow43
I'm assuming the returned object cannot be pickled (which is ClearML's way of serializing it)
You can upload it as a model with
` uploaded_model_url = Task.current_task().update_output_model(model_path="/path/to/local/model")

...
return uploaded_model_url `wdyt?

2 years ago
0 Hi, I Try To Execute Pipeline With Pipelinecontroller And Define It Like This: Pipe = Pipelinecontroller(

yes thanks , but if I do this, the packages will be installed for each step again, is it possible to use a single venv?

Notice that the venv is Cached on the clearml-agent host machine (if this is k8s glue, make sure to setup the Cache as a PV to achieve the same)
This means there is no need to worry about that and this is stable.
That said, if you have an existing VENV inside the container, just add docker_args="-e CLEARML_AGENT_SKIP_PIP_VENV_INSTALL =/path/to/bin/python"
Se...

6 months ago
0 Sorry For Always Posting Such Cryptic Problems. I Managed To Create A Docker-Compose File That Runs Clearml

Hi @<1541954607595393024:profile|BattyCrocodile47>
This looks like a docker issue running on mac m2
None
wdyt?

one year ago
0 , This Is A Great Tool For Visualizing All Your Experiments.

Thank you @<1523720500038078464:profile|MotionlessSeagull22> always great to hear 🙂
btw, if you feel like sharing your thoughts with us, consider filling our survey , it should not take more than 5min

5 years ago
0 Hi All, I'M Updating My Code To Use Hydra, And Facing An Issue: When I Try To Init A Task In Offline Mode I'Me Getting The Following:

Hi RipeGoose2
I just test the hydra example, seems to work when you add the offline right after the import:
` from clearml import Task

Task.set_offline(True) `

4 years ago
0 With

Looking at the 

supervisor

 method of the base 

AutoScaler

 class, where are the worker IDs kept.
Is it in the class attribute 

queues

 ?

Actually the supervisor is passing a fixed prefix, then it asks the clearml-server on workers starting with this name.
This way we can have a fixed init script for all agents, while we still can differentiate them from the other agent instances in the system. Make sense ?

4 years ago
0 Hi There

Also, for a single parameter you can use:
cloned_task.set_parameter(name="Args/artifact_name", value="test-artifact", description="my help text that will appear in the UI next to the value")This way, you are not overwriting the other parameters, you are adding to them.
(Similar to update_parameters , only for a single parameter)

5 years ago
0 Hey, Using K8S With Trains 0.16.1-320, All Of A Sudden The Entire Data (I.E Experiments, Tasks, Api Creds) Is Not Showing In The Ui Anymore. All Logs Seems To Be Fine Afai Can Tell... Any Idea What Went Wrong?

so if the node went down and then some other node came up, the data is lost

That might be the case. where is the k8s running ? cloud service ?

4 years ago
0 Hi, Can You Help Me Pls, I Got: Environment Setup Completed Successfully Starting Task Execution: Traceback (Most Recent Call Last): File "Agro_Api.Py", Line 13, In From Help_Models.Consts Import Urls Importerror: No Module Named 'Help_Models'

PlainSquid19 No worries 🙂
btw: If you could see if the mangling of workings / script path happens with the latest trains, that will be appreciated, because if you were running the script in the first place from "stages/" then the trains should have caught it ...

5 years ago
0 Just Getting Started With Clearml, Any Recommended Videos On How To Get A Sample Project Up? I Am Using The One On Their Youtube Channel Right Now But I Am A Bit Confused As How To Use The Demoapp

however setting up the interpertier on pycharm is different on mac for some reason, and the video just didnt match what I see

MiniatureCrocodile39 Are you running on a remote machine (i.e. PyCharm + remote ssh) ?

4 years ago
0 Let'S Say That I Specify The

I guess that was never the intention of the function, it just returns the internal representation. Actually my question would be, how do you use it, and why? :)

4 years ago
0 It Is A Good Practice To Call A Function Decorated By

Thanks GiganticTurtle0 !
I will try to reproduce with the example you provided. regardless I already took a look at the code, and I'm pretty sure I know what the issue is. We will be pushing a few fixes after the weekend, I'm hoping this one will be included as well 🙂

3 years ago
0 Hi, I Was Getting A Really Weird Error Due To Mismatch On The Versions Between The Installed Libraries In My Environment And The Ones Ran In The Node (I Manually Changed The Installed Packages And Everything Worked). How Can I Force Trains To Use Exactly

Hi GrievingTurkey78
How are you getting different version than what is used in run time? it analyzes the PYTHONPATH just as python does ? How can I reproduce it? Currently you can use Task.add_requirements(package_name, package_version=None) This will not force it though, it is a recommendation (if it fails to find the package itself) maybe we can add force ?!What do you think?

4 years ago
Show more results compactanswers