Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8076 Answers
  Active since 10 January 2023
  Last activity 9 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Unrelated Problem (Or Is It?) The Clearml'S Built In Cleanup Service Fails

. Yes I do have a GOOGLE_APPLICATION_CREDENTIALS environment variable set, but nowhere do we save anything to GCS. The only usage is in the code which reads from BigQuery

Are you certain you have no artifacts on GS?
Are you saying that if GOOGLE_APPLICATION_CREDENTIALS and clearml.conf contains no "project" section it crashed when starting ?

3 years ago
0 Hi! I Have Question About Data Managment Part Of Clearml. Does Clearml Support Data Versioning Like In Lakefs ?) Is It Similar ? Maybe There Is Some Interesting Pros And Cons?

You mean does one solution is better than combining maintaining and automating 3+ solutions (dvc/lakefs + mlflow + cubeflow/airflow)
Yes I'd say it is. BTW if you have airflow running for other automations you can very easily combine the automation with clearml and have a single airflow automation for everything, but the main difference now airflow only launches logic, never actual compute/data (which are launched and scaled via clearml
Does that make sense?

8 months ago
0 Hi, I Try To Run Locally

Okay this seems correct...
Can you share both yaml files (server & serving) and env file?

2 years ago
one year ago
2 years ago
0 Is It Possible To Give The Agent Access To Install Private Pip Packages (Needs To Be Installed From The Repo)?

I’m not sure if 

https

 will work because I want to use ssh keys for creds.

BTW: I was not aware github provide pypi like artifactory, do they ?
Regrading SSH keys, they are passed from the host machine (i.e. in venv mode it will use the SSH keys from the user running the agent, and n docker mode, they are automatically mapped into the container)

3 years ago
0 Hi All, I'M Updating My Code To Use Hydra, And Facing An Issue: When I Try To Init A Task In Offline Mode I'Me Getting The Following:

Hmm let me check, I think we changed the offline mode to use the latest API version (because by definition it cannot know what's the server).
Let me check if you can override it

3 years ago
0 Hi, I'M Trying To Use

HappyDove3 where are you running the code?
(the upload is done in the background, but it seems the python interpreter closed?!)
You can also wait for the upload:
task.upload_artifact(name="my artifact", artifact_object=np.eye(3,3), wait_on_upload=True)

3 years ago
0 Hi, A Question About Dataset Storage Suppose I Create A Dataset Like This

Hi MelancholyElk85
So the way datasets now work, is they are actually an entity (folder) inside a project , all under TFW hidden .datasets sub project
This is so all data and tasks are both on the same project , but at the same time will not intersect with subprojects by the same name. Does that make sense?

2 years ago
0 Hi, A Question About Dataset Storage Suppose I Create A Dataset Like This

I still have name

my_name

, but the project name

my_project/.datasets/my_name

rather than

my_project/.datasets

Yes, this is the expected behavior

And I don't see any new projects / subprojects where that dataset creation Task is stored

They are marked "hidden" hence by default you cannot see them in the UI (so they will only appear in the Dataset page),
you can turn the UI hidden flag by going to your settings page and selecting "Con...

2 years ago
0 Hi Again, I Was Wondering What Would Be A Good Practice With Respect To Saving Different Datasets (While Preprocessing It In Several Steps/Stages). Mainly With The Use Of Remove_Files(). Is It Ok To Delete Raw Data After Preprocessing For Example? In That

Hi CostlyElephant1
What do you mean by "delete raw data"? Data is always fetched to cached folders and clearml takes care of cache cleanup
That said notice that get mutable copy is a target you specify, in this case you should definetly delete after usage. Wdyt ?

2 years ago
0 Hi All

CooperativeFox72 could you expand on "not working"?
If you have a yaml file, I would do:
` # local_path = './my_config.yaml'
path = task.connect_configuration(local_path, name=name)
if task.running_locally():
with open(local_path, "r") as config_file:
my_params_dict = yaml.load(config_file, Loader=yaml.FullLoader)
my_params_dict['change_me'] = 'new value'
my_params_text = yaml.dump(my_params_dict)

store back the change, my_params assumed to be the content of the param file (tex...

3 years ago
0 Please Tell Me What Ram Metric Is Tracked By Clearml? What I See In Htop And On The Board Don'T Match Even Though It'S The Same Server 20 Gb Vs 70Gb

Hi @<1523702932069945344:profile|CheerfulGorilla72>

Please tell me what RAM metric is tracked by ClearML?

Free RAM is the entire machine free RAM
Yeah htop shows odd numbers as it doesn't "count" allocated buffers
specifically you can see the code here:
None

one year ago
0 Hi Everyone, I Have A Question About Using

Hi @<1643060801088524288:profile|HarebrainedOstrich43>
try this RC let me know if it works 🙂

pip install clearml==1.13.3rc1
one year ago
0 Tracking From Experiments To Datasets

Hi AmiableFish73

Hi all - is there an easier way track the set of datasets used by a particular task?

I think the easiest is to give the Dataset an alias, it will automatically appear in the Configuration section:
Dataset.get(..., alias="train dataset")wdyt?

2 years ago
0 Is It Possible To Disable Vcs-Cache? I Tried To Change Value From True To False In The Trains.Conf, But It Does Not Affect Anything. I Want To Disable It, Because It Gives Error When I Run A Project Firstly On Docker Then On Venv.

Hi MysteriousBee56 ,
Yes this is permissions issue, the docker creates all folders as root (as it is the root user running inside the docker), Then when you execute in venv mode, you are running it from your user, which obviously cannot change root created folders.

4 years ago
0 Hi, I Am Trying To Use Agent With A Sample, Very Simple Task. But It Stucks And Task Does Not Finish. In Ui In Console I See What I Pasted On Image. Do You Know What I Might Be Doing Wrong? Agent Is Run In Virtual Env Mode

RoundMosquito25 do notice the agent is pulling the code from the remote repo, so you do need to push the local commits, but the uncommitted changes clearml will do for you. Make sense?

2 years ago
0 Heya, Is There Any Plan For Clearml To Leverage The New

Hi FierceHamster54
This is already supported, unfortunately the open-source version only supports static allocation (i.e you can spin multiple agents and connect each one to specific number of GPUs), the dynamic option (where you have single agent allocating jobs to multiple GPUs / Slices is only part of the enterprise edition
(there is the hidden assumption there that if you spent so much on a DGX you are probably not a small team 🙂 )

2 years ago
Show more results compactanswers