Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
MagnificentBear85
Moderator
6 Questions, 21 Answers
  Active since 08 June 2023
  Last activity 19 days ago

Reputation

0

Badges 1

19 × Eureka!
0 Votes
4 Answers
896 Views
0 Votes 4 Answers 896 Views
Is there some example of how to develop a HPO in a pipeline setup where each hyperparameter setup is each own step again? Should we first mimick a base task ...
one year ago
0 Votes
3 Answers
541 Views
0 Votes 3 Answers 541 Views
Hi there, is there a way to save a model simply to the fileserver such that the MODEL URL will point there and not to a local disk (I am running in docker co...
6 months ago
0 Votes
5 Answers
69 Views
0 Votes 5 Answers 69 Views
Hi guys, I have a (potentially very stupid) but important problem. I moved the server to a new machine and hooked up the fileshare that we use for storage. I...
19 days ago
0 Votes
2 Answers
96 Views
0 Votes 2 Answers 96 Views
one month ago
0 Votes
13 Answers
95 Views
0 Votes 13 Answers 95 Views
Hi everyone, I am updating the self-hosted server to a public IP. However, all my datasets cannot be downloaded anymore. I followed instructions from here , ...
one month ago
0 Votes
3 Answers
78 Views
0 Votes 3 Answers 78 Views
My current training setup is a hyperparameter optimization using the TPEsampler from Optuna. For configuration we use Hydra. There is a very nice plugin that...
22 days ago
0 Is There Some Example Of How To Develop A Hpo In A Pipeline Setup Where Each Hyperparameter Setup Is Each Own Step Again? Should We First Mimick A Base Task For Example?

I'm now thinking I need some main process that runs first a base_template task such that all gets initialized well. In the same process start the HPO which will add subtasks to the queue. This main process (also a task) will then wait until all other tasks (i.e. hyperparameter setups) have completed before wrapping up and reporting back.

one year ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

O yeah, one more thing. The initial link you sent me contains the snippet that is written to file using cat but for me it only works with simply echo on a single line. If I copy from the website, it inserts weird end of line characters that mess it up (at least that's my hypothesis) - so you might want to consider putting a warning on the website or updating to the instruction below

echo 'db.model.find({uri:{$regex:/^http:\/\/10\.0\.0\.12:8081/}}).forEach(function(e,i) { e.uri = e.uri.r...
one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Ok, even weirder now - the model paths seem updated to 172. but I have also the csv's as artifacts that are still at 10.
Any clues @<1722061389024989184:profile|ResponsiveKoala38> ?
image
image

one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Awesome, thanks very much for this detailed reply! This indeed seemed to have updated every url.
One note - I had to call the mongo host as --mongo-host None

one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Could it be that here Failed getting object 10.0.0.12:8081/Esti/ it is without the 'http' part? That I also have to replace all those occurrences?

one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Of course, you can see it in the error message that I already shared - but here is another one just in case.

.venv/bin/python -c "from clearml import Dataset; Dataset.get(dataset_project='Esti', dataset_name='bulk_density')"
2024-10-09 18:56:03,137 - clearml.storage - WARNING - Failed getting object size: ValueError('Failed getting object 10.0.0.12:8081/Esti/.datasets/bulk_density/bulk_density.f66a70c6cda440dd8fdaccb52d5e9055/artifacts/state/state.json (401): UNAUTHORIZED')
2024-10-09 ...
one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Thanks for the quick and helpful answer @<1722061389024989184:profile|ResponsiveKoala38> ! It works. At least, in the sense that I can see my artifacts are updated. However, my datasets are still on the wrong address. How to update those as well?

one month ago
0 Hi Everyone, I Am Updating The Self-Hosted Server To A Public Ip. However, All My Datasets Cannot Be Downloaded Anymore. I Followed Instructions From

Ah okay, this python script is meant to replace all the other scripts? That makes sense then 🙂

one month ago
0 My Current Training Setup Is A Hyperparameter Optimization Using The Tpesampler From Optuna. For Configuration We Use Hydra. There Is A Very Nice Plugin That Let'S You Define The Hyperparameters In The Config Files Using The

Yeah, both of them. The HPO though requires everything to be defined by python code. The Hydra config is parsed and stored nicely, but it isn't recognized as describing HPO.

20 days ago
0 Is There Some Example Of How To Develop A Hpo In A Pipeline Setup Where Each Hyperparameter Setup Is Each Own Step Again? Should We First Mimick A Base Task For Example?

Thanks for responding quickly. For this specific use case I need a regression sklearn model (trained in 10-fold CV) that I want to hyperoptimize using optuna. As my datasets are updated regularly, I'd like to define all of this in a pipeline such that I can easily run everything again once the data is changed.

one year ago