Reputation
Badges 1
25 × Eureka!files_server:
://genuin-ai/
should be:
files_server:
AbruptWorm50 my apologies I think I mislead you you, yes you can pass geenric arguments to the optimizer class, but specifically for optuna, this is disabled (not sure why)
Specifically to your case, the way it works is:
your code logs to tensorboard, clearml catches the data and moves it to the Task (on clearml-server), optuna optimization is running on another machine, trail valies are maanually updated (i.e. the clearml optimization pulls the Task reported metric from the server and updat...
Found it, definitely a bug in the callback, it has not effect on the HPO process itself
I put two models in the same endpoint, then only one was running,
without providing version number, you are overriding the models (because this is the same endpoint)
I started another docker container having a different port number and then the curls with the new model endpoint (with the new port) started working
Seems like misconfiguration on the first one?
, which apparently I can't specify when I establish the model endpoint but I need to re compose the docker container by...
Hi FierceHamster54
Dataset is downloading multi threaded already
But yes get_local_copy() is thread / process safe
Hi @<1523704157695905792:profile|VivaciousBadger56>
No these are 3 different ways of building pipelines.
Creating from decorators is recommended when each component can be easily packages into a single function (every function can have an accompanying repository).
Here the idea it is very easy to write complex execution logic, basically the automagic does serialization/deserialization so you can write pipelines like you would code python.
Creating from Tasks is a good match if you need to ...
Hi @<1727497172041076736:profile|TightSheep99>
Yes it can, it will upload the meta-data as well as the files (it will also do de-dup and will not upload files that already exist in the dataset based on the hash of teh file content)
suspect permissions, but not entirely sure what and where
Seems like it.
Check the config file on the agent machine
https://github.com/allegroai/clearml-agent/blob/822984301889327ae1a703ffdc56470ad006a951/docs/clearml.conf#L18
https://github.com/allegroai/clearml-agent/blob/822984301889327ae1a703ffdc56470ad006a951/docs/clearml.conf#L19
Hmm, let me see if you can somehow "signal" to the subprocess that it should not use the main process Task. (btw: are you forking or spawning a subprocess?)
What do you mean by "modules first and find a way to install that package" ?
Are those modules already in wheels ? are they part a git repository?
(the pipeline component can also start inside a git repository it clones)
DepressedFox45
you can just copy/add this section π
https://github.com/allegroai/clearml-agent/blob/e43f31eb80f9399da01dc5432cdacdf81c1bd084/docs/clearml.conf#L15
Hi LovelyHamster1
Could you think of a toy code that reproduces this issue ?
I want pipeline / task dispatch to be reported and monitored outside of clearml. For example, I might want to log the dispatch event in some non-clearml system and then monitor the health of the pipeline and alert if if it is pending for too long.Hmm interesting, so like a callback?!
I'm thinking a callback is being executed after the Pipelines is sent, but once the callback is done, the pipeline process leaves?
Does that make sense ?
I might want to dispatch other jobs from within the same p...
maybe worth updating the main Readme.md in the github.. if someone try to follow the instructions there it breaks
Hmm I thought we already did, Yes you are absolutely correct, I'll make sure we do
FiercePenguin76 in the Tasks execution tab, under "script path", change to "-m filprofiler run catboost_train.py".
It should work (assuming the "catboost_train.py" is in the working directory).
GiganticTurtle0 what's the Dataset Task status?
I did change the
instead of 8080?
So this is the issue
SubstantialElk6
Regrading cloning the executed Task:
In the pip requirements syntax, "@" is a hint that tells pip where to find the package if it is not preinstalled.
Usually when you find the @ /tmp/folder It means the packages was preinstalled (usually pre installed in the docker).
What is the exact scenario that caused it to appear (this was always the case, before v1 as well).
For example zipp package is installed from pypi be default and not from local temp file.
Your fix b...
Thanks, yes you are correct the color is derived from the series name, so I guess the issue is the name+Id is not kept in full screen
So why is it trying to upload to "//:8081/files_server:" ?
What do you have in the trains.conf on the machine running the experiment ?
where people can do @'s for experiments/projects/tasks and even comparisons ...
ohhh I like that! for me this throws me directly to Slack integration .
I think my main question is, "is the discussion ephemeral?" in other words, is this an on going discussion that later no one will care about, or are we creating some "knowledge base" that we want to later share?
Also, by "address bar at the top", i assume you mean address url right?
yes... apologies for the phrasing, it was w...
I think I found something relating to the issue on the subprocess not logging. Let me check if we can share something quickly
In fact, as I assume, we need to write our custom HyperParameterOptimizer, am I right?
Yes exactly! it should be very easy
Just Inherit from RandomSearch and change create_job
https://github.com/allegroai/clearml/blob/d45ec5d3e2caf1af477b37fcb36a81595fb9759f/clearml/automation/optimization.py#L1043
AttractiveCockroach17 could it be Hydra actually kills these processes?
(I'm trying to figure out if we can fix something with the hydra integration so that it marks them as aborted)
DeliciousBluewhale87
node.base_task_id
Β is the base task, which will always be in draft mode, Instead we should use theΒ
node.executed
Β which references the current executed node.
YES, maybe we should add that into the example, so it is clearer ? WDYT?
load_model will get a link to a previously registered URL (i.e. it search a model pointing to the specific URL, if it finds it, it will get you the Model object)
HealthyStarfish45
No, it should work π
Hi VexedCat68
Are we talking youtubes ? docs? courses ?
