Reputation
Badges 1
25 × Eureka!I remember being told that the ClearML.conf on the client will not be used in a remote execution like the above so I think this was the problem.
SubstantialElk6 the configuration should be set on the agent's machine (i.e. clearml.conf that is on the machine running the agent)
- Users have no choice of defining their own repo destination of choice.
In the UI you can specify in the "Execution" tab, Output "destination", a different destination for the models/artifacts. Is this...
SmallBluewhale13 in your code what are you getting when you print the version:from clearml import __version__ print(__version__)
from clearml import TaskTypes
That will only work if you are using the latest from the GitHub, I guess the example code was modified before a stable release ...
For example, store inference results, explanations, etc and then use them in a different process. I currently use separate database for this.
You can use artifacts for complex data then retrieve them programatically.
Or you can manually report scalers / plots etc, with Logger
class, also you can retrive them with task.get_last_scalar_metrics
I see that you guys have made a lot of progress in the last two months! I'm excited to dig inΒ
Thank you!
You can further di...
Thanks RipeGoose2 !
clearml logging starts from n+n (thats how it seems) for non explicit
I have to say it looks like the expected behavior , I think.
Basically matching the TB, no?
okay, let me see if I can nail down the issue
Anyhow from your response is it safe to assume that mixing inΒ
Β code with the core ML task code has not occurred to you as something problematic to start with?
Correct π Actually we believe it makes it easier, as worst case scenario you can always run clearml in "offline" without the need for the backend, and later if needed you can import that run.
That said, regrading (3), the "mid" interaction is always the challenge, clearml will do the auto tracking/upload of the mod...
Building the pipeline in runtime from external configuration is very cool!!
I think nested components is exactly the correct solution, and it is a great use case.
Hi GrittyKangaroo27
Is it possible to import user-defined modules when wrapping tasks/steps with functions and decorators?
Sure, any package (local included) can be imported, and will be automatically listed in the "installed packages" section of the pipeline component Task
(This of course assumes that on a remote machine you could do the "pip install <package")
Make sense ?
WickedGoat98 Same for me, let me ask the UI guys, I think this is a UI bug.
Also maybe before you post the article we could release a fix to both, what do you think?
EDIT:
Never mind π i just saw the medium link, very cool!!!
Hmm I think the approach in general would be to create two pipeline tasks, then launch them from a third pipeline or trigger externally? If on the other hand it makes sense to see both pipelines on the same execution graph, then the nested components makes a lot of sense. Wdyt?
The problems comes from ClearML that thinks it starts from iteration 420, and then adds again the iteration number (421), so it starts logging from 420+421=841
JitteryCoyote63 Is this the issue ?
Okay, progress.
What are you getting when running the following from the git repo folder:git ls-remote --get-url origin
Thanks ReassuredTiger98 , yes that makes sense.
What's the python version you are using ?
Also, the IDs as an entry in the Configuration will not be clickable in the web interface, right?
No, but on the other hand, it will be editable if you clone the Task.
Which brings me to a different scenario,
In the original one, the Main Task created the Dataset, i.e. Output Dataset (and stored it both ways).
I could think of a situation the Task is using the Dataset as input (say preprocessing or traing), then we might want to enable users to clone and change the Input dataset. wdyt?
MuddySquid7 I might have found something, and this is very very odd, it seems it will Not upload any new images post the history size, which is very odd considering the number of users actively using this feature...
Do you want to try a hack to see if it solved your issue ?
Hi @<1618056041293942784:profile|GaudySnake67>Task.create
is designed to create an External task not from the current running process.Task.init
is for creating a Task from your current code, and this is why you have all the auto_connect parameters. Does that make sense ?
Hi UnsightlySeagull42
Do you mean how tp pass user/pass (user/token) to the clearml-agent so it can clone your repository ?
https://github.com/allegroai/clearml-agent/blob/a2db1f5ab5cbf178840da736afdc370cfff43f0f/docs/clearml.conf#L18
I understand but how do you launch the cleaml-agent
itself:clearml-agent daemon --detached --queue default --docker
DeliciousBluewhale87
node.base_task_id
Β is the base task, which will always be in draft mode, Instead we should use theΒ
node.executed
Β which references the current executed node.
YES, maybe we should add that into the example, so it is clearer ? WDYT?
Hi SubstantialElk6
I can't see that is was removed, could you send the full log ?
Isn't that risky? not knowing you need a package ?
How do you actually install it on the remote machine with the agent ?
basically use the template π we will deprecate the override option soon
it knows itβs a notebook and automatically adds the notebook as an artifact right?
correct
and the uncommited changes becomes the nottebook converted to a script?
correct
In one case I am seeing actual git diff coming in instead of the notebook.
it might be there is both a git repository and a notebook and the git diff will show before the notebook is detected and shown instead ? (there is a watchdog refreshing the notebook every 30sec or so)
Shouldn't this be a real value and not a template
you mean value being pulled to the pod that failed ?
This doesn't seem to be running inside a container...
What's the clearml-agent launch command you are using ? (i.e. do you have --docker flag)
So it makes sense it installs v8.0.1
(maybe originally you provided no version and it installed the latest one)
This is basically pip's doing the package version resolving