Would be good to have frequentish releases if possible 🙂
It might be better suited than execute remotely for your specific workflow
Exactly
Is there a published package version for these?
more like testing especially before a pipeline
Hmm yes, that makes sense.
Any chance you can open a github issue on it?
Let me see if I understand, basically, do not limit the clone on execute_remotely, right ?
When did this PipelineDecorator come. Looks interesting
A few days ago (I think)
It is very cool! checkout the full object proxy interaction on the actual pipeline logic This might be better for your workflow, https://github.com/allegroai/clearml/blob/c85c05ef6aaca4e07e739ba53d13f16e6a994b05/clearml/backend_interface/task/populate.py#L489
create_task_from_function
I was looking at options to implement this just today, as part of the same remote debugging that I was talking of in this thread
Any comments/ideas on how to make it better will be more than welcomed 🙂
Any updates on trigger and schedule docs
I think examples are already pushed, docs still in progress.
BTW: pipeline v2 examples are also out:
https://github.com/allegroai/clearml/blob/master/examples/scheduler/trigger_example.py
https://github.com/allegroai/clearml/blob/master/examples/pipeline/full_custom_pipeline.py
Hi TrickySheep9 ,
so running a local notebook with execute_remotely
kills the kernel? what version of clearml
are you running?
Meanwhile check CreateFromFunction(object).create_task_from_function(...)
It might be better suited than execute remotely for your specific workflow 🙂
I think RC should be out in a day or two, meanwhile pip install git+
https://github.com/allegroai/clearml.git
more like testing especially before a pipeline
Any chance you can open a github issue on it?
Will do!
do not limit the clone on execute_remotely,
Yes
When did this PipelineDecorator come. Looks interesting 🙂
I am providing a helper to run a task in queue after running it locally in the notebook
Is this part of a pipeline process or just part of the workflow ?
(reason for asking is that if this is a pipeline thing we might be able to support it in v2)
In order to clone the Task it needs to complete sync, which implies closing. I guess the use case for execute remotely while still running was not considered. How / why is this your workflow? Specifically how does Jupyter get into the picture?
I am providing a helper to run a task in queue after running it locally in the notebook