Reputation
Badges 1
25 × Eureka!Hi RoundMosquito25
Hmm I remember this is tricky ... What's the clearml version? also where is the line you had to hack ?
Hi GiganticTurtle0
Is there a simple way to makeΒ
Task.init
Β compatible with
Dask.distributed
Β client?
Please tell me more π
I think Dask is trying to pickle you Task object (which is not pickable).
You can however create the Task once with Task.init
and pass the Task ID to the child processes and then use Task.init(..., continue_last_task=task_id_here)
wdyt?
JitteryCoyote63 could you test with rc3 ?
Thanks! @<1792364603552829440:profile|TestyBeetle31> I'll pass it to the maintainers
Hi ScaryLeopard77
Could that be solved with this PR?
https://github.com/allegroai/clearml/pull/548
Any insight will help, if you can provide the log of the Task that did get stuck, that would be a good start
Oh I see
but now I'm confused if this is from code, why aren't you coping the Pipeline ID from the UI?
regrading the query, it should be something like
task_to_schedule = Task.get_task(project_name='MyProject/.pipelines/PipelineName', task_name='PipelineName')
ContemplativePuppy11
yes, nice move. my question was to make sure that the steps are not run in parallel because each one builds upon the previous one
if they are "calling" one another (or passing data) then the pipeline logic will deduce they cannot run in parallel π basically it is automatic
so my takeaway is that if the funcs are class methods the decorators wont break, right?
In theory, but the idea of the decorator is that it tracks the return value so it "knows" how t...
My apologies, let me rephrase:
if you are using pip ans package manager and not running in docker-mode, trains-agent cannot touch the cuda/cuddn drivers (actually .so) library.
If you want to verify you can check echo $LD_LIBRARY_PATH
SmarmyDolphin68 if you can reproduce the behavior in a standalone script , it will really accelerate fixing this issue
Go to https://demoapp.trains.allegro.ai/profile
You should see something like 0.16.2-123
Could you amend the original snippet (or verify that it also produces plots in debug samples) ?
(Basically I need something that I can run π )
I figured out the problem...
Nice!
Unfortunately, the hyperparameters in configuration object seems to be superior to the hyperparameters in Hyperparameter section
Hmm what do you mean by that ? how did you construct the code itself? (you should be able to "prioritize" one over the over)
Hmm, as a quick solution you can use the custom example and load everything manually:
https://github.com/allegroai/clearml-serving/blob/219fa308df2b12732d6fe2c73eea31b72171b342/examples/custom/preprocess.py
But you have a very good point, I'm not sure how one could know what's the xgboost correct class, do you?
Hi SubstantialElk6
If you are using boto to acess anything that is Not AWS S3 you have to add both address and port, and make sure you configure the "security" flag.
See example in clearml.conf :
https://github.com/allegroai/clearml-agent/blob/176b4a4cdec9c4303a946a82e22a579ae22c3355/docs/clearml.conf#L247
` aws {
s3 {
{
host: "my-minio-host:9000"
key: "12345678"
secret: "12345678"
...
others from the local environment and this causes a conflict when importing the attr module
Inside the docker ? " local environment" ?
This is all under "root" no?
Hi SquareFish25
Sure, here are a few:
HPO
https://github.com/allegroai/trains/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py
Pipeline
https://github.com/allegroai/trains/blob/master/examples/pipeline/pipeline_controller.py
Automation:
https://github.com/allegroai/trains/blob/master/examples/automation/task_piping_example.py
No should be fine... Let me see if I can get a windows box π
why are all defined components shown in the UI Results/Plots/PipelineDetails/ExecutionDetails section? Shouldn't it make more sense to show only the ones that are used in that pipeline?
They are listed there (because of the decorator, you basically "say" these are steps so they are listed), the actual resolving (i.e. which steps are actually being called) is done in "real-time"
Make sense ?
EnviousStarfish54 thanks again for the reproducible code, it seems this is a Web UI bug, I'll keep you updated.
We are planning an RC later this week, I'll make sure this fix is part of it
Hi NastyFox63 could you verify the fix works?pip install git+
Yes, I think we just found out it breaks clearml π
could you test with the latest stable, just in case ?
(I'll make sure we have an RC that supports the hydra dev version)
Okay that seems to explain it. Now the question is why it installed it in the wrong place.
Hi SubstantialElk6
noted that clearml-serving does not support Spacy models out of the box and
So this is a good point.
To add any pissing package to the preprocessing docker you can just add them in the following environment variable here: https://github.com/allegroai/clearml-serving/blob/d15bfcade54c7bdd8f3765408adc480d5ceb4b45/docker/docker-compose.yml#L83EXTRA_PYTHON_PACKAGES="spacy>1"
Regrading a custom engine, basically this is supported with --engine custom
you c...
However, are you thinking of including this callbacks features in the new pipelines as well?
Can you see a good use case ? (I mean the infrastructure supports it, but sometimes too many arguments is just confusing, no?!)