Reputation
Badges 1
25 × Eureka!UPD: works on 1.7.0 as well, the bug is introduced in 1.8.0
Thanks JitteryCoyote63 , just to be clear, is this only in comparison or also on the individual Tasks ?
LudicrousDeer3 when using Logger you can provide 'iteration' argument, is this what you are looking for?
if they're mission critical, but rather the clearml cache folder?
hmmm... they are important, but only when starting the process. any specific suggestion ?
(and they are deleted after the Task is done, so they are temp)
Thanks DilapidatedDucks58 ! We β€ suggestions for improvements π
Did you try to print a page using the browser (I think that they can all store it as pdf these days) Yes I agree, it would π we have some thoughts on creating plugins for the system, I think this could be a good use-case. Wait a week or two ;)
UnevenDolphin73 since at the end plotly is doing the presentation, I think you can provide the extra layout here:
https://github.com/allegroai/clearml/blob/226a6826216a9cabaf9c7877dcfe645c6ae801d1/clearml/logger.py#L293
Could not install packages due to an EnvironmentError: [Errno 2] No such file or directory: '/tmp/build/80754af9/attrs_1604765588209/work'
Seems like pip failed creating a folder
Could it be you are out of space ?
So the agent installed okay. It's the specific Task that the agent is failing to create the environment for, correct?
if this is the case, what do you have in the "Installed Packages" section of the Task (see under the Execution tab)
The idea of queues is not to let the users have too much freedom on the one hand and on the other allow for maximum flexibility & control.
The granularity offered by K8s (and as you specified) is sometimes way too detailed for a user, for example I know I want 4 GPUs but 100GB disk-space, no idea, just give me 3 levels to choose from (if any, actually I would prefer a default that is large enough, since this is by definition for temp cache only), and the same argument for number of CPUs..
Ch...
Yeah I can write a script to transfer it over, I was just wondering if there was a built in feature.
unfortunately no π
Maybe if you have a script we can put it somewhere?
how to make sure it will traverse only current package?
Just making sure there is no bug in the process, if you call Task.init in your entire repo (serve/train) you end up with "installed packages" section that contains all the required pacakges for both use cases ?
I have separate packages for serving and training in a single repo. I donβt want serving requirements to be installed.
Hmm, it cannot "know" which is which, because it doesn't really trace all the import logs (this w...
As you said you just need to clone, righr click clone?
Iβve did saw this βpublishβ option for pipelines, just for models, is this a new feature?
Kind of hidden in the UI (not sure if on purpose), but if you click on the pipeline then go to details, in the new tab (of the pipeline Task) you can publish the Task (aka the pipeline)
In this example:
https://github.com/allegroai/clearml-actions-train-model/blob/7f47f16b438a4b05b91537f88e8813182f39f1fe/train_model.py#L14
replace with something like:
` task = Task.get_tasks(project_name="pipel...
What's the host you have in the clearml.conf ?
is it something like " http://localhost:8008 " ?
any idea why i cannot selected text inside the table?
Ichh, seems again like plotly π I have to admit quite annoying to me as well ... I would vote here: None
Bake to the error:
clearml_agent: ERROR: Failed getting token (error 401 from
): Unauthorized (invalid credentials) (failed to locate provided credentials)
See here:
https://github.com/allegroai/clearml-server/blob/3f2b96266bc51bfce680bd759c7fa9d635ae36d3/docker/docker-compose.yml#L131
You need to provide an access key so it can actually "talk" to the server next to it.
Hi LazyFish41
Could it be some permission issue on /home/quetalasj/.clearml/cache/
?
The pipeline stores the state of it's previous run, specifically the executed steps.
In our case the executed step was reset (I assume) so it cannot find the output model you are referring to, hence crashing
CleanPigeon16 make sense ?
Yes please π
BTW: I originally thought the double quotes (in your PR) were also a bug, this is why I was asking, wdyt?
SmugOx94 could you please open a GitHub issue with this request, otherwise we might forget π
We might also get some feedback from other users
I mean manually you can get the results and rescale but, not through the UI
WhimsicalLion91 I guess import/export is going to be more challenging, doable though. You will need to get all the Tasks, then collect all the artifacts, then collect all the reported logs (console/plots/etc). Then import everything back to your own server...
Exporting a single Tasktask.export_task
and Task.import_task
If you need all the scalars :task.get_reported_scalars(...)
And the console logs:Task.get_reported_console_output
Hi MiniatureCrocodile39
Which packages to you need to run the viewer? I suppose dicom reader is a must?
Interesting!
Wouldn't Dataset (class) be a good solution ?
Check the links that are generated in the ui when you upload an artifact or model
I see TightElk12
You can always setup the OS environments : CLEARML_API_HOST CLEARML_WEB_HOST CLEARML_FILES_HOST with the correct configuration Or you can simply set CLEARML_NO_DEFAULT_SERVER=1 which will prevent any usage of the default demo serverwdyt?
, but what I really want to achieve is to share this code:
You mean to share the code between them, unless this is a "preinstalled" package in the container, each endpoint has it's own separate set of modules / files
(this is on purpose, so you could actually change them, just image diff versions of the same common.py file)
doing some extra "services"
what do you mean by "services" ? (from the system perspective any Task that is executed by an agent that is running in "services-mode" is a service, there are no actual limitation on what it can do π )
Hmm that makes sense, btw the PYTHONPATH set by the agent would be the working dir listed under the Task, But if you set the agent.force_git_root_python_path
the agent would also add the root git repo to the python path