Just to confirm AgitatedDove14 - clearml doesn’t do any “magic” in regard to this for tensorflow, pytorch etc right?
Cool, didn't know it was disabled. This exact reason was why I created a wrapper over ClearML for my use so that people don't ever accidentally talk to demo server
If i publish a keras_mnist model and experiment on, each of it gets pushed as a separate Model entity right? But there’s only one unique model with multiple different version of it
I think I mean if it supports the first
more like testing especially before a pipeline
I am also not understanding how clearml-serving is doing the version for models in triton.
This is the command that is running:
` ['docker', 'run', '-t', '-e', 'NVIDIA_VISIBLE_DEVICES=none', '-e', 'CLEARML_WORKER_ID=clearml-services:service:c606029d77784c69a30edfdf4ba291a5', '-e', 'CLEARML_DOCKER_IMAGE=', '-v', '/tmp/.clearml_agent.72r6h9pl.cfg:/root/clearml.conf', '-v', '/root/.clearml/apt-cache:/var/cache/apt/archives', '-v', '/root/.clearml/pip-cache:/root/.cache/pip', '-v', '/root/.clearml/pip-download-cache:/root/.clearml/pip-download-cache', '-v', '/root/.clearml/cache:/clea...
Thanks AgitatedDove14 - i get overall what you are saying. Have to get glue setup, which I couldn’t understand fully, so that’s a different topic 🙂
The first line of it is getting a dataset and it’s failing with no project name
As in I am cloning a task and running it and in that takes without doing any Task.init i am trying to get the task that is running
For now that's a quick thing, but for actual use I will need a proper model (pkl) and the .py
AS in clone the task from the UI and run it
I am providing a helper to run a task in queue after running it locally in the notebook
is it known?
And exact output after the package install and stuff:
Environment setup completed successfully Starting Task Execution: None
IF there’s a post-task script, I can add a way to zip and upload pip cache etc to s3 - as in do any caching that I want without having first class support in clearml
In params:
parameter_override={'General/dataset_urlWhat’s the General for?
This is my code, but it’s pretty standard
I guess it won’t due to the nature of services?
The image to run is empty essentially
There’s also the cli to create tasks- https://github.com/allegroai/clearml/blob/master/docs/clearml-task.md
Updating to 1.1.0 gives this error:
ERROR: Could not push back task [e55e0f0ea228407a921e004f0d8f7901] to k8s pending queue [c288c73b8c434a6c8c55ebb709684b28], error: Invalid task status (Task already in requested status): current_status=queued, new_status=queued
Anything that is shown in git status as untracked? So ignore .gitignored. and maybe a oaram or config to say include untracked. Anyway, it's only a nice to have feature.
Progress with boto3 added, but fails: