
Reputation
Badges 1
38 × Eureka!Is there any example of how to use clearml-data
?
Hi JitteryCoyote63 ,
Oh, you have somethings, Nice!
I will look into that document, thanks!
GrumpyPenguin23 Hi, thanks for your instruction!
Putting some metadata into the model sounds nice.
I was exactly wondering how to take care of labels and being afraid of handling them as a dataset even when inferring.
We have a web server which accepts various requests and manages database resources.
This web server arranges the request and creates a task on the clearml api server, which is running an internal network.
By the way, we found that when I added labels param and post a tasks.create
request, it worked.
Can you run this one -
 ?
Do you get the labels for both local and clearml-agent run?
Okay, I did the example.
For the local run, I got the labels.
For the agent run, I did not get the labels.
As for the versionsroot@120eb0cddb60:~# pip list | grep clearml clearml 0.17.5 clearml-agent 0.17.1
I mean, the output model comes with the labels which is posted.
For the agent run, I posted only the following params:
name project script typeto tasks.create
endpoint and let an agent to pick it.
No, I have checked it on the web frontend, following the model link and the LABELS tab.
I confirmed that worked if it is not started by an agent.
now I am going AFK.
Thanks for your support!
Yeah, as I have known that, now the CLI looks much more familiar to me.
I tried clearml.model.InputModel
and successfully downloaded a model.
Is this an expected way to consume a trained model for inference?
Even though I called task.connect_label_enumeration
, there is no data to show on the output model.
Maybe I should have clone the repo with https instead of ssh.
Um..
and if you clone the local task run and enqueue it to the agent?
It failed.
Saying: Could not read from remote repository.
Are you talking about the public demo server?
If so, it says:This server is reset daily at 24:00 PST.
BTW why using the api calls and not clearml sdk?
Because the training part is only the sub system of our whole system.
And the python stuff is not facing to the web, where training request is coming.
Does this task
(started by an agent) have some limitation?
Like being disabled to connect labels?
Hm, clearml-data looks very much like git.