Reputation
Badges 1
125 × Eureka!So far I have taken one mnist image, and done the following:
` from PIL import Image
import numpy as np
def preprocess(img, format, dtype, h, w, scaling):
sample_img = img.convert('L')
resized_img = sample_img.resize((1, w*h), Image.BILINEAR)
resized = np.array(resized_img)
resized = resized.astype(dtype)
return resized
png img file
img = Image.open('./7.png')
preprocessed img, FP32 formated numpy array
img = preprocess(img, format, "float32", 28, 28, None)
...
logger.report_media( title=name_title, series="Nan", iteration=0, local_path=fig_nan, delete_after_upload=delete_after_upload, ) clearml_task.upload_artifact( name=name_title, artifact_object=fig_nan, wait_on_upload=True, )
Hi CostlyOstrich36
I added this instruction at the very end of my postprocess
functionshutil.rmtree("~/.clearml")
Sent it to you via DM!
Same thing SuccessfulKoala55 😞
I have this inside my pipeline defined with decorator
(base) emilio@unicorn:~$ docker version Client: Docker Engine - Community Version: 19.03.13 API version: 1.40 Go version: go1.13.15 Git commit: 4484c46d9d Built: Wed Sep 16 17:02:36 2020 OS/Arch: linux/amd64 Experimental: false (base) emilio@unicorn:~$ docker-compose --version docker-compose version 1.17.1, build unknown
Yeah, that would be nice!
i'm just interested in actually running a prediction with the serving engine and all
some documentation for understanding the parameters, flags, options etc.
I want my serving controller to go into a particular project instead of the default DevOps, for example
Other things might be programatically adding endpoints instead of using cli (this is because I potentially have hundreds of models to serve so I can't do this manually)
SuccessfulKoala55 I can't get it to work... I tried using the pip conf locally and it works, but the agent doesn't seem to be able to install the package
can you elaborate a bit on the token side? i'm not sure exactly what would be a bad practice here
using clearML agent
right, and why can't a particular version be found? how it does it try to find python versions?
Is this a possible future feature? I have used cometML before and they have this. I'm not sure how they do it though...
Hey AgitatedDove14 , thanks for the answer. What does that mean? In any case I think it would be a nice to have feature.
Ah, so you're saying I can write a callback for stuff like train_loss
, val_loss
, etc.
Absolutely, I could try but I'm not sure what it entails...
Hey AgitatedDove14 , did you get a chance to look at this?
And then you'll hook it