Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ElegantCoyote26
Moderator
34 Questions, 126 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

125 × Eureka!
0 What Could Be Causing This?

is that what you mean?

2 years ago
0 What Could Be Causing This?

unicorn is an alias for the ip of the machine the server is hosted on

2 years ago
2 years ago
0 Is There A Way To Get A Task'S Docker Container Id/Name? I'M Generally Interested In Resource Profiling Of Each Container, So I Noticed I Can Use

I see, ok!
I will try that out.
Another thing I noticed: none of my pipeline tasks are reporting these graphs, regardless of runtime. I guess this line would also fix that?

2 years ago
0 What Could Be Causing This?

it says method not allowed when i try to access that url

2 years ago
0 Is It Possible To Add Extra Arguments To

I think the issue is that the host is not trusted... it looks like it looks into the index

2 years ago
0 I'Ve Been Seeing This Message And Similar Messages A Lot In Some Of My Tasks Lately... Any Ideas?

Ok, going to ask the server admins, will keep you posted, thanks!

2 years ago
0 What Could Be Causing This?

i don't think the conf is an issue. it's been deployed for a long time and working. models from yesterday correctly display the url

2 years ago
0 Has Anyone Used

fp32 seems to be floating point 32 so my preprocessing seems wrong

3 years ago
0 Is It Generally Recommended To Close Api Client Sessions? Like If I Open A Client Like This:

can you elaborate a bit on the token side? i'm not sure exactly what would be a bad practice here

2 years ago
0 Does

where should I look for this folder?
I am talking about serving

2 years ago
0 If I Have 1 Machine With A Gpu, Can I Put A Worker On It With Gpu And Two Workers With

yeah, that's fair enough. is it possible to assign cpu cores? I wasn't aware

2 years ago
0 What Could Be Causing This?

how do I check?

2 years ago
2 years ago
0 What Could Be Causing This?

yes, in the corresponding task

2 years ago
0 Has Anyone Used

i'm also not sure what this is
-H "Content-Type: application/octet-stream" -H' NV-InferRequest:batch_size: 1 input { name: "dense_input" dims: [-1, 784] } output { name: "activation_2" cls { count: 1 } }'

3 years ago
0 Has Anyone Used

i'm just interested in actually running a prediction with the serving engine and all

3 years ago
0 Has Anyone Used

i'm probably sending the request all wrong + i'm not sure how the model expects the input

3 years ago
0 Has Anyone Used

well, i have run the keras mnist example that is in the clearml-serving READme. Now I'm just trying to send a request to make a prediction via curl

3 years ago
0 What Could Be Causing This?

but it's been that way for over 1 hour.. I remember I can force the task to wait for the upload. how do i do this?

2 years ago
0 What Could Be Causing This?

cheers, let me try this

2 years ago
0 Has Anyone Used

i don't know ahahaha

3 years ago
0 Has Anyone Used

So far I have taken one mnist image, and done the following:

` from PIL import Image
import numpy as np

def preprocess(img, format, dtype, h, w, scaling):

sample_img = img.convert('L')


resized_img = sample_img.resize((1, w*h), Image.BILINEAR)
resized = np.array(resized_img)
resized = resized.astype(dtype)

return resized

png img file

img = Image.open('./7.png')

preprocessed img, FP32 formated numpy array

img = preprocess(img, format, "float32", 28, 28, None)

...

3 years ago
0 Can I Change The Clearml-Serving Inference Port? 8080 Is Already Used For My Self-Hosted Server.. I Guess I Can Just Change It In The Docker-Compose, But I Find A Little Weird That You Are Using This Port If The Self-Hosted Server Web Is Hosted In It..

And this is what I get with the curl inference example on the README.md
(prediction_module) emilio@unicorn:~/clearml-serving$ curl -X POST " ` " -H "accept: application/json" -H "Content-Type: application/json" -d '{"x0": 1, "x1": 2}'

<html> <head><title>405 Not Allowed</title></head> <body> <center><h1>405 Not Allowed</h1></center> <hr><center>nginx/1.20.1</center> </body> </html> `

2 years ago
0 How Do I Disable

yep, setting it to -1 is still caching envs..

one year ago
0 What Could Be Causing This?

see this for example

2 years ago
Show more results compactanswers