Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, When I Save Model Using Tf.Keras.Save_Model Or Using Modelcheckpoint Model Is Not Saved As An Artifact. Output Uri Is Set To Google Cloud Bucket. When Reporting With Logger Everything Is Stored Correctly. Do You Maybe Have Any Idea Why This Would Not

Hi, when I save model using tf.keras.save_model or using ModelCheckpoint model is not saved as an artifact. Output uri is set to google cloud bucket. When reporting with logger everything is stored correctly. Do you maybe have any idea why this would not work?

  
  
Posted 2 years ago
Votes Newest

Answers 7


OutrageousGiraffe8 this sounds like a bug, how can we reproduce it?
Maybe a add another layer here?
https://github.com/allegroai/clearml/blob/a47f127679ebf5912690f7c3e60791a2daa5c984/examples/frameworks/tensorflow/tensorflow_mnist.py#L40

  
  
Posted 2 years ago

I found out that if I work with one model than that model is saved but if I work with different model that model is not saved, everything else being the same. Both models use Functional API

Edit: I have further pinpointed a problem to a ReLU layer. If i use
layers.ReLU()it does not work, but if I use
layers.Activation("relu")it works.

  
  
Posted 2 years ago

Hi OutrageousGiraffe8

when I save model using tf.keras.save_model

This should create a new Model in the system (not artifact), models have their own entity and UID.
Are you creating the Task with output_uri=" gs://bucket/folder " ?

  
  
Posted 2 years ago

OutrageousGiraffe8 so basically replacing to:
self.d1 = ReLU()

  
  
Posted 2 years ago

self.d1 = Dense(128, dtype=tf.float32) self.d1_a = ReLU()

  
  
Posted 2 years ago

Hi OutrageousGiraffe8
I was not able to reproduce 😞
Python 3.8 Ubuntu + TF 2.8
I get both metrics and model stored and uploaded
Any idea?

  
  
Posted 2 years ago

I'm sorry for late response. You could probably replicate it by instead of using activation="relu" you used new ReLU layer after Dense. Or if this does not do, extract part of the model in separate Sequential model, for example
Sequential([ Dense(128), BatchNormalization(), ReLU(), ])

  
  
Posted 2 years ago