OutrageousGiraffe8 this sounds like a bug, how can we reproduce it?
Maybe a add another layer here?
https://github.com/allegroai/clearml/blob/a47f127679ebf5912690f7c3e60791a2daa5c984/examples/frameworks/tensorflow/tensorflow_mnist.py#L40
Hi OutrageousGiraffe8
I was not able to reproduce 😞
Python 3.8 Ubuntu + TF 2.8
I get both metrics and model stored and uploaded
Any idea?
I found out that if I work with one model than that model is saved but if I work with different model that model is not saved, everything else being the same. Both models use Functional API
Edit: I have further pinpointed a problem to a ReLU layer. If i uselayers.ReLU()
it does not work, but if I uselayers.Activation("relu")
it works.
self.d1 = Dense(128, dtype=tf.float32) self.d1_a = ReLU()
OutrageousGiraffe8 so basically replacing to:self.d1 = ReLU()
I'm sorry for late response. You could probably replicate it by instead of using activation="relu"
you used new ReLU layer after Dense. Or if this does not do, extract part of the model in separate Sequential model, for exampleSequential([ Dense(128), BatchNormalization(), ReLU(), ])
Hi OutrageousGiraffe8
when I save model using tf.keras.save_model
This should create a new Model in the system (not artifact), models have their own entity and UID.
Are you creating the Task with output_uri="
gs://bucket/folder "
?