How to use a saved model in Tensorflow 2.x | by Rahul Bhadani | Jan, 2021

0
46

An educational on saving and reusing Tensorflow-trained fashions

Rahul Bhadani

In my earlier article, I wrote about model validation, regularization, and callbacks the use of TensorFlow 2.x. In the machine-learning pipeline, growing a skilled model isn’t sufficient. What are we going to do with the skilled model as soon as now we have completed the learning, validated, and examined it with a portion of information put aside? In apply, we would really like to import such a skilled model in order that it may be of use in some sensible packages. For instance, shall we embrace I skilled a model on digital camera photographs to acknowledge pedestrians. Ultimately, I would like to use the skilled model to make a real-time prediction of detecting pedestrians with a digital camera fastened on a self-driving automobile. Additionally, coaching a model additionally wishes saving the model as checkpoints, particularly if you end up coaching a model on a truly massive dataset or coaching time is in the order of hours. Model saving may be helpful in case your coaching will get interrupted for some causes comparable to a flaw in your programming good judgment, the battery of your pc died, there was once an I/O error, and so forth.

Ultimately, I would like to use the skilled model to make a real-time prediction of detecting pedestrians with a digital camera fastened on a self-driving automobile.

There are a couple of items we will be able to do whilst saving a model. Do we wish to save the model weights and coaching parameters in each and every iteration (epochs), each and every as soon as in a whilst, or as soon as coaching has completed? We can use built-in callbacks identical to we noticed in my earlier article to routinely save the model weights all over the learning procedure. Alternatively, we will be able to additionally save the model weights and different vital data as soon as coaching has completed.

There are two primary codecs for saved fashions: One in local TensorFlow, and the opposite in HDF5 structure since we’re the use of TensorFlow via Keras API.

An instance of saving the model all over the learning process:

from tensorflow.keras.fashions import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.losses import BinaryCrossentropy
from tensorflow.keras.callbacks import ModelCheckpoint
model = Sequential( [
Dense(128, activation='sigmoid', input_shape = (10, )),
Dense(1)])
model.collect(optimizer = 'sgd', loss = BinaryCrossentropy(from_logits = True))checkpoint = ModelCheckpoint('saved_modelname', save_weights_only=True)model.have compatibility(X_train, y_train, epochs = 10, callbacks = [checkpoint])

You can see that, I used ModelCheckpoint magnificence to create an object checkpoint that takes a controversy which shall be used as a filename to save the model. Since save_weights_only=True is used, simplest weights shall be saved, and community structure might not be saved. Finally, we go callback = [checkpoint] to the have compatibility serve as.

If as an alternative of ‘saved_modelname’ if we stock ‘saved_modelname.h5 , then the model shall be saved in HDF5 structure.

To load the weights of up to now saved, we name load_weights serve as.

model = Sequential( [
Dense(128, activation='sigmoid', input_shape = (10, )),
Dense(1)])
model.load_weights('saved_modelname')

Saving the model weights manually with out the callbacks

We too can save the fashions manually by calling save_weights on the finish of the learning

model = Sequential( [
Dense(128, activation='sigmoid', input_shape = (10, )),
Dense(1)])
model.collect(optimizer = 'sgd', loss = BinaryCrossentropy(from_logits = True))model.have compatibility(X_train, y_train, epochs = 10)
model.save_weights("saved_modelname")

You too can analyze the listing the place the model is saved:

general 184Okay
-rw-r--r-- 1 ivory ivory 61 Jan 12 01:08 saved_modelname
-rw-r--r-- 1 ivory ivory 174Okay Jan 12 01:08 saved_modelname.data-00000-of-00001
-rw-r--r-- 1 ivory ivory 2.0K Jan 12 01:08 saved_modelname.index

Here, you’ll be able to see that the real model saved is saved_modelname.data-00000-of-00001 and the remainder of the record accommodates metadata.

Until now, we noticed that we had been simplest saving the model weights. However, saving all the model is really easy. Just go save_weights_only=False whilst instantiating ModelCheckpoint magnificence.

checkpoint_dir = 'saved_model'
checkpoint = ModelCheckpoint(filepath=checkpoint_dir,
frequency = "epoch",
save_weights_only = False,
verbose= True)
model.have compatibility(X_train, y_train, callbacks=[checkpoint])

In this situation, a new listing is created with the next content material:

general 128
drwxr-xr-x 2 ivory ivory 4096 Jan 12 01:14 property
-rw-r--r-- 1 ivory ivory 122124 Jan 12 01:14 saved_model.pb
drwxr-xr-x 2 ivory ivory 4096 Jan 12 01:14 variables

In this situation, the primary model is saved in the record saved_model.pb and different information are metadata.

Finally, we will be able to use the saved model as follows:

from tensorflow.keras.fashions import load_model
model = load_model(checkpoint_dir)

If we wish to save the model as soon as the learning process is completed, we will be able to name save serve as as follows:

model.save("mysavedmodel")

If you use model.save(“mysavedmodel.h5”), then the model shall be saved as a unmarried record mysavedmodel.h5 .

The saved model can be utilized to make predictions the use of a logo new records set.

model.expect(X_test)

A extra descriptive instance is given in my GitHub repo at https://github.com/rahulbhadani/medium.com/blob/master/01_12_2021/Saving_Model_TF2.ipynb.

References

The article is motivated by the creator’s studying from the TensorFlow2 Coursera path https://www.coursera.org/learn/getting-started-with-tensor-flow2/. Readers would possibly in finding similarities in the introduced instance with examples from the Coursera path.

LEAVE A REPLY

Please enter your comment!
Please enter your name here