Tensorflow 2.x: save and load keras model, and their customization

文章目录

    • A keras model contains:
    • 基本save和load模型方法:
    • 保存全部模型
      • SavedModel类型(默认)
      • H5类型(SavedModel的轻量级)
      • SavedModel和H5比较:
    • 仅保存模型结构architecture
    • 仅保存模型权值weights
    • tf.train.Checkpoint
    • 总结

A keras model contains:

  1. architecture, which specifyies what layers the model contain, and how they’re connected.
  2. weights
  3. an optimizer
  4. losses and metrics

基本save和load模型方法:

# Saving a Keras model:
model.save('path/to/location')

# Loading the model back:
from tensorflow import keras
model = keras.models.load_model('path/to/location')

保存全部模型

There are two formats you can use to save an entire model to disk: the TensorFlow SavedModel format, and the older Keras H5 format. The recommended format is SavedModel. It is the default when you use model.save() .

SavedModel类型(默认)

默认保存模型的类型为SavedModel,e.g.,:


import numpy as np
import tensorflow as tf
from tensorflow import keras

def get_model():
    # Create a simple model.
    inputs = keras.Input(shape=(32,))
    outputs = keras.layers.Dense(1)(inputs)
    model = keras.Model(inputs, outputs)
    model.compile(optimizer="adam", loss="mean_squared_error")
    return model

def init_data():
    test_input = np.random.random((128, 32))
    test_target = np.random.random((128, 1))
    return test_input, test_target

def test():
    model = get_model()
    # Train the model.
    test_input, test_target = init_data()
    model.fit(test_input, test_target)
    # Save the model
    # Calling `save('my_model')` creates a SavedModel folder `my_model`.
    model.save("my_model")
    # Load the model:
    # It can be used to reconstruct the model identically.
    reconstructed_model = keras.models.load_model("my_model")
    # Let's check:
    np.testing.assert_allclose(
        model.predict(test_input), reconstructed_model.predict(test_input)
    )
    # The reconstructed model is already compiled and has retained the optimizer
    # state, so training can resume:
    reconstructed_model.fit(test_input, test_target)

if __name__ == '__main__':
    test()

保存的结果为创建了一个my_model的文件夹,包含一个saved_model.pb: 包含architecture, an optimization, losses and metrics, 和一个variables文件夹,包含weights

The model architecture, and training con¦guration (including the optimizer, losses, and metrics) are stored in
saved_model.pb . The weights are saved in the variables/ directory.
Tensorflow 2.x: save and load keras model, and their customization_第1张图片

H5类型(SavedModel的轻量级)

除了保存为默认的SavedModel类型,Keras模型还可保存为H5类型:
(H5类型相当于轻量级的SavedModel,保存内容有architecture, weights and compile() information, 相当于没有optimizer和losses metrics)
Keras also supports saving a single HDF5 file containing the model’s architecture, weights values, and compile()
information. It is a light-weight alternative to SavedModel.

model.save("my_h5_model.h5")
reconstructed_model = keras.models.load_model("my_h5_model.h5")

SavedModel和H5比较:

Compared to the SavedModel format, there are two things that don’t get included in the H5 file:

  • External losses & metrics added via model.add_loss() & model.add_metric() are not saved (unlike SavedModel).
    If you have such losses & metrics on your model and you want to resume training, you need to add these losses back yourself after loading the model. Note that this does not apply to losses/metrics created inside layers via self.add_loss() & self.add_metric() . As long as the layer gets loaded, these losses & metrics are kept, since they are part of the call method of the layer.
  • The computation graph of custom objects such as custom layers is not included in the saved file. At loading time, Keras will need access to the Python classes/functions of these objects in order to reconstruct the model. See Custom
    objects.

仅保存模型结构architecture

如果只想保存模型结构,不需模型参数如weight等,可通过以下两种方式实现:
注:Note this only applies to models de¦ned using the functional or Sequential apis not subclassed models.

1) get_config() and from_config()
2) tf.keras.models.model_to_json() and tf.keras.models.model_from_json()

不同之处在于第一种方式为保存中间变量,第二种方式为保存为json文件。示例说明如下:

第一种方式:

# Layer example:
layer = keras.layers.Dense(3, activation="relu")
layer_config = layer.get_config()
new_layer = keras.layers.Dense.from_config(layer_config)

# Sequential model example
model = keras.Sequential([keras.Input((32,)), keras.layers.Dense(1)])
config = model.get_config()
new_model = keras.Sequential.from_config(config)

# Functional model example
inputs = keras.Input((32,))
outputs = keras.layers.Dense(1)(inputs)
model = keras.Model(inputs, outputs)
config = model.get_config()
new_model = keras.Model.from_config(config)

第二种方式:

# This is similar to get_config / from_config , except it turns the model into a JSON string, which can then be loaded without the original model class. 
# It is also specific to models, it isn't meant for layers.

model = keras.Sequential([keras.Input((32,)), keras.layers.Dense(1)])
json_config = model.to_json()
new_model = keras.models.model_from_json(json_config)

仅保存模型权值weights

基本方法:

  • tf.keras.layers.Layer.get_weights() : Returns a list of numpy arrays.

  • tf.keras.layers.Layer.set_weights() : Sets the model weights to the values in the weights argument

  • mode.save_weights(weights)

  • model.get_weights()

示例

  1. Transfering weights from one layer to another, in memory
def create_layer():
	layer = keras.layers.Dense(64, activation="relu", name="dense_2")
	layer.build((None, 784))
	return layer

layer_1 = create_layer()
layer_2 = create_layer()

# Copy weights from layer 2 to layer 1
layer_2.set_weights(layer_1.get_weights())
  1. Transfering weights from one model to another model with a compatible architecture, in memory
# Create a simple functional model
inputs = keras.Input(shape=(784,), name="digits")
x = keras.layers.Dense(64, activation="relu", name="dense_1")(inputs)
x = keras.layers.Dense(64, activation="relu", name="dense_2")(x)
outputs = keras.layers.Dense(10, name="predictions")(x)
functional_model = keras.Model(inputs=inputs, outputs=outputs, name="3_layer_mlp")

# Define a subclassed model with the same architecture
class SubclassedModel(keras.Model):
	def __init__(self, output_dim, name=None):
		super(SubclassedModel, self).__init__(name=name)
		self.output_dim = output_dim
		self.dense_1 = keras.layers.Dense(64, activation="relu", name="dense_1")
		self.dense_2 = keras.layers.Dense(64, activation="relu", name="dense_2")
		self.dense_3 = keras.layers.Dense(output_dim, name="predictions")
	def call(self, inputs):
		x = self.dense_1(inputs)
		x = self.dense_2(x)
		x = self.dense_3(x)
		return x
	def get_config(self):
		return {"output_dim": self.output_dim, "name": self.name}


subclassed_model = SubclassedModel(10)
# Call the subclassed model once to create the weights.
subclassed_model(tf.ones((1, 784)))
# Copy weights from functional_model to subclassed_model.
subclassed_model.set_weights(functional_model.get_weights())
assert len(functional_model.weights) == len(subclassed_model.weights)
for a, b in zip(functional_model.weights, subclassed_model.weights):
	np.testing.assert_allclose(a.numpy(), b.numpy())

和保存模型结构时类似,我们也可以保存模型权值为文件,保存类型有:

  • TensorFlow Checkpoint (默认保存类型,即model.save_weights())
  • HDF5

设定保存权值类型(TensorFlow Checkpoint/ HDF5)的方式有两种:

  1. 设定save_format
  2. save_weights(‘path/to/location’)的参数path/to/location看后缀,如.h5, .hdf5为HDF5,其它后缀为TensorFlow Checkpoint

示例:

# Runnable example
sequential_model = keras.Sequential(
	[
		keras.Input(shape=(784,), name="digits"),
		keras.layers.Dense(64, activation="relu", name="dense_1"),
		keras.layers.Dense(64, activation="relu", name="dense_2"),
		keras.layers.Dense(10, name="predictions"),
	]
)

sequential_model.save_weights("weights.h5")
sequential_model.load_weights("weights.h5")

tf.train.Checkpoint

类似于tensorflow 1.x里的tf.train.Saver, 非常强大的工具,可自定义保存的变量。

基本用法:

# save
ckpt_path = tf.train.Checkpoint(
	my_param_1=my_param_1_value, my_param_2=my_param_2_value...
).save("ckpt")

# restore
tf.train.Checkpoint(
	my_param_1=my_param_1_value, my_param_2=my_param_2_value...
).restore(ckpt_path).assert_consumed()

示例:

# Create a subclassed model that essentially uses functional_model's first
# and last layers.
# First, save the weights of functional_model's first and last dense layers.
first_dense = functional_model.layers[1]
last_dense = functional_model.layers[-1]
ckpt_path = tf.train.Checkpoint(
	dense=first_dense, kernel=last_dense.kernel, bias=last_dense.bias
).save("ckpt")

# Define the subclassed model.
class ContrivedModel(keras.Model):
	def __init__(self):
		super(ContrivedModel, self).__init__()
		self.first_dense = keras.layers.Dense(64)
		self.kernel = self.add_variable("kernel", shape=(64, 10))
		self.bias = self.add_variable("bias", shape=(10,))
	def call(self, inputs):
		x = self.first_dense(inputs)
		return tf.matmul(x, self.kernel) + self.bias
		
model = ContrivedModel()
# Call model on inputs to create the variables of the dense layer.
_ = model(tf.ones((1, 784)))
# Create a Checkpoint with the same structure as before, and load the weights.
tf.train.Checkpoint(
	dense=model.first_dense, kernel=model.kernel, bias=model.bias
).restore(ckpt_path).assert_consumed()

总结

看到文件类型应知道保存的是模型什么内容:

  1. SavedModel(assets + variables + saved_model.pb): 模型的全部内容
  2. json:模型结构
  3. h5:可能是模型结构和权值(model.save(model.h5)),也可能仅是权值(model.save_weights(weights.h5))
  4. ckpt: checkpoints即模型权值

你可能感兴趣的:(TensorFlow,Python)