tensorflow hub 如何load本地的模型,公网访问模型

tensorflow 2.x     

以 small_bert/bert_en_uncased_L-2_H-128_A-2/1 为例,

https://tfhub.dev/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1

一是 可以转换为 指向模块压缩文件TGZ的URL。重点是,https://storage.googleapis.com可在公网访问。

https://storage.googleapis.com/tfhub-modules/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1.tar.gz

二是 可以下载解压到本地文件系统(云端文件系统),直接加载

C:\bert\small_bert_bert_en_uncased_L-2_H-128_A-2_1

以下是工作代码

print("=========Load Remote hub model:hub.KerasLayer =============")
#tfhub_bert="https://tfhub.dev/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1"
#tfhub_bert="https://storage.googleapis.com/tfhub-modules/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1.tar.gz"
#bert_layer    = hub.KerasLayer(tfhub_bert,trainable=False)
print("=========Load Remote hub model:hub.KerasLayer, success!=============")

print("=========Load local hub model:hub.KerasLayer =============")
model_dir="C:\\bert\\small_bert_bert_en_uncased_L-2_H-128_A-2_1\\"
bert_layer = hub.KerasLayer(model_dir,trainable=False)
print("=========Load local hub model:hub.KerasLayer, success!=============")

 

官方文档:

hub 加载函数

hub.load( handle, tags=None, options=None)

hub.resolve(handle)

hub.KerasLayer(

    handle, trainable=False, arguments=None, _sentinel=None, tags=None,

    signature=None, signature_outputs_as_dict=None, output_key=None,

    output_shape=None, load_options=None, **kwargs

)

hub 加载函数 支持的handle类型

1)托管的URL。 Smart URL resolvers such as tfhub.dev,  e.g.: https://tfhub.dev/google/nnlm-en-dim128/1.

2) 本地或云端文件系统目录。A directory on a file system supported by Tensorflow containing module files.

This may include a local directory (e.g. /usr/local/mymodule) or a Google Cloud Storage bucket (gs://mymodule).

3) URL指向TGZ模块压缩文件。A URL pointing to a TGZ archive of a module,  e.g. https://example.com/mymodule.tar.gz.

参考:https://www.tensorflow.org/hub/api_docs/python/hub

 

你可能感兴趣的:(00.Tensorflow2)