添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Sorry, you must verify to complete this action. Please click the verification link in your email. You may re-send via your profile .

Hello,

I have a CNN-LSTM model and would like to convert it into the OpenVINO IR format. I have saved the trained model in an HDF5 format with model checkpoint also in the same HDF5 format.

The following code is used to convert the model using Model Optimizer:

python3 mo_tf.py --input_model <directory>/cnn_lstm_model.hdf5 --output_dir <directory>/disaster --input_shape [60,160,160,3] --input_checkpoint <directory>/model_checkpoint.hdf5

However, the conversion is not successful and I got this error:

[ FRAMEWORK ERROR ] Cannot load input model: Unable to open table file <directory>/model_checkpoint.hdf5: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator?

I am using Openvino version 2021.4 & TensorFlow version 2.1.0. Any help would be great.

Regards,

nat98

Thanks for sharing your models with us.

I also encountered the same error when converting your shared SavedModel format files into IR. Then, I serialized the HDF5 format model into SavedModel format and successfully converted into IR. I found that my converted saved_model.pb file is 1727KB while yours is just 559KB.

I share my steps here:

1. Install the prerequisites for TensorFlow 2:

cd /opt/intel/openvino_2021.4.582/deployment_tools/model_optimizer/install_prerequisites

./install_prerequisites_tf2.sh

2. Serialize the HDF5 format model into SavedModel format:

import tensorflow as tf

from tensorflow.keras.models import load_model

model = load_model(‘path/cnn_lstm_model.hdf5’)

tf.saved_model.save(model,'model')

3. Convert SavedModel format into IR:

python3 mo_tf.py --saved_model_dir <path_SavedModel> --input_shape [1,60,160,160,3]

Link to SavedModel format and IR:

https://drive.google.com/file/d/114MxoDn7GjJ4GqWsZ9A4cgIB4cAZU1kF/view?usp=sharing

Regards,

Hi nat98,


For model in the HDF5 format, you have to load the model using TensorFlow 2 and serialize it in the SavedModel format. Here is an example of how to do it:


import tensorflow as tf

model = tf.keras.models.load_model('model.h5')

tf.saved_model.save(model,'model')


Then convert the the SavedModel format into Intermediate Representation (IR) using Model Optimizer:

python3 mo_tf.py --saved_model_dir <SAVED_MODEL_DIRECTORY> --output_dir <OUTPUT_MODEL_DIR>


For more information, you can refer to Convert TensorFlow 2 Models .



Regards,

Peh


Thanks for your reply.

I have tried the method to load the model using TensorFlow 2 and serialize it in the SavedModel format as shown in your reply.

import tensorflow as tf
model = tf.keras.models.load_model('cnn_lstm_model.h5')
tf.saved_model.save(model,'model')

I got an error when I run the above code. A ttached is the output of the terminal when I run the conversion. The error is exactly the same whether the model is in cnn_lstm_model.h5 format or in cnn_lstm_model.hdf5 format.

For your information, the second attachment is my model summary.

Regards,

nat98

Hi nat98,


Based on your first attachment, I found that your encountered error might be due to h5py version ==3.0.0 which causes issues with Keras model loads in Tensorflow 2.1.0.


Please try to downgrade the h5py package with the following command:

pip install 'h5py==2.10.0' --force-reinstall


Here are two similar discussions that you can refer to:

h5py==3.0.0 causes issues with keras model loads in tensorflow 2.1.0

Does Any one got "AttributeError: 'str' object has no attribute 'decode' " , while Loading a Keras Saved Model



Regards,

Peh


Hi Peh,

Thanks for your suggestion, I have downgraded the h5py package to version 2.10.0. This solved the "Attribute Error" issue. Thanks for that.

I load the model using TensorFlow 2 and serialize it in the SavedModel format, and converted it successfully.

Now, when I convert to OpenVINO IR using the following code:

python3 mo_tf.py --saved_model_dir <SAVED_MODEL_DIRECTORY> --output_dir <OUTPUT_MODEL_DIR>

python3 mo_tf.py --saved_model_dir <SAVED_MODEL_DIRECTORY> --output_dir <OUTPUT_MODEL_DIR> --input_shape [60,160,160,3]

I got the following errors:

[ ERROR ] Cannot infer shapes or values for node "StatefulPartitionedCall_1".
[ ERROR ] Expected DataType for argument 'dtype' not <class 'str'>.
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function tf_native_tf_node_infer at 0x7f39a20a4620>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "StatefulPartitionedCall_1" node.
For more information please refer to Model Optimizer FAQ, question #38. ( https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=38#question-38 )

Attached is the screenshot of the terminal.

Regards,

nat98

Hi Peh,

I have downgraded the h5py package. It solved the error that I have during the model conversion from hdf5 to pb file format.

I run the following codes:

python3 mo_tf.py --saved_model_dir <SAVED_MODEL_DIRECTORY> --output_dir <OUTPUT_MODEL_DIR>

python3 mo_tf.py --saved_model_dir <SAVED_MODEL_DIRECTORY> --output_dir <OUTPUT_MODEL_DIR> --input_shape [60,160,160,3]

And I got some errors during the conversion to OpenVINO IR. Here are some of the errors:

[ ERROR ] Cannot infer shapes or values for node "StatefulPartitionedCall_1".
[ ERROR ] Expected DataType for argument 'dtype' not <class 'str'>.

Attached is the screenshot of the full error.

Regards,

nat98

Thanks for your response.

The link to my models in both hdf5 and SavedModel format are as follows:

HDF5: https://drive.google.com/file/d/1ePuQrVXNA1diNEOvNlqvSYZ7RTOzncIZ/view?usp=sharing

SavedModel: https://drive.google.com/file/d/1ehp6dJ9uNEY13yIzI_OtVsXiSbxYDsdd/view?usp=sharing

Note that I could not attach my model in this reply due to the maximum size allowed.

Regards,

nat98

Thanks for sharing your models with us.

I also encountered the same error when converting your shared SavedModel format files into IR. Then, I serialized the HDF5 format model into SavedModel format and successfully converted into IR. I found that my converted saved_model.pb file is 1727KB while yours is just 559KB.

I share my steps here:

1. Install the prerequisites for TensorFlow 2:

cd /opt/intel/openvino_2021.4.582/deployment_tools/model_optimizer/install_prerequisites

./install_prerequisites_tf2.sh

2. Serialize the HDF5 format model into SavedModel format:

import tensorflow as tf

from tensorflow.keras.models import load_model

model = load_model(‘path/cnn_lstm_model.hdf5’)

tf.saved_model.save(model,'model')

3. Convert SavedModel format into IR:

python3 mo_tf.py --saved_model_dir <path_SavedModel> --input_shape [1,60,160,160,3]

Link to SavedModel format and IR:

https://drive.google.com/file/d/114MxoDn7GjJ4GqWsZ9A4cgIB4cAZU1kF/view?usp=sharing

Regards,

Hi Peh,

I followed your steps and indeed after installing/upgrading TensorFlow 2 prerequisites, the converted saved_model.pb file is 1.8 MB.

Also, I am able to convert the said model into the OpenVINO IR format.

Thanks a lot for your time in solving my issues.

Regards,

nat98

Community support is provided Monday to Friday. Other contact methods are available here .

Intel does not verify all solutions, including but not limited to any file transfers that may appear in this community. Accordingly, Intel disclaims all express and implied warranties, including without limitation, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement, as well as any warranty arising from course of performance, course of dealing, or usage in trade.

For more complete information about compiler optimizations, see our Optimization Notice .