1
votes

I have trained a model in Tensorflow (v2.0) Keras locally. I now need to upload this pretrained model into Google Datalab to make predictions on a large batch of data. Tenserflow version available on Datalab is 1.8 but I assume backward compatibility.

I have uploaded the saved model (model.h5) onto Google Cloud Storage. I tried to load it into a Jupyter Notebook in Datalab like so:

%%gcs read --object gs://project-xxx/data/saved_model.h5 --variable ldmodel
model = keras.models.load_model(ldmodel)

This throws up the error:

---------------------------------------------------------------------------
UnicodeDecodeError                        Traceback (most recent call last)
<ipython-input-18-07c40785a14b> in <module>()
----> 1 model = keras.models.load_model(ldmodel)

/usr/local/envs/py3env/lib/python3.5/site- 
   packages/tensorflow/python/keras/_impl/keras/engine/saving.py in load_model(filepath, 
custom_objects, 
compile)
    233     return obj
    234 
--> 235   with h5py.File(filepath, mode='r') as f:
    236     # instantiate model
    237     model_config = f.attrs.get('model_config')

/usr/local/envs/py3env/lib/python3.5/site-packages/h5py/_hl/files.py in __init__(self, name, mode, 
driver, libver, userblock_size, swmr, **kwds)
    267             with phil:
    268                 fapl = make_fapl(driver, libver, **kwds)
--> 269                 fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
    270 
    271                 if swmr_support:

    /usr/local/envs/py3env/lib/python3.5/site-packages/h5py/_hl/files.py in make_fid(name, mode, 
   userblock_size, fapl, fcpl, swmr)
     97         if swmr and swmr_support:
     98             flags |= h5f.ACC_SWMR_READ
---> 99         fid = h5f.open(name, flags, fapl=fapl)
    100     elif mode == 'r+':
    101         fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/h5f.pyx in h5py.h5f.open()

h5py/defs.pyx in h5py.defs.H5Fopen()

h5py/_errors.pyx in h5py._errors.set_exception()

UnicodeDecodeError: 'utf-8' codec can't decode byte 0x89 in position 29: invalid start byte

Any help will be appreciated!

2

2 Answers

2
votes

I won't bet on backward compatibility. Here more details on it.

In addition, your version is old. 1.8 has been released in april 2018. The latest 1 (1.15) version has been released the last month.

Finally, Keras was not very well integrated in the version 1 of Tensorflow. V2 change all at this level and your issue stick with this incompatibility issue.

0
votes

I solved this issue by loading the pretrained model in .h5 format from GCS into a tensorflow 2 notebook on the GC AI Platform.