Skip to content

Save Model: TypeError: ('Not JSON Serializable:', Dimension(2048)) #9342

Closed
@Offpics

Description

@Offpics

I made a stackoverflow question about this because it seems people had similar problem in the past https://stackoverflow.com/questions/48691449/typeerror-not-json-serializable-dimension2048.

When I try to save this particular model it gives TypeError: ('Not JSON Serializable:', Dimension(2048)).

I can save other models without a problem and I don't understand why this one doesn't work.

I tried to save it on Windows 10 with python_ver = 3.6, tensorflow_ver = 1.6-rc0 and Ubuntu 16.04 with python_ver = 3.6, tensorflow_ver = 1.3.

I created a model and trained it using code below.

from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import InputLayer
from tensorflow.python.keras.layers import Dense


# Declare variables for model.
transfer_len = 2048
num_classes = 3


# Model creation.
model = Sequential()
# Input layer of shape 2048.
model.add(InputLayer(input_shape = (transfer_len,)))
# Fully connected 1024.
model.add(Dense(1024, activation='relu'))
# Output layer.
model.add(Dense(num_classes, activation='softmax'))


from tensorflow.python.keras.optimizers import Adam

optimizer = Adam(lr=1e-3)

model.compile(optimizer = optimizer,
             loss = 'categorical_crossentropy',
             metrics=['accuracy'])

model.fit(x = transfer_values_train,
         y = labels_train,
         epochs = 20, batch_size = 100, verbose=0)

When I try to save it it gives following error.

output_path = "model.keras"
model.save(output_path)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-22-6a252d3d7102> in <module>()
----> 1 model.save(output_path)

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\engine\topology.py in save(self, filepath, overwrite, include_optimizer)
   1044     """
   1045     from tensorflow.python.keras._impl.keras.models import save_model  # pylint: disable=g-import-not-at-top
-> 1046     save_model(self, filepath, overwrite, include_optimizer)
   1047 
   1048   def save_weights(self, filepath, overwrite=True):

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\models.py in save_model(model, filepath, overwrite, include_optimizer)
    131             'config': model.get_config()
    132         },
--> 133         default=get_json_type).encode('utf8')
    134 
    135     model_weights_group = f.create_group('model_weights')

~\Anaconda3\envs\gpu\lib\json\__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
    236         check_circular=check_circular, allow_nan=allow_nan, indent=indent,
    237         separators=separators, default=default, sort_keys=sort_keys,
--> 238         **kw).encode(obj)
    239 
    240 

~\Anaconda3\envs\gpu\lib\json\encoder.py in encode(self, o)
    197         # exceptions aren't as detailed.  The list call should be roughly
    198         # equivalent to the PySequence_Fast that ''.join() would do.
--> 199         chunks = self.iterencode(o, _one_shot=True)
    200         if not isinstance(chunks, (list, tuple)):
    201             chunks = list(chunks)

~\Anaconda3\envs\gpu\lib\json\encoder.py in iterencode(self, o, _one_shot)
    255                 self.key_separator, self.item_separator, self.sort_keys,
    256                 self.skipkeys, _one_shot)
--> 257         return _iterencode(o, 0)
    258 
    259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\models.py in get_json_type(obj)
    113       return obj.__name__
    114 
--> 115     raise TypeError('Not JSON Serializable:', obj)
    116 
    117   from tensorflow.python.keras._impl.keras import __version__ as keras_version  # pylint: disable=g-import-not-at-top

TypeError: ('Not JSON Serializable:', Dimension(2048)

Activity

Offpics

Offpics commented on Feb 9, 2018

@Offpics
Author

Tried with tensorflow_ver = 1.5.0, python_ver = 3.5.0 and it gives the same error.
Also changed the model to Functional and raises the same error.

inputs = Input(shape=(transfer_len,))

net = inputs

net = Dense(1024, activation='relu')(net)

net = Dense(num_classes, activation='softmax')(net)

outputs = net
Offpics

Offpics commented on Feb 9, 2018

@Offpics
Author

Okay, so the transfer_len variable was a type 'tensorflow.python.framework.tensor_shape.Dimension'.
Changed to int and it saves normally.
Now the question is wheter it is possible to fix it in the save method.

ghost

ghost commented on Apr 24, 2018

@ghost

Hello!

I was able to fix this by changing get_json_type here:
tensorflow/python/keras/_impl/keras/engine/saving.py.

This is the function that's called from the \tensorflow\python\keras\_impl\keras\models.py file that raised your error.

Looks like the original function essentially if/else's the classes to find a JSON-friendly representation and raises a TypeError if it can't.

The following lines makes it happy, but I'm sure there's a much better, more principled fix. Anyone know what the best place to put code like this would be? I am more than happy to take a stab at it, just wondering where it would be at.
It seems like the standalone Keras fixed this when it marshal.dump(...)s the object, not sure what the tf.Keras equivalent is.

# ... earlier get_json_type code
# NOTE: Hacky fix to serialize Dimension objects.
from tensorflow.python.framework.tensor_shape import Dimension
if type(obj) == Dimension:
  return int(obj)
# original error raised here
raise TypeError('Not JSON Serializable:', obj)
ArchieMeng

ArchieMeng commented on Jun 10, 2018

@ArchieMeng

Since Dimension object can be None like 'Dimension(None)'.A better fix can be:

from tensorflow.python.framework.tensor_shape import Dimension
if type(obj) == Dimension:
  return int(obj.value or 0)
ialhashim

ialhashim commented on Sep 3, 2018

@ialhashim

I still need to add the hacks suggested by @Clck @ArchieMeng . Why isn't this issue fixed?

gillesdegottex

gillesdegottex commented on Sep 19, 2018

@gillesdegottex

@Clck @ArchieMeng definitely works. it seems get_json_type() is in engine/network.py though.

The question is how to make from tensorflow.python.framework.tensor_shape import Dimension backend-independent. Which might explain why there has been no PR about this.

cottrell

cottrell commented on Mar 27, 2019

@cottrell

Possibly related: #12473

sandippaul7204

sandippaul7204 commented on Sep 15, 2020

@sandippaul7204

I am having the same problem. I have added the code lines in Keras as suggested by you. It returns the same error.

ArchieMeng

ArchieMeng commented on Oct 18, 2020

@ArchieMeng

I am having the same problem. I have added the code lines in Keras as suggested by you. It returns the same error.

Well. I just went back to keras with Deep learning recently, and I used the model.save method as well. However, it doesn't raise the same error as I got many years ago, and it works pretty well.
So, how did you create your model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @cottrell@fchollet@ialhashim@gillesdegottex@ArchieMeng

        Issue actions

          Save Model: TypeError: ('Not JSON Serializable:', Dimension(2048)) · Issue #9342 · keras-team/keras