manicman1999/StyleGAN2-Tensorflow-2.0

TypeError: Dimension value must be integer or None or have an __index__ method, got 256.0

Opened this issue · 4 comments

I tried running this on Google Colab, and it gives this error when I try to run the model.

TypeError: Dimension value must be integer or None or have an index method, got 256.0

TypeError                                 Traceback (most recent call last)
<ipython-input-35-1135957cf48f> in <module>()
      1 if __name__ == "__main__":
----> 2     model = StyleGAN(lr = 0.0001, silent = False)
      3     model.evaluate(0)
      4 
      5     while model.GAN.steps < 1000001:

14 frames
<ipython-input-34-e6decac87252> in __init__(self, steps, lr, decay, silent)
    342 
    343         #Init GAN and Eval Models
--> 344         self.GAN = GAN(steps = steps, lr = lr, decay = decay)
    345         self.GAN.GenModel()
    346         self.GAN.GenModelA()

<ipython-input-34-e6decac87252> in __init__(self, steps, lr, decay)
    161         #Init Models
    162         self.discriminator()
--> 163         self.generator()
    164 
    165         self.GMO = Adam(lr = self.LR, beta_1 = 0, beta_2 = 0.999)

<ipython-input-34-e6decac87252> in generator(self)
    240         x = Reshape([4, 4, 4*cha])(x)
    241 
--> 242         x, r = g_block(x, inp_style[0], inp_noise, 32 * cha, u = False)  #4
    243         outs.append(r)
    244 

<ipython-input-34-e6decac87252> in g_block(inp, istyle, inoise, fil, u)
    104     out = LeakyReLU(0.2)(out)
    105 
--> 106     return out, to_rgb(out, rgb_style)
    107 
    108 def d_block(inp, fil, p = True):

<ipython-input-34-e6decac87252> in to_rgb(inp, style)
    125     size = inp.shape[2]
    126     x = Conv2DMod(3, 1, kernel_initializer = VarianceScaling(200/size), demod = False)([inp, style])
--> 127     return Lambda(upsample_to_size, output_shape=[None, im_size, im_size, None])(x)
    128 
    129 def from_rgb(inp, conc = None):

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    920                     not base_layer_utils.is_in_eager_or_tf_function()):
    921                   with auto_control_deps.AutomaticControlDependencies() as acd:
--> 922                     outputs = call_fn(cast_inputs, *args, **kwargs)
    923                     # Wrap Tensors in `outputs` in `tf.identity` to avoid
    924                     # circular dependencies.

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/core.py in call(self, inputs, mask, training)
    886     with backprop.GradientTape(watch_accessed_variables=True) as tape,\
    887         variable_scope.variable_creator_scope(_variable_creator):
--> 888       result = self.function(inputs, **kwargs)
    889     self._check_variables(created_variables, tape.watched_variables())
    890     return result

<ipython-input-34-e6decac87252> in upsample_to_size(x)
     75 def upsample_to_size(x):
     76     y = im_size / x.shape[2]
---> 77     x = K.resize_images(x, y, y, "channels_last",interpolation='bilinear')
     78     return x
     79 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/backend.py in resize_images(x, height_factor, width_factor, data_format, interpolation)
   2829   else:
   2830     output_shape = (None, new_height, new_width, None)
-> 2831   x.set_shape(output_shape)
   2832   return x
   2833 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in set_shape(self, shape)
    637     # We want set_shape to be reflected in the C API graph for when we run it.
    638     if not isinstance(shape, tensor_shape.TensorShape):
--> 639       shape = tensor_shape.TensorShape(shape)
    640     dim_list = []
    641     if shape.dims is None:

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in __init__(self, dims)
    769         self._dims = [as_dimension(dims)]
    770       else:
--> 771         self._dims = [as_dimension(d) for d in dims_iter]
    772 
    773   @property

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in <listcomp>(.0)
    769         self._dims = [as_dimension(dims)]
    770       else:
--> 771         self._dims = [as_dimension(d) for d in dims_iter]
    772 
    773   @property

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in as_dimension(value)
    714     return value
    715   else:
--> 716     return Dimension(value)
    717 
    718 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in __init__(self, value)
    198             TypeError("Dimension value must be integer or None or have "
    199                       "an __index__ method, got {!r}".format(value)),
--> 200             None)
    201       if self._value < 0:
    202         raise ValueError("Dimension %d must be >= 0" % self._value)

/usr/local/lib/python3.6/dist-packages/six.py in raise_from(value, from_value)

TypeError: Dimension value must be integer or None or have an __index__ method, got 256.0

Hey,
I was getting the same problem, and changing this line in upsample_to_size function:
y = im_size / x.shape[2]
to
y = im_size // x.shape[2]
helped me to solve this problem (as with the former / division, the result was a float rather than an integer).

Hey,
I was getting the same problem, and changing this line in upsample_to_size function:
y = im_size / x.shape[2]
to
y = im_size // x.shape[2]
helped me to solve this problem (as with the former / division, the result was a float rather than an integer).

Thank you so much for the reply. This worked!

I stumbled running model.load() though with the same problem appearing when it's loading the json. I tried changing some dtypes from the gen.json to int32, but it does not work, as well as turning all float32 to int32.

TypeError                                 Traceback (most recent call last)
<ipython-input-26-31b4c636623b> in <module>()
      6     #    model.train()
      7 
----> 8 model.load(28)
      9 n1 = noiseList(64)
     10 n2 = nImage(64)

17 frames
<ipython-input-20-0bebf3a8d64a> in load(self, num)
    614         self.GAN.D = self.loadModel("dis", num)
    615         self.GAN.S = self.loadModel("sty", num)
--> 616         self.GAN.G = self.loadModel("gen", num)
    617 
    618         self.GAN.GE = self.loadModel("genMA", num)

<ipython-input-20-0bebf3a8d64a> in loadModel(self, name, num)
    595         file.close()
    596 
--> 597         mod = model_from_json(json, custom_objects = {'Conv2DMod': Conv2DMod})
    598         mod.load_weights("/content/drive/My Drive/GAN/Models/"+name+"_"+str(num)+".h5")
    599 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/saving/model_config.py in model_from_json(json_string, custom_objects)
    114   config = json.loads(json_string)
    115   from tensorflow.python.keras.layers import deserialize  # pylint: disable=g-import-not-at-top
--> 116   return deserialize(config, custom_objects=custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/serialization.py in deserialize(config, custom_objects)
    107       module_objects=globs,
    108       custom_objects=custom_objects,
--> 109       printable_module_name='layer')

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/utils/generic_utils.py in deserialize_keras_object(identifier, module_objects, custom_objects, printable_module_name)
    371             custom_objects=dict(
    372                 list(_GLOBAL_CUSTOM_OBJECTS.items()) +
--> 373                 list(custom_objects.items())))
    374       with CustomObjectScope(custom_objects):
    375         return cls.from_config(cls_config)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/network.py in from_config(cls, config, custom_objects)
    985     """
    986     input_tensors, output_tensors, created_layers = reconstruct_from_config(
--> 987         config, custom_objects)
    988     model = cls(inputs=input_tensors, outputs=output_tensors,
    989                 name=config.get('name'))

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/network.py in reconstruct_from_config(config, custom_objects, created_layers)
   2027       if layer in unprocessed_nodes:
   2028         for node_data in unprocessed_nodes.pop(layer):
-> 2029           process_node(layer, node_data)
   2030 
   2031   input_tensors = []

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/network.py in process_node(layer, node_data)
   1975     if input_tensors is not None:
   1976       input_tensors = base_layer_utils.unnest_if_single_tensor(input_tensors)
-> 1977       output_tensors = layer(input_tensors, **kwargs)
   1978 
   1979       # Update node index map.

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    920                     not base_layer_utils.is_in_eager_or_tf_function()):
    921                   with auto_control_deps.AutomaticControlDependencies() as acd:
--> 922                     outputs = call_fn(cast_inputs, *args, **kwargs)
    923                     # Wrap Tensors in `outputs` in `tf.identity` to avoid
    924                     # circular dependencies.

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/core.py in call(self, inputs, mask, training)
    886     with backprop.GradientTape(watch_accessed_variables=True) as tape,\
    887         variable_scope.variable_creator_scope(_variable_creator):
--> 888       result = self.function(inputs, **kwargs)
    889     self._check_variables(created_variables, tape.watched_variables())
    890     return result

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/core.py in upsample_to_size(x)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/backend.py in resize_images(x, height_factor, width_factor, data_format, interpolation)
   2829   else:
   2830     output_shape = (None, new_height, new_width, None)
-> 2831   x.set_shape(output_shape)
   2832   return x
   2833 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in set_shape(self, shape)
    637     # We want set_shape to be reflected in the C API graph for when we run it.
    638     if not isinstance(shape, tensor_shape.TensorShape):
--> 639       shape = tensor_shape.TensorShape(shape)
    640     dim_list = []
    641     if shape.dims is None:

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in __init__(self, dims)
    769         self._dims = [as_dimension(dims)]
    770       else:
--> 771         self._dims = [as_dimension(d) for d in dims_iter]
    772 
    773   @property

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in <listcomp>(.0)
    769         self._dims = [as_dimension(dims)]
    770       else:
--> 771         self._dims = [as_dimension(d) for d in dims_iter]
    772 
    773   @property

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in as_dimension(value)
    714     return value
    715   else:
--> 716     return Dimension(value)
    717 
    718 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py in __init__(self, value)
    198             TypeError("Dimension value must be integer or None or have "
    199                       "an __index__ method, got {!r}".format(value)),
--> 200             None)
    201       if self._value < 0:
    202         raise ValueError("Dimension %d must be >= 0" % self._value)

/usr/local/lib/python3.6/dist-packages/six.py in raise_from(value, from_value)

TypeError: Dimension value must be integer or None or have an __index__ method, got 256.0

@cuakevinlex Hi. I ran into the same problem. I believe there is some compatibility issue with newer Tensorflows. I tested it using Tensorflow 2.1 and 2.2, both giving the same problem.

The author stated that he used Tensorflow 2.0 so I tried that out. It gave a different error regarding CPU/GPU. Turns out, TF 2.0 uses a different CUDA version. So I downloaded this Docker Image which should have compatible CUDA and TF versions. It ran perfectly.