philipperemy/keras-tcn

tcn_full_summary

jjfaj opened this issue · 4 comments

jjfaj commented

Describe the bug
When I use your sample code, I get the error below...

Paste a snippet
tcn_full_summary(model=m, expand_residual_blocks=False) and I get the error below:


AttributeError Traceback (most recent call last)
in ()
1
----> 2 tcn_full_summary(model=m, expand_residual_blocks=False)
3 #tcn_full_summary(model=m)
4

/usr/local/lib/python3.7/dist-packages/tcn/tcn.py in tcn_full_summary(model, expand_residual_blocks)
457
458 def tcn_full_summary(model: Model, expand_residual_blocks=True):
--> 459 layers = model._layers.copy() # store existing layers
460 model._layers.clear() # clear layers
461

AttributeError: 'Sequential' object has no attribute '_layers'

Dependencies
Am using Colab with version 2.5 of tensorflow

@jjfaj TF 2.5 broke the compatibility with this function. Try to use 2.4 in the mean time while we find a way to fix it.

commenting to watch as I have the same issue.

I'll do my best to provide a version compatible for tensorflow 2.5+

So guys after many attempts, I could not get anything satisfactory. I think we should all move to tensorboard.

Something like that is more visual:

image

I posted an example on how to use it: https://github.com/philipperemy/keras-tcn/blob/master/tasks/tcn_tensorboard.py.

I will deprecate the function tcn_full_summary().

For information, I'm posting the code where I tried to have it working for tensorflow 2.5+

def tcn_full_summary(model: Model):
    existing_layers = list(model.layers)  # store existing layers
    all_layers = []

    # {'embedding/embeddings:0': 2560000,
    # 'tcn/residual_block_0/conv1D_0/kernel:0': 49152, x
    # 'tcn/residual_block_0/conv1D_0/bias:0': 64, x
    # 'tcn/residual_block_0/conv1D_1/kernel:0':  24576, x
    # 'tcn/residual_block_0/conv1D_1/bias:0': 64, x
    # 'tcn/residual_block_0/matching_conv1D/kernel:0': 8192, <- missing?
    # 'tcn/residual_block_0/matching_conv1D/bias:0': 64, <- missing?
    # 'tcn/residual_block_1/conv1D_0/kernel:0': 24576, x
    # 'tcn/residual_block_1/conv1D_0/bias:0': 64, x
    # 'tcn/residual_block_1/conv1D_1/kernel:0': 24576, x
    # 'tcn/residual_block_1/conv1D_1/bias:0': 64, x
    # 'dense/kernel:0': 64,
    # 'dense/bias:0': 1}
    # [2560000, 49152, 64, 24576, 64, 8192, 64, 24576, 64, 24576, 64, 64, 1]
    for existing_layer in existing_layers:
        if isinstance(existing_layer, TCN):
            for tcn_layer in existing_layer._self_tracked_trackables:
                # try:
                #     print(tcn_layer.name, tcn_layer)
                # except:
                #     pass
                if isinstance(tcn_layer, ResidualBlock):
                    for res_layer in tcn_layer.layers:
                        if not hasattr(res_layer, '__iter__'):
                            # print('-', res_layer.name, res_layer)
                            all_layers.append(res_layer)
                        else:
                            print('PASS', res_layer)
                    if hasattr(tcn_layer, 'matching_conv1D'):
                        # print('matching_conv1D', tcn_layer.matching_conv1D)
                        all_layers.append(tcn_layer.matching_conv1D)
                else:
                    if not hasattr(tcn_layer, '__iter__'):
                        all_layers.append(tcn_layer)
                    # else:
                    #     print('PASS', tcn_layer)
        else:
            all_layers.append(existing_layer)

    for i, a in enumerate(all_layers):
        a._name = a.name + '__' + str(i)

    Sequential(layers=all_layers).summary()
    # model.summary()  # print summary

I will close this issue but don't hesitate to revisit it if you find the magical solution ;)