BrikerMan/Kashgari

AttributeError: 'DoubleBLSTMModel' object has no attribute 'processor'

ahmad-alismail opened this issue · 2 comments

Hello,
I am trying to reimplement the same customized code explained in the documentation for text labeling:

from typing import Dict, Any

from tensorflow import keras

from kashgari.tasks.labeling.abc_model import ABCLabelingModel
from kashgari.layers import L

import logging
logging.basicConfig(level='DEBUG')

class DoubleBLSTMModel(ABCLabelingModel):
    """Bidirectional LSTM Sequence Labeling Model"""

    @classmethod
    def default_hyper_parameters(cls) -> Dict[str, Dict[str, Any]]:
        """
        Get hyper parameters of model
        Returns:
            hyper parameters dict
        """
        return {
            'layer_blstm1': {
                'units': 128,
                'return_sequences': True
            },
            'layer_blstm2': {
                'units': 128,
                'return_sequences': True
            },
            'layer_dropout': {
                'rate': 0.2
            },
            'layer_time_distributed': {},
            'layer_activation': {
                'activation': 'softmax'
            }
        }

    def build_model_arc(self):
        """
        build model architectural
        """
        output_dim = len(self.processor.label2idx)
        config = self.hyper_parameters
        embed_model = self.embedding.embed_model

        # Define your layers
        layer_blstm1 = L.Bidirectional(L.LSTM(**config['layer_blstm1']),
                                       name='layer_blstm1')
        layer_blstm2 = L.Bidirectional(L.LSTM(**config['layer_blstm2']),
                                       name='layer_blstm2')

        layer_dropout = L.Dropout(**config['layer_dropout'],
                                  name='layer_dropout')

        layer_time_distributed = L.TimeDistributed(L.Dense(output_dim,
                                                           **config['layer_time_distributed']),
                                                   name='layer_time_distributed')
        layer_activation = L.Activation(**config['layer_activation'])

        # Define tensor flow
        tensor = layer_blstm1(embed_model.output)
        tensor = layer_blstm2(tensor)
        tensor = layer_dropout(tensor)
        tensor = layer_time_distributed(tensor)
        output_tensor = layer_activation(tensor)

        # Init model
        self.tf_model = keras.Model(embed_model.inputs, output_tensor)

model = DoubleBLSTMModel()
model.fit(sentences_train, labels_train)

However, I have the following message error:

Preparing text vocab dict: 100%|██████████| 2418/2418 [00:00<00:00, 168482.88it/s]
2021-01-31 04:24:53,929 [DEBUG] kashgari - --- Build vocab dict finished, Total: 1695 ---
2021-01-31 04:24:53,930 [DEBUG] kashgari - Top-10: ['[PAD]', '[UNK]', '[CLS]', '[SEP]', '.', 'the', ',', 'and', 'a', 'to']
Preparing text vocab dict: 100%|██████████| 2418/2418 [00:00<00:00, 250094.37it/s]
2021-01-31 04:24:53,946 [DEBUG] kashgari - --- Build vocab dict finished, Total: 4 ---
2021-01-31 04:24:53,946 [DEBUG] kashgari - Top-10: ['[PAD]', 'O', 'B_A', 'I_A']
Calculating sequence length: 100%|██████████| 2418/2418 [00:00<00:00, 1627379.18it/s]
2021-01-31 04:25:36,860 [DEBUG] kashgari - Calculated sequence length = 42
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-36-928aa2c1f3d0> in <module>()
----> 1 model.fit(sentences_train, labels_train)

3 frames
<ipython-input-35-b8fc908f6695> in build_model_arc(self)
     41         build model architectural
     42         """
---> 43         output_dim = len(self.processor.label2idx)
     44         config = self.hyper_parameters
     45         embed_model = self.embedding.embed_model

AttributeError: 'DoubleBLSTMModel' object has no attribute 'processor'

Environment

  • OS [e.g. Mac OS, Linux]: Mac OS
  • Python Version: Python 3.6 on google-colab

Could you help me to solve the problem please?

Thanks in advance!

I think the documentation from the legacy version also?

stale commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.