sksq96/pytorch-summary

AttributeError: 'BertForSequenceClassification' object has no attribute 'layers'

furkaano opened this issue · 0 comments

from tensorflow.keras.initializers import TruncatedNormal
from tensorflow.keras.layers import Input, Dropout, Dense
from tensorflow.keras.losses import BinaryCrossentropy
from tensorflow.keras.metrics import AUC
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import Adam

transformer_model = BertForSequenceClassification.from_pretrained(
                                                                MODEL_NAME, 
                                                                output_hidden_states=False
                                                                )

bert = transformer_model.layers[0]

input_ids = Input(shape=(MAX_LENGTH,), 
                  name='input_ids', 
                  dtype='int32')
inputs = {'input_ids': input_ids}

bert_model = bert(inputs)[0][:, 0, :] 

dropout = Dropout(config.dropout, name='pooled_output')
pooled_output = dropout(bert_model, training=False)
output = Dense(
    units=train_labels.shape[1],
    kernel_initializer=TruncatedNormal(stddev=config.initializer_range), 
    activation="sigmoid",  # Choose a sigmoid for multi-label classification
    name='output'
)(pooled_output)

model = Model(inputs=inputs, 
              outputs=output, 
              name='BERT_MultiLabel')
model.summary()

I'm trying to multi-label classification by using Bert and Tensorflow. But I've a problem after I run that code. The output shows that

Some weights of the model checkpoint at dbmdz/bert-base-turkish-128k-cased were not used when initializing BertForSequenceClassification: ['cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight']
- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of BertForSequenceClassification were not initialized from the model checkpoint at dbmdz/bert-base-turkish-128k-cased and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Input In [63], in <module>
      6 from tensorflow.keras.optimizers import Adam
      8 transformer_model = BertForSequenceClassification.from_pretrained(
      9                                                                 MODEL_NAME, 
     10                                                                 output_hidden_states=False
     11                                                                 )
---> 13 bert = transformer_model.layers[0]
     16 input_ids = Input(shape=(MAX_LENGTH,), 
     17                   name='input_ids', 
     18                   dtype='int32')

File ~\AppData\Roaming\Python\Python39\site-packages\torch\nn\modules\module.py:1177, in Module.__getattr__(self, name)
   1175     if name in modules:
   1176         return modules[name]
-> 1177 raise AttributeError("'{}' object has no attribute '{}'".format(
   1178     type(self).__name__, name))

AttributeError: 'BertForSequenceClassification' object has no attribute layers

What should I do at this point ?