Error Typeerror: __init__() Missing 1 required positional argument: 'output_dim' when instantiating convulutional neural network object

Asked

Viewed 715 times

-1

I took a course on neural networks applied to natural language processing and now I’m trying to instantiate an object based on the class built in class (with some modifications, because my data is different).

The goal is to classify three feelings, all based on text.

When I instate the class below, I get the following error:

'Typeerror: init() Missing 1 required positional argument: 'output_dim''.

I didn’t see output_dim in the documentation of Keras, from which my class inherits, nor did I put this parameter in my constructor method __init__.Could someone help me with this?

class DCNN(tf.keras.Model):
  def __init__(self, 
               vocab_size,
               emb_dim= 128,
               nb_filters = 50,
               ffn_units=512,
               nb_classes = 3,
               dropout_rate = 0.1,
               training = False,
               name = 'dcnn'):
    super(DCNN, self).__init__(name = name)

    self.embedding = layers.Embedding(vocab_size, emb_dim = 128)
    self.bigram = layers.Conv1D(filters = nb_filters, kernel_size=2, 
                                   padding = 'same', 
                                   activation = 'relu')
    self.trigram = layers.Conv1D(filters = nb_filters, kernel_size=3, 
                                   padding = 'same', 
                                   activation = 'relu')
    self.fourgram = layers.Conv1D(filters = nb_filters, kernel_size=4, 
                                   padding = 'same', 
                                   activation = 'relu')

    self.pool = layers.GlobalMaxPool1D()

    self.dense_1 = layers.Dense(units = ffn_units, activation= 'relu')

    self.dropout = layers.Dropout(rate = dropout_rate)

    self.last_dense = layers.Dense(units = nb_classes, activation= 'softmax')

  def call(self, inputs, training):
    x = self.embedding(inputs) #inicio do fluxo da rede neural, embedding
    x_1 = self.bigram(x)
    x_1 = self.pool(x_1)
    x_2 = self.trigram(x)
    x_2 = self.pool(x_2)
    x_3 = self.fourgram(x)
    x_3 = self.pool(x_3)

    merged = tf.concat([x_1, x_2, x_3], axis = -1) ##concatena o output do pooling
    merged = self.dense_1(merged) #liga os dados da concatenação à camada densa
    merged = self.dropout(merged, training) #zera porção dos neuronios
    output = self.last_dense(merged) #liga camada de drop out nessa camada

    return output 

#instanciação 

dcnn = DCNN(vocab_size = 3276, emb_dim = 100, nb_filters = 100,
            ffn_units = 256, nb_classes = 3, 
            dropout_rate = 0.2)

Typeerror: init() Missing 1 required positional argument: 'output_dim'

  • Welcome to the site. I suggest posting the complete error, because the way it is is hard to know which part of the code generates the error. I also suggest posting the tools you’re using, where this comes from layers? Of from tensorflow.keras import layers?

  • I believe the error is in the call from layers.Embedding. Behold here

  • Thank you very much, Paulo. I changed the emb_dim for the output_dim. Another user had already signaled me. Thanks.

1 answer

0


Error points to line:

--->  self.embedding = layers.Embedding(vocab_size, emb_dim = 128)

Checking out the Keras documentation on Embedding layer, we can see that this layer receives as mandatory parameters a input_dim and a output_dim. In your code, only the input_dim as vocab_size, but the output_dim. That’s why the mistake

Missing 1 required positional argument: 'output_dim'

Still in the documentation, there is no parameter called emb_dim, which leads me to believe that you changed the output_dim for emb_dim and that the right thing would be:

self.embedding = layers.Embedding(input_dim=vocab_size, output_dim=128)
  • You nailed it! hehehe. Thank you, that’s right. What surgery you did in the code. I had already tried everything. It worked with your adjustment.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.