0
votes

I'm working in Keras (Tensorflow 2). I'd like to multiply each element of a tensor with its own trainable weight. Let's say that my input tensor is 1D, with 10 elements; so I try to define the input as a Keras input tensor, the weights as a tf.Variable, and I try to use the Keras Multiply layer, thus:

import tensorflow as tf
inputs = tf.keras.layers.Input(shape=(10), name='inputs')
weights = tf.Variable(tf.random.normal([10]), name='weights')
outputs = tf.keras.layers.Multiply()([inputs, weights])

Now when I inspect the dimensions they are:

inputs: shape=(None, 10)
weights: shape=(10,)
outputs: shape=(10, 10)

The input dimension has a None dimension, for the batch size, which is what I expect and want. However I expected outputs to have shape=(None, 10). Instead, the initial dimension for the batch size seems to have taken a fixed size of 10. How should I correct this?

1

1 Answers

1
votes

You need to broadcast weights along dimenstion 0. The shape of the dimension you want to fix must be constant.

That is, weights must have the shape (1, 10), not (10,).

This can be done using:

weights = tf.Variable(tf.random.normal([1, 10]), name='weights')

or

weights = tf.Variable(tf.random.normal([10]), name='weights')
...
weights = tf.expand_dims(weights, axis=0)