0
votes

Is it possible to create hidden layers with different activation functions, which are both connected to the input layer and not to each other, using Keras?

For example a hidden layer with 10 neurons where say 5 neurons have ReLU activation and 5 neurons have say Sigmoid activation functions. I want to create a slab architecture neural network.

1

1 Answers

1
votes

You can create two separate dense layers. It's the simpliest way of doing it.

Separate layers:

from keras.layers import *
from keras.models import Model

#model's input and the basic syntax for creating layers

inputTensor = Input(some_shape)
outputTensor = SomeLayer(blablabla)(inputTensor)
outputTensor = AnotherLayer(bblablabla)(outputTensor)


#keep creating other layers like the previous one
#when you reach the point you want to divide:

out1 = Dense(5,activation='relu')(outputTensor)
out2 = Dense(5,activation='sigmoid')(outputTensor)


#you may concatenate the results:
outputTensor = Concatenate()([out1,out2])


#keep creating more layers....


#create the model
model = Model(inputTensor,outputTensor)