2
votes

I need to Store the Concatenation Values for offline use in my model..

I need to save, load and loop though the CNN concatenation feature.

 class DCNN(tf.keras.Model):
    def __init__(self, nb_filters=50, FFN_units=512, nb_classes=2, dropout_rate=0.1, name="dncc"):
    super(DCNN, self).__init__(name=name)

    self.bert_layer = hub.KerasLayer("https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/1",trainable=False)

    self.feature_size = nb_filters * len([2, 3, 4])
    self.num_filters_total = nb_filters * len([2, 3, 4])

    # self.features_before = tf.placeholder(tf.float32, [None, 3, self.feature_size], name="features_before")
    self.features_before = [] #K.placeholder(shape=(None, 3, self.feature_size), name="features_before")

    self.bigram = layers.Conv1D(filters=nb_filters,
                                kernel_size=2,
                                padding='valid',
                                activation='relu')

    self.trigram = layers.Conv1D(filters=nb_filters,
                                 kernel_size=3,
                                 padding='valid',
                                 activation='relu')

    self.fourgram = layers.Conv1D(filters=nb_filters,
                                  kernel_size=5,
                                  padding='valid',
                                  activation='relu')

    self.pool = layers.GlobalMaxPooling1D()

    self.dense1 = layers.Dense(units=FFN_units, activation='relu')

    self.dropout = layers.Dropout(rate=dropout_rate)

    if nb_classes == 2:
        self.last_dense = layers.Dense(units=1, activation='sigmoid')
    else:
        self.last_dense = layers.Dense(units=nb_classes, activation='softmax')

def embed_with_bert(self,all_tokens):
    #first: all sentence , second: tokens accesss = get ids:0 masks:1 segments:2
    _, embds = self.bert_layer([all_tokens[:,0,:],
                               all_tokens[:,1,:],
                               all_tokens[:,2,:]])
    return embds

def call(self, inputs):

    x = self.embed_with_bert(inputs)
    x_1 = self.bigram(x)
    x_1 = self.pool(x_1)  # dim = batchsize x nb_filters

    x_2 = self.trigram(x)
    x_2 = self.pool(x_2)  # dim = batchsize x 50

    x_3 = self.fourgram(x)
    x_3 = self.pool(x_3)  # dim = batchsize x 50

    merged = tf.concat([x_1, x_2, x_3], axis=1)  # batchsize x 3*nb_filters = batchsize x 150

    h_pool_flat = tf.reshape(merged, [-1, self.num_filters_total])

    # features_before: list, 3D tensor of [batch_size, timestep_size, feature_size]
    # [batch_size, timestep_size, feature_size]
    t = tf.math.log(tf.expand_dims(h_pool_flat, axis=1))
    self.features_before.append(t)


    merged = self.dense1(merged)


    merged = self.dropout(merged)

    output = self.last_dense(merged)

    return output

def inference(self):
    return ft.stack(self.features_before)

I tried this: Making a list and appending to it in TensorFlow

but I get the following error:

ValueError: Tensor("dncc/Log:0", shape=(None, 1, 96), dtype=float32) must be from the same graph as Tensor("dncc/Log:0", shape=(None, 1, 96), dtype=float32).

What should I do to fix this error

1
Can you please clarify your requirement? Do you want to create a list and keep adding values to it in every iteration OR you want to save a variable to disk?Tensorflow Warrior
I want to do both, after every epoch I want to store the values. then, I want to create a file from returned stacked variables in inferenceRabab Alkhalifa
Can you change ft.stack to tf.stack and also can you please share the complete code OR a reproducible code for the error?Tensorflow Warrior
here is the code github.com/Rababalkhalifa/BERT : in bert_utils.py you can see the other model commented that I am trying to rewriteRabab Alkhalifa
There seems to be indentation error when code was copied to Colab - File "<ipython-input-1-4f3571338aef>", line 106 super(DCNN, self).__init__(name=name) ^ IndentationError: expected an indented block. Can you please create a Google colab file with the code and the error and share the link.Tensorflow Warrior

1 Answers

2
votes

You can use callbacks functionality in model.fit(). A custom callback is a powerful tool to customize the behavior of a Keras model during training, evaluation, or inference, including reading/changing the Keras model.

Here in the below program, I have created a simple model. In the model, we are capturing the layers[2] weights before every epoch begins in a list. I have created list called my_list, and capturing the weights before every epoch begin using on_epoch_begin of callbacks. I am using append to add to the list the new epoch weights. At the end, I have converted this list to a ndarray for simplicity.

Note : You can download the dataset I am using in the program from here.

Code -

%tensorflow_version 1.x
# MLP for Pima Indians Dataset saved to single file
import numpy as np
from numpy import loadtxt
import tensorflow as tf
print(tf.__version__)
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import model_from_json

# load pima indians dataset
dataset = np.loadtxt("/content/pima-indians-diabetes.csv", delimiter=",")

# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]

# define model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

# compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

# Model Summary
model.summary()

my_list = []

# Define the Required Callback Function
class ListAppend(tf.keras.callbacks.Callback):
    def on_epoch_begin(self, epoch, logs={}):
      weights = model.layers[2].get_weights()[0]
      my_list.append(weights)

listappend = ListAppend() 

# Fit the model
model.fit(X, Y, epochs=150, batch_size=10, verbose=0, callbacks = [listappend])

# evaluate the model
scores = model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

# (7) Convert to a 2 dimensiaonal array of (epoch, gradients) type
my_list = np.asarray(my_list)
print("my_list Array has the shape:",my_list.shape)

Output -

1.15.2
Model: "sequential_8"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_24 (Dense)             (None, 12)                108       
_________________________________________________________________
dense_25 (Dense)             (None, 8)                 104       
_________________________________________________________________
dense_26 (Dense)             (None, 1)                 9         
=================================================================
Total params: 221
Trainable params: 221
Non-trainable params: 0
_________________________________________________________________
acc: 78.26%
my_list Array has the shape: (150, 8, 1)

You can refer this official tensorflow link to understand more about different methods available in tf.keras.callbacks.Callback. You can refer this official tensorflow link for Keras custom callbacks example.

Hope this answers your question. Happy Learning.