keras fit_generator with multiple input layers

  • Last Update :
  • Techknowledgy :

Try this generator:

def generator_two_img(X1, X2, y, batch_size):
   genX1 = gen.flow(X1, y, batch_size = batch_size, seed = 1)
genX2 = gen.flow(X2, y, batch_size = batch_size, seed = 1)
while True:
   X1i = genX1.next()
X2i = genX2.next()
yield [X1i[0], X2i[0]], X1i[1]

Generator for 3 inputs:

def generator_three_img(X1, X2, X3, y, batch_size):
   genX1 = gen.flow(X1, y, batch_size = batch_size, seed = 1)
genX2 = gen.flow(X2, y, batch_size = batch_size, seed = 1)
genX3 = gen.flow(X3, y, batch_size = batch_size, seed = 1)
while True:
   X1i = genX1.next()
X2i = genX2.next()
X3i = genX3.next()
yield [X1i[0], X2i[0], X3i[0]], X1i[1]

EDIT (add generator, output image and numpy array, and target)

#X1 is an image, y is the target, X2 is a numpy array - other data input
def gen_flow_for_two_inputs(X1, X2, y):
   genX1 = gen.flow(X1, y, batch_size = batch_size, seed = 666)
genX2 = gen.flow(X1, X2, batch_size = batch_size, seed = 666)
while True:
   X1i = genX1.next()
X2i = genX2.next()
#Assert arrasy are equal - this was
for peace of mind, but slows down training
#np.testing.assert_array_equal(X1i[0], X2i[0])
yield [X1i[0], X2i[1]], X1i[1]

I have an implementation for multiple inputs for TimeseriesGenerator that I have adapted it (I have not been able to test it unfortunately) to meet this example with ImageDataGenerator. My approach was to build a wrapper class for the multiple generators from keras.utils.Sequence and then implement the base methods of it: __len__ and __getitem__:

from keras.preprocessing.image
import ImageDataGenerator
from keras.utils
import Sequence

class MultipleInputGenerator(Sequence):
   ""
"Wrapper of 2 ImageDataGenerator"
""

def __init__(self, X1, X2, Y, batch_size):
   # Keras generator
self.generator = ImageDataGenerator(rotation_range = 15,
   width_shift_range = 0.2,
   height_shift_range = 0.2,
   shear_range = 0.2,
   zoom_range = 0.2,
   horizontal_flip = True,
   fill_mode = 'nearest')

# Real time multiple input data augmentation
self.genX1 = self.generator.flow(X1, Y, batch_size = batch_size)
self.genX2 = self.generator.flow(X2, Y, batch_size = batch_size)

def __len__(self):
   ""
"It is mandatory to implement it on Keras Sequence"
""
return self.genX1.__len__()

def __getitem__(self, index):
   ""
"Getting items from the 2 generators and packing them"
""
X1_batch, Y_batch = self.genX1.__getitem__(index)
X2_batch, Y_batch = self.genX2.__getitem__(index)

X_batch = [X1_batch, X2_batch]

return X_batch, Y_batch

Suggestion : 2

I am trying to implement a custom data anycodings_keras generator for a model with 3 inputs and a anycodings_keras single output that deals with textual data anycodings_keras as follows:,The data generator code I found here, I anycodings_keras wonder how to modify it to accept multiple anycodings_keras input tensors.,All you need to do is modify Generator anycodings_keras class as follows.,How to handle multiple inputs to create nested state property

I am trying to implement a custom data anycodings_keras generator for a model with 3 inputs and a anycodings_keras single output that deals with textual data anycodings_keras as follows:

# dummy model
input_1 = Input(shape = (None, ))
input_2 = Input(shape = (None, ))
input_3 = Input(shape = (None, ))
combined = Concatenate(axis = -1)([input_1, input_2, input_3])
   ...
   dense_1 = Dense(10, activation = 'relu')(combined)
output_1 = Dense(1, activation = 'sigmoid')(dense_1)

model = Model([input_1, input_2, input_3], output_1)
print(model.summary())

#Compile and fit_generator
model.compile(optimizer = 'adam', loss = 'binary_crossentropy')

train_data_gen = Generator([x1_train, x2_train, x3_train], y_train, batch_size)
test_data_gen = Generator([x1_test, x2_test, x3_test], y_test, batch_size)

model.fit_generator(generator = train_data_gen, validation_data = test_data_gen, epochs = epochs, verbose = 1)

The data generator code I found here, I anycodings_keras wonder how to modify it to accept multiple anycodings_keras input tensors.

class Generator(Sequence):
   # Class is a dataset wrapper
for better training performance
def __init__(self, x_set, y_set, batch_size = 256):
   self.x, self.y = x_set, y_set
self.batch_size = batch_size
self.indices = np.arange(self.x.shape[0])

def __len__(self):
   return math.floor(self.x.shape[0] / self.batch_size)

def __getitem__(self, idx):
   inds = self.indices[idx * self.batch_size: (idx + 1) * self.batch_size]
batch_x = self.x[inds]
batch_y = self.y[inds]
return batch_x, batch_y

def on_epoch_end(self):
   np.random.shuffle(self.indices)

All you need to do is modify Generator anycodings_keras class as follows.

class Generator(Sequence):
   # Class is a dataset wrapper
for better training performance
def __init__(self, x_set, y_set, batch_size = 256):
   self.x, self.y = x_set, y_set
self.batch_size = batch_size
self.indices = np.arange(self.x[0].shape[0])

def __len__(self):
   return math.floor(self.x[0].shape[0] / self.batch_size)

def __getitem__(self, idx):
   inds = self.indices[idx * self.batch_size: (idx + 1) * self.batch_size]
batch_x = [self.x[0][inds], self.x[1][inds], self.x[2][inds]]
batch_y = self.y[inds]
return batch_x, batch_y

def on_epoch_end(self):
   np.random.shuffle(self.indices)

Suggestion : 3

In the case of multi-input or multi-output models, you can use lists as well:,This model will include all layers required in the computation of b given a.,ValueError: In case of mismatch between the provided input data and the model's expectations, or in case a stateful model receives a number of samples that is not a multiple of the batch size.,x: Numpy array of test data, or list of Numpy arrays if the model has multiple inputs. If all inputs in the model are named, you can also pass a dictionary mapping input names to Numpy arrays.

In the functional API, given some input tensor(s) and output tensor(s), you can instantiate a Model via:

from keras.models
import Model
from keras.layers
import Input, Dense

a = Input(shape = (32, ))
b = Dense(32)(a)
model = Model(inputs = a, outputs = b)

In the case of multi-input or multi-output models, you can use lists as well:

model = Model(inputs = [a1, a2], outputs = [b1, b3, b3])

compile

compile(self, optimizer, loss = None, metrics = None, loss_weights = None, sample_weight_mode = None, weighted_metrics = None, target_tensors = None)

evaluate

evaluate(self, x = None, y = None, batch_size = None, verbose = 1, sample_weight = None, steps = None)

predict

predict(self, x, batch_size = None, verbose = 0, steps = None)

Suggestion : 4

February 9, 2019 at 4:30 am,February 9, 2019 at 3:12 am,February 7, 2019 at 7:08 am,February 6, 2019 at 7:01 am

For example, we may define a simple sequential neural network as:

model = Sequential()
model.add(Dense(8, input_shape = (10, ), activation = "relu"))
model.add(Dense(4, activation = "relu"))
model.add(Dense(1, activation = "linear"))

We can define the sample neural network using the functional API:

inputs = Input(shape = (10, ))
x = Dense(8, activation = "relu")(inputs)
x = Dense(4, activation = "relu")(x)
x = Dense(1, activation = "linear")(x)
model = Model(inputs, x)

To see the power of Keras’ function API consider the following code where we create a model that accepts multiple inputs:

# define two sets of inputs
inputA = Input(shape = (32, ))
inputB = Input(shape = (128, ))

# the first branch operates on the first input
x = Dense(8, activation = "relu")(inputA)
x = Dense(4, activation = "relu")(x)
x = Model(inputs = inputA, outputs = x)

# the second branch opreates on the second input
y = Dense(64, activation = "relu")(inputB)
y = Dense(32, activation = "relu")(y)
y = Dense(4, activation = "relu")(y)
y = Model(inputs = inputB, outputs = y)

# combine the output of the two branches
combined = concatenate([x.output, y.output])

# apply a FC layer and then a regression prediction on the
# combined outputs
z = Dense(2, activation = "relu")(combined)
z = Dense(1, activation = "linear")(z)

# our model will accept the inputs of the two branches and
# then output a single value
model = Model(inputs = [x.input, y.input], outputs = z)

And from there you can download the House Prices dataset via:

$ git clone https: //github.com/emanhamed/Houses-dataset

Let’s take a look at how today’s project is organized:

$ tree--dirsfirst--filelimit 10
   .├──Houses - dataset│├── Houses\ Dataset[2141 entries]│└── README.md├── pyimagesearch│├── __init__.py│├── datasets.py│└── models.py└── mixed_training.py

3 directories, 5 files

Suggestion : 5

Last Updated : 25 Jun, 2020

Syntax:

fit(object, x = NULL, y = NULL, batch_size = NULL, epochs = 10,
   verbose = getOption("keras.fit_verbose",
      default = 1),
   callbacks = NULL, view_metrics = getOption("keras.view_metrics",
      default = "auto"), validation_split = 0, validation_data = NULL,
   shuffle = TRUE, class_weight = NULL, sample_weight = NULL,
   initial_epoch = 0, steps_per_epoch = NULL, validation_steps = NULL,
   ...)

Understanding few important arguments:

      - > object: the model to train. -
         > X: our training data.Can be Vector, array or matrix -
         > Y: our training labels.Can be Vector, array or matrix -
         > Batch_size: it can take any integer value or NULL and by
      default, it will
      be set to 32. It specifies no.of samples per gradient. -
         > Epochs: an integer and number of epochs we want to train our model
      for. -
      > Verbose: specifies verbosity mode(0 = silent, 1 = progress bar, 2 = one line per epoch). -
         > Shuffle: whether we want to shuffle our training data before each epoch. -
         > steps_per_epoch: it specifies the total number of steps taken before
      one epoch has finished and started the next epoch.By
      default it values is set to NULL.

How to use Keras fit:

model.fit(Xtrain, Ytrain, batch_size = 32, epochs = 100)