trainable_variables return zero after calling them through the method

Solution for trainable_variables return zero after calling them through the method
is Given Below:

I have multiple API functional CNN models, after defining them into a method, I append one by one in a list to reach them after calling the method. Into the method, my models’ trainable_variables are not zero but when I call the method and put the return’s value into a list my models’ trainable_variables are zero.

this is my method:

def define_discriminator(n_blocks, input_shape=(4,4,1)):
# weight initialization
init = tf.keras.initializers.HeNormal
# weight constraint
# const = max_norm(1.0)
const = ClipConstraint(0.1)
model_list = list()
# base model input
in_image = Input(shape=input_shape)
# conv 1x1
conv1 = Conv2D(128, (1,1), padding='same', kernel_initializer=init, kernel_constraint= const)(in_image)
lk1 = LeakyReLU(alpha=0.2)(conv1)
# conv 3x3 (output block)
mb = MinibatchStdev()(lk1)
conv2 = Conv2D(128, (3,3), padding='same', kernel_initializer=init, kernel_constraint= const)(mb)
lk2 = LeakyReLU(alpha=0.2)(conv2)
# conv 4x4
conv3 = Conv2D(128, (4,4), padding='same', kernel_initializer=init, kernel_constraint= const)(lk2)
lk3 = LeakyReLU(alpha=0.2)(conv3)
# dense output layer
flat = Flatten()(lk3)
out_class = Dense(1)(flat)
# define model
model = Model(in_image, out_class)
# print(model.summary())
# plot graph
plot_model(model, to_file="multilayer_perceptron_graph.png")
# compile model
# model.compile(loss=wasserstein_loss, optimizer=RMSprop(lr=0.000002 ,epsilon=10e-8))
# store model
model_list.append([model, model])
# create submodels
for i in range(1, n_blocks):
    # get prior model without the fade-on
    old_model = model_list[i - 1][0]
    # old_model.summary() 
    # create new model for next resolution
    models = add_discriminator_block(old_model)
    # print(models[0].summary())
    # print(models[1].summary())
    # store model
    model_list.append(models)
return model_list

model_list contains 6 CNN models. when I print the model.summary() there isn’t any problem. but when I call the method I’ve faced with trainable_variables = 0,

d_models = define_discriminator(6)