adam

” Adam

QQuantLib.qml4var.adam.adam_optimizer_loop(weights_dict, loss_function, metric_function, gradient_function, batch_generator, initial_time=0, **kwargs)
Parameters:
  • weights_dict (dict) – dictionary with the weights to fit

  • loss_function (function) – function for computing the loss function

  • metric_function (fuction) – function for computing the metric function

  • gradient_function (function) – function for computing the gradient of the loss function

  • batch_generator (function) – function for generating batches of the trainin data.

  • initial_time (int) – Initial time step

  • kwargs (keyword arguments) – arguments for configuring optimizer. For ADAM:

  • store_folder (kwargs, str) – Folder for saving results. If None not saving

  • epochs (kwargs, int) – Maximum number of iterations

  • tolerance (kwargs, float) – Tolerance to achieve

  • n_counts_tolerance (kwargs, int) – Number of times the tolerance should be achieved in consecutive iterations

  • print_step (kwargs, int) – Print_step for printing evolution of training

  • learning_rate (kwargs,float) – Learning_rate for ADAM

  • beta1 (kwargs, float) – beta1 for ADAM

  • beta2 (kwargs, float) – beta2 for ADAM

QQuantLib.qml4var.adam.initialize_adam(parameters)

Initialize the parameters of ADAM

QQuantLib.qml4var.adam.save_stuff(weights, weights_names, t_, loss_, metric_mse_=None, file_to_save=None)

Save stuff

QQuantLib.qml4var.adam.update_parameters_with_adam(x, grads, s, v, t, learning_rate=0.01, beta1=0.9, beta2=0.999, epsilon=1e-08)

Update the parameters of ADAM