myqlm_workflows

This module contains the different myqlm workflows mandatory for the different PQCs evaluations needed for a training proccess

QQuantLib.qml4var.myqlm_workflows.cdf_workflow(weights, x_sample, **kwargs)

This function builds the mandatory stack for evaluating a PQC (provided in the kwargs) for the given weights and x_sample. Additionally, it executes the stack using the stack_execution function. So it computes the corresponding CDF value for the given weights and x_sample.

Parameters:
  • weights (np array) – array with the weights of the PQC

  • x_sample (np array) – input sample to provide to the PQC

  • kwargs (keyword arguments) – same kwargs like the provided to the stack_execution function

  • qpu_info (kwargs, dictionary) – Python dictionary with the infor for configuring a QPU

Returns:

results – Value of the CDF, computed using the input PQC, for the given weights and x_sample.

Return type:

float

QQuantLib.qml4var.myqlm_workflows.mse_workflow(weights, data_x, data_y, dask_client=None, **kwargs)

Computes the MSE function.

Parameters:

workflow_for_qdml (Same parameters that)

Returns:

mse__

Return type:

computed mse function value for the input data.

QQuantLib.qml4var.myqlm_workflows.pdf_workflow(weights, x_sample, **kwargs)

Given a PQC that computes the CDF (provided in the kwargs), this function builds the mandatory stack for computing the corresponding PDF for the given weights and x_sample. Additionally, it executes the stack using the stack_execution function. So it computes the corresponding PDF value for the given weights and x_sample.

Parameters:
  • weights (np array) – array with the weights of the PQC

  • x_sample (np array) – input sample to provide to the PQC

  • kwargs (keyword arguments.) – See cdf_workflow documentation

Returns:

results – Value of the PDF, computed using the input PQC, for the given weights and x_sample.

Return type:

float

QQuantLib.qml4var.myqlm_workflows.qdml_loss_workflow(weights, data_x, data_y, dask_client=None, **kwargs)

Workflow for computing the qdml loss function.

Parameters:

workflow_for_qdml (Same parameters that)

Returns:

loss_

Return type:

computed loss function value for the input data.

QQuantLib.qml4var.myqlm_workflows.qdml_loss_workflow_old(weights, data_x, data_y, dask_client=None, **kwargs)

Workflow for computing the qdml loss function.

Parameters:

workflow_for_qdml (Same parameters that)

Returns:

loss_

Return type:

computed loss function value for the input data.

QQuantLib.qml4var.myqlm_workflows.stack_execution(weights, x_sample, stack, **kwargs)

Given a stack, the weights, an input sample, a PQC and Observable (provided into the kewyword arguments) this function this function builds the corresponding QLM Batch of jobs and execute it.

Parameters:
  • weights (np array) – array with the weights of the PQC

  • x_sample (np array) – input sample to provide to the PQC

  • kwargs (keyword arguments)

  • pqc (kwargs, QLM Program) – qlm program with the implementation of the PQC

  • observable (kwargs, QLM Observable) – qlm observable with the Observable definition of the PQC

  • weights_names (kwargs, list) – list with the names of the parameters of the PQC corresponding to the weights

  • features_names (kwargs, list) – list with the names of the parameters of the PQC corresponding to the input features

  • nbshots (kwargs, int) – number of shots

Returns:

results – QLM BatchResult with the results of the execution of the stack

Return type:

QLM BatchResult

QQuantLib.qml4var.myqlm_workflows.workflow_execution(weights, data_x, workflow, dask_client=None)

Given an input weights, a complete dataset of features, and a properly configured workflow function (like cdf_workflow or pdf_workflow) executes the workflow for all the samples of the dataset. Dask cluster computation using map

Parameters:
  • weights (np array) – array with the weights of the PQC

  • data_x (np array) – array with the dataset of input features

  • dask_client (dask client) – Dask client for speed up computations

Returns:

y_data

Return type:

list

Note

The return of the function depends on dask_client. If dask_client is None then returns a list with results of the workflow for all input dataset. If a dask_client is passed then returns a list of futures and a gather operation should be executed for retrieving the data.

QQuantLib.qml4var.myqlm_workflows.workflow_execution_submit(weights, data_x, workflow, dask_client=None)

Given an input weights, a complete dataset of features, and a properly configured workflow function (like cdf_workflow or pdf_workflow) executes the workflow for all the samples of the dataset

Parameters:
  • weights (np array) – array with the weights of the PQC

  • data_x (np array) – array with the dataset of input features

  • dask_client (dask client) – Dask client for speed up computations

Returns:

  • y_data (depends on dask_client. If dask_client is None then returns)

  • a list with results of the workflow for all input dataset. If a

  • dask_client is passed then returns a list of futures and a gather

  • operation should be executed for retrieving the data.

QQuantLib.qml4var.myqlm_workflows.workflow_for_cdf(weights, data_x, dask_client=None, **kwargs)

Workflow for proccessing the CDF for the input data_x.

Parameters:
  • weights (numpy array) – Array with weights for PQC

  • data_x (numpy array) – Array with dataset of the features

  • dask_client (Dask client) – Dask client for speed up training. Not mandatory

  • kwargs (keyword arguments.) – See cdf_workflow function documentation.

Returns:

output_dict – dictionary with the computed arrays. Keys: data_y : input data_y data y_predict_cdf : CDF prediction for data_x

Return type:

dict

QQuantLib.qml4var.myqlm_workflows.workflow_for_pdf(weights, data_x, dask_client=None, **kwargs)

Workflow for proccessing the CDF for the input data_x.

Parameters:
  • weights (numpy array) – Array with weights for PQC

  • data_x (numpy array) – Array with dataset of the features

  • dask_client (Dask client) – Dask client for speed up training. Not mandatory

  • kwargs (keyword arguments.) – See pdf_workflow function documentation.

Returns:

output_dict – dictionary with the computed arrays. Keys: y_predict_pdf : PDF prediction for data_x

Return type:

dict

QQuantLib.qml4var.myqlm_workflows.workflow_for_qdml(weights, data_x, data_y, dask_client=None, **kwargs)

Workflow for proccessing the data and obtain the mandatory arrays for computing the desired loss.

Parameters:
  • weights (numpy array) – Array with weights for PQC.

  • data_x (numpy array) – Array with dataset of the features. Shape: (-1, number of features)

  • data_y (numpy array) – Array with targets (labes) dataset. Shape: (-1, 1)

  • dask_client (Dask client) – Dask client for speed up training. Not mandatory

  • kwargs (keyword arguments.) – See the cdf_workflow function. The following ones can be provided

  • minval (kwargs, list) – Minimum values for the domain of all the features.

  • maxval (kwargs, list) – Maximum values for the domain of all the features.

  • points (kwargs,int) – Number of points for a feature domain discretization

Returns:

output_dict – dictionary with the computed arrays. Keys: data_y : input data_y data y_predict_cdf : CDF prediction for data_x y_predict_pdf : PDF prediction for data_x x_integral : domain discretization for computing integral y_predict_pdf_domain : PDF of the x_integral

Return type:

dict