We present a novel hybrid algorithm for training deep neural networks that combines the state-of-the-art gradient descent (gd) method with a mixed integerlinear programming (milp) solver, outperforming gd and variants in terms of accuracy as well as resource and data efficiency for both regression and classification tasks.
Our algorithm, called gdsolver, works as follows : given a dnn as input, gdsolver invokes gd to partially train until it gets stuck in a local minima, at which point it invokes an milpsolver to exhaustively search a region of the loss landscape around the weightassignments of final layer parameters with the goal of tunnelling through and escaping the local minima.
We present the relevant results and proofs from the theory of continued fractions in detail (even in more detail than in text books) filling the gap to allow a complete comprehension of the algorithm of shor for prime factorization.
We propose a hybrid quantum-classical algorithm to solve quadratic-constrained binary optimization models for loan collection optimization.
The objective is to find a set of optimal loan collection actions that maximizes the expected net profit presenteded to the bank as well as the financial welfare in the financial network of loanees, while keeping the loan loss provision at its minimum.