SlideShare une entreprise Scribd logo
1  sur  23
Quantum Generalized
Linear Models: A Proof
of Concept
Uchenna Chukwu and Colleen M. Farrelly, Quantopo LLC
Big Picture Summary
Overview and Implications
 Generalized linear models are the simplest instance of link-based statistical
models, which are based on the underlying geometry of an outcome’s underlying
probability distribution (typically from the exponential family).
 Machine learning algorithms provide alternative ways to minimize a model’s sum
of square error (error between predicted values and actual values of a test set).
 However, some deep results regarding the exponential family’s relation to affine
connections in differential geometry provide a possible alternative to link
functions:
1. Algorithms that either continuously deform the outcome distribution from known results
2. Algorithms that superpose all possible distributions and collapse to fit a dataset
 Leveraging the fact that some quantum computer gates, such as the non-Gaussian
transformation gate, essentially perform (1) natively and in a computationally-efficient
way!
 This project provides a proof-of-concept for leveraging specific hardware gates to
solve the affine connection problem, with benchmarking at state-of-the-art levels.
 Results can be extended to many other, more complicated statistical models, such
as generalized estimating equations, hierarchical regression models, and even
homotopy-continuation problems.
Generalized Linear Model
Background and Business Usage
An Introduction to Tweedie Models
Generalized Linear Models
 Generalized linear modeling (GLM) as extension of linear regression to
outcomes with probability distributions that are not Gaussian
 Includes binomial outcomes, Poisson outcomes, gamma outcomes, and many more
 Link functions to transform distribution of these outcomes to a normal distribution
to fit a linear model
 𝐸 𝑌 = µ = 𝑔−1
𝑿𝛽
 Var 𝑌 = 𝑉𝑎𝑟(µ) = 𝑉𝑎𝑟(𝑔−1
𝑿𝛽 )
 Where Y is a vector of outcome values, µ is the mean of Y, X is the matrix of predictor
values, g is a link function (such as the log function), and β is a vector of predictor
weights in the regression equation.
 Many statistical extensions:
 Generalized estimating equations (longitudinal data modeling)
 Generalized linear mixed models (longitudinal data with random effects)
 Generalized additive models (in which the predictor vectors can be transformed within
the model)
 Cox regression and Weibull-based regression (survival data modeling)
 Very high computational cost for many of these extensions
Important Applications of GLMs
 Ubiquitous in part failure modeling, medical research, actuarial science, and many other
problems
 Example problems:
 Modeling likelihood of insurance claims and expected payout (worldwide, a $5 trillion industry)
 Understanding risk behavior in medical research (daily heroin usage, sexual partners within prior month…)
 Modeling expected failure rates and associated conditions for airplane parts or machine parts within a
manufacturing plant (~$4 trillion industry in the USA alone)
 Modeling expected natural disaster impacts and precipitating factors related to impact extent
 Many supervised learning algorithms are extensions of generalized linear models and have link
functions built into the algorithm to model different outcome distributions
 Boosted regression, Morse-Smale regression, dgLARS, Bayesian model averaging…
 Optimization algorithm to find minimum sum of square error differ among machine learning methods and
with respect to GLMs, which use a least square error algorithm
 Methods like deep learning and classical neural networks attempt to solve this problem in a general way
through a series of general mappings leading to a potentially novel link function
 Exploiting the geometric relationships between distributions through a superposition of states collapsed
to the “ideal” link would present an optimal solution to the problem
 Tweedie regression as a general framework that handles many distributions in the exponential
family and the problem of overdispersion of model/outcome variance
 Very nice geometric properties
 Connected to many common exponential family distributions
Details of Tweedie Regression
 Many common distributions of the exponential family converge to Tweedie
distributions and can be formulated through Tweedie distributions, formally
defined as:
 𝐸 𝑌 = µ
 𝑉𝑎𝑟 𝑌 = 𝜑µ 𝜉
 where 𝜑 is the dispersion parameter, and 𝜉is the Tweedie parameter (or shape parameter)
 Tweedie distributions themselves enjoy a variety of useful properties:
 Reproductive properties that allow distributions to be added together to form new
distributions that are themselves Tweedie
 Varying Tweedie parameter and dispersion parameter to recover many exponential
family distributions used in GLMs:
 Tweedie parameter of 0 for normal distribution
 Tweedie parameter of 1 for Poisson distribution
 Tweedie parameter of 2 for gamma distribution
 Dispersion parameter for 0-inflated models and outliers, similar to negative binomial
regression models
The Problem of Overdispersion in
Tweedie Models
 Well-known statistical problem involving dispersion parameters, which relate
to the variance of a outcome
 Many GLMs and their machine learning extensions struggle on problems of
overdispersion
 Simulations show this behavior, particularly as dispersion parameter increases
substantially (values of 4+)
 Empirical datasets with 0-inflation and long tails
 Recent paper exploring bagged KNN models
 Demonstrates problem in simulations
 Demonstrates with open-source datasets, such as UCI’s Forest Fire dataset
 Models that work well, such as the KNN ensemble with varying k parameters, tend
to take a long time to compute
Common Tweedie Models
Family
Distribution
Dispersion
(extra 0’s
and tail
fatness)
Power (variance
proportional to
mean: 1/Power)
Normal 1 0
Poisson 1 1
Compound
Poisson
1 >1 and <2
Gamma 1 2
Inverse-Gaussian 1 3
Stable 1 >2 (Extreme >3)
Negative
Binomial
>1 1
Underdispersion
Poisson
<1 1
Unique Tweedie >=1 >=0
Connection of GLMs to Differential
Geometry
Motivation for Implementation on Xanadu System
Differential Geometry and the
Exponential Family
 Possible to formulate exponential
family distributions and their
parameterizations to form a series of
curves on a 2-dimensional surface
 Each curve defined by 2 points at
either end of the probability function,
0 and 1, connected by a line that
follows a shortest path following
parameterization of the distribution,
called a geodesic
 Because the exponential family can be
generalized into Tweedie distributions
through continuous transformations,
the geodesic connecting 0 and 1 can
flow across distributions defining the
2-dimensional surface in a continuous
manner (much like homotopy
continuation methods).
 This is an affine connection, and the
morphing of the line as it passes
parameters transforms one
distribution to another.
Consequences of Exponential Family
Geometry
 Analytically-
derived
results/equation
for one
distribution
morphed to fit
another
distribution
through
continuous
transformations!
 Limit theorems
derived by
continuous
deformations of
either moment
generating
functions or
characteristic
functions
Xanadu Technology and Suitability to
GLMs
 Xanadu’s qumode formulation makes ideal for implementing quantum GLMs
 Ability to perform linear algebra operations on physical data representations
 GLMs and their extensions all based on simple matrix operations
 𝑀𝑒𝑎𝑛 𝑌 = 𝑔−1
𝑿𝛽 + 𝜀
 Matrix multiplication and addition for the linear model 𝑿𝛽 + 𝜀 coupled with a continuous
transformation of the model results to fit the outcome distribution
 Non-Gaussian transformation gate provides perfect avenue to perform the affine
transformation related to the outcome distribution without a need to specific a
link function to approximate the geometry
 Should be able to approximate any continuous outcome’s distribution, creating potential
new “link functions” through this gate through affine transformation of the wavefunctions
representing the data
 Removes the need for approximations by easy-to-compute link transformations
 In theory, should approximate any continuous distribution, including ones that aren’t
included in common statistical packages implementing GLMs and their
longitudinal/survival data extensions
 Thus, Xanadu’s system provides a general solution to the linear regression equation with
many potential extensions to more sophisticated regression models!
Methods and Results on Example
Cases
Simulated overdispersion dataset and UCI Forest Fire dataset
Methodology
 Simulation
 Similar to simulations used in the KNN ensemble paper
 1000 observations with a 70/30 test/train split
 Tweedie outcome related to 3 predictors (1 interaction term, 1 main effect) with added noise
 Tweedie parameter=1, dispersion parameter=8
 1 noise variable added
 Empirical dataset
 UCI Repository’s Forest Fire dataset
 Notoriously difficult to beat the mean model with machine learning algorithms
 12 predictors (2 spatial coordinates of location, month, day, FFMC index, DMC index, DC index, ISI index,
temperature, relative humidity, wind, and rain) and 517 observations
 t-SNE was used to reduce the dimensionality of the predictor set to 4 components so as to make it compatible
with Xanadu’s capabilities.
 70/30 test/train split
 Comparison methods
 Boosted regression
 Random forest (tree-based bagged ensemble)
 DGLARS (tangent-space-based least angle regression model)
 BART (Bayesian-based tree ensemble)
 HLASSO (homotopy-based LASSO model)
 Poisson regression (GLM without any modifications)
Data Preprossessing
 Dimensionality reduction through t-SNE to create a set of 4 predictors and 1
outcome, such that predictors are uncorrelated when entered into models.
 Easier for systems to calculate with fewer variables.
 Decorrelation helps most regression methods, including linear models and tree
models.
 Other dimensionality reduction methods are possible, including the introduction of
factors from factor analytic models or combinations of linear/nonlinear,
global/local dimensionality reduction algorithms.
 Scaling of outcome to a scale of -3 to 3, such that the Xanadu simulation can
effectively model and process the data in qumodes.
 Slight warping of the most extreme values, but these are generally less than 5
observations per dataset.
 Other types of scaling might be useful to explore.
Qumodes Circuit Details
 GLMs can be embedded within Xanadu’s qumode quantum computer simulation
software (and qumode computer) with a singular value decomposition of the
𝛽 coefficient in the formulation:
 𝑀𝑒𝑎𝑛 𝑌 = 𝑔−1
𝑋𝛽
 This translates to 𝛽 = 𝑂1Σ𝑂2, which can be modeled through a series of quantum
circuit gates:
 Multiplication of X and an orthogonal matrix:
 | 𝑂1 𝑋 ≅ 𝑈1| 𝑋 , which corresponds to a linear interferometer gate (𝑈1) acting on X
 Multiplication of that result by a diagonal matrix:
 |Σ𝑂1 𝑋 ∝ 𝑆 𝑟 | 𝑂1 𝑋 , which corresponds to a squeezing gate that acts on a single qumode
 Multiplication of X and an orthogonal matrix:
 | 𝑂2Σ𝑂1 𝑋 ≅ 𝑈2|Σ 𝑂1 𝑋 , which corresponds to a linear interferometer gate (𝑈2) acting on the result
 Multiplication by a nonlinear function on this result:
 |𝑔−1
( 𝑂2Σ𝑂1 𝑋) ≅ Φ|𝑂2Σ𝑂1 𝑋 , which corresponds to the non-Gaussian gate acting on the result
 This gives a final result of gates acting upon the dataset as:
 Φ ∗ 𝒰2 ∗ 𝒮 ∗ 𝒰1| 𝑋 ∝ | 𝑔−1
𝑋𝛽
Qumodes Parameter Settings
 The algorithm simulation was created through Strawberry Fields.
 The deep learning framework already existed.
 Hidden layers and bias terms were removed to collapse to a generalized linear model
framework.
 The loss function optimized was mean square error, which corresponds to the loss
functions specified in the comparison algorithms.
 Qumode cut-off dimension was set to 10.
 Optimization via least squares was not available, so gradient descent was used with a
learning rate of 0.1 over 80 iterations.
 This gave a qumodes implementation of a quantum generalized linear model with a
boosting feel to it.
 Because the quantum computing component is inherently probabilitistic,
algorithms were run on the same training and test set multiple times to
average out quantum effects.
Results: Simulation of Overdispersion
Algorithm Scaled Model MSE
Random Forest 0.80
BART 0.78
Boosted Regression 0.78
DGLARS 0.81
HLASSO 0.81
GLM 0.81
QGLM 0.82
Mean 0.85
QGLMs yield slightly worse
prediction on the simulated
dataset. However, their
performance is not far off from
state-of-the art algorithms, and
some random error is expected
from the quantum machinery.
Results: Forest Fire Dataset
Algorithm Scaled Model MSE
Random Forest 0.125
BART 0.125
Boosted Regression 0.119
DGLARS 0.114
HLASSO 0.120
GLM 0.119
QGLM 0.106
Mean 0.115
QGLMs emerge as the best-performing algorithm on a difficult, real-
world dataset (Forest Fire dataset in the UCI repository). QGLMs
provide ~10% gain over the next best algorithm on this dataset. This
suggests that they work well on real data and difficult problems.
Conclusions
 This suggests that the qumodes formulation with its unique operators can
eliminate the need for link functions within linear models by exploiting the
geometry of the models and still give good prediction.
 Better than state-of-the-art prediction for a difficult Tweedie regression dataset
(UCI Forest Fire)
 Around state-of-the-art prediction for a simulated dataset
 This has the potential to bring statistical modeling into quantum computing,
by leveraging the underlying geometry and the connection between model
geometry and the geometry of quantum physics.
 Generalized estimating equations/generalized linear mixed models
 Structural equation models/hierarchical regression models
 Also a potential avenue through which to implement the homotopy continuation
method common in dynamic systems research and some machine learning models
(such as homotopy-based LASSO), which take a known problem’s solution and
continuously deform it to fit the problem of interest.
 Currently a computational challenge
 Limited to small datasets
References
 Amari, S. I. (1997). Information geometry. Contemporary Mathematics, 203, 81-96.
 Bulmer, M. G. (1974). On fitting the Poisson lognormal distribution to species-abundance data. Biometrics,
101-110.
 Buscemi, F. (2012). Comparison of quantum statistical models: equivalent conditions for sufficiency.
Communications in Mathematical Physics, 310(3), 625-647.
 Cortez, P., & Morais, A. D. J. R. (2007). A data mining approach to predict forest fires using meteorological
data.
 De Jong, P., & Heller, G. Z. (2008). Generalized linear models for insurance data (Vol. 10). Cambridge:
Cambridge University Press.
 Farrelly, C. M. (2017). KNN Ensembles for Tweedie Regression: The Power of Multiscale Neighborhoods.
arXiv preprint arXiv:1708.02122.
 Farrelly, C. M. (2017). Topology and Geometry in Machine Learning for Logistic Regression.
 Fehm, L., Beesdo, K., Jacobi, F., & Fiedler, A. (2008). Social anxiety disorder above and below the
diagnostic threshold: prevalence, comorbidity and impairment in the general population. Social psychiatry
and psychiatric epidemiology, 43(4), 257-265.
 Fergusson, D. M., Boden, J. M., & Horwood, L. J. (2006). Cannabis use and other illicit drug use: testing
the cannabis gateway hypothesis. Addiction, 101(4), 556-569.
 Frees, E. W., Lee, G., & Yang, L. (2016). Multivariate frequency-severity regression models in insurance.
Risks, 4(1), 4.
 Gardner, W., Mulvey, E. P., & Shaw, E. C. (1995). Regression analyses of counts and rates: Poisson,
overdispersed Poisson, and negative binomial models. Psychological bulletin, 118(3), 392.
 Herings, R., & Erkens, J. A. (2003). Increased suicide attempt rate among patients interrupting use of
atypical antipsychotics. Pharmacoepidemiology and drug safety, 12(5), 423-424.
References
 Jorgensen, B. (1997). The theory of dispersion models. CRC Press.
 Jørgensen, B., Goegebeur, Y., & Martínez, J. R. (2010). Dispersion models for extremes. Extremes, 13(4), 399-437.
 Killoran, N., Bromley, T. R., Arrazola, J. M., Schuld, M., Quesada, N., & Lloyd, S. (2018). Continuous-variable
quantum neural networks. arXiv preprint arXiv:1806.06871.
 Killoran, N., Izaac, J., Quesada, N., Bergholm, V., Amy, M., & Weedbrook, C. (2018). Strawberry Fields: A Software
Platform for Photonic Quantum Computing. arXiv preprint arXiv:1804.03159.
 Luitel, B. N. (2016). Prediction of North Atlantic tropical cyclone activity and rainfall (Doctoral dissertation, The
University of Iowa).
 Marriott, P. (1990). Applications of differential geometry to statistics (Doctoral dissertation, University of
Warwick).
 Mills, K. L., Teesson, M., Ross, J., & Peters, L. (2006). Trauma, PTSD, and substance use disorders: findings from
the Australian National Survey of Mental Health and Well-Being. American Journal of Psychiatry, 163(4), 652-658.
 Nielsen, F., & Garcia, V. (2009). Statistical exponential families: A digest with flash cards. arXiv preprint
arXiv:0911.4863.
 Osborne, M. R., Presnell, B., & Turlach, B. A. (2000). A new approach to variable selection in least squares
problems. IMA journal of numerical analysis, 20(3), 389-403.
 Pistone, G., & Rogantin, M. P. (1999). The exponential statistical manifold: mean parameters, orthogonality and
space transformations. Bernoulli, 5(4), 721-760.
 Sawalha, Z., & Sayed, T. (2006). Traffic accident modeling: some statistical issues. Canadian Journal of Civil
Engineering, 33(9), 1115-1124.
 Tweedie, M. C. K. (1984). An index which distinguishes between some important exponential families. In Statistics:
Applications and new directions: Proc. Indian statistical institute golden Jubilee International conference (Vol.
579, p. 6o4).

Contenu connexe

Tendances

Data Science - Part XIII - Hidden Markov Models
Data Science - Part XIII - Hidden Markov ModelsData Science - Part XIII - Hidden Markov Models
Data Science - Part XIII - Hidden Markov ModelsDerek Kane
 
Data Mining & Data Warehousing Lecture Notes
Data Mining & Data Warehousing Lecture NotesData Mining & Data Warehousing Lecture Notes
Data Mining & Data Warehousing Lecture NotesFellowBuddy.com
 
Data Mining & Applications
Data Mining & ApplicationsData Mining & Applications
Data Mining & ApplicationsFazle Rabbi Ador
 
Logistic regression: topological and geometric considerations
Logistic regression: topological and geometric considerationsLogistic regression: topological and geometric considerations
Logistic regression: topological and geometric considerationsColleen Farrelly
 
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...Edureka!
 
Data mining
Data mining Data mining
Data mining AthiraR23
 
Data Mining: Classification and analysis
Data Mining: Classification and analysisData Mining: Classification and analysis
Data Mining: Classification and analysisDataminingTools Inc
 
CLUSTERING IN DATA MINING.pdf
CLUSTERING IN DATA MINING.pdfCLUSTERING IN DATA MINING.pdf
CLUSTERING IN DATA MINING.pdfSowmyaJyothi3
 
Supervised and unsupervised learning
Supervised and unsupervised learningSupervised and unsupervised learning
Supervised and unsupervised learningAmAn Singh
 
Knowledge discovery thru data mining
Knowledge discovery thru data miningKnowledge discovery thru data mining
Knowledge discovery thru data miningDevakumar Jain
 
04 Classification in Data Mining
04 Classification in Data Mining04 Classification in Data Mining
04 Classification in Data MiningValerii Klymchuk
 
Big data
Big dataBig data
Big datahsn99
 
Machine learning Algorithms
Machine learning AlgorithmsMachine learning Algorithms
Machine learning AlgorithmsWalaa Hamdy Assy
 

Tendances (20)

Data Science - Part XIII - Hidden Markov Models
Data Science - Part XIII - Hidden Markov ModelsData Science - Part XIII - Hidden Markov Models
Data Science - Part XIII - Hidden Markov Models
 
Data Mining & Data Warehousing Lecture Notes
Data Mining & Data Warehousing Lecture NotesData Mining & Data Warehousing Lecture Notes
Data Mining & Data Warehousing Lecture Notes
 
Data Mining & Applications
Data Mining & ApplicationsData Mining & Applications
Data Mining & Applications
 
Logistic regression: topological and geometric considerations
Logistic regression: topological and geometric considerationsLogistic regression: topological and geometric considerations
Logistic regression: topological and geometric considerations
 
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...
Who is a Data Scientist? | How to become a Data Scientist? | Data Science Cou...
 
Morse-Smale Regression
Morse-Smale RegressionMorse-Smale Regression
Morse-Smale Regression
 
Data mining
Data mining Data mining
Data mining
 
Data integration
Data integrationData integration
Data integration
 
Data mining
Data mining Data mining
Data mining
 
Data Mining: Data processing
Data Mining: Data processingData Mining: Data processing
Data Mining: Data processing
 
Data analytics
Data analyticsData analytics
Data analytics
 
Data Mining: Classification and analysis
Data Mining: Classification and analysisData Mining: Classification and analysis
Data Mining: Classification and analysis
 
Tree building
Tree buildingTree building
Tree building
 
CLUSTERING IN DATA MINING.pdf
CLUSTERING IN DATA MINING.pdfCLUSTERING IN DATA MINING.pdf
CLUSTERING IN DATA MINING.pdf
 
Supervised and unsupervised learning
Supervised and unsupervised learningSupervised and unsupervised learning
Supervised and unsupervised learning
 
Feature scaling
Feature scalingFeature scaling
Feature scaling
 
Knowledge discovery thru data mining
Knowledge discovery thru data miningKnowledge discovery thru data mining
Knowledge discovery thru data mining
 
04 Classification in Data Mining
04 Classification in Data Mining04 Classification in Data Mining
04 Classification in Data Mining
 
Big data
Big dataBig data
Big data
 
Machine learning Algorithms
Machine learning AlgorithmsMachine learning Algorithms
Machine learning Algorithms
 

Similaire à Quantum generalized linear models

PyData Miami 2019, Quantum Generalized Linear Models
PyData Miami 2019, Quantum Generalized Linear ModelsPyData Miami 2019, Quantum Generalized Linear Models
PyData Miami 2019, Quantum Generalized Linear ModelsColleen Farrelly
 
Jgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging componentJgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging componentNiccolò Tubini
 
Data Science Meetup: DGLARS and Homotopy LASSO for Regression Models
Data Science Meetup: DGLARS and Homotopy LASSO for Regression ModelsData Science Meetup: DGLARS and Homotopy LASSO for Regression Models
Data Science Meetup: DGLARS and Homotopy LASSO for Regression ModelsColleen Farrelly
 
ProbabilisticModeling20080411
ProbabilisticModeling20080411ProbabilisticModeling20080411
ProbabilisticModeling20080411Clay Stanek
 
Guide for building GLMS
Guide for building GLMSGuide for building GLMS
Guide for building GLMSAli T. Lotia
 
2007 santiago marchi_cobem_2007
2007 santiago marchi_cobem_20072007 santiago marchi_cobem_2007
2007 santiago marchi_cobem_2007CosmoSantiago
 
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...theijes
 
Poor man's missing value imputation
Poor man's missing value imputationPoor man's missing value imputation
Poor man's missing value imputationLeonardo Auslender
 
Abrigo and love_2015_
Abrigo and love_2015_Abrigo and love_2015_
Abrigo and love_2015_Murtaza Khan
 
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...Martha Brown
 
ders 6 Panel data analysis.pptx
ders 6 Panel data analysis.pptxders 6 Panel data analysis.pptx
ders 6 Panel data analysis.pptxErgin Akalpler
 
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Power System Operation
 
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Power System Operation
 
Evaluating competing predictive distributions
Evaluating competing predictive distributionsEvaluating competing predictive distributions
Evaluating competing predictive distributionsAndreas Collett
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...IAEME Publication
 

Similaire à Quantum generalized linear models (20)

PyData Miami 2019, Quantum Generalized Linear Models
PyData Miami 2019, Quantum Generalized Linear ModelsPyData Miami 2019, Quantum Generalized Linear Models
PyData Miami 2019, Quantum Generalized Linear Models
 
beven 2001.pdf
beven 2001.pdfbeven 2001.pdf
beven 2001.pdf
 
Jgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging componentJgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging component
 
Data Science Meetup: DGLARS and Homotopy LASSO for Regression Models
Data Science Meetup: DGLARS and Homotopy LASSO for Regression ModelsData Science Meetup: DGLARS and Homotopy LASSO for Regression Models
Data Science Meetup: DGLARS and Homotopy LASSO for Regression Models
 
ProbabilisticModeling20080411
ProbabilisticModeling20080411ProbabilisticModeling20080411
ProbabilisticModeling20080411
 
Guide for building GLMS
Guide for building GLMSGuide for building GLMS
Guide for building GLMS
 
2007 santiago marchi_cobem_2007
2007 santiago marchi_cobem_20072007 santiago marchi_cobem_2007
2007 santiago marchi_cobem_2007
 
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
 
TO_EDIT
TO_EDITTO_EDIT
TO_EDIT
 
Poor man's missing value imputation
Poor man's missing value imputationPoor man's missing value imputation
Poor man's missing value imputation
 
Abrigo and love_2015_
Abrigo and love_2015_Abrigo and love_2015_
Abrigo and love_2015_
 
Cohen
CohenCohen
Cohen
 
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...
A General Purpose Exact Solution Method For Mixed Integer Concave Minimizatio...
 
ders 6 Panel data analysis.pptx
ders 6 Panel data analysis.pptxders 6 Panel data analysis.pptx
ders 6 Panel data analysis.pptx
 
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
 
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
Modeling Approaches and Studies of the Impact of Distributed Energy Resources...
 
Evaluating competing predictive distributions
Evaluating competing predictive distributionsEvaluating competing predictive distributions
Evaluating competing predictive distributions
 
panel data.ppt
panel data.pptpanel data.ppt
panel data.ppt
 
gamdependence_revision1
gamdependence_revision1gamdependence_revision1
gamdependence_revision1
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
 

Plus de Colleen Farrelly

Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Colleen Farrelly
 
Hands-On Network Science, PyData Global 2023
Hands-On Network Science, PyData Global 2023Hands-On Network Science, PyData Global 2023
Hands-On Network Science, PyData Global 2023Colleen Farrelly
 
Modeling Climate Change.pptx
Modeling Climate Change.pptxModeling Climate Change.pptx
Modeling Climate Change.pptxColleen Farrelly
 
Natural Language Processing for Beginners.pptx
Natural Language Processing for Beginners.pptxNatural Language Processing for Beginners.pptx
Natural Language Processing for Beginners.pptxColleen Farrelly
 
The Shape of Data--ODSC.pptx
The Shape of Data--ODSC.pptxThe Shape of Data--ODSC.pptx
The Shape of Data--ODSC.pptxColleen Farrelly
 
Generative AI, WiDS 2023.pptx
Generative AI, WiDS 2023.pptxGenerative AI, WiDS 2023.pptx
Generative AI, WiDS 2023.pptxColleen Farrelly
 
Emerging Technologies for Public Health in Remote Locations.pptx
Emerging Technologies for Public Health in Remote Locations.pptxEmerging Technologies for Public Health in Remote Locations.pptx
Emerging Technologies for Public Health in Remote Locations.pptxColleen Farrelly
 
Applications of Forman-Ricci Curvature.pptx
Applications of Forman-Ricci Curvature.pptxApplications of Forman-Ricci Curvature.pptx
Applications of Forman-Ricci Curvature.pptxColleen Farrelly
 
Geometry for Social Good.pptx
Geometry for Social Good.pptxGeometry for Social Good.pptx
Geometry for Social Good.pptxColleen Farrelly
 
Topology for Time Series.pptx
Topology for Time Series.pptxTopology for Time Series.pptx
Topology for Time Series.pptxColleen Farrelly
 
Time Series Applications AMLD.pptx
Time Series Applications AMLD.pptxTime Series Applications AMLD.pptx
Time Series Applications AMLD.pptxColleen Farrelly
 
An introduction to quantum machine learning.pptx
An introduction to quantum machine learning.pptxAn introduction to quantum machine learning.pptx
An introduction to quantum machine learning.pptxColleen Farrelly
 
An introduction to time series data with R.pptx
An introduction to time series data with R.pptxAn introduction to time series data with R.pptx
An introduction to time series data with R.pptxColleen Farrelly
 
NLP: Challenges and Opportunities in Underserved Areas
NLP: Challenges and Opportunities in Underserved AreasNLP: Challenges and Opportunities in Underserved Areas
NLP: Challenges and Opportunities in Underserved AreasColleen Farrelly
 
Geometry, Data, and One Path Into Data Science.pptx
Geometry, Data, and One Path Into Data Science.pptxGeometry, Data, and One Path Into Data Science.pptx
Geometry, Data, and One Path Into Data Science.pptxColleen Farrelly
 
Topological Data Analysis.pptx
Topological Data Analysis.pptxTopological Data Analysis.pptx
Topological Data Analysis.pptxColleen Farrelly
 
Transforming Text Data to Matrix Data via Embeddings.pptx
Transforming Text Data to Matrix Data via Embeddings.pptxTransforming Text Data to Matrix Data via Embeddings.pptx
Transforming Text Data to Matrix Data via Embeddings.pptxColleen Farrelly
 
Natural Language Processing in the Wild.pptx
Natural Language Processing in the Wild.pptxNatural Language Processing in the Wild.pptx
Natural Language Processing in the Wild.pptxColleen Farrelly
 
SAS Global 2021 Introduction to Natural Language Processing
SAS Global 2021 Introduction to Natural Language Processing SAS Global 2021 Introduction to Natural Language Processing
SAS Global 2021 Introduction to Natural Language Processing Colleen Farrelly
 
2021 American Mathematical Society Data Science Talk
2021 American Mathematical Society Data Science Talk2021 American Mathematical Society Data Science Talk
2021 American Mathematical Society Data Science TalkColleen Farrelly
 

Plus de Colleen Farrelly (20)

Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024
 
Hands-On Network Science, PyData Global 2023
Hands-On Network Science, PyData Global 2023Hands-On Network Science, PyData Global 2023
Hands-On Network Science, PyData Global 2023
 
Modeling Climate Change.pptx
Modeling Climate Change.pptxModeling Climate Change.pptx
Modeling Climate Change.pptx
 
Natural Language Processing for Beginners.pptx
Natural Language Processing for Beginners.pptxNatural Language Processing for Beginners.pptx
Natural Language Processing for Beginners.pptx
 
The Shape of Data--ODSC.pptx
The Shape of Data--ODSC.pptxThe Shape of Data--ODSC.pptx
The Shape of Data--ODSC.pptx
 
Generative AI, WiDS 2023.pptx
Generative AI, WiDS 2023.pptxGenerative AI, WiDS 2023.pptx
Generative AI, WiDS 2023.pptx
 
Emerging Technologies for Public Health in Remote Locations.pptx
Emerging Technologies for Public Health in Remote Locations.pptxEmerging Technologies for Public Health in Remote Locations.pptx
Emerging Technologies for Public Health in Remote Locations.pptx
 
Applications of Forman-Ricci Curvature.pptx
Applications of Forman-Ricci Curvature.pptxApplications of Forman-Ricci Curvature.pptx
Applications of Forman-Ricci Curvature.pptx
 
Geometry for Social Good.pptx
Geometry for Social Good.pptxGeometry for Social Good.pptx
Geometry for Social Good.pptx
 
Topology for Time Series.pptx
Topology for Time Series.pptxTopology for Time Series.pptx
Topology for Time Series.pptx
 
Time Series Applications AMLD.pptx
Time Series Applications AMLD.pptxTime Series Applications AMLD.pptx
Time Series Applications AMLD.pptx
 
An introduction to quantum machine learning.pptx
An introduction to quantum machine learning.pptxAn introduction to quantum machine learning.pptx
An introduction to quantum machine learning.pptx
 
An introduction to time series data with R.pptx
An introduction to time series data with R.pptxAn introduction to time series data with R.pptx
An introduction to time series data with R.pptx
 
NLP: Challenges and Opportunities in Underserved Areas
NLP: Challenges and Opportunities in Underserved AreasNLP: Challenges and Opportunities in Underserved Areas
NLP: Challenges and Opportunities in Underserved Areas
 
Geometry, Data, and One Path Into Data Science.pptx
Geometry, Data, and One Path Into Data Science.pptxGeometry, Data, and One Path Into Data Science.pptx
Geometry, Data, and One Path Into Data Science.pptx
 
Topological Data Analysis.pptx
Topological Data Analysis.pptxTopological Data Analysis.pptx
Topological Data Analysis.pptx
 
Transforming Text Data to Matrix Data via Embeddings.pptx
Transforming Text Data to Matrix Data via Embeddings.pptxTransforming Text Data to Matrix Data via Embeddings.pptx
Transforming Text Data to Matrix Data via Embeddings.pptx
 
Natural Language Processing in the Wild.pptx
Natural Language Processing in the Wild.pptxNatural Language Processing in the Wild.pptx
Natural Language Processing in the Wild.pptx
 
SAS Global 2021 Introduction to Natural Language Processing
SAS Global 2021 Introduction to Natural Language Processing SAS Global 2021 Introduction to Natural Language Processing
SAS Global 2021 Introduction to Natural Language Processing
 
2021 American Mathematical Society Data Science Talk
2021 American Mathematical Society Data Science Talk2021 American Mathematical Society Data Science Talk
2021 American Mathematical Society Data Science Talk
 

Dernier

Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our WorldEduminds Learning
 
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...KarteekMane1
 
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...Milind Agarwal
 
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024Susanna-Assunta Sansone
 
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptx
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptxThe Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptx
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptxTasha Penwell
 
What To Do For World Nature Conservation Day by Slidesgo.pptx
What To Do For World Nature Conservation Day by Slidesgo.pptxWhat To Do For World Nature Conservation Day by Slidesgo.pptx
What To Do For World Nature Conservation Day by Slidesgo.pptxSimranPal17
 
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Boston Institute of Analytics
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesTimothy Spann
 
SMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxSMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxHaritikaChhatwal1
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBoston Institute of Analytics
 
Principles and Practices of Data Visualization
Principles and Practices of Data VisualizationPrinciples and Practices of Data Visualization
Principles and Practices of Data VisualizationKianJazayeri1
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max PrincetonTimothy Spann
 
Cyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataCyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataTecnoIncentive
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Seán Kennedy
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectBoston Institute of Analytics
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryJeremy Anderson
 
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfEnglish-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfblazblazml
 
convolutional neural network and its applications.pdf
convolutional neural network and its applications.pdfconvolutional neural network and its applications.pdf
convolutional neural network and its applications.pdfSubhamKumar3239
 

Dernier (20)

Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our World
 
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...
wepik-insightful-infographics-a-data-visualization-overview-20240401133220kwr...
 
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
 
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
 
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptx
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptxThe Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptx
The Power of Data-Driven Storytelling_ Unveiling the Layers of Insight.pptx
 
What To Do For World Nature Conservation Day by Slidesgo.pptx
What To Do For World Nature Conservation Day by Slidesgo.pptxWhat To Do For World Nature Conservation Day by Slidesgo.pptx
What To Do For World Nature Conservation Day by Slidesgo.pptx
 
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
 
SMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxSMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptx
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
 
Principles and Practices of Data Visualization
Principles and Practices of Data VisualizationPrinciples and Practices of Data Visualization
Principles and Practices of Data Visualization
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max Princeton
 
Cyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataCyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded data
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis Project
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data Story
 
Data Analysis Project: Stroke Prediction
Data Analysis Project: Stroke PredictionData Analysis Project: Stroke Prediction
Data Analysis Project: Stroke Prediction
 
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfEnglish-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
 
convolutional neural network and its applications.pdf
convolutional neural network and its applications.pdfconvolutional neural network and its applications.pdf
convolutional neural network and its applications.pdf
 

Quantum generalized linear models

  • 1. Quantum Generalized Linear Models: A Proof of Concept Uchenna Chukwu and Colleen M. Farrelly, Quantopo LLC
  • 3. Overview and Implications  Generalized linear models are the simplest instance of link-based statistical models, which are based on the underlying geometry of an outcome’s underlying probability distribution (typically from the exponential family).  Machine learning algorithms provide alternative ways to minimize a model’s sum of square error (error between predicted values and actual values of a test set).  However, some deep results regarding the exponential family’s relation to affine connections in differential geometry provide a possible alternative to link functions: 1. Algorithms that either continuously deform the outcome distribution from known results 2. Algorithms that superpose all possible distributions and collapse to fit a dataset  Leveraging the fact that some quantum computer gates, such as the non-Gaussian transformation gate, essentially perform (1) natively and in a computationally-efficient way!  This project provides a proof-of-concept for leveraging specific hardware gates to solve the affine connection problem, with benchmarking at state-of-the-art levels.  Results can be extended to many other, more complicated statistical models, such as generalized estimating equations, hierarchical regression models, and even homotopy-continuation problems.
  • 4. Generalized Linear Model Background and Business Usage An Introduction to Tweedie Models
  • 5. Generalized Linear Models  Generalized linear modeling (GLM) as extension of linear regression to outcomes with probability distributions that are not Gaussian  Includes binomial outcomes, Poisson outcomes, gamma outcomes, and many more  Link functions to transform distribution of these outcomes to a normal distribution to fit a linear model  𝐸 𝑌 = µ = 𝑔−1 𝑿𝛽  Var 𝑌 = 𝑉𝑎𝑟(µ) = 𝑉𝑎𝑟(𝑔−1 𝑿𝛽 )  Where Y is a vector of outcome values, µ is the mean of Y, X is the matrix of predictor values, g is a link function (such as the log function), and β is a vector of predictor weights in the regression equation.  Many statistical extensions:  Generalized estimating equations (longitudinal data modeling)  Generalized linear mixed models (longitudinal data with random effects)  Generalized additive models (in which the predictor vectors can be transformed within the model)  Cox regression and Weibull-based regression (survival data modeling)  Very high computational cost for many of these extensions
  • 6. Important Applications of GLMs  Ubiquitous in part failure modeling, medical research, actuarial science, and many other problems  Example problems:  Modeling likelihood of insurance claims and expected payout (worldwide, a $5 trillion industry)  Understanding risk behavior in medical research (daily heroin usage, sexual partners within prior month…)  Modeling expected failure rates and associated conditions for airplane parts or machine parts within a manufacturing plant (~$4 trillion industry in the USA alone)  Modeling expected natural disaster impacts and precipitating factors related to impact extent  Many supervised learning algorithms are extensions of generalized linear models and have link functions built into the algorithm to model different outcome distributions  Boosted regression, Morse-Smale regression, dgLARS, Bayesian model averaging…  Optimization algorithm to find minimum sum of square error differ among machine learning methods and with respect to GLMs, which use a least square error algorithm  Methods like deep learning and classical neural networks attempt to solve this problem in a general way through a series of general mappings leading to a potentially novel link function  Exploiting the geometric relationships between distributions through a superposition of states collapsed to the “ideal” link would present an optimal solution to the problem  Tweedie regression as a general framework that handles many distributions in the exponential family and the problem of overdispersion of model/outcome variance  Very nice geometric properties  Connected to many common exponential family distributions
  • 7. Details of Tweedie Regression  Many common distributions of the exponential family converge to Tweedie distributions and can be formulated through Tweedie distributions, formally defined as:  𝐸 𝑌 = µ  𝑉𝑎𝑟 𝑌 = 𝜑µ 𝜉  where 𝜑 is the dispersion parameter, and 𝜉is the Tweedie parameter (or shape parameter)  Tweedie distributions themselves enjoy a variety of useful properties:  Reproductive properties that allow distributions to be added together to form new distributions that are themselves Tweedie  Varying Tweedie parameter and dispersion parameter to recover many exponential family distributions used in GLMs:  Tweedie parameter of 0 for normal distribution  Tweedie parameter of 1 for Poisson distribution  Tweedie parameter of 2 for gamma distribution  Dispersion parameter for 0-inflated models and outliers, similar to negative binomial regression models
  • 8. The Problem of Overdispersion in Tweedie Models  Well-known statistical problem involving dispersion parameters, which relate to the variance of a outcome  Many GLMs and their machine learning extensions struggle on problems of overdispersion  Simulations show this behavior, particularly as dispersion parameter increases substantially (values of 4+)  Empirical datasets with 0-inflation and long tails  Recent paper exploring bagged KNN models  Demonstrates problem in simulations  Demonstrates with open-source datasets, such as UCI’s Forest Fire dataset  Models that work well, such as the KNN ensemble with varying k parameters, tend to take a long time to compute
  • 9. Common Tweedie Models Family Distribution Dispersion (extra 0’s and tail fatness) Power (variance proportional to mean: 1/Power) Normal 1 0 Poisson 1 1 Compound Poisson 1 >1 and <2 Gamma 1 2 Inverse-Gaussian 1 3 Stable 1 >2 (Extreme >3) Negative Binomial >1 1 Underdispersion Poisson <1 1 Unique Tweedie >=1 >=0
  • 10. Connection of GLMs to Differential Geometry Motivation for Implementation on Xanadu System
  • 11. Differential Geometry and the Exponential Family  Possible to formulate exponential family distributions and their parameterizations to form a series of curves on a 2-dimensional surface  Each curve defined by 2 points at either end of the probability function, 0 and 1, connected by a line that follows a shortest path following parameterization of the distribution, called a geodesic  Because the exponential family can be generalized into Tweedie distributions through continuous transformations, the geodesic connecting 0 and 1 can flow across distributions defining the 2-dimensional surface in a continuous manner (much like homotopy continuation methods).  This is an affine connection, and the morphing of the line as it passes parameters transforms one distribution to another.
  • 12. Consequences of Exponential Family Geometry  Analytically- derived results/equation for one distribution morphed to fit another distribution through continuous transformations!  Limit theorems derived by continuous deformations of either moment generating functions or characteristic functions
  • 13. Xanadu Technology and Suitability to GLMs  Xanadu’s qumode formulation makes ideal for implementing quantum GLMs  Ability to perform linear algebra operations on physical data representations  GLMs and their extensions all based on simple matrix operations  𝑀𝑒𝑎𝑛 𝑌 = 𝑔−1 𝑿𝛽 + 𝜀  Matrix multiplication and addition for the linear model 𝑿𝛽 + 𝜀 coupled with a continuous transformation of the model results to fit the outcome distribution  Non-Gaussian transformation gate provides perfect avenue to perform the affine transformation related to the outcome distribution without a need to specific a link function to approximate the geometry  Should be able to approximate any continuous outcome’s distribution, creating potential new “link functions” through this gate through affine transformation of the wavefunctions representing the data  Removes the need for approximations by easy-to-compute link transformations  In theory, should approximate any continuous distribution, including ones that aren’t included in common statistical packages implementing GLMs and their longitudinal/survival data extensions  Thus, Xanadu’s system provides a general solution to the linear regression equation with many potential extensions to more sophisticated regression models!
  • 14. Methods and Results on Example Cases Simulated overdispersion dataset and UCI Forest Fire dataset
  • 15. Methodology  Simulation  Similar to simulations used in the KNN ensemble paper  1000 observations with a 70/30 test/train split  Tweedie outcome related to 3 predictors (1 interaction term, 1 main effect) with added noise  Tweedie parameter=1, dispersion parameter=8  1 noise variable added  Empirical dataset  UCI Repository’s Forest Fire dataset  Notoriously difficult to beat the mean model with machine learning algorithms  12 predictors (2 spatial coordinates of location, month, day, FFMC index, DMC index, DC index, ISI index, temperature, relative humidity, wind, and rain) and 517 observations  t-SNE was used to reduce the dimensionality of the predictor set to 4 components so as to make it compatible with Xanadu’s capabilities.  70/30 test/train split  Comparison methods  Boosted regression  Random forest (tree-based bagged ensemble)  DGLARS (tangent-space-based least angle regression model)  BART (Bayesian-based tree ensemble)  HLASSO (homotopy-based LASSO model)  Poisson regression (GLM without any modifications)
  • 16. Data Preprossessing  Dimensionality reduction through t-SNE to create a set of 4 predictors and 1 outcome, such that predictors are uncorrelated when entered into models.  Easier for systems to calculate with fewer variables.  Decorrelation helps most regression methods, including linear models and tree models.  Other dimensionality reduction methods are possible, including the introduction of factors from factor analytic models or combinations of linear/nonlinear, global/local dimensionality reduction algorithms.  Scaling of outcome to a scale of -3 to 3, such that the Xanadu simulation can effectively model and process the data in qumodes.  Slight warping of the most extreme values, but these are generally less than 5 observations per dataset.  Other types of scaling might be useful to explore.
  • 17. Qumodes Circuit Details  GLMs can be embedded within Xanadu’s qumode quantum computer simulation software (and qumode computer) with a singular value decomposition of the 𝛽 coefficient in the formulation:  𝑀𝑒𝑎𝑛 𝑌 = 𝑔−1 𝑋𝛽  This translates to 𝛽 = 𝑂1Σ𝑂2, which can be modeled through a series of quantum circuit gates:  Multiplication of X and an orthogonal matrix:  | 𝑂1 𝑋 ≅ 𝑈1| 𝑋 , which corresponds to a linear interferometer gate (𝑈1) acting on X  Multiplication of that result by a diagonal matrix:  |Σ𝑂1 𝑋 ∝ 𝑆 𝑟 | 𝑂1 𝑋 , which corresponds to a squeezing gate that acts on a single qumode  Multiplication of X and an orthogonal matrix:  | 𝑂2Σ𝑂1 𝑋 ≅ 𝑈2|Σ 𝑂1 𝑋 , which corresponds to a linear interferometer gate (𝑈2) acting on the result  Multiplication by a nonlinear function on this result:  |𝑔−1 ( 𝑂2Σ𝑂1 𝑋) ≅ Φ|𝑂2Σ𝑂1 𝑋 , which corresponds to the non-Gaussian gate acting on the result  This gives a final result of gates acting upon the dataset as:  Φ ∗ 𝒰2 ∗ 𝒮 ∗ 𝒰1| 𝑋 ∝ | 𝑔−1 𝑋𝛽
  • 18. Qumodes Parameter Settings  The algorithm simulation was created through Strawberry Fields.  The deep learning framework already existed.  Hidden layers and bias terms were removed to collapse to a generalized linear model framework.  The loss function optimized was mean square error, which corresponds to the loss functions specified in the comparison algorithms.  Qumode cut-off dimension was set to 10.  Optimization via least squares was not available, so gradient descent was used with a learning rate of 0.1 over 80 iterations.  This gave a qumodes implementation of a quantum generalized linear model with a boosting feel to it.  Because the quantum computing component is inherently probabilitistic, algorithms were run on the same training and test set multiple times to average out quantum effects.
  • 19. Results: Simulation of Overdispersion Algorithm Scaled Model MSE Random Forest 0.80 BART 0.78 Boosted Regression 0.78 DGLARS 0.81 HLASSO 0.81 GLM 0.81 QGLM 0.82 Mean 0.85 QGLMs yield slightly worse prediction on the simulated dataset. However, their performance is not far off from state-of-the art algorithms, and some random error is expected from the quantum machinery.
  • 20. Results: Forest Fire Dataset Algorithm Scaled Model MSE Random Forest 0.125 BART 0.125 Boosted Regression 0.119 DGLARS 0.114 HLASSO 0.120 GLM 0.119 QGLM 0.106 Mean 0.115 QGLMs emerge as the best-performing algorithm on a difficult, real- world dataset (Forest Fire dataset in the UCI repository). QGLMs provide ~10% gain over the next best algorithm on this dataset. This suggests that they work well on real data and difficult problems.
  • 21. Conclusions  This suggests that the qumodes formulation with its unique operators can eliminate the need for link functions within linear models by exploiting the geometry of the models and still give good prediction.  Better than state-of-the-art prediction for a difficult Tweedie regression dataset (UCI Forest Fire)  Around state-of-the-art prediction for a simulated dataset  This has the potential to bring statistical modeling into quantum computing, by leveraging the underlying geometry and the connection between model geometry and the geometry of quantum physics.  Generalized estimating equations/generalized linear mixed models  Structural equation models/hierarchical regression models  Also a potential avenue through which to implement the homotopy continuation method common in dynamic systems research and some machine learning models (such as homotopy-based LASSO), which take a known problem’s solution and continuously deform it to fit the problem of interest.  Currently a computational challenge  Limited to small datasets
  • 22. References  Amari, S. I. (1997). Information geometry. Contemporary Mathematics, 203, 81-96.  Bulmer, M. G. (1974). On fitting the Poisson lognormal distribution to species-abundance data. Biometrics, 101-110.  Buscemi, F. (2012). Comparison of quantum statistical models: equivalent conditions for sufficiency. Communications in Mathematical Physics, 310(3), 625-647.  Cortez, P., & Morais, A. D. J. R. (2007). A data mining approach to predict forest fires using meteorological data.  De Jong, P., & Heller, G. Z. (2008). Generalized linear models for insurance data (Vol. 10). Cambridge: Cambridge University Press.  Farrelly, C. M. (2017). KNN Ensembles for Tweedie Regression: The Power of Multiscale Neighborhoods. arXiv preprint arXiv:1708.02122.  Farrelly, C. M. (2017). Topology and Geometry in Machine Learning for Logistic Regression.  Fehm, L., Beesdo, K., Jacobi, F., & Fiedler, A. (2008). Social anxiety disorder above and below the diagnostic threshold: prevalence, comorbidity and impairment in the general population. Social psychiatry and psychiatric epidemiology, 43(4), 257-265.  Fergusson, D. M., Boden, J. M., & Horwood, L. J. (2006). Cannabis use and other illicit drug use: testing the cannabis gateway hypothesis. Addiction, 101(4), 556-569.  Frees, E. W., Lee, G., & Yang, L. (2016). Multivariate frequency-severity regression models in insurance. Risks, 4(1), 4.  Gardner, W., Mulvey, E. P., & Shaw, E. C. (1995). Regression analyses of counts and rates: Poisson, overdispersed Poisson, and negative binomial models. Psychological bulletin, 118(3), 392.  Herings, R., & Erkens, J. A. (2003). Increased suicide attempt rate among patients interrupting use of atypical antipsychotics. Pharmacoepidemiology and drug safety, 12(5), 423-424.
  • 23. References  Jorgensen, B. (1997). The theory of dispersion models. CRC Press.  Jørgensen, B., Goegebeur, Y., & Martínez, J. R. (2010). Dispersion models for extremes. Extremes, 13(4), 399-437.  Killoran, N., Bromley, T. R., Arrazola, J. M., Schuld, M., Quesada, N., & Lloyd, S. (2018). Continuous-variable quantum neural networks. arXiv preprint arXiv:1806.06871.  Killoran, N., Izaac, J., Quesada, N., Bergholm, V., Amy, M., & Weedbrook, C. (2018). Strawberry Fields: A Software Platform for Photonic Quantum Computing. arXiv preprint arXiv:1804.03159.  Luitel, B. N. (2016). Prediction of North Atlantic tropical cyclone activity and rainfall (Doctoral dissertation, The University of Iowa).  Marriott, P. (1990). Applications of differential geometry to statistics (Doctoral dissertation, University of Warwick).  Mills, K. L., Teesson, M., Ross, J., & Peters, L. (2006). Trauma, PTSD, and substance use disorders: findings from the Australian National Survey of Mental Health and Well-Being. American Journal of Psychiatry, 163(4), 652-658.  Nielsen, F., & Garcia, V. (2009). Statistical exponential families: A digest with flash cards. arXiv preprint arXiv:0911.4863.  Osborne, M. R., Presnell, B., & Turlach, B. A. (2000). A new approach to variable selection in least squares problems. IMA journal of numerical analysis, 20(3), 389-403.  Pistone, G., & Rogantin, M. P. (1999). The exponential statistical manifold: mean parameters, orthogonality and space transformations. Bernoulli, 5(4), 721-760.  Sawalha, Z., & Sayed, T. (2006). Traffic accident modeling: some statistical issues. Canadian Journal of Civil Engineering, 33(9), 1115-1124.  Tweedie, M. C. K. (1984). An index which distinguishes between some important exponential families. In Statistics: Applications and new directions: Proc. Indian statistical institute golden Jubilee International conference (Vol. 579, p. 6o4).

Notes de l'éditeur

  1. Jorgensen, B. (1997). The theory of dispersion models. CRC Press. Kendal, W. S., & Jørgensen, B. (2011). Tweedie convergence: A mathematical basis for Taylor's power law, 1/f noise, and multifractality. Physical review E, 84(6), 066120.
  2. Marriott, P. (1990). Applications of differential geometry to statistics (Doctoral dissertation, University of Warwick). Nielsen, F., & Garcia, V. (2009). Statistical exponential families: A digest with flash cards. arXiv preprint arXiv:0911.4863. Amari, S. I. (1997). Information geometry. Contemporary Mathematics, 203, 81-96. Pistone, G., & Rogantin, M. P. (1999). The exponential statistical manifold: mean parameters, orthogonality and space transformations. Bernoulli, 5(4), 721-760.
  3. Jørgensen, B., & Martínez, J. R. (2013, March). Multivariate exponential dispersion models. In Multivariate Statistics: Theory and Applications. Proceedings of the IX Tartu Conference on Multivariate Statistics & XX International Workshop on Matrices and Statistics (pp. 73-98). Jorgensen, B. (1997). The theory of dispersion models. CRC Press. Sawalha, Z., & Sayed, T. (2006). Traffic accident modeling: some statistical issues. Canadian Journal of Civil Engineering, 33(9), 1115-1124. Jørgensen, B., Goegebeur, Y., & Martínez, J. R. (2010). Dispersion models for extremes. Extremes, 13(4), 399-437. Marriott, P. (1990). Applications of differential geometry to statistics (Doctoral dissertation, University of Warwick).