Numerical Methods and Statistics: Algorithms and Statistical Inference
VerifiedAdded on 2022/09/14
|6
|1322
|28
Report
AI Summary
This report provides an overview of several numerical methods, focusing on their performance and application within the field of statistical inference. It begins by examining root-finding algorithms, including the Bisection and Newton-Raphson methods, discussing their convergence rates and error handling. The report then explores optimization techniques, specifically genetic algorithms, and their use in solving constrained and unconstrained problems. Numerical integration methods, such as the rectangle and trapezoidal rules, are also covered, along with Monte Carlo integration and its applications. The report emphasizes the relationship between these numerical methods and statistical inference, highlighting their use in computing parameter estimates, obtaining OLS estimators, and building confidence levels. Furthermore, it touches on the use of genetic algorithms in regression models and the application of Monte Carlo methods in Bayesian modeling, including Metropolis-Hastings and Gibbs sampling. The conclusion emphasizes the importance of efficient resource utilization in evaluating algorithm performance and their widespread applications in statistical inference.

Running head: Numerical Methods & statistics. 1
Name of the Student
Name of the University
Author’s Note.
Name of the Student
Name of the University
Author’s Note.
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Numerical Methods & statistics. 2
This report seeks to find out 4 Numerical Methods; specifically describing the methods of
how to evaluate their performance, and how they are related to the statistical inferential field.
Root Finding Algorithms.
Root finding algorithms are given in the form (Dey, 2015):find x s . t f ( x )=0, then such an
x is the root of the function f. During evaluation of methods, time and the iteration that yield the
root for the root finding methods is usually the key consideration (Dey, 2015). The rate of
convergence thus will always depend on the first initial value given and could be either linear,
quadratic or more. (Dey, 2015).
First, considering the Bisection method (most primitive method),applies the Intermediate
value theorem: i.e. a function f ( x )=0 that is continuous everywhere at f ( a ) and at f ( b ) will
always have opposite signs and less than zero so that there is a root in between. The convergence
is linear especially during error finding but has however, a slow rate.
For the Newton-Raphson method, this is the most employed root-finding formula. It is a
generalization of the Taylor series expansion i.e. f ( xn +1 )=∑
n=0
∞ f ( n ) ( xn )
n ! ( xn +1−xn ) n where the version
is approximated by truncation i.e. f ( xn +1 ) ≅ f ( xn ) + f ( 1 ) ( xn ) ( xn+1−xn ) = 0 at the x axis (Chapra &
Canale, 2012; Dey, 2015). This can be solved as
xn+1=xn − f ( xn )
f (1 ) ( xn ) .
The convergence of the errors obtained is usually quadratic in nature (Dey, 2015). This implies
that it’s rate of convergence is faster as compared to bisection method hence its effectivity. The
selection of initial guesses determines the convergence of the iteration
This report seeks to find out 4 Numerical Methods; specifically describing the methods of
how to evaluate their performance, and how they are related to the statistical inferential field.
Root Finding Algorithms.
Root finding algorithms are given in the form (Dey, 2015):find x s . t f ( x )=0, then such an
x is the root of the function f. During evaluation of methods, time and the iteration that yield the
root for the root finding methods is usually the key consideration (Dey, 2015). The rate of
convergence thus will always depend on the first initial value given and could be either linear,
quadratic or more. (Dey, 2015).
First, considering the Bisection method (most primitive method),applies the Intermediate
value theorem: i.e. a function f ( x )=0 that is continuous everywhere at f ( a ) and at f ( b ) will
always have opposite signs and less than zero so that there is a root in between. The convergence
is linear especially during error finding but has however, a slow rate.
For the Newton-Raphson method, this is the most employed root-finding formula. It is a
generalization of the Taylor series expansion i.e. f ( xn +1 )=∑
n=0
∞ f ( n ) ( xn )
n ! ( xn +1−xn ) n where the version
is approximated by truncation i.e. f ( xn +1 ) ≅ f ( xn ) + f ( 1 ) ( xn ) ( xn+1−xn ) = 0 at the x axis (Chapra &
Canale, 2012; Dey, 2015). This can be solved as
xn+1=xn − f ( xn )
f (1 ) ( xn ) .
The convergence of the errors obtained is usually quadratic in nature (Dey, 2015). This implies
that it’s rate of convergence is faster as compared to bisection method hence its effectivity. The
selection of initial guesses determines the convergence of the iteration

Numerical Methods & statistics. 3
Optimization.
Furthermore, Optimization (genetic algorithm) bases itself on natural selection process to
solve both constrained and unconstrained optimization problems (Genetic Algorithm - MATLAB
& Simulink, n.d.). Discontinuous, nondifferentiable, stochastic, or nonlinear problem can also be
optimized by the genetic algorithm (Genetic Algorithm - MATLAB & Simulink, n.d.). Accessing
its performance relies on the following.
The number of objective and constraints function evaluation.
The number of iterations (speed).
Accuracy of solutions (whether it is correlated with computational effort).
Numerical Integration.
Delving into Numerical integration (Quadrature: rectangle rule, trapezoidal rule),
algorithms are based on finding the numerical solutions to differential equations (Press et al.,
2007). Its application is directed to a one-dimensional integral i.e., approximating solutions and
finding degree of accuracy for ∫
a
b
f ( x ) dx. The midpoint rule: ∫
a
b
f ( x ) dx ≈(b−a)f ⟨ a+b
2 ⟩
is the simplest method and uses interpolating polynomials that pass through ¿ on a line (Chapra
& Canale, 2012; Press et al., 2007) . For the trapezoidal rule, interpolating polynomials that
passes through points (a , f ( a ) ) and (b , f ( b )) through the affine model. Trapezoidal rule:
∫
a
b
f ( x ) dx ≈(b−a) ⟨ f (a)+ f (b)
2 ⟩.
Monte Carlo Integration.
Finally, while looking at very high dimensional problem, Monte Carlo and its methods
become more efficient. This computer-based analysis uses statistical sampling techniques to
solve models and estimate there parameters (Firestone et al., 1997). Error handling in MC is
Optimization.
Furthermore, Optimization (genetic algorithm) bases itself on natural selection process to
solve both constrained and unconstrained optimization problems (Genetic Algorithm - MATLAB
& Simulink, n.d.). Discontinuous, nondifferentiable, stochastic, or nonlinear problem can also be
optimized by the genetic algorithm (Genetic Algorithm - MATLAB & Simulink, n.d.). Accessing
its performance relies on the following.
The number of objective and constraints function evaluation.
The number of iterations (speed).
Accuracy of solutions (whether it is correlated with computational effort).
Numerical Integration.
Delving into Numerical integration (Quadrature: rectangle rule, trapezoidal rule),
algorithms are based on finding the numerical solutions to differential equations (Press et al.,
2007). Its application is directed to a one-dimensional integral i.e., approximating solutions and
finding degree of accuracy for ∫
a
b
f ( x ) dx. The midpoint rule: ∫
a
b
f ( x ) dx ≈(b−a)f ⟨ a+b
2 ⟩
is the simplest method and uses interpolating polynomials that pass through ¿ on a line (Chapra
& Canale, 2012; Press et al., 2007) . For the trapezoidal rule, interpolating polynomials that
passes through points (a , f ( a ) ) and (b , f ( b )) through the affine model. Trapezoidal rule:
∫
a
b
f ( x ) dx ≈(b−a) ⟨ f (a)+ f (b)
2 ⟩.
Monte Carlo Integration.
Finally, while looking at very high dimensional problem, Monte Carlo and its methods
become more efficient. This computer-based analysis uses statistical sampling techniques to
solve models and estimate there parameters (Firestone et al., 1997). Error handling in MC is

Numerical Methods & statistics. 4
usually independent of the dimension d and it always scales to 1
N . Inverse transformation
sampling being one of the methods, is used for random number generation from any probability
using the inverse of the cumulative distribution.
Acceptance – Rejection sampling uses the interval x ∈ [ a , b ] for the domain of the PDF
p(x). This can be applied to inferential statistics while building confidence levels and in
generating of samples from populations. Markov Chain Monte-Carlo (MCMC) betters up the
Acceptance-Rejection sampling. Convergence of diagnostics is an important attribute of the
Metropolis-Hastings Algorithm (Casella et al., 2004; Haugh, n.d.). Gibbs sampling is a
development of the Metropolis-Hasting sampling.
Relation to Statistical inference.
Inn relation to statistical inference, Newton’s Method and Bisection are applied in
computing parameter estimates ^θ especially for the Nonlinear Models. OLS estimators can also
be obtained by Newton-Raphson method (Vajda, 1947).
Looking at Optimization,linear and logistic regression models employ GA to obtain parameter
estimates due to its randomness (Chatterjee et al., 1996; Tang et al., 1996). Also, GA with an aid
of bootstrap methods are useful for robust criteria for estimators (Chatterjee et al., 1996).
Procedures such as CART, clustering, variable selection and order of entry that are of a discrete
nature can be done more efficiently by genetic algorithms.
In statistical inference, quadrature methods, for integrating over a finite interval
[ a , b ], are usually simple to use and have enormous improvements for smooth functions.
Function such as the Bayesian quadrature usually performs integration from a statistical
perspective. Monte Carlo integration is fully an experimental field and hence already requires
statistics. In Bayesian Modelling Sampling from the posterior distribution uses Metropolis-
usually independent of the dimension d and it always scales to 1
N . Inverse transformation
sampling being one of the methods, is used for random number generation from any probability
using the inverse of the cumulative distribution.
Acceptance – Rejection sampling uses the interval x ∈ [ a , b ] for the domain of the PDF
p(x). This can be applied to inferential statistics while building confidence levels and in
generating of samples from populations. Markov Chain Monte-Carlo (MCMC) betters up the
Acceptance-Rejection sampling. Convergence of diagnostics is an important attribute of the
Metropolis-Hastings Algorithm (Casella et al., 2004; Haugh, n.d.). Gibbs sampling is a
development of the Metropolis-Hasting sampling.
Relation to Statistical inference.
Inn relation to statistical inference, Newton’s Method and Bisection are applied in
computing parameter estimates ^θ especially for the Nonlinear Models. OLS estimators can also
be obtained by Newton-Raphson method (Vajda, 1947).
Looking at Optimization,linear and logistic regression models employ GA to obtain parameter
estimates due to its randomness (Chatterjee et al., 1996; Tang et al., 1996). Also, GA with an aid
of bootstrap methods are useful for robust criteria for estimators (Chatterjee et al., 1996).
Procedures such as CART, clustering, variable selection and order of entry that are of a discrete
nature can be done more efficiently by genetic algorithms.
In statistical inference, quadrature methods, for integrating over a finite interval
[ a , b ], are usually simple to use and have enormous improvements for smooth functions.
Function such as the Bayesian quadrature usually performs integration from a statistical
perspective. Monte Carlo integration is fully an experimental field and hence already requires
statistics. In Bayesian Modelling Sampling from the posterior distribution uses Metropolis-
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Numerical Methods & statistics. 5
Hasting and Gibbs sampling. Bayesian Credible intervals are also a development of MC
integration (Casella et al., 2004; Haugh, n.d.).
Conclusion.
While evaluating algorithm performances, there is much implication to the results and
most specifically, recourses have to be utilized efficiently. There are many applications of these
Numerical methods in the field of inferential statistics.
References.
Casella, G., Robert, C. P., & Wells, M. T. (2004). Generalized Accept-Reject sampling schemes.
Institute of Mathematical Statistics. https://doi.org/10.1214/lnms/1196285403
Chapra, S. C., & Canale, R. P. (2012). Mathematical modelling and engineering problem solving.
Numer. Methods Eng. https://doi.org/10.13140/RG.2.1.3966.4405
Chatterjee, S., Laudato, M., & Lynch, L. A. (1996). Genetic algorithms and their statistical
applications: an introduction. Computational Statistics and Data Analysis, 22(6), 633–651.
https://doi.org/10.1016/0167-9473(96)00011-4
Dey, A. (2015). Mathematical Model Formulation and Comparison Study of Various Methods of
Root-Finding Problems. IOSR Journal of Mathematics Ver. III, 11(2), 2278–5728.
Hasting and Gibbs sampling. Bayesian Credible intervals are also a development of MC
integration (Casella et al., 2004; Haugh, n.d.).
Conclusion.
While evaluating algorithm performances, there is much implication to the results and
most specifically, recourses have to be utilized efficiently. There are many applications of these
Numerical methods in the field of inferential statistics.
References.
Casella, G., Robert, C. P., & Wells, M. T. (2004). Generalized Accept-Reject sampling schemes.
Institute of Mathematical Statistics. https://doi.org/10.1214/lnms/1196285403
Chapra, S. C., & Canale, R. P. (2012). Mathematical modelling and engineering problem solving.
Numer. Methods Eng. https://doi.org/10.13140/RG.2.1.3966.4405
Chatterjee, S., Laudato, M., & Lynch, L. A. (1996). Genetic algorithms and their statistical
applications: an introduction. Computational Statistics and Data Analysis, 22(6), 633–651.
https://doi.org/10.1016/0167-9473(96)00011-4
Dey, A. (2015). Mathematical Model Formulation and Comparison Study of Various Methods of
Root-Finding Problems. IOSR Journal of Mathematics Ver. III, 11(2), 2278–5728.

Numerical Methods & statistics. 6
https://doi.org/10.9790/5728-11236471
Firestone, M., Fenner-Crisp, P., Barry, T., Bennett, D., Chang, S., Callahan, M., Burke, A.,
Barnes, D., Wood, W. P., & Knott, S. M. (1997). Guiding Principles for Monte Carlo
Analysis Technical Panel Office of Prevention, Pesticides, and Toxic Substances Risk
Assessment Forum Staff.
Genetic Algorithm - MATLAB & Simulink. (n.d.). Retrieved April 6, 2020, from
https://www.mathworks.com/discovery/genetic-algorithm.html
Haugh, M. (n.d.). IEOR E4703: Monte-Carlo Simulation.
Press, W. H., Teukolsky, S. A., Vetterling, T. W., & Flannery, B. P. (2007). Numerical Recipies,
The Art od Scientific Computing. In Cambridge University Press (Cambridge). Cambridge
University Press. http://apps.nrbook.com/empanel/index.html?pg=155
Tang, K. S., Man, K. F., Kwong, S., & He, Q. (1996). Genetic algorithms and their applications.
IEEE Signal Processing Magazine, 13(6), 22–37. https://doi.org/10.1109/79.543973
Vajda, S. (1947). Statistical methods. Nature, 160(4073), 724. https://doi.org/10.1038/160724a0
https://doi.org/10.9790/5728-11236471
Firestone, M., Fenner-Crisp, P., Barry, T., Bennett, D., Chang, S., Callahan, M., Burke, A.,
Barnes, D., Wood, W. P., & Knott, S. M. (1997). Guiding Principles for Monte Carlo
Analysis Technical Panel Office of Prevention, Pesticides, and Toxic Substances Risk
Assessment Forum Staff.
Genetic Algorithm - MATLAB & Simulink. (n.d.). Retrieved April 6, 2020, from
https://www.mathworks.com/discovery/genetic-algorithm.html
Haugh, M. (n.d.). IEOR E4703: Monte-Carlo Simulation.
Press, W. H., Teukolsky, S. A., Vetterling, T. W., & Flannery, B. P. (2007). Numerical Recipies,
The Art od Scientific Computing. In Cambridge University Press (Cambridge). Cambridge
University Press. http://apps.nrbook.com/empanel/index.html?pg=155
Tang, K. S., Man, K. F., Kwong, S., & He, Q. (1996). Genetic algorithms and their applications.
IEEE Signal Processing Magazine, 13(6), 22–37. https://doi.org/10.1109/79.543973
Vajda, S. (1947). Statistical methods. Nature, 160(4073), 724. https://doi.org/10.1038/160724a0
1 out of 6

Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.