Statistics

New nonlinear least squares solvers in R with {gslnls}

Introduction Solving a nonlinear least squares problem consists of minimizing a least squares objective function made up of residuals \(g_1(\boldsymbol{\theta}), \ldots, g_n(\boldsymbol{\theta})\) that are nonlinear functions of the parameters of interest \(\boldsymbol{\theta} = (\theta_1,\ldots, \theta_p)'\):

GSL nonlinear least squares fitting in R

Introduction The new gslnls-package provides R bindings to nonlinear least-squares optimization with the GNU Scientific Library (GSL) using the trust region methods implemented by the gsl_multifit_nlinear module. The gsl_multifit_nlinear module was added in GSL version 2.

Asymptotic confidence intervals for NLS regression in R

Introduction Nonlinear regression model As a model setup, we consider noisy observations \(y_1,\ldots, y_n \in \mathbb{R}\) obtained from a standard nonlinear regression model of the form: \[ \begin{aligned} y_i &\ = \ f(\boldsymbol{x}_i, \boldsymbol{\theta}) + \epsilon_i, \quad i = 1,\ldots, n \end{aligned} \] where \(f: \mathbb{R}^k \times \mathbb{R}^p \to \mathbb{R}\) is a known nonlinear function of the independent variables \(\boldsymbol{x}_1,\ldots,\boldsymbol{x}_n \in \mathbb{R}^k\) and the unknown parameter vector \(\boldsymbol{\theta} \in \mathbb{R}^p\) that we aim to estimate.

Step function regression in Stan

Introduction The aim of this post is to provide a working approach to perform piecewise constant or step function regression in Stan. To set up the regression problem, consider noisy observations \(y_1, \ldots, y_n \in \mathbb{R}\) sampled from a standard signal plus i.

Demystifying Stein's Paradox

Stein’s paradox Stein’s example, perhaps better known under the name Stein’s Paradox, is a well-known example in statistics that demonstrates the use of shrinkage to reduce the mean squared error (\(L_2\)-risk) of a multivariate estimator with respect to classical (unbiased) estimators, such as the maximum likelihood estimator.