Bayesian News Feeds

Informative g-Priors for Logistic Regression

Timothy E. Hanson, Adam J. Branscum, Wesley O. Johnson
Categories: Bayesian Analysis

Cluster Analysis, Model Selection, and Prior Distributions on Models

George Casella, Elias Moreno and F.J. Giron
Categories: Bayesian Analysis

Toward Rational Social Decisions: A Review and Some Results

Joseph B. Kadane and Steven N. MacEachern
Categories: Bayesian Analysis

Bayesian Analysis, Volume 9, Number 2 (2014)

Bayesian Analysis (Latest Issue) - Tue, 2014-05-27 08:46

Contents:

Zhihua Zhang, Dakan Wang, Guang Dai, Michael I. Jordan. Matrix-Variate Dirichlet Process Priors with Applications. 259--286.

Nammam Ali Azadi, Paul Fearnhead, Gareth Ridall, Joleen H. Blok. Bayesian Sequential Experimental Design for Binary Response Data with Application to Electromyographic Experiments. 287--306.

Juhee Lee, Steven N. MacEachern, Yiling Lu, Gordon B. Mills. Local-Mass Preserving Prior Distributions for Nonparametric Bayesian Models. 307--330.

Ruitao Liu, Arijit Chakrabarti, Tapas Samanta, Jayanta K. Ghosh, Malay Ghosh. On Divergence Measures Leading to Jeffreys and Other Reference Priors. 331--370.

Xin-Yuan Song, Jing-Heng Cai, Xiang-Nan Feng, Xue-Jun Jiang. Bayesian Analysis of the Functional-Coefficient Autoregressive Heteroscedastic Model. 371--396.

Yu Ryan Yue, Daniel Simpson, Finn Lindgren, Håvard Rue. Bayesian Adaptive Smoothing Splines Using Stochastic Differential Equations. 397--424.

Jaakko Riihimäki, Aki Vehtari. Laplace Approximation for Logistic Gaussian Process Density Estimation and Regression. 425--448.

Fei Liu, Sounak Chakraborty, Fan Li, Yan Liu, Aurelie C. Lozano. Bayesian Regularization via Graph Laplacian. 449--474.

Catia Scricciolo. Adaptive Bayesian Density Estimation in $L^{p}$ -metrics with Pitman-Yor or Normalized Inverse-Gaussian Process Kernel Mixtures. 475--520.

Categories: Bayesian Analysis

Matrix-Variate Dirichlet Process Priors with Applications

Zhihua Zhang, Dakan Wang, Guang Dai, Michael I. Jordan.

Source: Bayesian Analysis, Volume 9, Number 2, 259--286.

Abstract:
In this paper we propose a matrix-variate Dirichlet process (MATDP) for modeling the joint prior of a set of random matrices. Our approach is able to share statistical strength among regression coefficient matrices due to the clustering property of the Dirichlet process. Moreover, since the base probability measure is defined as a matrix-variate distribution, the dependence among the elements of each random matrix is described via the matrix-variate distribution. We apply MATDP to multivariate supervised learning problems. In particular, we devise a nonparametric discriminative model and a nonparametric latent factor model. The interest is in considering correlations both across response variables (or covariates) and across response vectors. We derive Markov chain Monte Carlo algorithms for posterior inference and prediction, and illustrate the application of the models to multivariate regression, multi-class classification and multi-label prediction problems.

Categories: Bayesian Analysis

Bayesian Sequential Experimental Design for Binary Response Data with Application to Electromyographic Experiments

Nammam Ali Azadi, Paul Fearnhead, Gareth Ridall, Joleen H. Blok.

Source: Bayesian Analysis, Volume 9, Number 2, 287--306.

Abstract:
We develop a sequential Monte Carlo approach for Bayesian analysis of the experimental design for binary response data. Our work is motivated by surface electromyographic (SEMG) experiments, which can be used to provide information about the functionality of subjects’ motor units. These experiments involve a series of stimuli being applied to a motor unit, with whether or not the motor unit fires for each stimulus being recorded. The aim is to learn about how the probability of firing depends on the applied stimulus (the so-called stimulus-response curve). One such excitability parameter is an estimate of the stimulus level for which the motor unit has a 50% chance of firing. Within such an experiment we are able to choose the next stimulus level based on the past observations. We show how sequential Monte Carlo can be used to analyse such data in an online manner. We then use the current estimate of the posterior distribution in order to choose the next stimulus level. The aim is to select a stimulus level that mimimises the expected loss of estimating a quantity, or quantities, of interest. We will apply this loss function to the estimates of target quantiles from the stimulus-response curve. Through simulation we show that this approach is more efficient than existing sequential design methods in terms of estimating the quantile(s) of interest. If applied in practice, it could reduce the length of SEMG experiments by a factor of three.

Categories: Bayesian Analysis

Local-Mass Preserving Prior Distributions for Nonparametric Bayesian Models

Juhee Lee, Steven N. MacEachern, Yiling Lu, Gordon B. Mills.

Source: Bayesian Analysis, Volume 9, Number 2, 307--330.

Abstract:
We address the problem of prior specification for models involving the two-parameter Poisson-Dirichlet process. These models are sometimes partially subjectively specified and are always partially (or fully) specified by a rule. We develop prior distributions based on local mass preservation. The robustness of posterior inference to an arbitrary choice of overdispersion under the proposed and current priors is investigated. Two examples are provided to demonstrate the properties of the proposed priors. We focus on the three major types of inference: clustering of the parameters of interest, estimation and prediction. The new priors are found to provide more stable inference about clustering than traditional priors while showing few drawbacks. Furthermore, it is shown that more stable clustering results in more stable inference for estimation and prediction. We recommend the local-mass preserving priors as a replacement for the traditional priors.

Categories: Bayesian Analysis

On Divergence Measures Leading to Jeffreys and Other Reference Priors

Ruitao Liu, Arijit Chakrabarti, Tapas Samanta, Jayanta K. Ghosh, Malay Ghosh.

Source: Bayesian Analysis, Volume 9, Number 2, 331--370.

Abstract:
The paper presents new measures of divergence between prior and posterior which are maximized by the Jeffreys prior. We provide two methods for proving this, one of which provides an easy to verify sufficient condition. We use such divergences to measure information in a prior and also obtain new objective priors outside the class of Bernardo’s reference priors.

Categories: Bayesian Analysis

Bayesian Analysis of the Functional-Coefficient Autoregressive Heteroscedastic Model

Xin-Yuan Song, Jing-Heng Cai, Xiang-Nan Feng, Xue-Jun Jiang.

Source: Bayesian Analysis, Volume 9, Number 2, 371--396.

Abstract:
In this paper, we propose a new model called the functional-coefficient autoregressive heteroscedastic (FARCH) model for nonlinear time series. The FARCH model extends the existing functional-coefficient autoregressive models and double-threshold autoregressive heteroscedastic models by providing a flexible framework for the detection of nonlinear features for both the conditional mean and conditional variance. We propose a Bayesian approach, along with the Bayesian P-splines technique and Markov chain Monte Carlo algorithm, to estimate the functional coefficients and unknown parameters of the model. We also conduct model comparison via the Bayes factor. The performance of the proposed methodology is evaluated via a simulation study. A real data set derived from the daily S&P 500 Composite Index is used to illustrate the methodology.

Categories: Bayesian Analysis

Bayesian Adaptive Smoothing Splines Using Stochastic Differential Equations

Yu Ryan Yue, Daniel Simpson, Finn Lindgren, Håvard Rue.

Source: Bayesian Analysis, Volume 9, Number 2, 397--424.

Abstract:
The smoothing spline is one of the most popular curve-fitting methods, partly because of empirical evidence supporting its effectiveness and partly because of its elegant mathematical formulation. However, there are two obstacles that restrict the use of the smoothing spline in practical statistical work. Firstly, it becomes computationally prohibitive for large data sets because the number of basis functions roughly equals the sample size. Secondly, its global smoothing parameter can only provide a constant amount of smoothing, which often results in poor performances when estimating inhomogeneous functions. In this work, we introduce a class of adaptive smoothing spline models that is derived by solving certain stochastic differential equations with finite element methods. The solution extends the smoothing parameter to a continuous data-driven function, which is able to capture the change of the smoothness of the underlying process. The new model is Markovian, which makes Bayesian computation fast. A simulation study and real data example are presented to demonstrate the effectiveness of our method.

Categories: Bayesian Analysis

Laplace Approximation for Logistic Gaussian Process Density Estimation and Regression

Jaakko Riihimäki, Aki Vehtari.

Source: Bayesian Analysis, Volume 9, Number 2, 425--448.

Abstract:
Logistic Gaussian process (LGP) priors provide a flexible alternative for modelling unknown densities. The smoothness properties of the density estimates can be controlled through the prior covariance structure of the LGP, but the challenge is the analytically intractable inference. In this paper, we present approximate Bayesian inference for LGP density estimation in a grid using Laplace’s method to integrate over the non-Gaussian posterior distribution of latent function values and to determine the covariance function parameters with type-II maximum a posteriori (MAP) estimation. We demonstrate that Laplace’s method with MAP is sufficiently fast for practical interactive visualisation of 1D and 2D densities. Our experiments with simulated and real 1D data sets show that the estimation accuracy is close to a Markov chain Monte Carlo approximation and state-of-the-art hierarchical infinite Gaussian mixture models. We also construct a reduced-rank approximation to speed up the computations for dense 2D grids, and demonstrate density regression with the proposed Laplace approach.

Categories: Bayesian Analysis

Bayesian Regularization via Graph Laplacian

Fei Liu, Sounak Chakraborty, Fan Li, Yan Liu, Aurelie C. Lozano.

Source: Bayesian Analysis, Volume 9, Number 2, 449--474.

Abstract:
Regularization plays a critical role in modern statistical research, especially in high-dimensional variable selection problems. Existing Bayesian methods usually assume independence between variables a priori. In this article, we propose a novel Bayesian approach, which explicitly models the dependence structure through a graph Laplacian matrix. We also generalize the graph Laplacian to allow both positively and negatively correlated variables. A prior distribution for the graph Laplacian is then proposed, which allows conjugacy and thereby greatly simplifies the computation. We show that the proposed Bayesian model leads to proper posterior distribution. Connection is made between our method and some existing regularization methods, such as Elastic Net, Lasso, Octagonal Shrinkage and Clustering Algorithm for Regression (OSCAR) and Ridge regression. An efficient Markov Chain Monte Carlo method based on parameter augmentation is developed for posterior computation. Finally, we demonstrate the method through several simulation studies and an application on a real data set involving key performance indicators of electronics companies.

Categories: Bayesian Analysis

Adaptive Bayesian Density Estimation in $L^{p}$ -metrics with Pitman-Yor or Normalized Inverse-Gaussian Process Kernel Mixtures

Catia Scricciolo.

Source: Bayesian Analysis, Volume 9, Number 2, 475--520.

Abstract:
We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process convolution kernel mixture as the prior distribution for a density. The procedure is studied from a frequentist perspective. Using the stick-breaking representation of the Pitman-Yor process and the finite-dimensional distributions of the normalized inverse-Gaussian process, we prove that, when the data are independent replicates from a density with analytic or Sobolev smoothness, the posterior distribution concentrates on shrinking $L^{p}$ -norm balls around the sampling density at a minimax-optimal rate, up to a logarithmic factor. The resulting hierarchical Bayesian procedure, with a fixed prior, is adaptive to the unknown smoothness of the sampling density.

Categories: Bayesian Analysis

Bayesian Analysis, Volume 9, Number 2 (2014)

Bayesian Analysis (Latest Issue) - Mon, 2014-05-26 19:51

Contents:

Zhihua Zhang, Dakan Wang, Guang Dai, Michael I. Jordan. Matrix-Variate Dirichlet Process Priors with Applications. 259--286.

Nammam Ali Azadi, Paul Fearnhead, Gareth Ridall, Joleen H. Blok. Bayesian Sequential Experimental Design for Binary Response Data with Application to Electromyographic Experiments. 287--306.

Juhee Lee, Steven N. MacEachern, Yiling Lu, Gordon B. Mills. Local-Mass Preserving Prior Distributions for Nonparametric Bayesian Models. 307--330.

Ruitao Liu, Arijit Chakrabarti, Tapas Samanta, Jayanta K. Ghosh, Malay Ghosh. On Divergence Measures Leading to Jeffreys and Other Reference Priors. 331--370.

Xin-Yuan Song, Jing-Heng Cai, Xiang-Nan Feng, Xue-Jun Jiang. Bayesian Analysis of the Functional-Coefficient Autoregressive Heteroscedastic Model. 371--396.

Yu Ryan Yue, Daniel Simpson, Finn Lindgren, Håvard Rue. Bayesian Adaptive Smoothing Splines Using Stochastic Differential Equations. 397--424.

Jaakko Riihimäki, Aki Vehtari. Laplace Approximation for Logistic Gaussian Process Density Estimation and Regression. 425--448.

Fei Liu, Sounak Chakraborty, Fan Li, Yan Liu, Aurelie C. Lozano. Bayesian Regularization via Graph Laplacian. 449--474.

Catia Scricciolo. Adaptive Bayesian Density Estimation in $L^{p}$ -metrics with Pitman-Yor or Normalized Inverse-Gaussian Process Kernel Mixtures. 475--520.

Categories: Bayesian Analysis