Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Generalized additive models for gigadata: modeling the U.K. black smoke network daily data

Wood, Simon N., Li, Zheyuan, Shaddick, Gavin ORCID: https://orcid.org/0000-0002-4117-4264 and Augustin, Nicole H. 2017. Generalized additive models for gigadata: modeling the U.K. black smoke network daily data. Journal of the American Statistical Association 112 (519) , pp. 1199-1210. 10.1080/01621459.2016.1195744

Full text not available from this repository.

Abstract

We develop scalable methods for fitting penalized regression spline based generalized additive models with of the order of 104 coefficients to up to 108 data. Computational feasibility rests on: (i) a new iteration scheme for estimation of model coefficients and smoothing parameters, avoiding poorly scaling matrix operations; (ii) parallelization of the iteration’s pivoted block Cholesky and basic matrix operations; (iii) the marginal discretization of model covariates to reduce memory footprint, with efficient scalable methods for computing required crossproducts directly from the discrete representation. Marginal discretization enables much finer discretization than joint discretization would permit. We were motivated by the need to model four decades worth of daily particulate data from the U.K. Black Smoke and Sulphur Dioxide Monitoring Network. Although reduced in size recently, over 2000 stations have at some time been part of the network, resulting in some 10 million measurements. Modeling at a daily scale is desirable for accurate trend estimation and mapping, and to provide daily exposure estimates for epidemiological cohort studies. Because of the dataset size, previous work has focused on modeling time or space averaged pollution levels, but this is unsatisfactory from a health perspective, since it is often acute exposure locally and on the time scale of days that is of most importance in driving adverse health outcomes. If computed by conventional means our black smoke model would require a half terabyte of storage just for the model matrix, whereas we are able to compute with it on a desktop workstation. The best previously available reduced memory footprint method would have required three orders of magnitude more computing time than our new method. Supplementary materials for this article are available online.

Item Type: Article
Date Type: Publication
Status: Published
Schools: ?? VCO ??
Publisher: Taylor and Francis Group
ISSN: 0162-1459
Last Modified: 30 Jul 2024 14:31
URI: https://orca.cardiff.ac.uk/id/eprint/170806

Actions (repository staff only)

Edit Item Edit Item