These are the relevant UQ publications from the last few years. For a full list, please see the academic members individual sites.

2016

  1. Oakley, J. E., & Youngman, B. D. (2016). Calibration of stochastic computer simulators using likelihood emulation. Technometrics (In Press).
    We calibrate a stochastic computer simulation model of ‘moderate’ computational expense. The simulator is an imperfect representation of reality, and we recognise this discrepancy to ensure a reliable calibration. The calibration model combines a Gaussian process emulator of the likelihood surface with importance sampling. Changing the discrepancy specification changes only the importance weights, which lets us investigate sensitivity to different discrepancy specifications at little computational cost. We present a case study of a natural history model that has been used to characterise UK bowel cancer incidence. Data sets and computer code are provided as supplementary material.
    @article{Oakley:2016,
      author = {Oakley, Jeremy E. and Youngman, Benjamin D.},
      journal = {Technometrics (in press)},
      title = {Calibration of stochastic computer simulators using likelihood emulation},
      year = {2016},
      website = {http://www.tandfonline.com/doi/abs/10.1080/00401706.2015.1125391}
    }
    

2015

  1. Strong, M., Oakley, J. E., Brennan, A., & Breeze, P. (2015). Estimating the expected value of sample information using the probabilistic sensitivity analysis sample: a fast nonparametric regression-based method. Medical Decision Making, 35(5), 570–83.
    @article{StrongOakleyBrennanBreeze:2015,
      title = {Estimating the expected value of sample information using the probabilistic sensitivity analysis sample: a fast nonparametric regression-based method},
      author = {Strong, M. and Oakley, J. E. and Brennan, A. and Breeze, P.},
      journal = {Medical Decision Making},
      volume = {35},
      number = {5},
      pages = {570--83},
      year = {2015},
      website = {http://mdm.sagepub.com/cgi/reprint/0272989X15575286v1.pdf?ijkey=6t1Xa911zXrOdCv&keytype=finite}
    }
    
  2. Andrianakis, I., Vernon, I. R., McCreesh, N., McKinley, T. J., Oakley, J. E., Nsubuga, R. N., … White, R. G. (2015). Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda. PLoS Comput Biol, 11(1), e1003968. doi:10.1371/journal.pcbi.1003968
    @article{Andrianakis:2015,
      author = {Andrianakis, Ioannis and Vernon, Ian R. and McCreesh, Nicky and McKinley, Trevelyan J. and Oakley, Jeremy E. and Nsubuga, Rebecca N. and Goldstein, Michael and White, Richard G.},
      journal = {PLoS Comput Biol},
      publisher = {Public Library of Science},
      title = {Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda},
      year = {2015},
      month = jan,
      volume = {11},
      website = {http://dx.doi.org/10.1371\%2Fjournal.pcbi.1003968},
      pages = {e1003968},
      number = {1},
      doi = {10.1371/journal.pcbi.1003968}
    }
    
  3. Holden, P. B., Edwards, N. R., Garthwaite, P. H., & Wilkinson. (2015). Emulation and interpretation of high-dimensional climate model outputs. Journal Of Applied Statistics, 42, 2038–2055.
    @article{Holden2015,
      author = {Holden, P. B. and Edwards, N. R. and Garthwaite, P. H. and Wilkinson},
      title = {Emulation and interpretation of high-dimensional climate model outputs},
      journal = {Journal of Applied Statistics},
      volume = {42},
      website = {http://www.tandfonline.com/doi/abs/10.1080/02664763.2015.1016412?journalCode=cjas20#.VplL7NZhPlI},
      pages = { 2038-2055},
      year = {2015}
    }
    
  4. Bounceur, N., Crucifix, M., & Wilkinson. (2015). Global sensitivity analysis of the climate vegetation system to astronomical forcing: an emulator-based approach. Earth Syst. Dynam. Discuss, 6, 205–224.
    @article{Bonceur2015,
      author = {Bounceur, N. and Crucifix, M. and Wilkinson},
      title = { Global sensitivity analysis of the climate vegetation system to astronomical forcing: an emulator-based approach},
      journal = {Earth Syst. Dynam. Discuss},
      website = {http://www.earth-syst-dynam.net/6/205/2015/esd-6-205-2015.pdf},
      year = {2015},
      volume = {6},
      pages = {205-224}
    }
    
  5. Holden, P., Edwards, N., & Wilkinson. (2015). ABC for climate: dealing with expensive simulators. In The Handbook of ABC.
    This paper is due to appear as a chapter of the forthcoming Handbook of Approximate Bayesian Computation (ABC) by S. Sisson, L. Fan, and M. Beaumont. We describe the challenge of calibrating climate simulators, and discuss the differences in emphasis in climate science compared to many of the more traditional ABC application areas. The primary difficulty is how to do inference with a computationally expensive simulator which we can only afford to run a small number of times, and we describe how Gaussian process emulators are used as surrogate models in this case. We introduce the idea of history matching, which is a non-probabilistic calibration method, which divides the parameter space into (not im)plausible and implausible regions. History matching can be shown to be a special case of ABC, but with a greater emphasis on defining realistic simulator discrepancy bounds, and using these to define tolerances and metrics. We describe a design approach for choosing parameter values at which to run the simulator, and illustrate the approach on a toy climate model, showing that with careful design we can find the plausible region with a very small number of model evaluations. Finally, we describe how calibrated GENIE-1 (an earth system model of intermediate complexity) predictions have been used, and why it is important to accurately characterise parametric uncertainty.
    @incollection{HoldenABC2015,
      author = {Holden, P. and Edwards, N. and Wilkinson},
      title = { ABC for climate: dealing with expensive simulators},
      website = {http://arxiv.org/abs/1511.03475},
      booktitle = { The Handbook of ABC},
      year = {2015}
    }
    

2014

  1. Strong, M., & Oakley, J. E. (2014). When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies. SIAM/ASA Journal On Uncertainty Quantification, 2(1), 106–125.
    @article{StrongOakley:2014,
      title = { When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies},
      author = {Strong, M. and Oakley, J. E.},
      journal = {SIAM/ASA Journal on Uncertainty Quantification},
      volume = {2},
      number = {1},
      pages = {106--125},
      year = {2014},
      website = {http://epubs.siam.org/doi/pdf/10.1137/120889563}
    }
    
  2. Strong, M., Oakley, J. E., & A., B. (2014). Estimating multi-parameter partial Expected Value of Perfect Information from a probabilistic sensitivity analysis sample: a non-parametric regression approach. Medical Decision Making, 34(3), 311–26.
    @article{StrongOakleyBrennan:2014,
      title = {Estimating multi-parameter partial Expected Value of Perfect Information from a probabilistic sensitivity analysis sample: a non-parametric regression approach},
      author = {Strong, M. and Oakley, J. E. and A., Brennan},
      journal = {Medical Decision Making},
      volume = {34},
      number = {3},
      pages = {311--26},
      year = {2014},
      website = {http://www.shef.ac.uk/polopoly_fs/1.305038!/file/multiparameterEVPPI_MDM_Accepted_clean_version.pdf}
    }
    
  3. Wilkinson. (2014). Accelerating ABC methods using Gaussian processes: Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics. JMLR Workshop And Conference Proceedings, 33, 1015–1023.
    @article{Wilkinson2014,
      title = { {Accelerating ABC methods using Gaussian} processes: Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics},
      author = {Wilkinson},
      journal = { JMLR Workshop and Conference Proceedings},
      volume = {33},
      pages = {1015-1023},
      website = {http://jmlr.org/proceedings/papers/v33/wilkinson14.pdf},
      year = {2014}
    }
    

2013

  1. Strong, M., & Oakley, J. E. (2013). An efficient method for computing single parameter partial expected value of perfect information. Medical Decision Making, 33(6), 755–66.
    @article{StrongOakley:2013,
      title = {An efficient method for computing single parameter partial expected value of perfect information},
      author = {Strong, M. and Oakley, J. E.},
      journal = {Medical Decision Making},
      volume = {33},
      number = {6},
      pages = {755--66},
      year = {2013},
      website = {http://mdm.sagepub.com/content/33/6/755.full}
    }
    
  2. Fricker, T., Oakley, J. E., & Urban, N. M. (2013). Multivariate Gaussian process emulators with nonseparable covariance structures. Technometrics, 55(1), 47–56.
    @article{Fricker:2013,
      title = {Multivariate Gaussian process emulators with nonseparable covariance structures},
      author = {Fricker, T. and Oakley, J. E. and Urban, N. M.},
      journal = {Technometrics},
      volume = {55},
      number = {1},
      pages = {47--56},
      year = {2013}
    }
    
  3. Wilkinson. (2013). Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Statistical Applications In Genetics and Molecular Biology, 12, 129–142.
    @article{Wilkinson2013,
      author = {Wilkinson},
      title = {Approximate{ Bayesian computation (ABC)} gives exact results under the assumption of model error},
      journal = {Statistical Applications in Genetics and Molecular Biology},
      volume = {12},
      website = {http://arxiv.org/abs/0811.3355},
      pages = {129-142},
      year = {2013}
    }
    

2012

  1. Becker, W., Oakley, J. E., Surace, C., Gili, P., Rowson, J., & Worden, K. (2012). Bayesian sensitivity analysis of a nonlinear finite element model. Mechanical Systems And Signal Processing, 32, 18–31. doi:http://dx.doi.org/10.1016/j.ymssp.2012.03.009
    @article{Becker2012,
      title = {Bayesian sensitivity analysis of a nonlinear finite element model},
      journal = {Mechanical Systems and Signal Processing},
      volume = {32},
      number = {},
      pages = {18 - 31},
      year = {2012},
      note = {Uncertainties in Structural Dynamics },
      issn = {0888-3270},
      doi = {http://dx.doi.org/10.1016/j.ymssp.2012.03.009},
      website = {http://www.sciencedirect.com/science/article/pii/S0888327012000866},
      author = {Becker, W. and Oakley, J.E. and Surace, C. and Gili, P. and Rowson, J. and Worden, K.}
    }
    
  2. Strong, M., Oakley, J. E., & Chilcott, J. (2012). Managing structural uncertainty in health economic decision models: a discrepancy approach. Journal Of the Royal Statistical Society: Series C (Applied Statistics), 61(1), 25–45. doi:10.1111/j.1467-9876.2011.01014.x
    @article{StrongOakleyChilcott:2012,
      author = {Strong, Mark and Oakley, Jeremy E. and Chilcott, Jim},
      title = {Managing structural uncertainty in health economic decision models: a discrepancy approach},
      journal = {Journal of the Royal Statistical Society: Series C (Applied Statistics)},
      volume = {61},
      number = {1},
      publisher = {Blackwell Publishing Ltd},
      issn = {1467-9876},
      website = {http://dx.doi.org/10.1111/j.1467-9876.2011.01014.x},
      doi = {10.1111/j.1467-9876.2011.01014.x},
      pages = {25--45},
      keywords = {Computer model, Elicitation, Health economics, Model uncertainty, Sensitivity analysis, Uncertainty analysis},
      year = {2012}
    }
    

2011

  1. Wilkinson, R. D., M., V., Cornford, D., & E., O. J. (2011). Quantifying simulator discrepancy in discrete-time dynamical simulators. Journal Of Agricultural, Biological and Environmental Statistics, 16(4), 554–570.
    @article{Wilkinson:2011,
      author = {Wilkinson, R. D. and M., Vrettasm and Cornford, D. and E., Oakley J.},
      title = {Quantifying simulator discrepancy in discrete-time dynamical simulators},
      journal = {Journal of Agricultural, Biological and Environmental Statistics},
      volume = {16},
      number = {4},
      website = {http://eprints.nottingham.ac.uk/1524/},
      pages = {554-570},
      year = {2011}
    }
    
  2. Fricker, T. E., Oakley, J. E., Sims, N. D., & Worden, K. (2011). Probabilistic uncertainty analysis of an FRF of a structure using a Gaussian process emulator. Mechanical Systems And Signal Processing, 25(8), 2962–2975. doi:http://dx.doi.org/10.1016/j.ymssp.2011.06.013
    @article{Fricker2011,
      title = {Probabilistic uncertainty analysis of an {FRF} of a structure using a Gaussian process emulator},
      journal = {Mechanical Systems and Signal Processing},
      volume = {25},
      number = {8},
      pages = {2962 - 2975},
      year = {2011},
      note = {},
      issn = {0888-3270},
      doi = {http://dx.doi.org/10.1016/j.ymssp.2011.06.013},
      website = {http://www.sciencedirect.com/science/article/pii/S0888327011002354},
      author = {Fricker, Thomas E. and Oakley, Jeremy E. and Sims, Neil D. and Worden, Keith},
      keywords = {Bayesian }
    }
    
  3. Oakley, J. E. (2011). Modelling with Deterministic Computer Models. In M. Christie, A. Cliffe, P. Dawid, & S. Senn (Eds.), Simplicity, Complexity and Modelling (pp. 51–67). John Wiley & Sons, Ltd. doi:10.1002/9781119951445.ch4
    @inbook{Oakley:2011,
      title = {Modelling with Deterministic Computer Models},
      author = {Oakley, Jeremy E.},
      editor = {Christie, M. and Cliffe, A. and Dawid, P. and Senn, S.},
      publisher = {John Wiley & Sons, Ltd},
      isbn = {9781119951445},
      website = {http://dx.doi.org/10.1002/9781119951445.ch4},
      doi = {10.1002/9781119951445.ch4},
      pages = {51--67},
      keywords = {deterministic computer models, Gaussian process regression, Gaussian processes emulators, sensitivity analysis, uncertainty analysis},
      booktitle = {Simplicity, Complexity and Modelling},
      year = {2011}
    }
    
  4. Becker, W., Rowson, J., Oakley, J. E., Yoxall, A., Manson, G., & Worden, K. (2011). Bayesian sensitivity analysis of a model of the aortic valve. Journal Of Biomechanics, 44(8), 1499–1506. doi:http://dx.doi.org/10.1016/j.jbiomech.2011.03.008
    @article{Becker2011,
      title = {Bayesian sensitivity analysis of a model of the aortic valve},
      journal = {Journal of Biomechanics},
      volume = {44},
      number = {8},
      pages = {1499 - 1506},
      year = {2011},
      note = {},
      issn = {0021-9290},
      doi = {http://dx.doi.org/10.1016/j.jbiomech.2011.03.008},
      website = {http://www.sciencedirect.com/science/article/pii/S0021929011002570},
      author = {Becker, W. and Rowson, J. and Oakley, J.E. and Yoxall, A. and Manson, G. and Worden, K.},
      keywords = {Finite element analysis }
    }
    

2010 and earlier

  1. Oakley, J. E., & Clough, H. E. (2010). Sensitivity analysis in microbial risk assessment: vero-cytotoxigenic E.coli O157 in farm-pasteurised milk. In A. O’Hagan & M. West (Eds.), Handbook of Applied Bayesian Analysis. Oxford University Press.
    @inbook{Oakley:2010,
      title = {Sensitivity analysis in microbial risk assessment: vero-cytotoxigenic {E}.coli {O}157 in farm-pasteurised milk},
      author = {Oakley, Jeremy E. and Clough, H. E.},
      editor = {O'Hagan, A. and West, M.},
      publisher = {Oxford University Press},
      booktitle = {Handbook of Applied Bayesian Analysis},
      year = {2010}
    }
    
  2. Conti, S., Gosling, J. P., Oakley, J. E., & O’Hagan, A. (2009). Gaussian process emulation of dynamic computer codes. Biometrika, 96(3), 663–676. doi:10.1093/biomet/asp028
    Computer codes are used in scientific research to study and predict the behaviour of complex systems. Their run times often make uncertainty and sensitivity analyses impractical because of the thousands of runs that are conventionally required, so efficient techniques have been developed based on a statistical representation of the code. The approach is less straightforward for dynamic codes, which represent time-evolving systems. We develop a novel iterative system to build a statistical model of dynamic computer codes, which is demonstrated on a rainfall-runoff simulator.
    @article{Conti2009,
      author = {Conti, S. and Gosling, J. P. and Oakley, J. E. and O'Hagan, A.},
      title = {Gaussian process emulation of dynamic computer codes},
      volume = {96},
      number = {3},
      pages = {663-676},
      year = {2009},
      doi = {10.1093/biomet/asp028},
      url = {http://biomet.oxfordjournals.org/content/96/3/663.abstract},
      website = {http://biomet.oxfordjournals.org/content/96/3/663.full.pdf+html},
      journal = {Biometrika}
    }
    
  3. Oakley, J. E. (2009). Decision-Theoretic Sensitivity Analysis for Complex Computer Models. Technometrics, 51(2), 121–129.
    When using a computer model to inform a decision, it is important to investigate any uncertainty in the model and determine how that uncertainty may impact on the decision. In probabilistic sensitivity analysis, model users can investigate how various uncertain model inputs contribute to the uncertainty in the model output. However, much of the literature focuses only on output uncertainty as measured by variance; the decision problem itself often is ignored, even though uncertainty as measured by variance may not equate to uncertainty about the optimum decision. Consequently, traditional variance-based measures of input parameter importance may not correctly describe the importance of each input. We review a decision theoretic framework for conducting sensitivity analysis that addresses this problem. Because computation of these decision-theoretic measures can be impractical for complex computer models, we provide efficient computational tools using Gaussian processes. We give an illustration in the field of medical decision making, and compare the Gaussian process approach with conventional Monte Carlo sampling.
    @article{Oakley:2009,
      issn = {00401706},
      author = {Oakley, Jeremy E.},
      journal = {Technometrics},
      number = {2},
      pages = {121-129},
      publisher = {Taylor & Francis, Ltd.},
      title = {Decision-Theoretic Sensitivity Analysis for Complex Computer Models},
      volume = {51},
      year = {2009}
    }
    
  4. Oakley, J. (2004). Estimating percentiles of uncertain computer code outputs. Journal Of the Royal Statistical Society: Series C (Applied Statistics), 53(1), 83–93. doi:10.1046/j.0035-9254.2003.05044.x
    @article{Oakley:2004,
      author = {Oakley, Jeremy},
      title = {Estimating percentiles of uncertain computer code outputs},
      journal = {Journal of the Royal Statistical Society: Series C (Applied Statistics)},
      volume = {53},
      number = {1},
      publisher = {Blackwell Publishing},
      issn = {1467-9876},
      website = {http://dx.doi.org/10.1046/j.0035-9254.2003.05044.x},
      doi = {10.1046/j.0035-9254.2003.05044.x},
      pages = {83--93},
      keywords = {Deterministic computer code, Gaussian process, Uncertainty distribution},
      year = {2004}
    }
    
  5. Oakley, J. E., & O’Hagan, A. (2004). Probabilistic sensitivity analysis of complex models: a Bayesian approach. Journal Of the Royal Statistical Society: Series B (Statistical Methodology), 66(3), 751–769. doi:10.1111/j.1467-9868.2004.05304.x
    In many areas of science and technology, mathematical models are built to simulate complex real world phenomena. Such models are typically implemented in large computer programs and are also very complex, such that the way that the model responds to changes in its inputs is not transparent. Sensitivity analysis is concerned with understanding how changes in the model inputs influence the outputs. This may be motivated simply by a wish to understand the implications of a complex model but often arises because there is uncertainty about the true values of the inputs that should be used for a particular application. A broad range of measures have been advocated in the literature to quantify and describe the sensitivity of a model’s output to variation in its inputs. In practice the most commonly used measures are those that are based on formulating uncertainty in the model inputs by a joint probability distribution and then analysing the induced uncertainty in outputs, an approach which is known as probabilistic sensitivity analysis. We present a Bayesian framework which unifies the various tools of probabilistic sensitivity analysis. The Bayesian approach is computationally highly efficient. It allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods. Furthermore, all measures of interest may be computed from a single set of runs.
    @article{Oakley:2004b,
      author = {Oakley, Jeremy E. and O'Hagan, Anthony},
      title = {Probabilistic sensitivity analysis of complex models: a Bayesian approach},
      journal = {Journal of the Royal Statistical Society: Series B (Statistical Methodology)},
      volume = {66},
      number = {3},
      publisher = {Blackwell Publishing},
      issn = {1467-9868},
      website = {http://dx.doi.org/10.1111/j.1467-9868.2004.05304.x},
      doi = {10.1111/j.1467-9868.2004.05304.x},
      pages = {751--769},
      keywords = {Bayesian inference, Computer model, Gaussian process, Sensitivity analysis, Uncertainty analysis},
      year = {2004}
    }
    
  6. Oakley, J., & O’Hagan, A. (2002). Bayesian inference for the uncertainty distribution of computer model outputs. Biometrika, 89(4), 769–784. doi:10.1093/biomet/89.4.769
    We consider a problem of inference for the output of a computationally expensive computer model. We suppose that the model is to be used in a context where the values of one or more inputs are uncertain, so that the input configuration is a random variable. We require to make inference about the induced distribution of the output. This distribution is called the uncertainty distribution, and the general problem is known to users of computer models as uncertainty analysis. To be specific, we develop Bayesian inference for the distribution and density functions of the model output. Modelling the output, as a function of its inputs, as a Gaussian process, we derive expressions for the posterior mean and variance of the distribution and density functions, based on data comprising observed outputs at a sample of input configurations. We show that direct computation of these expressions may encounter numerical difficulties. We develop an alternative approach based on simulating approximate realisations from the posterior distribution of the output function. Two examples are given to illustrate our methods.
    @article{Oakley01122002,
      author = {Oakley, Jeremy and O'Hagan, Anthony},
      title = {Bayesian inference for the uncertainty distribution of computer model outputs},
      volume = {89},
      number = {4},
      pages = {769-784},
      year = {2002},
      doi = {10.1093/biomet/89.4.769},
      website = {http://biomet.oxfordjournals.org/content/89/4/769.full.pdf+html},
      journal = {Biometrika}
    }
    
  7. Oakley, J. (2002). Eliciting Gaussian process priors for complex computer codes. Journal Of the Royal Statistical Society: Series D (The Statistician), 51(1), 81–97. doi:10.1111/1467-9884.00300
    Summary. We consider the problem of eliciting expert knowledge about the output of a deterministic computer code, where the output is a function of a vector of input variables. A Gaussian process prior is assumed for the unknown function, and expert judgments about the output at various inputs are used to find suitable hyperparameters of the Gaussian process prior distribution. An example is presented involving the movement of radionuclides in the food chain.
    @article{RSSD:RSSD300,
      author = {Oakley, Jeremy},
      title = {Eliciting Gaussian process priors for complex computer codes},
      journal = {Journal of the Royal Statistical Society: Series D (The Statistician)},
      volume = {51},
      number = {1},
      publisher = {Blackwell Science, Ltd},
      issn = {1467-9884},
      website = {http://dx.doi.org/10.1111/1467-9884.00300},
      doi = {10.1111/1467-9884.00300},
      pages = {81--97},
      keywords = {Computer experiment, Deterministic computer model, Expert elicitation, Gaussian process},
      year = {2002}
    }
    
  8. O’Hagan, A., Kennedy, M. C., & Oakley, J. E. (1999). Uncertainty analysis and other inference tools for complex computer codes (with discussion). In J. M. B. et al. (Ed.), Bayesian Statistics 6 (pp. 503–524). Oxford University Press.
    @inbook{OHagan:1999,
      title = {Uncertainty analysis and other inference tools for complex computer codes (with discussion)},
      author = {O'Hagan, A. and Kennedy, M. C. and Oakley, J. E.},
      editor = {et al., J. M. Bernardo},
      publisher = {Oxford University Press},
      booktitle = {Bayesian Statistics 6},
      year = {1999},
      pages = {503--524}
    }
    
  9. Oakley, J. E. (1999). Bayesian Uncertainty Analysis for Complex Computer Codes (PhD thesis). Department of Probability and Statistics, University of Sheffield.
    @phdthesis{Oakley:1999,
      author = {Oakley, J.E.},
      year = {1999},
      title = {Bayesian Uncertainty Analysis for Complex Computer Codes},
      address = {University of Sheffield},
      school = {Department of Probability and Statistics},
      website = {http://www.jeremy-oakley.staff.shef.ac.uk/jeothesis.pdf}
    }
    
  10. Wilkinson, R. D. (2010). Bayesian calibration of expensive multivariate computer experiments. In L. T. Biegler, G. Biros, O. Ghattas, M. Heinkenschloss, D. Keyes, B. K. Mallick, … Y. Marzouk (Eds.), Large-scale inverse problems and quantification of uncertainty. John Wiley and Sons.
    @incollection{wilkinson2010bayesian,
      title = {Bayesian calibration of expensive multivariate computer experiments},
      author = {Wilkinson, Richard D},
      booktitle = {Large-scale inverse problems and quantification of uncertainty},
      year = {2010},
      website = {http://onlinelibrary.wiley.com/book/10.1002/9780470685853},
      publisher = {John Wiley and Sons},
      editor = {Biegler, L. T. and Biros, G. and Ghattas, O. and Heinkenschloss, M. and Keyes, D. and Mallick, B. K. and Tenorio, L. and Waanders, B. Van Bloemen and Wilcox, K. and Marzouk, Y.}
    }
    
  11. Holden, P. B., Edwards, N. R., Oliver, K. I. C., Lenton, T. M., & Wilkinson. (2010). A probabilistic calibration of climate sensitivity and terrestrial carbon change in GENIE-1. Climate Dynamics, 35, 785–806.
    @article{Holden2010,
      author = {Holden, P.B. and Edwards, N.R. and Oliver, K.I.C. and Lenton, T.M. and Wilkinson},
      title = {A probabilistic calibration of climate sensitivity and terrestrial carbon change in {GENIE-1}},
      journal = {Climate Dynamics},
      volume = {35},
      website = {http://link.springer.com/article/10.1007%2Fs00382-009-0630-8#/page-1},
      pages = {785-806},
      year = {2010}
    }