Table of contents for Introduction to applied Bayesian statistics and estimation for social scientists / Scott M. Lynch.


Bibliographic record and links to related information available from the Library of Congress catalog
Note: Electronic data is machine generated. May be incomplete or contain other coding.


Counter
"1.1  Outline                ........                  .    3
1.2  A  note  on  program m ing  .............................. .. .  5
1.3 Symbols used throughout the book ..... .. ........      6
2   Probability Theory and Classical Statistics  . .. .  ..    9
2.1 Rules of probability . .....                            9
2.2 Probabilit distributions in general ... 12
2.2.1 important quantities in distributions .......  . . ... 17
2.2.2  M ultiv riate  distributions  .  .  ................  ..  19
2.2.3 Marginal and conditional distributions ............  23
2.3 Some important distributions in social science ....... .. .. 25
2.3.1  The binomial distribution  ... ... ......... ........  25
2..2  The  ultinomial distribution  .. . .  .  .   .  ..  . .  27
2.3.3  The Poisson distribution  ............       .  28
2.34  The normal distribution.................. .... ..  29
23,5 The rnultivariate normal distribution.   . ........... 30
2,3.6 t and mutivariate t distributions ................  33
24  Classical statistics in  social science  . ... .  . .  . . ......... . .   .  33
2.  Maximum   kelihood estimation  .............. .... ..  35
2..51 Constructing a likelihood function. ... ...... ..   . .. 36
2.5.2 Maximizing a likelihood function ..... ..      . 38
"2.5.3  Obtaining standard errors  .. . . .. .     ....  39
2.3.4  A  normal likelihood  example . .. ...  . ......... .. .  41
2.6  Conc lusions .  ..... .. ... .......   . .   . ..  . .. .... .   44
2.7  Exercises  ,  . ......  ,.             ..   .  . . .  . .  44
2.7.1  Probability  exercises  .......................... ...   44
2.7.2  Classical inference exercises  ............ ......   45
3   Basics of Bayesian  Statistics  .... ..... . .....  .. ...... .  47
3.  BaYes' Theorem for point probabilities .. .. . . . .  ........ 47
3.2 Bayes Theorem applied to probability distributions .....  50
32.1 I  Proportionality  ..............     .........  .  1
3.3 Bayes' Theorem with distributions: A voting example ....... 53
3.3.1 Specification of a prior: The beta distribution ..,.....  54
3.32 An alternative model for the polling data: A gammma
prior/ Poisson likelihood approach  . . . .  . . . .   .. ...  60
34   A normal prior- normal likelihood example with cr known .... 62
3.41.1 Extending the normal distribution example ...... 6. 65
3.5  Some useful prior distributions .....   . . .  .  . . . .  .  .  .  .  .   .  68
SThe Dirichlet distribution ......                     69
3.5.2  Ihe inverse gammna distribution  .....  ..   . . . .  . . . . . .  69
3.5.3 Wishart and inverse Wishart distributions ............ 70
3.6 Criticism against Bayesian statistics  .. .............   70
3.7  Conclusions.  ...... ..........                         73
3.8  E xercises  ... ....  .... .........             ...... 74
4   Modern Model Estimation Part 1: Gibbs Sampling . ...... 77
41 Wihat Bayesians want and why ..........                   77
4.2 The logic of sampling from posterior densities .. . . ..   . .  78
4.3 Two basic sampling methods ................    ......... 80
4.31 The inversion method of sampling .. . .. . ....... . 81
4.3.2 The rejection rncthod of samplng .  .  .  .  ..... 84
4.  introduction to MCMC sampling ....................... 88
4.4.1 Generic Gibbs sampling ....................... 88
4.4.: Gibbs sampling example using the inversion method ... 89
44. 3 Example repeated using rejection sampling ...... .  93
4.1 .  Gibbs sampling from a real bivariate density ... ..... . 96
4.o5 Reversing the process: Sampling the parameters given,
the  data .... ....  .......... .  .......... ....   100
4.5  C onclusions .......... .  .... .......... ......... ......  103
4.6  Exercises  ......          ................           . 105
5.  A  generic M1H  algorithm '.....  ........... . ..... ... .  108
5.1.1  Relat onshlip between Gibbs and iMH sampling ....... . 113
5.2 Example: MH sampling when conditional densities are
difficult  to  derive  . ...... ...  ...... .. ................. I  15
5.3 Example: MH sampling for a conditional density with an
unknown form          .     .    ......     ..     .   . 118
5.4 Extending the bivariate normal example: The full
multiparameter model ........ .   . .......... ...  ...   121
5.4.1  The conditionals for p,  and  p .... . ........ .. ....  122
54.2  The conditionals for r  r,  and  .... .........12
5.4I3 The complete MH algorithm  ........     ......I. 124
5.4.4 A matrix approach to the bivariate normal distribution
problem   ..   ......... .... .. .. . . .............. ..  126
5.5  C onclusions  ..................... ............. ......   28
5.6  E xercises  . . ..... .... ....  ....... ... ... .... . . 29
6   Evaluating Markov Chain Monte Carlo Algorithms and
M o d el  F it  .... . .................................... .... 31
6.1 Why evaluate MCMC algorithm performance? ..              132
6.2 Some common problems and solutions .. . .. .. .. . . ..  132
6.3 Recognizing poor performance .   ..      . . . . . .. 135
6.3.1  Trace  plots  .  ..... . .... . ..  ... ............... .   . 13
6.3.2 Acceptance rates of MH algorithms .. .  .. ..  . .  . .. 141
6.3.3  Autocorrelation  of parameters  .................. .  146
6.34      and  other calculations  .... ................... 47
6.4  Evaluating  model fit  ..... ........ .... ...... . ...  .. .  153
6.4.1  Residual analysis  ..........  ..... . . .........154
6.4.2  Posterior predictive distributions  ...... . .... .. .  .... .  155
6.5 Formal comparison and combining models .. .    ........  159
6.5.1  Bayes  factors  ................ .. . ..... .... .....  159
6.5.2 Bayesian model averaging ............... .161
6.6  Conclusions .............  .   ..   .... ..........     163
6.7  Exercises .  ...... .... ..          ..          . .......  .163
7   The Linear Regression Model ................       .......... 165
7.1 Development of the linear regression model .....  ...... . 165
7.2 Sampling from the posterior distribution for the model
param eters  ... ...... .... ..... ...... ....  ... . .....  168
7.2.1 Sampling with an MH algorithm ......    ........... 168
7.2.2 Sampling the model parameters using Gibbs sampling.. 169
7.3 Example: Are people in the South "nicer" than others? .. ... 174
7.3.1 Results and comparison of the algorithms ........... 175
7.3.2  M odel evaluation  ......... . . .... .............. ...  .178
7.4  Incorporating  missing  data  ... .. . .... ................ .182
7.4.1  Types of missingness  . ..  .... . ..... ...... ... . .  182
7.4 2 A generic Bayesian approach when data are MAR:
The "niceness" example revisited ................... 186
75  Conclusions  ......  . .  .  . . ...... . . ........ . ..... .  191
.6  E xercis s   .. .  ...... ......... .................... .192
8   Generalized  Linear M odels  ................. .     ....... 193
8.1  The dichotomno s probit model  ................ ....... . 195
8.1.1 Model development and parameter interpretation ..... 195
8.1.2 Sampling from the posterior distribution for the model
pararn eters  ... .  .......... ...... ... ... ... . .. 198
8.1.3  Sinnlating from truncated normal distributions ...... 200
8.1A4  Dichotomous probit model example: Black-white
differences in mortalit ....... ........ . .  ..  206
8.2 The ordinal probit model .  . .... ........ ........ .217
8.2.1 Model development and parameter interpretation ...... 218
8.2.2 Sampling from the posterior distribution for the
paraineters. . .  . . . .                         220
8.2.3  Ordinal probit model example: Black-white differences
in  health  .........   ..  . ..  ....... ...       0223
8.3  Conclusions ..............                    .228
8.4  Exercises  .....   .....  .... .....  ...229
9   Introduction to Hierarchical Models . .. . . ....... ... ... ..231
9.1  Hierarchical mo els in  general ........................ ... 232
0.1.  The voting example red  x  .  . . ..... . . . . . . .  .  233
9.2 Hierarchical linear regression models ...............240
9.2.1 Ran dom effects: The random intercept model  .... 241
9.2.2  Random effects: The random coeffcient moel . ..... 25
2.3   Growth  m odels ..... ...............    .......   256
9.3 A note on fixed versus random effects models and other
"leri n nology .  .  . . . . .  ...  . .. .  .. ..   ....... ...  . .  .. . 26
9.4  Conclusions., ..  ..  ....                    ...268
-)   Exercises . . . .............................           269
10   ntroduction to Multivariate Regression Models ... . ....... 271
10.l  Miultivariate linear re ession  ..........            271
10.1.1  Model development ..   .................. ...... ..... 271
10.1.2  Implementing the algorithm  .......  .... ....... .275
10 2 Muiltivariate probit models ............ ..   ..    .   277
10.2.1  M odel development .... ........... .. ...... ... .  278
10i.2 Step 2: S1 iulating draws from truncated muliivarniate
normal distributions  ,                  ......., ,.... ... .... . ......283
10.2.3 Step 3: Sim.lation of thresholds in the multivariate
probit model .. ...-...... ..........   . .......  289
10.2.4 Step 5: Shmnlating the error covariance matrix  .... 295
10.25 o  iepleienting the algorithm ..... .. .   ........ . ... .... 297
j10.3 A multi variae probit m odel for generating distribiutions .... . 303
10.3.1 Model specification and simulation            .307
10.3.2 Life table generation and other posterior inferences ... 310
10.4  Conclusionss  ........                                .  ..  .  . .  . . 315
10.5 Exercises   . .. . .    . .              . .           317
11  Conclusion  ..    .....  .   .. .... . . .  ..  . ...     .  319
A   Background Mathematics..                   .       .        323
A. Summary of calculus .           i . .                  .. 323
A.1.1  Limits  ..  .  ...  .. ..      t              ... 323
1A.2 Differential calculus ..      ....     ...     .. 324
A.1.3 Integral calculus .... ....                  ..   326
A.14 Finding a general rule for a derivative ........ 329
A.2 Summary of matrix algebra . .......               .     330
A.21.  M atrix  notation  .   ..  ... ....  .. . ..... .. .  30
A.2,2  M atrix  operations  ... ....... .....  .. ...... 331
A .3  E xercises  ..... ... ... ... .  ..   ..   ........ ..  .  335
A  3.1  Calculus exercises  . . . ....  .     ....... .   335
A   2 Matrix algebra exercses.       . .            . . . 335
B   The Central Limit Theorem, Confidence Intervals, and
Hypothesis Tests      .   ....   ....  ......  ..           337
BAi  A  simulation study  .............                 ...  337
B.2  Classical inference  .   ......   . ..  .... . ..... .  338
B.2.1   l Hypothesis testing . . . . . .  .        . . .339
B.2.2  Confidence intervals  .  . .. .........  . ...   .  342
B.2.3  Soine final notes . .  . . .     .  . ..  .  ..  344



Library of Congress subject headings for this publication: Social sciences Statistical methods, Bayesian statistical decision theory