Simon Jackman
Bayesian Analysis for the Social Sciences (eBook, PDF)
60,99 €
60,99 €
inkl. MwSt.
Sofort per Download lieferbar
0 °P sammeln
60,99 €
Als Download kaufen
60,99 €
inkl. MwSt.
Sofort per Download lieferbar
0 °P sammeln
Jetzt verschenken
Alle Infos zum eBook verschenken
60,99 €
inkl. MwSt.
Sofort per Download lieferbar
Alle Infos zum eBook verschenken
0 °P sammeln
Simon Jackman
Bayesian Analysis for the Social Sciences (eBook, PDF)
- Format: PDF
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung

Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei
bücher.de, um das eBook-Abo tolino select nutzen zu können.
Hier können Sie sich einloggen
Hier können Sie sich einloggen
Sie sind bereits eingeloggt. Klicken Sie auf 2. tolino select Abo, um fortzufahren.

Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei bücher.de, um das eBook-Abo tolino select nutzen zu können.
Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using…mehr
- Geräte: PC
- mit Kopierschutz
- eBook Hilfe
- Größe: 12.23MB
Andere Kunden interessierten sich auch für
- Bayesian Inference in the Social Sciences (eBook, PDF)113,99 €
- Peter CongdonApplied Bayesian Modelling (eBook, PDF)68,99 €
- Peter M. LeeBayesian Statistics (eBook, PDF)42,99 €
- Timo KoskiBayesian Networks (eBook, PDF)80,99 €
- Bayesian Networks (eBook, PDF)92,99 €
- Bruce L. BrownMultivariate Analysis for the Biobehavioral and Social Sciences (eBook, PDF)102,99 €
- Ioannis NtzoufrasBayesian Modeling Using WinBUGS (eBook, PDF)142,99 €
-
-
-
Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in D ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: John Wiley & Sons
- Erscheinungstermin: 27. Oktober 2009
- Englisch
- ISBN-13: 9780470686638
- Artikelnr.: 37298586
- Verlag: John Wiley & Sons
- Erscheinungstermin: 27. Oktober 2009
- Englisch
- ISBN-13: 9780470686638
- Artikelnr.: 37298586
- Herstellerkennzeichnung Die Herstellerinformationen sind derzeit nicht verfügbar.
Simon Jackman is a political scientist by trade but has a tremendous amount of experience in using Bayesian methods for solving problems in the social and political sciences, and teaching Bayesian methods to social science students.
List of Figures.
List of Tables.
Preface.
Acknowledgments.
Introduction.
Part I: Introducing Bayesian Analysis.
1. The foundations of Bayesian inference.
1.1 What is probability?
1.2 Subjective probability in Bayesian statistics.
1.3 Bayes theorem, discrete case.
1.4 Bayes theorem, continuous parameter.
1.5 Parameters as random variables, beliefs as distributions.
1.6 Communicating the results of a Bayesian analysis.
1.7 Asymptotic properties of posterior distributions.
1.8 Bayesian hypothesis testing.
1.9 From subjective beliefs to parameters and models.
1.10 Historical note.
2. Getting started: Bayesian analysis for simple models.
2.1 Learning about probabilities, rates and proportions.
2.2 Associations between binary variables.
2.3 Learning from counts.
2.4 Learning about a normal mean and variance.
2.5 Regression models.
2.6 Further reading.
Part II: Simulation Based Bayesian Analysis.
3. Monte Carlo methods.
3.1 Simulation consistency.
3.2 Inference for functions of parameters.
3.3 Marginalization via Monte Carlo integration.
3.4 Sampling algorithms.
3.5 Further reading.
4. Markov chains.
4.1 Notation and definitions.
4.2 Properties of Markov chains.
4.3 Convergence of Markov chains.
4.4 Limit theorems for Markov chains.
4.5 Further reading.
5. Markov chain Monte Carlo.
5.1 Metropolis-Hastings algorithm.
5.2 Gibbs sampling.
6. Implementing Markov chain Monte Carlo.
6.1 Software for Markov chain Monte Carlo.
6.2 Assessing convergence and run-length.
6.3 Working with BUGS/JAGS from R.
6.4 Tricks of the trade.
6.5 Other examples.
6.6 Further reading.
Part III: Advanced Applications in the Social Sciences.
7. Hierarchical Statistical Models.
7.1 Data and parameters that vary by groups: the case for hierarchical
modeling.
7.2 ANOVA as a hierarchical model.
7.3 Hierarchical models for longitudinal data.
7.4 Hierarchical models for non-normal data.
7.5 Multi-level models.
8. Bayesian analysis of choice making.
8.1 Regression models for binary responses.
8.2 Ordered outcomes.
8.3 Multinomial outcomes.
8.4 Multinomial probit.
9. Bayesian approaches to measurement.
9.1 Bayesian inference for latent states.
9.2 Factor analysis.
9.3 Item-response models.
9.4 Dynamic measurement models.
Part IV: Appendices.
Appendix A: Working with vectors and matrices.
Appendix B: Probability review.
B.1 Foundations of probability.
B.2 Probability densities and mass functions.
B.3 Convergence of sequences of random variabales.
Appendix C: Proofs of selected propositions.
C.1 Products of normal densities.
C.2 Conjugate analysis of normal data.
C.3 Asymptotic normality of the posterior density.
References.
Topic index.
Author index.
List of Tables.
Preface.
Acknowledgments.
Introduction.
Part I: Introducing Bayesian Analysis.
1. The foundations of Bayesian inference.
1.1 What is probability?
1.2 Subjective probability in Bayesian statistics.
1.3 Bayes theorem, discrete case.
1.4 Bayes theorem, continuous parameter.
1.5 Parameters as random variables, beliefs as distributions.
1.6 Communicating the results of a Bayesian analysis.
1.7 Asymptotic properties of posterior distributions.
1.8 Bayesian hypothesis testing.
1.9 From subjective beliefs to parameters and models.
1.10 Historical note.
2. Getting started: Bayesian analysis for simple models.
2.1 Learning about probabilities, rates and proportions.
2.2 Associations between binary variables.
2.3 Learning from counts.
2.4 Learning about a normal mean and variance.
2.5 Regression models.
2.6 Further reading.
Part II: Simulation Based Bayesian Analysis.
3. Monte Carlo methods.
3.1 Simulation consistency.
3.2 Inference for functions of parameters.
3.3 Marginalization via Monte Carlo integration.
3.4 Sampling algorithms.
3.5 Further reading.
4. Markov chains.
4.1 Notation and definitions.
4.2 Properties of Markov chains.
4.3 Convergence of Markov chains.
4.4 Limit theorems for Markov chains.
4.5 Further reading.
5. Markov chain Monte Carlo.
5.1 Metropolis-Hastings algorithm.
5.2 Gibbs sampling.
6. Implementing Markov chain Monte Carlo.
6.1 Software for Markov chain Monte Carlo.
6.2 Assessing convergence and run-length.
6.3 Working with BUGS/JAGS from R.
6.4 Tricks of the trade.
6.5 Other examples.
6.6 Further reading.
Part III: Advanced Applications in the Social Sciences.
7. Hierarchical Statistical Models.
7.1 Data and parameters that vary by groups: the case for hierarchical
modeling.
7.2 ANOVA as a hierarchical model.
7.3 Hierarchical models for longitudinal data.
7.4 Hierarchical models for non-normal data.
7.5 Multi-level models.
8. Bayesian analysis of choice making.
8.1 Regression models for binary responses.
8.2 Ordered outcomes.
8.3 Multinomial outcomes.
8.4 Multinomial probit.
9. Bayesian approaches to measurement.
9.1 Bayesian inference for latent states.
9.2 Factor analysis.
9.3 Item-response models.
9.4 Dynamic measurement models.
Part IV: Appendices.
Appendix A: Working with vectors and matrices.
Appendix B: Probability review.
B.1 Foundations of probability.
B.2 Probability densities and mass functions.
B.3 Convergence of sequences of random variabales.
Appendix C: Proofs of selected propositions.
C.1 Products of normal densities.
C.2 Conjugate analysis of normal data.
C.3 Asymptotic normality of the posterior density.
References.
Topic index.
Author index.
List of Figures.
List of Tables.
Preface.
Acknowledgments.
Introduction.
Part I: Introducing Bayesian Analysis.
1. The foundations of Bayesian inference.
1.1 What is probability?
1.2 Subjective probability in Bayesian statistics.
1.3 Bayes theorem, discrete case.
1.4 Bayes theorem, continuous parameter.
1.5 Parameters as random variables, beliefs as distributions.
1.6 Communicating the results of a Bayesian analysis.
1.7 Asymptotic properties of posterior distributions.
1.8 Bayesian hypothesis testing.
1.9 From subjective beliefs to parameters and models.
1.10 Historical note.
2. Getting started: Bayesian analysis for simple models.
2.1 Learning about probabilities, rates and proportions.
2.2 Associations between binary variables.
2.3 Learning from counts.
2.4 Learning about a normal mean and variance.
2.5 Regression models.
2.6 Further reading.
Part II: Simulation Based Bayesian Analysis.
3. Monte Carlo methods.
3.1 Simulation consistency.
3.2 Inference for functions of parameters.
3.3 Marginalization via Monte Carlo integration.
3.4 Sampling algorithms.
3.5 Further reading.
4. Markov chains.
4.1 Notation and definitions.
4.2 Properties of Markov chains.
4.3 Convergence of Markov chains.
4.4 Limit theorems for Markov chains.
4.5 Further reading.
5. Markov chain Monte Carlo.
5.1 Metropolis-Hastings algorithm.
5.2 Gibbs sampling.
6. Implementing Markov chain Monte Carlo.
6.1 Software for Markov chain Monte Carlo.
6.2 Assessing convergence and run-length.
6.3 Working with BUGS/JAGS from R.
6.4 Tricks of the trade.
6.5 Other examples.
6.6 Further reading.
Part III: Advanced Applications in the Social Sciences.
7. Hierarchical Statistical Models.
7.1 Data and parameters that vary by groups: the case for hierarchical
modeling.
7.2 ANOVA as a hierarchical model.
7.3 Hierarchical models for longitudinal data.
7.4 Hierarchical models for non-normal data.
7.5 Multi-level models.
8. Bayesian analysis of choice making.
8.1 Regression models for binary responses.
8.2 Ordered outcomes.
8.3 Multinomial outcomes.
8.4 Multinomial probit.
9. Bayesian approaches to measurement.
9.1 Bayesian inference for latent states.
9.2 Factor analysis.
9.3 Item-response models.
9.4 Dynamic measurement models.
Part IV: Appendices.
Appendix A: Working with vectors and matrices.
Appendix B: Probability review.
B.1 Foundations of probability.
B.2 Probability densities and mass functions.
B.3 Convergence of sequences of random variabales.
Appendix C: Proofs of selected propositions.
C.1 Products of normal densities.
C.2 Conjugate analysis of normal data.
C.3 Asymptotic normality of the posterior density.
References.
Topic index.
Author index.
List of Tables.
Preface.
Acknowledgments.
Introduction.
Part I: Introducing Bayesian Analysis.
1. The foundations of Bayesian inference.
1.1 What is probability?
1.2 Subjective probability in Bayesian statistics.
1.3 Bayes theorem, discrete case.
1.4 Bayes theorem, continuous parameter.
1.5 Parameters as random variables, beliefs as distributions.
1.6 Communicating the results of a Bayesian analysis.
1.7 Asymptotic properties of posterior distributions.
1.8 Bayesian hypothesis testing.
1.9 From subjective beliefs to parameters and models.
1.10 Historical note.
2. Getting started: Bayesian analysis for simple models.
2.1 Learning about probabilities, rates and proportions.
2.2 Associations between binary variables.
2.3 Learning from counts.
2.4 Learning about a normal mean and variance.
2.5 Regression models.
2.6 Further reading.
Part II: Simulation Based Bayesian Analysis.
3. Monte Carlo methods.
3.1 Simulation consistency.
3.2 Inference for functions of parameters.
3.3 Marginalization via Monte Carlo integration.
3.4 Sampling algorithms.
3.5 Further reading.
4. Markov chains.
4.1 Notation and definitions.
4.2 Properties of Markov chains.
4.3 Convergence of Markov chains.
4.4 Limit theorems for Markov chains.
4.5 Further reading.
5. Markov chain Monte Carlo.
5.1 Metropolis-Hastings algorithm.
5.2 Gibbs sampling.
6. Implementing Markov chain Monte Carlo.
6.1 Software for Markov chain Monte Carlo.
6.2 Assessing convergence and run-length.
6.3 Working with BUGS/JAGS from R.
6.4 Tricks of the trade.
6.5 Other examples.
6.6 Further reading.
Part III: Advanced Applications in the Social Sciences.
7. Hierarchical Statistical Models.
7.1 Data and parameters that vary by groups: the case for hierarchical
modeling.
7.2 ANOVA as a hierarchical model.
7.3 Hierarchical models for longitudinal data.
7.4 Hierarchical models for non-normal data.
7.5 Multi-level models.
8. Bayesian analysis of choice making.
8.1 Regression models for binary responses.
8.2 Ordered outcomes.
8.3 Multinomial outcomes.
8.4 Multinomial probit.
9. Bayesian approaches to measurement.
9.1 Bayesian inference for latent states.
9.2 Factor analysis.
9.3 Item-response models.
9.4 Dynamic measurement models.
Part IV: Appendices.
Appendix A: Working with vectors and matrices.
Appendix B: Probability review.
B.1 Foundations of probability.
B.2 Probability densities and mass functions.
B.3 Convergence of sequences of random variabales.
Appendix C: Proofs of selected propositions.
C.1 Products of normal densities.
C.2 Conjugate analysis of normal data.
C.3 Asymptotic normality of the posterior density.
References.
Topic index.
Author index.