Carl Friedrich Gauss
Theory of the Combination of Observations Least Subject to Errors, Part One, Part Two, Supplement
Übersetzer: Stewart, G W
Carl Friedrich Gauss
Theory of the Combination of Observations Least Subject to Errors, Part One, Part Two, Supplement
Übersetzer: Stewart, G W
- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
In the 1820s Gauss published two memoirs on least squares, which contain his final, definitive treatment of the area along with a wealth of material on probability, statistics, numerical analysis, and geodesy. These memoirs, originally published in Latin with German Notices, have been inaccessible to the English-speaking community. Here for the first time they are collected in an English translation.
In the 1820s Gauss published two memoirs on least squares, which contain his final, definitive treatment of the area along with a wealth of material on probability, statistics, numerical analysis, and geodesy. These memoirs, originally published in Latin with German Notices, have been inaccessible to the English-speaking community. Here for the first time they are collected in an English translation.
Produktdetails
- Produktdetails
- Verlag: Society for Industrial and Applied Mathematics (SIAM)
- Seitenzahl: 253
- Erscheinungstermin: 1. Januar 1987
- Englisch
- Abmessung: 228mm x 152mm x 14mm
- Gewicht: 499g
- ISBN-13: 9780898713473
- ISBN-10: 0898713471
- Artikelnr.: 27029734
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Society for Industrial and Applied Mathematics (SIAM)
- Seitenzahl: 253
- Erscheinungstermin: 1. Januar 1987
- Englisch
- Abmessung: 228mm x 152mm x 14mm
- Gewicht: 499g
- ISBN-13: 9780898713473
- ISBN-10: 0898713471
- Artikelnr.: 27029734
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
1. Translator's Introduction
2. Pars Prior/Part One: Random and regular errors in observations
3. Regular errors excluded
4. their treatment
5. General properties of random errors
6. The distribution of the error
7. The constant part or mean value of the error
8. The mean square error as a measure of uncertainty
9. Mean error, weight and precision
10. Effect of removing the constant part
11. Interpercentile ranges and probable error
12. properties of the uniform, triangular, and normal distribution
13. Inequalities relating the mean error and interpercentile ranges
14. The fourth moments of the uniform, triangular, and normal distributions
15. The distribution of a function of several errors
16. The mean value of a function of several errors
17. Some special cases
18. Convergence of the estimate of the mean error
19. the mean error of the estimate itself
20. the mean error of the estimate for the mean value
21. Combining errors with different weights
22. Overdetermined systems of equations
23. the problem of obtaining the unknowns as combinations of observations
24. the principle of least squares
25. The mean error of a function of quantities with errors
26. The regression model
27. The best combination for estimating the first unknown
28. The weight of the estimate
29. estimates of the remaining unknowns and their weights
30. justification of the principle of least squares
31. The case of a single unknown
32. the arithmetic mean. Pars Posterior/Part Two: Existence of the least
squares estimates
33. Relation between combinations for different unknowns
34. A formula for the residual sum of squares
35. Another formula for the residual sum of squares
36. Four formulas for the residual sum of squares as a function of the unknowns
37. Errors in the least squares estimates as functions of the errors in the
observations
38. mean errors and correlations
39. Linear functions of the unknowns
40. Least squares with a linear cons traint
41. Review of Gaussian elimination
42. Abbreviated computation of the weights of the unknowns
43. Computational details
44. A bbreviated computation of the weight of a linear function of the unknowns
45. Updating the unknowns and their weights when a new observation is added to
the system
46. Updating the unknowns and their weights when the weight of an observation
change s
47. A bad formula for estimating the errors in the observations from the
residual sum of squares
48. The correct formula
49. The mean error of the residual sum of squares
50. Inequalities for the mean error of the residual sum of squares
51. the case of the normal distribution. Supplementum/Supplement: Problems
having constraints on the observations
52. reduction to an ordinary least squares problem
53. Functions of the observations
54. their mean errors
55. Estimating a function of observations that are subject to constraints
56. Characterization of permissible estimates
57. The function that gives the most reliable estimate
58. The value of the most reliable estimate
59. Four formulas for the weight of the value of the estimate
60. The case of more than one function
61. The most reliable adjustments of the observations and their use in
estimation
62. Least squares characterization of the most reliable adjustment
63. Difficulties in determining weights
64. A better method
65. Computational details
66. Existence of the estimates
67. Estimating the mean error in the observations
68. Estimating the mean error in the observations, continued
69. The mean error in the estimate
70. Incomplete adjustment of observations
71. Relation between complete and incomplete adjustments
72. A block iterative method for adjusting observations
73. The inverse of a symetric system is a symmetric
74. Fundamentals of geodesy
75. De Krayenhof's triangulation
76. A triangulation from Hannover
77. Determining weights in the Hannover triangulation. Anzeigen/Notices. Part
One. Part Two. Supplement. After word. Gauss's Schooldays
78. Legendre and the Priority Controversy
79. Beginnings: Mayer, Boscovich and Laplace
80. Gauss and Laplace
81. The Theoria Motus
82. Laplace and the Central Limit Theorem
83. The Theoria Combinationis Observationum
84. The Precision of Observations
2. Pars Prior/Part One: Random and regular errors in observations
3. Regular errors excluded
4. their treatment
5. General properties of random errors
6. The distribution of the error
7. The constant part or mean value of the error
8. The mean square error as a measure of uncertainty
9. Mean error, weight and precision
10. Effect of removing the constant part
11. Interpercentile ranges and probable error
12. properties of the uniform, triangular, and normal distribution
13. Inequalities relating the mean error and interpercentile ranges
14. The fourth moments of the uniform, triangular, and normal distributions
15. The distribution of a function of several errors
16. The mean value of a function of several errors
17. Some special cases
18. Convergence of the estimate of the mean error
19. the mean error of the estimate itself
20. the mean error of the estimate for the mean value
21. Combining errors with different weights
22. Overdetermined systems of equations
23. the problem of obtaining the unknowns as combinations of observations
24. the principle of least squares
25. The mean error of a function of quantities with errors
26. The regression model
27. The best combination for estimating the first unknown
28. The weight of the estimate
29. estimates of the remaining unknowns and their weights
30. justification of the principle of least squares
31. The case of a single unknown
32. the arithmetic mean. Pars Posterior/Part Two: Existence of the least
squares estimates
33. Relation between combinations for different unknowns
34. A formula for the residual sum of squares
35. Another formula for the residual sum of squares
36. Four formulas for the residual sum of squares as a function of the unknowns
37. Errors in the least squares estimates as functions of the errors in the
observations
38. mean errors and correlations
39. Linear functions of the unknowns
40. Least squares with a linear cons traint
41. Review of Gaussian elimination
42. Abbreviated computation of the weights of the unknowns
43. Computational details
44. A bbreviated computation of the weight of a linear function of the unknowns
45. Updating the unknowns and their weights when a new observation is added to
the system
46. Updating the unknowns and their weights when the weight of an observation
change s
47. A bad formula for estimating the errors in the observations from the
residual sum of squares
48. The correct formula
49. The mean error of the residual sum of squares
50. Inequalities for the mean error of the residual sum of squares
51. the case of the normal distribution. Supplementum/Supplement: Problems
having constraints on the observations
52. reduction to an ordinary least squares problem
53. Functions of the observations
54. their mean errors
55. Estimating a function of observations that are subject to constraints
56. Characterization of permissible estimates
57. The function that gives the most reliable estimate
58. The value of the most reliable estimate
59. Four formulas for the weight of the value of the estimate
60. The case of more than one function
61. The most reliable adjustments of the observations and their use in
estimation
62. Least squares characterization of the most reliable adjustment
63. Difficulties in determining weights
64. A better method
65. Computational details
66. Existence of the estimates
67. Estimating the mean error in the observations
68. Estimating the mean error in the observations, continued
69. The mean error in the estimate
70. Incomplete adjustment of observations
71. Relation between complete and incomplete adjustments
72. A block iterative method for adjusting observations
73. The inverse of a symetric system is a symmetric
74. Fundamentals of geodesy
75. De Krayenhof's triangulation
76. A triangulation from Hannover
77. Determining weights in the Hannover triangulation. Anzeigen/Notices. Part
One. Part Two. Supplement. After word. Gauss's Schooldays
78. Legendre and the Priority Controversy
79. Beginnings: Mayer, Boscovich and Laplace
80. Gauss and Laplace
81. The Theoria Motus
82. Laplace and the Central Limit Theorem
83. The Theoria Combinationis Observationum
84. The Precision of Observations
1. Translator's Introduction
2. Pars Prior/Part One: Random and regular errors in observations
3. Regular errors excluded
4. their treatment
5. General properties of random errors
6. The distribution of the error
7. The constant part or mean value of the error
8. The mean square error as a measure of uncertainty
9. Mean error, weight and precision
10. Effect of removing the constant part
11. Interpercentile ranges and probable error
12. properties of the uniform, triangular, and normal distribution
13. Inequalities relating the mean error and interpercentile ranges
14. The fourth moments of the uniform, triangular, and normal distributions
15. The distribution of a function of several errors
16. The mean value of a function of several errors
17. Some special cases
18. Convergence of the estimate of the mean error
19. the mean error of the estimate itself
20. the mean error of the estimate for the mean value
21. Combining errors with different weights
22. Overdetermined systems of equations
23. the problem of obtaining the unknowns as combinations of observations
24. the principle of least squares
25. The mean error of a function of quantities with errors
26. The regression model
27. The best combination for estimating the first unknown
28. The weight of the estimate
29. estimates of the remaining unknowns and their weights
30. justification of the principle of least squares
31. The case of a single unknown
32. the arithmetic mean. Pars Posterior/Part Two: Existence of the least
squares estimates
33. Relation between combinations for different unknowns
34. A formula for the residual sum of squares
35. Another formula for the residual sum of squares
36. Four formulas for the residual sum of squares as a function of the unknowns
37. Errors in the least squares estimates as functions of the errors in the
observations
38. mean errors and correlations
39. Linear functions of the unknowns
40. Least squares with a linear cons traint
41. Review of Gaussian elimination
42. Abbreviated computation of the weights of the unknowns
43. Computational details
44. A bbreviated computation of the weight of a linear function of the unknowns
45. Updating the unknowns and their weights when a new observation is added to
the system
46. Updating the unknowns and their weights when the weight of an observation
change s
47. A bad formula for estimating the errors in the observations from the
residual sum of squares
48. The correct formula
49. The mean error of the residual sum of squares
50. Inequalities for the mean error of the residual sum of squares
51. the case of the normal distribution. Supplementum/Supplement: Problems
having constraints on the observations
52. reduction to an ordinary least squares problem
53. Functions of the observations
54. their mean errors
55. Estimating a function of observations that are subject to constraints
56. Characterization of permissible estimates
57. The function that gives the most reliable estimate
58. The value of the most reliable estimate
59. Four formulas for the weight of the value of the estimate
60. The case of more than one function
61. The most reliable adjustments of the observations and their use in
estimation
62. Least squares characterization of the most reliable adjustment
63. Difficulties in determining weights
64. A better method
65. Computational details
66. Existence of the estimates
67. Estimating the mean error in the observations
68. Estimating the mean error in the observations, continued
69. The mean error in the estimate
70. Incomplete adjustment of observations
71. Relation between complete and incomplete adjustments
72. A block iterative method for adjusting observations
73. The inverse of a symetric system is a symmetric
74. Fundamentals of geodesy
75. De Krayenhof's triangulation
76. A triangulation from Hannover
77. Determining weights in the Hannover triangulation. Anzeigen/Notices. Part
One. Part Two. Supplement. After word. Gauss's Schooldays
78. Legendre and the Priority Controversy
79. Beginnings: Mayer, Boscovich and Laplace
80. Gauss and Laplace
81. The Theoria Motus
82. Laplace and the Central Limit Theorem
83. The Theoria Combinationis Observationum
84. The Precision of Observations
2. Pars Prior/Part One: Random and regular errors in observations
3. Regular errors excluded
4. their treatment
5. General properties of random errors
6. The distribution of the error
7. The constant part or mean value of the error
8. The mean square error as a measure of uncertainty
9. Mean error, weight and precision
10. Effect of removing the constant part
11. Interpercentile ranges and probable error
12. properties of the uniform, triangular, and normal distribution
13. Inequalities relating the mean error and interpercentile ranges
14. The fourth moments of the uniform, triangular, and normal distributions
15. The distribution of a function of several errors
16. The mean value of a function of several errors
17. Some special cases
18. Convergence of the estimate of the mean error
19. the mean error of the estimate itself
20. the mean error of the estimate for the mean value
21. Combining errors with different weights
22. Overdetermined systems of equations
23. the problem of obtaining the unknowns as combinations of observations
24. the principle of least squares
25. The mean error of a function of quantities with errors
26. The regression model
27. The best combination for estimating the first unknown
28. The weight of the estimate
29. estimates of the remaining unknowns and their weights
30. justification of the principle of least squares
31. The case of a single unknown
32. the arithmetic mean. Pars Posterior/Part Two: Existence of the least
squares estimates
33. Relation between combinations for different unknowns
34. A formula for the residual sum of squares
35. Another formula for the residual sum of squares
36. Four formulas for the residual sum of squares as a function of the unknowns
37. Errors in the least squares estimates as functions of the errors in the
observations
38. mean errors and correlations
39. Linear functions of the unknowns
40. Least squares with a linear cons traint
41. Review of Gaussian elimination
42. Abbreviated computation of the weights of the unknowns
43. Computational details
44. A bbreviated computation of the weight of a linear function of the unknowns
45. Updating the unknowns and their weights when a new observation is added to
the system
46. Updating the unknowns and their weights when the weight of an observation
change s
47. A bad formula for estimating the errors in the observations from the
residual sum of squares
48. The correct formula
49. The mean error of the residual sum of squares
50. Inequalities for the mean error of the residual sum of squares
51. the case of the normal distribution. Supplementum/Supplement: Problems
having constraints on the observations
52. reduction to an ordinary least squares problem
53. Functions of the observations
54. their mean errors
55. Estimating a function of observations that are subject to constraints
56. Characterization of permissible estimates
57. The function that gives the most reliable estimate
58. The value of the most reliable estimate
59. Four formulas for the weight of the value of the estimate
60. The case of more than one function
61. The most reliable adjustments of the observations and their use in
estimation
62. Least squares characterization of the most reliable adjustment
63. Difficulties in determining weights
64. A better method
65. Computational details
66. Existence of the estimates
67. Estimating the mean error in the observations
68. Estimating the mean error in the observations, continued
69. The mean error in the estimate
70. Incomplete adjustment of observations
71. Relation between complete and incomplete adjustments
72. A block iterative method for adjusting observations
73. The inverse of a symetric system is a symmetric
74. Fundamentals of geodesy
75. De Krayenhof's triangulation
76. A triangulation from Hannover
77. Determining weights in the Hannover triangulation. Anzeigen/Notices. Part
One. Part Two. Supplement. After word. Gauss's Schooldays
78. Legendre and the Priority Controversy
79. Beginnings: Mayer, Boscovich and Laplace
80. Gauss and Laplace
81. The Theoria Motus
82. Laplace and the Central Limit Theorem
83. The Theoria Combinationis Observationum
84. The Precision of Observations