Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Numerical Methods Algebra and Analysis 3 course, specialty – Mathematics Cluster A 1. Matrices. Matrix theory. Degree of matrix. Vector Norms. Matrix Norms and Absolute Value (modulus). Theorem 1. 2. Matrices. Matrix theory. Coordination Norm matrix. Condition Number. Eigenvalue of matrix. Matrix limits. Lemma 1, 2. 3. Matrix Series. Necessary Condition of matrix series convergence. Theorem 1, 2. Sufficient Condition. Theorem 3. Degree Matrix Series. Theorem 4, 5. Corollary. Remark. Example. 4. Adjunction. Hamilton-Kelli equality. 5. Numerical Methods of Linear Algebraic Equations Systems (NMLAES). Gauss direct method. Theorem 1, 2. 6. Numerical Methods of Linear Algebraic Equations Systems (NMLAES). Square Roots method. 7. Numerical Methods of Linear Algebraic Equations Systems (NMLAES). Cholesky Factorization. Remark 1, 2. 8. Iterative methods of LAES. Jacobi method. Theorem 1. 9. Iterative methods of LAES. General schemes. The Sufficient Condition of convergence of the Iterative Process. Theorem 2. Estimated of error. 10. The Necessary and Sufficient Conditions of convergence of Iterative Process of the LAES. Theorem 1. Corollary 3. Remark 1. 11. General Iterative Methods LAES. The Principles of Construction of the Iterative Methods. 12. The property of symmetric matrices. Theorem 1 – 6. Errors: error function (error functional), residual functional. 13. Gauss – Seidel method. The Sufficient Condition of convergence of the Gauss – Seidel iterative process. Theorem 7 – 9. 14. Minimal Residual Method and Convergence. 15. Method of Steepest Descent and Convergence. Theorem 1. 16. Error function (error functional) and their properties. 17. Quadratic functional and its minimum. Cluster B 18. General Iterative Methods of the Solutions Algebraic and Transcendental Equations. Contraction – Mapping Principle and their applications. 19. Isolation of the Roots of Algebraic and Transcendental Equations. (Root Finding). Theorem 1, 1’, 2. Example 1, 2. Graphical method and exactness. 20. Isolation of the Roots of Algebraic and Transcendental Equations. (Root Finding). Theorem 1, 1’, 2. Example 1, 2. Bisection method and exactness. 21. General Iterative Methods of the Solutions Algebraic and Transcendental Equations. Idea of methods. 22. Contraction – Mapping Principle. Theorem 3 (Contraction – Mapping Principle). Iteration order. 23. Intersecting (Chord) Methods and exactness. 24. Newton (Tangent) Methods and exactness. 25. Mixed methods and exactness. 26. Newton – Kantorovich Methods for system. 27. False Point Methods, Stephenson Methods and theirs exactness’s. 28. Wall’s Methods and exactness. 29. Newton – Kantorovich Modifications Methods. 30. Newton – Kantorovich Methods for operational equations P ( x ) 0. 31. Properties of Polynomials. Lagrange Interpolation. Errors of Polynomial Interpolation. 32. Newton Interpolation. Errors of Polynomial Interpolation. 33. Difference quotient and their properties. 34. Finite difference and their properties. Cluster C 35. Gauss Interpolation. Errors of Polynomial Interpolation. 36. Stirling Interpolations. 37. Bessel Interpolations. 38. Derivative and Finite Differences. Higher-Order Numerical Derivatives. 39. Multipoint First-Order Numerical Derivatives. 40. Integrals and Finite Sums. Newton-Cotes Integration Rules. 41. Richardson Extrapolation. 42. Trapezium formula and exactness. 43. Simpson formula and exactness. 44. Gaussian Quadrature Rules. Exactness. 45. Random Variable. Discrete Random Variable. Probability Density and Distribution Functions (PDF). 46. Expectation (Mean), Variance and Transforms. 47. Random Variables Modelling, Neumann Modelling. 48. Chebyshev Inequality. 49. Essential Random Sample Methods. 50. Estimation of the Integral by Monte Carlo Methods. Algorithm and exactness. 51. Example of estimation of the integral by Monte Carlo Methods and the variance estimated. Professor Kanat Shakenov