
Practical Linear Algebra for Developers
Description
Book Introduction
- Learn linear algebra intuitively using Python without complex proofs and formulas.
- Practice problems + answers + explanations, free sample book provided
How can we efficiently learn linear algebra, the foundation of virtually all analysis and algorithms today? Should we memorize equations or delve into abstract proofs, as we did before? Perhaps these methods are too time-consuming and tedious.
Unlike conventional methods that deal with complex proofs, this book guides you through the intuitive acquisition of linear algebra concepts through practical Python code.
Additionally, you can gain practical experience by implementing linear algebra applications used in the field through a wealth of practice problems.
Learn the concepts and applications of linear algebra in real-world applications with this book and apply them immediately to your work.
- Practice problems + answers + explanations, free sample book provided
How can we efficiently learn linear algebra, the foundation of virtually all analysis and algorithms today? Should we memorize equations or delve into abstract proofs, as we did before? Perhaps these methods are too time-consuming and tedious.
Unlike conventional methods that deal with complex proofs, this book guides you through the intuitive acquisition of linear algebra concepts through practical Python code.
Additionally, you can gain practical experience by implementing linear algebra applications used in the field through a wealth of practice problems.
Learn the concepts and applications of linear algebra in real-world applications with this book and apply them immediately to your work.
- You can preview some of the book's contents.
Preview
index
Chapter 1 Vectors, Part 1: Vectors and Basic Vector Operations
1.1 Creating and Visualizing Vectors with NumPy
_1.1.1 Geometric interpretation of vectors
1.2 Vector operations
_1.2.1 Addition of two vectors
_1.2.2 Geometric interpretation of addition and subtraction of vectors
_1.2.3 Scalar-Vector Multiplication
_1.2.4 Scalar-Vector Addition
_1.2.5 Preposition
_1.2.6 Vector Broadcasting in Python
1.3 Vector magnitude and unit vector
1.4 Vector inner product
_1.4.1 Distributive law of inner product
_1.4.2 Geometric interpretation of the inner product
1.5 Other vector multiplications
_1.5.1 Adamar product
_1.5.2 External product
_1.5.3 Cross product and triple product
1.6 Orthogonal vector decomposition
1.7 In conclusion
Practice problems
Chapter 2 Vectors, Part 2: Extended Concepts of Vectors
2.1 Vector sets
2.2 Linear weighted combination
2.3 Linear independence
_2.3.1 Linear independence in mathematics
_2.3.2 Independence and the Zero Vector
2.4 Subspaces and Generation
2.5 Base
_2.5.1 Base Definition
2.6 In conclusion
Practice problems
Chapter 3 Vector Applications: Vectors in Data Analysis
3.1 Correlation and Cosine Similarity
3.2 Time series filtering and feature detection
3.3 k-means clustering
Practice problems
Chapter 4 Matrices, Part 1: Matrices and Basic Matrix Operations
4.1 Creating and Visualizing Matrices in NumPy
_4.1.1 Matrix Visualization, Indexing, and Slicing
_4.1.2 Special matrices
4.2 Matrix Mathematics: Addition, Scalar Multiplication, Hadamard Product
_4.2.1 Addition and Subtraction
_4.2.2 Matrix 'Move'
_4.2.3 Scalar multiplication and Hadamard product
4.3 Standard matrix multiplication
_4.3.1 Rules for matrix multiplication validity
_4.3.2 Matrix Multiplication
_4.3.3 Matrix-Vector Multiplication
4.4 Matrix Operations: Transpose
_4.4.1 Inner and outer product notation
4.5 Matrix Operations: LIVE EVIL (Operation Order)
4.6 Symmetric matrices
_4.6.1 Creating a Symmetric Matrix from a Nonsymmetric Matrix
4.7 In conclusion
Practice problems
Chapter 5 Matrices, Part 2: Extended Concepts of Matrices
5.1 Matrix Norm
_5.1.1 Matrix diagonal sum and Frobenius norm
5.2 Matrix space (columns, rows, zeros)
_5.2.1 Thermal space
_5.2.2 Row space
_5.2.3 Zero space
5.3 coefficient
_5.3.1 Coefficients of Special Matrices
_5.3.2 Coefficients of Addition and Multiplication Matrices
_5.3.3 Coefficients of the shifted matrix
_5.3.4 Theory and Practice
5.4 Coefficient Applications
_5.4.1 Does the vector exist in column space?
_5.4.2 Linear independence of vector sets
5.5 Determinant
_5.5.1 Calculating the Determinant
_5.5.2 Linear Dependency and Determinants
_5.5.3 Characteristic polynomial
5.6 In conclusion
Practice problems
Chapter 6 Applications of Matrices: Matrices in Data Analysis
6.1 Multivariate data covariance matrix
6.2 Geometric transformations via matrix-vector multiplication
6.3 Image Feature Detection
6.4 In conclusion
Practice problems
Chapter 7 Inverse Matrices: The Universal Key to Matrix Equations
7.1 Inverse matrix
7.2 Types of inverse matrices and conditions of invertibility
7.3 Calculating the inverse matrix
_7.3.1 Inverse of a 2×2 matrix
_7.3.2 Inverse of a diagonal matrix
_7.3.3 Inverse of an arbitrary square maximum coefficient matrix
_7.3.4 One-way inverse matrix
7.4 Uniqueness of the Inverse Matrix
7.5 Moore-Penrose pseudoinverse
7.6 Numerical stability of the inverse matrix
7.7 Geometric interpretation of the inverse matrix
7.8 In conclusion
Practice problems
Chapter 8 Orthogonal Matrices and QR Decompositions: Core Decomposition Methods in Linear Algebra 1
8.1 Orthogonal matrices
8.2 Gram-Schmidt process
8.3 QR decomposition
_8.3.1 Size of Q and R
_8.3.2 QR decomposition and inverse
8.4 In conclusion
Practice problems
Chapter 9 Row Reduction and LU Factorization: Core Factorization Methods in Linear Algebra 2
9.1 Systems of linear equations
_9.1.1 Converting a system of linear equations into a matrix
_9.1.2 Handling matrix equations
9.2 Row reduction
_9.2.1 Gaussian elimination
_9.2.2 Gauss-Jordan Elimination
_9.2.3 Calculating the inverse matrix using Gauss-Jordan elimination
9.3 LU decomposition
_9.3.1 Row Swapping via Substitution Matrix
9.4 In conclusion
Practice problems
Chapter 10 General Linear Models and Least Squares: A Guide to Understanding the Universe
10.1 General linear model
_10.1.1 Terminology
_10.1.2 Building a General Linear Model
10.2 GLM solution
_10.2.1 Is the solution correct?
_10.2.2 Geometrical Perspectives on Least Squares Method
_10.2.3 How does the least squares method work?
10.3 A Simple Example of GLM
10.4 Least Squares Method via QR Decomposition
10.5 In conclusion
Practice problems
Chapter 11 Least Squares Applications: Least Squares Using Real Data
11.1 Weather-dependent bicycle rental volume forecast
_11.1.1 Regression Analysis Table Using statsmodels
_11.1.2 Multicollinearity
_11.1.3 Normalization
11.2 Polynomial regression
11.3 Finding Model Parameters with Grid Search
11.4 In conclusion
Practice problems
Chapter 12 Eigenvalue Decomposition: The Pearl of Linear Algebra
12.1 Interpretation of Eigenvalues and Eigenvectors
_12.1.1 Geometric Interpretation of Eigenvalues and Eigenvectors
_12.1.2 Statistics (Principal Component Analysis)
_12.1.3 Noise Reduction
_12.1.4 Dimensionality Reduction (Data Compression)
12.2 Finding Eigenvalues
12.3 Finding Eigenvectors
_12.3.1 Sign and magnitude uncertainty of eigenvectors
12.4 Diagonalization of a square matrix
12.5 Special Features of Symmetric Matrices
_12.5.1 Orthogonal Eigenvectors
_12.5.2 Real Eigenvalues
12.6 Eigenvalue decomposition of a singular matrix
12.7 Quadratics, determinism, and eigenvalues
_12.7.1 Quadratic equation of a matrix
_12.7.2 Government Support
_12.7.3 ATA is a positive (semi)definite sign
12.8 Generalized Eigenvalue Decomposition
12.9 In conclusion
Practice problems
Chapter 13 Singular Value Decomposition: The Next Step in Eigenvalue Decomposition
13.1 SVD Overview
_13.1.1 Singular values and matrix coefficients
13.2 SVD in Python
13.3 SVD of a matrix and the coefficient-1 'layer'
13.4 SVD from EIG
_13.4.1 ATA's SVD
_13.4.2 Variance transformation and explanation of singular values
_13.4.3 Condition number of matrix
13.5 SVD and MP pseudoinverse matrices
13.6 In conclusion
Practice problems
Chapter 14 Eigenvalue Decomposition and SVD Applications: A Gift from Linear Algebra
14.1 Principal Component Analysis (PCA) Using Eigenvalue Decomposition and SVD
_14.1.1 Mathematics of PCA
_14.1.2 PCA execution steps
_14.1.3 PCA via SVD
14.2 Linear Discriminant Analysis
14.3 Low coefficient approximation via SVD
_14.3.1 Noise removal using SVD
14.4 In conclusion
Practice problems
APPENDIX A Python Tutorial
A.1 Why use Python?
A.2 IDE (Integrated Development Environment)
A.3 Using Python Locally and Online
A.4 Variables
A.5 Function
A.6 Visualization
A.7 Converting formulas to code
A.8 Output Formats and F-Strings
A.9 Control Flow
A.10 Execution Time Measurement
A.11 Additional Learning
A.12 In conclusion
1.1 Creating and Visualizing Vectors with NumPy
_1.1.1 Geometric interpretation of vectors
1.2 Vector operations
_1.2.1 Addition of two vectors
_1.2.2 Geometric interpretation of addition and subtraction of vectors
_1.2.3 Scalar-Vector Multiplication
_1.2.4 Scalar-Vector Addition
_1.2.5 Preposition
_1.2.6 Vector Broadcasting in Python
1.3 Vector magnitude and unit vector
1.4 Vector inner product
_1.4.1 Distributive law of inner product
_1.4.2 Geometric interpretation of the inner product
1.5 Other vector multiplications
_1.5.1 Adamar product
_1.5.2 External product
_1.5.3 Cross product and triple product
1.6 Orthogonal vector decomposition
1.7 In conclusion
Practice problems
Chapter 2 Vectors, Part 2: Extended Concepts of Vectors
2.1 Vector sets
2.2 Linear weighted combination
2.3 Linear independence
_2.3.1 Linear independence in mathematics
_2.3.2 Independence and the Zero Vector
2.4 Subspaces and Generation
2.5 Base
_2.5.1 Base Definition
2.6 In conclusion
Practice problems
Chapter 3 Vector Applications: Vectors in Data Analysis
3.1 Correlation and Cosine Similarity
3.2 Time series filtering and feature detection
3.3 k-means clustering
Practice problems
Chapter 4 Matrices, Part 1: Matrices and Basic Matrix Operations
4.1 Creating and Visualizing Matrices in NumPy
_4.1.1 Matrix Visualization, Indexing, and Slicing
_4.1.2 Special matrices
4.2 Matrix Mathematics: Addition, Scalar Multiplication, Hadamard Product
_4.2.1 Addition and Subtraction
_4.2.2 Matrix 'Move'
_4.2.3 Scalar multiplication and Hadamard product
4.3 Standard matrix multiplication
_4.3.1 Rules for matrix multiplication validity
_4.3.2 Matrix Multiplication
_4.3.3 Matrix-Vector Multiplication
4.4 Matrix Operations: Transpose
_4.4.1 Inner and outer product notation
4.5 Matrix Operations: LIVE EVIL (Operation Order)
4.6 Symmetric matrices
_4.6.1 Creating a Symmetric Matrix from a Nonsymmetric Matrix
4.7 In conclusion
Practice problems
Chapter 5 Matrices, Part 2: Extended Concepts of Matrices
5.1 Matrix Norm
_5.1.1 Matrix diagonal sum and Frobenius norm
5.2 Matrix space (columns, rows, zeros)
_5.2.1 Thermal space
_5.2.2 Row space
_5.2.3 Zero space
5.3 coefficient
_5.3.1 Coefficients of Special Matrices
_5.3.2 Coefficients of Addition and Multiplication Matrices
_5.3.3 Coefficients of the shifted matrix
_5.3.4 Theory and Practice
5.4 Coefficient Applications
_5.4.1 Does the vector exist in column space?
_5.4.2 Linear independence of vector sets
5.5 Determinant
_5.5.1 Calculating the Determinant
_5.5.2 Linear Dependency and Determinants
_5.5.3 Characteristic polynomial
5.6 In conclusion
Practice problems
Chapter 6 Applications of Matrices: Matrices in Data Analysis
6.1 Multivariate data covariance matrix
6.2 Geometric transformations via matrix-vector multiplication
6.3 Image Feature Detection
6.4 In conclusion
Practice problems
Chapter 7 Inverse Matrices: The Universal Key to Matrix Equations
7.1 Inverse matrix
7.2 Types of inverse matrices and conditions of invertibility
7.3 Calculating the inverse matrix
_7.3.1 Inverse of a 2×2 matrix
_7.3.2 Inverse of a diagonal matrix
_7.3.3 Inverse of an arbitrary square maximum coefficient matrix
_7.3.4 One-way inverse matrix
7.4 Uniqueness of the Inverse Matrix
7.5 Moore-Penrose pseudoinverse
7.6 Numerical stability of the inverse matrix
7.7 Geometric interpretation of the inverse matrix
7.8 In conclusion
Practice problems
Chapter 8 Orthogonal Matrices and QR Decompositions: Core Decomposition Methods in Linear Algebra 1
8.1 Orthogonal matrices
8.2 Gram-Schmidt process
8.3 QR decomposition
_8.3.1 Size of Q and R
_8.3.2 QR decomposition and inverse
8.4 In conclusion
Practice problems
Chapter 9 Row Reduction and LU Factorization: Core Factorization Methods in Linear Algebra 2
9.1 Systems of linear equations
_9.1.1 Converting a system of linear equations into a matrix
_9.1.2 Handling matrix equations
9.2 Row reduction
_9.2.1 Gaussian elimination
_9.2.2 Gauss-Jordan Elimination
_9.2.3 Calculating the inverse matrix using Gauss-Jordan elimination
9.3 LU decomposition
_9.3.1 Row Swapping via Substitution Matrix
9.4 In conclusion
Practice problems
Chapter 10 General Linear Models and Least Squares: A Guide to Understanding the Universe
10.1 General linear model
_10.1.1 Terminology
_10.1.2 Building a General Linear Model
10.2 GLM solution
_10.2.1 Is the solution correct?
_10.2.2 Geometrical Perspectives on Least Squares Method
_10.2.3 How does the least squares method work?
10.3 A Simple Example of GLM
10.4 Least Squares Method via QR Decomposition
10.5 In conclusion
Practice problems
Chapter 11 Least Squares Applications: Least Squares Using Real Data
11.1 Weather-dependent bicycle rental volume forecast
_11.1.1 Regression Analysis Table Using statsmodels
_11.1.2 Multicollinearity
_11.1.3 Normalization
11.2 Polynomial regression
11.3 Finding Model Parameters with Grid Search
11.4 In conclusion
Practice problems
Chapter 12 Eigenvalue Decomposition: The Pearl of Linear Algebra
12.1 Interpretation of Eigenvalues and Eigenvectors
_12.1.1 Geometric Interpretation of Eigenvalues and Eigenvectors
_12.1.2 Statistics (Principal Component Analysis)
_12.1.3 Noise Reduction
_12.1.4 Dimensionality Reduction (Data Compression)
12.2 Finding Eigenvalues
12.3 Finding Eigenvectors
_12.3.1 Sign and magnitude uncertainty of eigenvectors
12.4 Diagonalization of a square matrix
12.5 Special Features of Symmetric Matrices
_12.5.1 Orthogonal Eigenvectors
_12.5.2 Real Eigenvalues
12.6 Eigenvalue decomposition of a singular matrix
12.7 Quadratics, determinism, and eigenvalues
_12.7.1 Quadratic equation of a matrix
_12.7.2 Government Support
_12.7.3 ATA is a positive (semi)definite sign
12.8 Generalized Eigenvalue Decomposition
12.9 In conclusion
Practice problems
Chapter 13 Singular Value Decomposition: The Next Step in Eigenvalue Decomposition
13.1 SVD Overview
_13.1.1 Singular values and matrix coefficients
13.2 SVD in Python
13.3 SVD of a matrix and the coefficient-1 'layer'
13.4 SVD from EIG
_13.4.1 ATA's SVD
_13.4.2 Variance transformation and explanation of singular values
_13.4.3 Condition number of matrix
13.5 SVD and MP pseudoinverse matrices
13.6 In conclusion
Practice problems
Chapter 14 Eigenvalue Decomposition and SVD Applications: A Gift from Linear Algebra
14.1 Principal Component Analysis (PCA) Using Eigenvalue Decomposition and SVD
_14.1.1 Mathematics of PCA
_14.1.2 PCA execution steps
_14.1.3 PCA via SVD
14.2 Linear Discriminant Analysis
14.3 Low coefficient approximation via SVD
_14.3.1 Noise removal using SVD
14.4 In conclusion
Practice problems
APPENDIX A Python Tutorial
A.1 Why use Python?
A.2 IDE (Integrated Development Environment)
A.3 Using Python Locally and Online
A.4 Variables
A.5 Function
A.6 Visualization
A.7 Converting formulas to code
A.8 Output Formats and F-Strings
A.9 Control Flow
A.10 Execution Time Measurement
A.11 Additional Learning
A.12 In conclusion
Detailed image

Publisher's Review
Learn the fundamentals of linear algebra with Python, not pen and paper!
As data science and machine learning take the lead in the IT field, linear algebra, the foundation of related technologies, is gaining traction.
However, for developers working in the field, existing linear algebra textbooks can feel too complex and boring.
This book allows you to quickly and intuitively learn the core of linear algebra by implementing code using Python rather than using complex proof methods.
Covers developer-tailored linear algebra theory, from the basic concepts of vectors and matrices to LU decomposition, QR decomposition, eigenvalue and singular value decomposition, and principal component analysis.
Additionally, you can learn and implement various applications of linear algebra that are actually used in the workplace through practice problems.
Lastly, for beginners who are still unfamiliar with Python, there is an appendix that covers the basics of Python. By learning the basics of Python and practicing the code in the book, you can improve your linear algebra knowledge as well as your Python skills.
Kill two birds with one stone with this book: linear algebra and Python!
Key Contents
-Concepts and applications of vectors and matrices
-Vector and matrix operations (various multiplications and transformations)
- Independence of matrices, coefficients, and inverse matrices
-Important decompositions used in applied linear algebra (LU decomposition and QR decomposition)
-Eigenvalue decomposition and singular value decomposition
- Introduction to application areas including principal component analysis
- Proving formulas and simplifying calculations using Python
As data science and machine learning take the lead in the IT field, linear algebra, the foundation of related technologies, is gaining traction.
However, for developers working in the field, existing linear algebra textbooks can feel too complex and boring.
This book allows you to quickly and intuitively learn the core of linear algebra by implementing code using Python rather than using complex proof methods.
Covers developer-tailored linear algebra theory, from the basic concepts of vectors and matrices to LU decomposition, QR decomposition, eigenvalue and singular value decomposition, and principal component analysis.
Additionally, you can learn and implement various applications of linear algebra that are actually used in the workplace through practice problems.
Lastly, for beginners who are still unfamiliar with Python, there is an appendix that covers the basics of Python. By learning the basics of Python and practicing the code in the book, you can improve your linear algebra knowledge as well as your Python skills.
Kill two birds with one stone with this book: linear algebra and Python!
Key Contents
-Concepts and applications of vectors and matrices
-Vector and matrix operations (various multiplications and transformations)
- Independence of matrices, coefficients, and inverse matrices
-Important decompositions used in applied linear algebra (LU decomposition and QR decomposition)
-Eigenvalue decomposition and singular value decomposition
- Introduction to application areas including principal component analysis
- Proving formulas and simplifying calculations using Python
GOODS SPECIFICS
- Date of issue: September 25, 2023
- Page count, weight, size: 356 pages | 761g | 183*235*30mm
- ISBN13: 9791169211451
- ISBN10: 1169211453
You may also like
카테고리
korean
korean