Linear Algebra for Machine Learning and Data Science
Instructor: Luis Serrano
Intermediate Level • 7 hours to complete • Flexible Schedule
What You'll Learn
- Represent data as vectors and matrices and identify their properties using concepts of singularity, rank, and linear independence
- Apply common vector and matrix algebra operations like dot product, inverse, and determinants
- Express certain types of matrix operations as linear transformation, and apply concepts of eigenvalues and eigenvectors to machine learning problems
Skills You'll Gain
Image Analysis
Dimensionality Reduction
Linear Algebra
Data Science
NumPy
Artificial Intelligence
Jupyter
Applied Mathematics
Machine Learning Methods
Python Programming
Data Manipulation
Shareable Certificate
Earn a shareable certificate to add to your LinkedIn profile
Outcomes
-
Learn new concepts from industry experts
-
Gain a foundational understanding of a subject or tool
-
Develop job-relevant skills with hands-on projects
-
Earn a shareable career certificate
There are 4 modules in this course
Matrices are commonly used in machine learning and data science to represent data and its transformations. In this week, you will learn how matrices naturally arise from systems of equations and how certain matrix properties can be thought in terms of operations on system of equations.
In this week, you will learn how to solve a system of linear equations using the elimination method and the row echelon form. You will also learn about an important property of a matrix: the rank. The concept of the rank of a matrix is useful in computer vision for compressing images.
An individual instance (observation) of data is typically represented as a vector in machine learning. In this week, you will learn about properties and operations of vectors. You will also learn about linear transformations, matrix inverse, and one of the most important operations on matrices: the matrix multiplication. You will see how matrix multiplication naturally arises from composition of linear transformations. Finally, you will learn how to apply some of the properties of matrices and vectors that you have learned so far to neural networks.
In this final week, you will take a deeper look at determinants. You will learn how determinants can be geometrically interpreted as an area and how to calculate determinant of product and inverse of matrices. We conclude this course with eigenvalues and eigenvectors. Eigenvectors are used in dimensionality reduction in machine learning. You will see how eigenvectors naturally follow from the concept of eigenbases.