With most math courses, you can't really know what it's about until you're in it. This, hopefully, will express some of what Lin Al is about before you decide to take it. Each context assumes you've taken the corresponding course, be it Algebra I, Geometry, or Trigonometry.
Context: Algebra I
The first confusion most people have over the title of "Linear Algebra" is that it sounds like a step backwards. Students taking it have already had Algebra I and II in high school and have been working with quadratics, cubics, quartics, and more. Why Algebra again? Why linear?
The answer is that it is a step back, in a way, but one that gives a deeper understanding. The objective is to focus on one thing you do in Algebra I — solving systems of linear equations — recognizing that it is really easy, and perhaps could be simplified, or at least streamlined. The surprising part is that it can be generalized, and there are some really far-reaching results that come from this study. (You only get to see those if you take a more theory-based course, however.)
So, how have you experienced systems of equations? Most Algebra I courses focus on problems like
\[ 2x + 5y = 17 \\ 3x - 7y = 12 \]
where you are then asked to find the intersection point of the two lines, the \((x,y)\) ordered pair that will solve both equations simultaneously. Usually, you're first taught how to do this by substitution (solving one equation for a variable, plugging into the other, back-substitute to get the rest), then by elimination (multiplying each equation by clever constants so that adding them cancels a variable out, then back-substituting to get the rest), and maybe by graphing once or twice to get an approximate answer. Courses tend to hand you 3-equation, 3-variable problems sparingly, and never go past that. They might discuss how you can find whether there is a solution without finding it by using some strange "determinant" creature, and if your teacher feels really adventurous, they'll teach you Cramer's rule or how to invert 2×2 matrices.
Linear algebra tries to answer the big questions that got brushed under the rug above. Things like:
- If we have \(n\) variables, do we always need \(n\) equations? What happens when we have more or less?
- The geometry of the situation gets confusing when we have more than two or three variables — can we do better?
- What's the minimum information needed to determine if the system has a solution?
- Can we consolidate the solving process in any way? Can we generalize that consolidation to more variables and equations?
- (In a theory-based course:) Can we apply solving systems to any other situations? What properties are required for it to work this way?
To streamline the study of these questions, the first tool introduced is the array. The idea is that when you look at enough problems like the one mentioned above, you get tired of writing the variables, which really just serve to "tag" the constants in the equations — which one is the \(x\)-coefficient, and so forth. So, instead of writing the problem as it was above, in linear algebra, we write each variable once:
\[ \newcommand\matrix[1]{\left[\begin{matrix} #1 \end{matrix}\right]}
\matrix{1 & 2 \\ 3 & 4}\matrix{x \\ y} = \matrix{17 \\ 12} \]
and then we define all the arithmetic for these new objects, called matrices (sg. matrix), to make that way of writing work. The reason the notation sticks is that as the big questions start getting answers, we find that more often than not, that first matrix of variable coefficients on the left is the most important one, and you only need to study him. At that point, you're studying 2×2 matrices rather than the original systems of equations.
These are just the starting points for linear algebra, however. As those big questions start getting answers, more questions arise, and new tools and directions of study are pursued.
Context: Geometry
A fair portion of geometry studies transformations, such as scalings, rotations, and reflections. Each transformation is one you can apply to the whole space (each point in the space has a unique point it is replaced by) and has nice properties for the shapes you may have been looking at. It turns out most of these are "linear transformations" of the space in the terms of linear algebra, and the numerical operations can be more easily categorized and understood in those terms.
In particular, within geometry, consider the concepts of congruency (two shapes that are "the same" but positioned differently in space) and isometry (a class of transformations that preserve distance, i.e., any two points before the transform have the same distance between them as after the transform). One way I like to relate these concepts in geometry is: Two shapes are congruent if and only if there is an isometry that maps one to the other. Geometers like to classify isometries as translations, rotations, reflections, and any compositions of those, such as glide reflections.
Linear algrebra picks up the numerical slack of that study, and formalizes the classification even more. First, it's easy to prove that isometries that fix the origin (so that it maps to itself) are linear transformations, and then you start to study linear transformations that preserve norms, the linear algebra notion of distance. Then you get to see exactly which matrices correspond to isometry (they are called orthogonal matrices, and a multiple of one representing a composition of a 45° rotation and a reflection shows up in the Trigonometry context below) and discover lots of cool properties they have. It becomes easy to classify the rotations from the reflections, as well as to notice that every rotation is a composition of two reflections, though every composition of rotations gives a rotation. It's also then easy to show that there are no origin-preserving isometries other than rotations, reflections, and their compositions.
Long story short, the rich study of the very geometric concept of isometry becomes very clean in the linear algebra framework, and beauty is found. And same as with the motivations from the Algebra I context, this is only the beginning for linear algebra.
Context: Trigonometry
When a skilled linear algebraist looks at sine's sum-difference formulas, they don't see this:
\[ \sin(u+v) = \sin u\cos v + \cos u\sin v \\
\sin(u-v) = \sin u\cos v - \cos u\sin v \]
Instead, they gloss over the trig bits, seeing something like
\[ \begin{aligned}s &= x + y \\
t &= x - y\end{aligned}
\quad\text{or in matrix form:}\quad
\matrix{s \\ t} = \matrix{1 & 1 \\ 1 & -1}\matrix{x \\ y} \]
The important thing seen is that from either of those views, it's clear that this is a solvable system of equations — which means we can invert it, expressing the "variables" \(x\), \(y\) in terms of the "answers" \(s\), \(t\). (Put back into trig terms, we can solve to find what each of \(\sin u\cos v\) and \(\cos u\sin v\) is in terms of \(\sin(u+v)\) and \(\sin(u-v)\).) Using the Algebra I trick of elimination, we can add the two equations together and solve for \(x\) to get
\[ s + t = 2x \\ x = \tfrac12(s+t) \]
and we can either back-substitute to find \(y\), or just re-eliminate by subtracting the two equations to get
\[ s - t = 2y \\ y = \tfrac12(s-t) \]
In full inverted glory, we summarize our results as
\[ \begin{aligned} s &= x + y \\ t &= x - y \end{aligned}
\Longleftrightarrow
\begin{aligned}x &= \tfrac12(s+t) \\ y &= \tfrac12(s-t)\end{aligned}
\\ \text{or in matrix form:}\\
\matrix{1 & 1 \\ 1 & -1}^{-1} = \frac12\matrix{1 & 1 \\ 1 & -1}\]
In matrix form, you may notice something quite neat — it seems almost as though our matrix is almost it's own inverse. There is a section of linear algebra that studies situations like that, specifically orthogonal matrices. (More on that in the Geometry context above.) But if you don't abstract away the sines and cosines, all you see is this:
\[ \begin{aligned}\sin(u+v) &= \sin u\cos v + \cos u\sin v \\
\sin(u-v) &= \sin u\cos v - \cos u\sin v\end{aligned}
\\ \Longleftrightarrow \\
\begin{aligned}\sin u\cos v &= \tfrac12\big(\sin(u+v) + \sin(u-v)\big) \\
\cos u\sin v &= \tfrac12\big(\sin(u+v) - \sin(u-v)\big) \end{aligned} \]
and it can be difficult to look past the gook and see that something interesting is happening.