post
All checks were successful
Build / Build-Docker-Image (push) Successful in 1m4s

This commit is contained in:
2025-07-15 10:34:44 -04:00
parent 34e578cee2
commit 604dfd609c
2 changed files with 58 additions and 0 deletions

View File

@@ -1,5 +1,8 @@
[!]
[=post-]
<p>
<i><a href="[^baseurl]/posts/differential-week-9.html">Next post in this series</a></i>
</p>
<p>
Hello again! Our second midterm is getting close (you can read the review bit on it <a href="[^baseurl]/posts/differential-exam-2.html">here</a>), but we aren't talking about
that now: the <i>sujet du jour</i> is week 8, and we're getting deep

View File

@@ -0,0 +1,55 @@
[!]
[=post-]
<p>
Hello, everyone! Welcome back to Deadly Boring Math for yet another week. The semester is rapidly drawing to a close; we've just passed week 9, and final exams are just a few weeks away.
It's gonna be fun...
</p>
<h2>Convolution Integrals</h2>
<p>
Convolution integrals are integrals in the form `int_0^t f(t - tau)g(tau) d tau`. They are a common result of Variation of Parameters, and so we have a really useful
general form: `int_0^t f(t - tau) g(tau) d tau = L(f(t)) L(g(t))`. This is obviously very powerful. For instance, when solving
`y'' + y = g(t), y(0) = 0, y'(0) = 0`, we end up with `y = int_0^t sin(t - tau) g(tau) d tau`: solving this directly would be unpleasant, but we can just
plug in the Laplace transforms, getting `frac 1 { 1 + s^2 } L(g(tau))`.
</p>
<p>
These are often used to solve equations in the form `ay'' + by' + cy = g(t)`. In these cases, we take the Laplace transform to get
`(as^2 + bs + c)Y(s) - (as + b) y(0) - a y'(0) = G(s)`, and let `H(s) = frac 1 {as^2 + bs + c)}`: this allows us to rewrite as
`Y(s) = H(s) ((as + b) y(0) + a y'(0)) + H(s) G(s)`. Taking the inverse Laplace transform of this is not necessarily simple: the first term
will reduce to some constant times `H(s)`, so we can partial fraction decompose there, but the second term `H(s) G(s)` is going to be much harder.
Fortunately, we can rewrite as a convolution integral! `Y(s) = L^{-1}(H(s)) ((as + b) y(0) + a y'(0)) + int_0^t h(t - tau) g(tau) d tau`. Finding `h(t - tau)`
is easy, and we already have `g(t)`.
</p>
<p>
Note that the first term is the complementary solution, or <i>free response</i>, and the second term is the <i>forced response</i>.
</p>
<p>
There are quite a few situations where the forced response is much more significant than the free response. We analyze these situations
with a <i>transfer function</i>: the ratio of the forced response to the input; conveniently always equal to `H(s)`. `H(s)` also has the property that,
after being inverse-laplaced, it is the impulse response; the solution given the situation that `g(t) = delta (t)`.
</p>
<h2>Briefly: Stability and Autonomous Systems</h2>
<p>
We've already covered autonomous <i>equations</i>: A system is just two related autonomous equations. The critical idea is that they do not depend on time: the differential
equations are timeless; time is only considered after solving.
</p>
<p>
An autonomous system in the form `vec x' = f(vec x)` has critical points wherever `vec x' = 0`. These critical points are classified in exactly the same ways we already know: they
are stable, unstable, or asymptotically stable. A more rigorous definition than "stays close" or "goes away" is this: given a critical point `vec delta`, if there is some finite `epsilon`
for which `|| vec x - vec delta || < epsilon` for all `t`, the critical point is stable and possibly asymptotically stable. If <i>not</i>, it's definitely unstable.
If `|| vec x - vec delta ||` goes to 0, it's asymptotically stable.
</p>
<h2>Almost Linearity</h2>
<p>
Some nonlinear autonomous systems behave <i>sorta</i> linear near the critical points. Specifically, this is true for an autonomous system in the form `vec x' = A vec x + g(x)`
where `g` is small relative to `x` close to the critical point; that is, `lim_{vec x -> \[0, 0]} frac { | g(x) | } { || vec x || } = 0`. This is usually pretty easy to determine.
</p>
<p>
These <i>almost linear</i> systems can be, unsurprisingly, linearized! It's really as simple as dropping the `g(x)` term from the `vec x' = A vec x + g(vec x)` to get
the linear `vec x' = A vec x`. This works because `g(vec x)` is small compared to `vec x` near the critical points, and can be treated as negligible.
</p>
[/]
[=title "Differential Equations Week 9"]
[=author "Tyler Clarke"]
[=date "2025-7-15"]
[=subject "Calculus"]
[#post.html]