Jump to content

Approximate solutions to differential equations through polynomial interpolation.


mardlamock

Recommended Posts

Hello, I was just wondering if there is such a thing as getting the approximate solution to a differential equation with a formula (say a polynomial) by using interpolation on an already existing numerical solution (through RK). Is there? Im doing something of the sort but would like to know if anyone else has done it before. Thanks!

Link to comment
Share on other sites

Oh my gosh. Calculus. I studied calculus about 25 years ago. I remember absolutely none of it. Sorry, just brings back memories from another life. Carry on.

Link to comment
Share on other sites

Hello, I was just wondering if there is such a thing as getting the approximate solution to a differential equation with a formula (say a polynomial) by using interpolation on an already existing numerical solution (through RK). Is there? Im doing something of the sort but would like to know if anyone else has done it before. Thanks!

I used to do that myself when I was a kid! :D

Much easier to come up with the first differential and solve for the precise answer, though.

Best,

-Slashy

Link to comment
Share on other sites

I used to do that myself when I was a kid! :D

Much easier to come up with the first differential and solve for the precise answer, though.

Best,

-Slashy

Lol... Solving a differential equation in general is no way in hell easier than making a polynomial approximation. Usually it isn't even possible to.

Link to comment
Share on other sites

Hello, I was just wondering if there is such a thing as getting the approximate solution to a differential equation with a formula (say a polynomial) by using interpolation on an already existing numerical solution (through RK). Is there? Im doing something of the sort but would like to know if anyone else has done it before. Thanks!

Absolutely. What you are looking for is called the Collocation Method. Very frequently, the numerical solution will use a Collocation Method as a single step in an implicit RK scheme. But if you are working with compact space and you expect polynomial of sufficient degree to be a good approximation to true solution, you can just collocate the entire space and do this in one go.

Unfortunately, general collocation method requires solving a non-linear optimization problem. However, in a special case where you are solving y'(x) = f(x), the collocation can be solved analytically. But then you are really just doing numerical integration and your analytical solution is the quadrature rule for your collocation points.

For solutions approximated with polynomials, Gauss-Legendre quadrature points are a good choice for collocation points.

Link to comment
Share on other sites

Lol... Solving a differential equation in general is no way in hell easier than making a polynomial approximation. Usually it isn't even possible to.

Maybe you're thinking of integrals? Derivatives are pretty easy...

Best,

-Slashy

Link to comment
Share on other sites

Well... if you could bear with me, under what circumstances would solving a differential equation be difficult or impossible? This hasn't been my experience.

Best,

-Slashy

The vast vast majority. A differential equation that finding an analytical solution is actually even possible is in the minority. One of which it is easy to find one is even more rare.

Typical example

f''+sin(f)=0.

Just write any random differential equation, chances are there will be no analytical solution.

Link to comment
Share on other sites

Well... if you could bear with me, under what circumstances would solving a differential equation be difficult or impossible? This hasn't been my experience.

Best,

-Slashy

There are many differential equations that do not have a nice ("elementary") solution. Actually, almost all of them are this ugly. There is also a field called Differential Galois Theory that came from deciding if some diff. eq. has a nice solution. An example would be f'' + x·f = 0 whose nontrivial solutions cannot be expressed using +, -, ·, /, exp, and integral signs (note that by these we already have ^, sin, cos, log, etc.).

Even integration can be hard. There is an algorithm by Risch that can find the integral of an elementary function in terms of elementary functions if it exists (while also being able to decide if no such thing exists). For example there are no elementary functions whose derivatives are e^(-x²) or x^x by Liouville's theorem.

Link to comment
Share on other sites

I must just be missing what the OP was asking or what you folks are talking about.

I was under the impression that he was talking about estimating the slope of a curve at a point through interpolation. Is this not the case?

Best,

-Slashy

Link to comment
Share on other sites

I must just be missing what the OP was asking or what you folks are talking about.

I was under the impression that he was talking about estimating the slope of a curve at a point through interpolation. Is this not the case?

Best,

-Slashy

No, the OP is talking about differential equations, not just differentials.

Link to comment
Share on other sites

Ah, okay then. I'm talking about something completely different. Please disregard :)

Best,

-Slashy

Even just talking about differentiating a function, a solution is certainly not generally possible (e.g. the sum from n=0 to infinity of (a^n)*cos((b^n)*pi*x) with 0<a<1 and b being an odd integer, is not differentiable with respect to x at any point.

Link to comment
Share on other sites

Even just talking about differentiating a function, a solution is certainly not generally possible (e.g. the sum from n=0 to infinity of (a^n)*cos((b^n)*pi*x) with 0<a<1 and b being an odd integer, is not differentiable with respect to x at any point.

But here we are already talking about a badly behaved function to begin with (and if it were somewhat reasonable, we could still find its derivative by deriving each summand). On the other hand, e^(-x²) is almost as nice as it gets for a non-polynomial function.

Link to comment
Share on other sites

we could still find its derivative by deriving each summand

Nope, there is no solution to differentiating it.

If you select any region c<x<d, if you decrease d-c, you'll find the region is exactly the same, even when taking the limit (d-c) -> 0.

Link to comment
Share on other sites

Nope, there is no solution to differentiating it.

If you select any region c<x<d, if you decrease d-c, you'll find the region is exactly the same, even when taking the limit (d-c) -> 0.

Read my entire sentence instead of quoting me out of context please.

Link to comment
Share on other sites

I can't see how in any way at all that is out of context.

By completely ignoring and not quoting the part where I said " if it were somewhat reasonable". That's a subjunctive there. So I am not claiming it is true, I am talking hypothetical there; about nicer functions.

Link to comment
Share on other sites

By completely ignoring and not quoting the part where I said " if it were somewhat reasonable". That's a subjunctive there. So I am not claiming it is true, I am talking hypothetical there; about nicer functions.

I don't understand.

Are you saying in response to me claiming that functions are not always differential in general, and giving an example, that your point was to say pretty much "Yeah but if you gave a different example that was differentiable then it would be differentiable"?

If that is your point then, yes. Not all functions can't be differentiated. I don't see how in any way why you would bring that up.

Link to comment
Share on other sites

Are you saying in response to me claiming that functions are not always differential in general, and giving an example, that your point was to say pretty much "Yeah but if you gave a different example that was differentiable then it would be differentiable"?

I was not talking about differentiability there. I was saying that in not too ugly cases, the derivative would be given by the sum of the derivatives of the summands. In other words, that the usual rules still apply then. Which is not at all obvious for infinite sums.

I was just adding more information. And that here the problem is that the coefficient is ugly; while e^(-x²) surely is a nice function.

If you look carefully you will see that I nowhere did contradict your statements on that function.

Edited by ZetaX
Link to comment
Share on other sites

My mistake then sorry, misinterpreted you.

No offense taken :-) Sorry if I overreacted. I am just a bit wary of missquotes in this forum after certain (totally unrelated to this) cases by others...

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...