Processing math: 100%

Monday, November 17, 2014

Function Whose Second Derivative is the Square of it's First Dervative

I was working on different forms of the Laplacian of a general product of scalar functions (which I might share in a later post). And I came to a point that I wanted to ask myself when is \nabla{f}\cdot\nabla{f}=\Delta{f} which is similar to asking if \frac{\partial^2 f}{\partial x^2} = \left(\frac{\partial f}{\partial x}\right)^2 In fact, the latter is a special case of the former. And if the partials with respect to x,y,z all follow this, then we have a term-wise equality between the dot product of the gradient and the Laplacian. But when does this happen? Well, let's start with a function of 1 variable. So we have the ODE f''(x)- \left(f'(x)\right)^2=0 First, substitute g(x) = f'(x). g'-g^2=0 Clearly we can write the following \begin{align*} g&=g\\ g'&=g^2\\ g''=2gg'&=2g^3\\ g^{(3)}=6g^2g'&=6g^4\\ g^{(4)}=24g^3g'&=24g^5\\ &\vdots \end{align*} And in general is appears that g^{(n)}=n!g^{n+1} We can prove that with induction. The base cases are done. And the induction step is fairly straight forward g^{(n+1)} = \left(g^{(n)}\right)' = \left(n!g^{n+1}\right)' = \left(n+1\right)n!g^ng' = \left(n+1\right)!g^{n+2} Let's try to write the Taylor series for g now letting g_0=g(x_0). g(x) = \sum_{n=0}^\infty\frac{g^{(n)}(0)}{n!}\left(x-x_0\right)^n = \sum_{n=0}^\infty\left(g_0\right)^{n+1}\left(x-x_0\right)^n=g_0\sum_{n=0}^\infty\left(g_0\left(x-x_0\right)\right)^n This last part looks an awful lot like the geometric series (1-x)^{-1}. But we shift x by x_0 scale the value by g_0. So really we have g(x) = \frac{g_0}{1-g_0\left(x-x_0\right)}=\frac{-a}{ax+b} where a=-g_0 and b=1+g_0x_0. Now since we said that g=f' we just integrate once to find f f(x) = \int\frac{-a}{ax+b}\,\mathrm dx Simple u-substitution for u=ax+b yields f(x) = c - \ln(ax+b) Double checking reveals that this does, in fact, solve our ODE. We could also separate variables and integrate to arrive at this same answer. Also, a buddy of mine shared the intuition almost immediately that \ln(x) is a basic function that gets us close to the solution and putting a negative out front fixes the one sign issue that shows up. So we can just guess that our function looks like f(x) = -\ln(g(x)) for some function g(x). Putting this into our original equation yields \frac{-g''(x)g(x)+g'(x)^2}{g(x)^2}=\frac{g'(x)^2}{g(x)^2} So either g(x) = 0 or g''(x)=0. The former means we'd be dividing by 0 (plus it's a boring answer), but the latter just implies that our function is linear. So g(x)=ax+b. Lastly, we just note that we only restrict the derivative, so we can slap a constant out front. f(x) = c - \ln(ax+b) The same thing we got before! But, how can we come up with a special case where for x,y,z we have \nabla{f}\cdot\nabla{f}=\Delta{f}? Well, a really simple solution would be f(x,y,z) = e - \ln(ax+by+cz+d) with a,b,c,d,e constants. The by+cz+d group would be a "constant" when taking the partial derivative with respect to x. And the rest of the partials would follow similarly!

Now, I don't know if this is the only solution to our original equation because we worked from the single variable case up. But it's a simple solution.

No comments:

Post a Comment