The asymptotic expansion of an integral of an exponential function

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
6
down vote

favorite
2












What is the asymptotic expansion of $f(x) := int_0^1 e^-x(1-u^2)du$?



The integrand steeply declines near $u=1$. I tried to transform $u$ into something that is suitable for the method of steepest descent, but have not found an appropriate transformation. Integration by parts has not yield a satisfactory result, perhaps due to I having not found the right components to integrate.







share|cite|improve this question


























    up vote
    6
    down vote

    favorite
    2












    What is the asymptotic expansion of $f(x) := int_0^1 e^-x(1-u^2)du$?



    The integrand steeply declines near $u=1$. I tried to transform $u$ into something that is suitable for the method of steepest descent, but have not found an appropriate transformation. Integration by parts has not yield a satisfactory result, perhaps due to I having not found the right components to integrate.







    share|cite|improve this question
























      up vote
      6
      down vote

      favorite
      2









      up vote
      6
      down vote

      favorite
      2






      2





      What is the asymptotic expansion of $f(x) := int_0^1 e^-x(1-u^2)du$?



      The integrand steeply declines near $u=1$. I tried to transform $u$ into something that is suitable for the method of steepest descent, but have not found an appropriate transformation. Integration by parts has not yield a satisfactory result, perhaps due to I having not found the right components to integrate.







      share|cite|improve this question














      What is the asymptotic expansion of $f(x) := int_0^1 e^-x(1-u^2)du$?



      The integrand steeply declines near $u=1$. I tried to transform $u$ into something that is suitable for the method of steepest descent, but have not found an appropriate transformation. Integration by parts has not yield a satisfactory result, perhaps due to I having not found the right components to integrate.









      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Aug 22 at 8:34

























      asked Aug 22 at 4:21









      Hans

      4,18221028




      4,18221028




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          6
          down vote



          accepted










          The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



          $$
          int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
          $$



          Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



          $$
          beginalign
          int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
          &approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
          &= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
          endalign
          $$



          as $x to infty$, which matches the series given in the other answer.




          Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



          $$
          I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
          $$



          as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



          Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



          $$
          g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
          $$



          We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



          $$
          I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
          $$



          We only need a rough estimate on $I_2(v)$:



          $$
          lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
          $$



          where we have assumed the last integral on the right is finite. Thus



          $$
          I(v) = I_1(v) + O(e^-delta x)
          $$



          as $x to infty$.



          Now, from $(1)$ we have



          $$
          I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
          $$



          The last integral can be bounded by



          $$
          leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
          $$



          Thus



          $$
          I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
          $$



          as $x to infty$. Finally we reattach the tails to the integrals in the sum,



          $$
          beginalign
          int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
          &= frack!x^k+1 + O!left(e^-delta xright),
          endalign
          $$



          and substitute these into the sum to get



          $$
          I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
          $$



          and hence



          $$
          I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
          $$



          as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



          $$
          I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
          $$



          as $x to infty$.






          share|cite|improve this answer






















          • Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
            – Hans
            Aug 22 at 8:46










          • @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
            – Antonio Vargas
            Aug 23 at 0:17











          • Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
            – Antonio Vargas
            Aug 23 at 0:36










          • @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
            – Antonio Vargas
            Aug 23 at 1:16










          • It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
            – Hans
            Aug 23 at 1:33

















          up vote
          5
          down vote













          The integral can be solved in closed form in terms of the Dawson function.

          $$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
          Mathematica was used to perform the asymptotic expansion for $s to infty.$






          share|cite|improve this answer




















            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "69"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2890617%2fthe-asymptotic-expansion-of-an-integral-of-an-exponential-function%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            6
            down vote



            accepted










            The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



            $$
            int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
            $$



            Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



            $$
            beginalign
            int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
            &approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
            &= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
            endalign
            $$



            as $x to infty$, which matches the series given in the other answer.




            Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



            $$
            I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



            Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



            $$
            g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
            $$



            We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



            $$
            I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
            $$



            We only need a rough estimate on $I_2(v)$:



            $$
            lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
            $$



            where we have assumed the last integral on the right is finite. Thus



            $$
            I(v) = I_1(v) + O(e^-delta x)
            $$



            as $x to infty$.



            Now, from $(1)$ we have



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
            $$



            The last integral can be bounded by



            $$
            leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
            $$



            Thus



            $$
            I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
            $$



            as $x to infty$. Finally we reattach the tails to the integrals in the sum,



            $$
            beginalign
            int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
            &= frack!x^k+1 + O!left(e^-delta xright),
            endalign
            $$



            and substitute these into the sum to get



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            and hence



            $$
            I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



            $$
            I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$.






            share|cite|improve this answer






















            • Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
              – Hans
              Aug 22 at 8:46










            • @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
              – Antonio Vargas
              Aug 23 at 0:17











            • Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
              – Antonio Vargas
              Aug 23 at 0:36










            • @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
              – Antonio Vargas
              Aug 23 at 1:16










            • It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
              – Hans
              Aug 23 at 1:33














            up vote
            6
            down vote



            accepted










            The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



            $$
            int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
            $$



            Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



            $$
            beginalign
            int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
            &approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
            &= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
            endalign
            $$



            as $x to infty$, which matches the series given in the other answer.




            Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



            $$
            I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



            Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



            $$
            g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
            $$



            We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



            $$
            I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
            $$



            We only need a rough estimate on $I_2(v)$:



            $$
            lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
            $$



            where we have assumed the last integral on the right is finite. Thus



            $$
            I(v) = I_1(v) + O(e^-delta x)
            $$



            as $x to infty$.



            Now, from $(1)$ we have



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
            $$



            The last integral can be bounded by



            $$
            leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
            $$



            Thus



            $$
            I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
            $$



            as $x to infty$. Finally we reattach the tails to the integrals in the sum,



            $$
            beginalign
            int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
            &= frack!x^k+1 + O!left(e^-delta xright),
            endalign
            $$



            and substitute these into the sum to get



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            and hence



            $$
            I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



            $$
            I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$.






            share|cite|improve this answer






















            • Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
              – Hans
              Aug 22 at 8:46










            • @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
              – Antonio Vargas
              Aug 23 at 0:17











            • Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
              – Antonio Vargas
              Aug 23 at 0:36










            • @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
              – Antonio Vargas
              Aug 23 at 1:16










            • It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
              – Hans
              Aug 23 at 1:33












            up vote
            6
            down vote



            accepted







            up vote
            6
            down vote



            accepted






            The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



            $$
            int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
            $$



            Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



            $$
            beginalign
            int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
            &approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
            &= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
            endalign
            $$



            as $x to infty$, which matches the series given in the other answer.




            Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



            $$
            I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



            Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



            $$
            g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
            $$



            We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



            $$
            I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
            $$



            We only need a rough estimate on $I_2(v)$:



            $$
            lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
            $$



            where we have assumed the last integral on the right is finite. Thus



            $$
            I(v) = I_1(v) + O(e^-delta x)
            $$



            as $x to infty$.



            Now, from $(1)$ we have



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
            $$



            The last integral can be bounded by



            $$
            leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
            $$



            Thus



            $$
            I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
            $$



            as $x to infty$. Finally we reattach the tails to the integrals in the sum,



            $$
            beginalign
            int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
            &= frack!x^k+1 + O!left(e^-delta xright),
            endalign
            $$



            and substitute these into the sum to get



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            and hence



            $$
            I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



            $$
            I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$.






            share|cite|improve this answer














            The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving



            $$
            int_0^1 e^-x(1-u^2),du = frac12 int_0^1 e^-xv (1-v)^-1/2,dv.
            $$



            Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^-1/2$, around $v=0$ and integrating term-by-term from $v=0$ to $v=infty$:



            $$
            beginalign
            int_0^1 e^-x(1-u^2),du &= frac12 int_0^1 e^-xv (1-v)^-1/2,dv \
            &approx frac12 sum_k=0^infty binom-1/2k (-1)^k int_0^infty e^-xv v^k,dv \
            &= frac12 sum_k=0^infty binom-1/2k frac(-1)^k k!x^k+1
            endalign
            $$



            as $x to infty$, which matches the series given in the other answer.




            Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $int_0^a lvert g(v) rvert,dv$ exists (these are certainly true for $g(v) = (1-v)^-1/2$), and show that



            $$
            I(v) = int_0^a e^-xvg(v),dv approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.



            Since $g(v)$ is analytic at $v=0$, there is a $delta in (0,a]$ such that $g^(k)(v)$ is analytic on $[0,delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v in [0,delta]$, there is a $v^* in [0,v]$ for which



            $$
            g(v) = sum_k=0^N fracg^(k)(0)k! v^k + fracg^(N+1)(v^*)(N+1)! v^N+1. tag1
            $$



            We will split the integral $I(v)$ at this $delta$, and estimate each piece separately. To this end, we define



            $$
            I(v) = int_0^delta e^-xvg(v),dv + int_delta^a e^-xvg(v),dv = I_1(v) + I_2(v).
            $$



            We only need a rough estimate on $I_2(v)$:



            $$
            lvert I_2(v) rvert leq int_delta^a e^-xv lvert g(v) rvert ,dv leq e^-delta x int_0^a lvert g(v) rvert,dv,
            $$



            where we have assumed the last integral on the right is finite. Thus



            $$
            I(v) = I_1(v) + O(e^-delta x)
            $$



            as $x to infty$.



            Now, from $(1)$ we have



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + frac1(N+1)! int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv.
            $$



            The last integral can be bounded by



            $$
            leftlvert int_0^delta e^-xv g^(N+1)(v^*) v^N+1,dv rightrvert leq left( sup_0 < v < delta leftlvert g^(N+1)(v) rightrvert right) int_0^infty e^-xv v^N+1,dv = fractextconst.x^N+2.
            $$



            Thus



            $$
            I_1(v) = I_1(v) = sum_k=0^N fracg^(k)(0)k! int_0^delta e^-xv v^k ,dv + O!left(x^-N-2right)
            $$



            as $x to infty$. Finally we reattach the tails to the integrals in the sum,



            $$
            beginalign
            int_0^delta e^-xv v^k ,dv &= int_0^infty e^-xv v^k,dv - int_delta^infty e^-xv v^k,dv \
            &= frack!x^k+1 + O!left(e^-delta xright),
            endalign
            $$



            and substitute these into the sum to get



            $$
            I_1(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            and hence



            $$
            I(v) = sum_k=0^N fracg^(k)(0)x^k+1 + O!left(x^-N-2right)
            $$



            as $x to infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion



            $$
            I(v) approx sum_k=0^infty fracg^(k)(0)x^k+1
            $$



            as $x to infty$.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Aug 23 at 1:56

























            answered Aug 22 at 5:03









            Antonio Vargas

            20.3k244110




            20.3k244110











            • Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
              – Hans
              Aug 22 at 8:46










            • @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
              – Antonio Vargas
              Aug 23 at 0:17











            • Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
              – Antonio Vargas
              Aug 23 at 0:36










            • @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
              – Antonio Vargas
              Aug 23 at 1:16










            • It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
              – Hans
              Aug 23 at 1:33
















            • Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
              – Hans
              Aug 22 at 8:46










            • @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
              – Antonio Vargas
              Aug 23 at 0:17











            • Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
              – Antonio Vargas
              Aug 23 at 0:36










            • @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
              – Antonio Vargas
              Aug 23 at 1:16










            • It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
              – Hans
              Aug 23 at 1:33















            Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
            – Hans
            Aug 22 at 8:46




            Thank you, Antonio Vargas. I tried the v transformation earlier but did not proceed with the binomial expansion. The singularity at $v=1$ however prevents the direct application of Watson's lemma. I have modified your answer with an integration by parts to rid the singularity there. But it seems not to match the other answer though. I have not figured out why.
            – Hans
            Aug 22 at 8:46












            @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
            – Antonio Vargas
            Aug 23 at 0:17





            @Hans Watson's lemma has no issue with the singularity at $v=1$. To apply it to an integral of the form $int_0^1 e^-xv g(v),dv$, Watson's lemma only requires (for example) $g(v)$ to be to be analytic at $v=0$ and for $int_0^1 |g(v)|,dv$ to exist. See the proof of Watson's lemma on the wikipedia page (which I wrote). I have reverted the edits you made to my answer. It does match the other answer (I double checked in Mathematica), so I'm not sure where you're going wrong.
            – Antonio Vargas
            Aug 23 at 0:17













            Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
            – Antonio Vargas
            Aug 23 at 0:36




            Hmm, it's been a while since I wrote that proof, but in rereading it now it looks like it doesn't quite apply to the case in question. It can be modified to work on this case, however.
            – Antonio Vargas
            Aug 23 at 0:36












            @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
            – Antonio Vargas
            Aug 23 at 1:16




            @Hans I've added a proof of Watson's lemma (only slightly modified from the one I linked) which applies directly to this specific question.
            – Antonio Vargas
            Aug 23 at 1:16












            It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
            – Hans
            Aug 23 at 1:33




            It looks good. Maybe you can incorporate your current more general proof into the Wikipedia article. Also, do you see why the integration by parts approach I took did not give the right answer?
            – Hans
            Aug 23 at 1:33










            up vote
            5
            down vote













            The integral can be solved in closed form in terms of the Dawson function.

            $$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
            Mathematica was used to perform the asymptotic expansion for $s to infty.$






            share|cite|improve this answer
























              up vote
              5
              down vote













              The integral can be solved in closed form in terms of the Dawson function.

              $$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
              Mathematica was used to perform the asymptotic expansion for $s to infty.$






              share|cite|improve this answer






















                up vote
                5
                down vote










                up vote
                5
                down vote









                The integral can be solved in closed form in terms of the Dawson function.

                $$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
                Mathematica was used to perform the asymptotic expansion for $s to infty.$






                share|cite|improve this answer












                The integral can be solved in closed form in terms of the Dawson function.

                $$f(x)=F(sqrtx)/sqrtx sim frac12x+frac14x^2+frac38x^3 +...$$
                Mathematica was used to perform the asymptotic expansion for $s to infty.$







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Aug 22 at 4:32









                skbmoore

                1,42229




                1,42229



























                     

                    draft saved


                    draft discarded















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2890617%2fthe-asymptotic-expansion-of-an-integral-of-an-exponential-function%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Comments

                    Popular posts from this blog

                    What does second last employer means? [closed]

                    List of Gilmore Girls characters

                    One-line joke