Example of a non-negative discrete distribution where the mean (or another moment) does not exist?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
3
down vote

favorite












I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)



I don't have an example handy if the discrete random variable has support on $mathbbZ$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).










share|cite|improve this question



























    up vote
    3
    down vote

    favorite












    I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)



    I don't have an example handy if the discrete random variable has support on $mathbbZ$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).










    share|cite|improve this question























      up vote
      3
      down vote

      favorite









      up vote
      3
      down vote

      favorite











      I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)



      I don't have an example handy if the discrete random variable has support on $mathbbZ$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).










      share|cite|improve this question













      I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)



      I don't have an example handy if the discrete random variable has support on $mathbbZ$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).







      mathematical-statistics expected-value






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 51 mins ago









      Lucas Roberts

      2,013429




      2,013429




















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          3
          down vote













          Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
          $$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
          $$
          This random variable $X$ arises in the St. Petersburg paradox.






          share|cite|improve this answer



























            up vote
            2
            down vote













            Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



            $$int_0^infty d(1-F(x)) = 1/2 + 1/3 + 1/4 + cdots$$



            which diverges.




            If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



            $$Pr_F(n) = frac1n - frac1n+1.$$



            This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



            The expectation is



            $$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



            which diverges.






            share|cite|improve this answer



























              up vote
              1
              down vote














              1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



                $P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



                where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



                (edit: The case $theta=2$ is very similar to whuber's answer)



                Another distribution with similar tail behaviour is the Yule-Simon distribution.




              2. Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



                $P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$







              share|cite|improve this answer






















                Your Answer




                StackExchange.ifUsing("editor", function ()
                return StackExchange.using("mathjaxEditing", function ()
                StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
                StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
                );
                );
                , "mathjax-editing");

                StackExchange.ready(function()
                var channelOptions =
                tags: "".split(" "),
                id: "65"
                ;
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function()
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled)
                StackExchange.using("snippets", function()
                createEditor();
                );

                else
                createEditor();

                );

                function createEditor()
                StackExchange.prepareEditor(
                heartbeatType: 'answer',
                convertImagesToLinks: false,
                noModals: false,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                );



                );













                 

                draft saved


                draft discarded


















                StackExchange.ready(
                function ()
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f366863%2fexample-of-a-non-negative-discrete-distribution-where-the-mean-or-another-momen%23new-answer', 'question_page');

                );

                Post as a guest






























                3 Answers
                3






                active

                oldest

                votes








                3 Answers
                3






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes








                up vote
                3
                down vote













                Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
                $$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
                $$
                This random variable $X$ arises in the St. Petersburg paradox.






                share|cite|improve this answer
























                  up vote
                  3
                  down vote













                  Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
                  $$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
                  $$
                  This random variable $X$ arises in the St. Petersburg paradox.






                  share|cite|improve this answer






















                    up vote
                    3
                    down vote










                    up vote
                    3
                    down vote









                    Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
                    $$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
                    $$
                    This random variable $X$ arises in the St. Petersburg paradox.






                    share|cite|improve this answer












                    Here's a famous example: Let $X$ take value $2^k$ with probability $2^-k$, for each integer $kge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $sum_k=1^infty 2^-k=1$, but its expectation is
                    $$E(X) = sum_k=1^infty 2^k P(X=2^k) = sum_k=1^infty 1 = infty.
                    $$
                    This random variable $X$ arises in the St. Petersburg paradox.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered 16 mins ago









                    grand_chat

                    1,70724




                    1,70724






















                        up vote
                        2
                        down vote













                        Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



                        $$int_0^infty d(1-F(x)) = 1/2 + 1/3 + 1/4 + cdots$$



                        which diverges.




                        If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



                        $$Pr_F(n) = frac1n - frac1n+1.$$



                        This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



                        The expectation is



                        $$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



                        which diverges.






                        share|cite|improve this answer
























                          up vote
                          2
                          down vote













                          Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



                          $$int_0^infty d(1-F(x)) = 1/2 + 1/3 + 1/4 + cdots$$



                          which diverges.




                          If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



                          $$Pr_F(n) = frac1n - frac1n+1.$$



                          This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



                          The expectation is



                          $$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



                          which diverges.






                          share|cite|improve this answer






















                            up vote
                            2
                            down vote










                            up vote
                            2
                            down vote









                            Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



                            $$int_0^infty d(1-F(x)) = 1/2 + 1/3 + 1/4 + cdots$$



                            which diverges.




                            If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



                            $$Pr_F(n) = frac1n - frac1n+1.$$



                            This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



                            The expectation is



                            $$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



                            which diverges.






                            share|cite|improve this answer












                            Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is



                            $$int_0^infty d(1-F(x)) = 1/2 + 1/3 + 1/4 + cdots$$



                            which diverges.




                            If you're uncomfortable with this notation, note that for $n=1,2,3,ldots,$



                            $$Pr_F(n) = frac1n - frac1n+1.$$



                            This defines a probability distribution since each term is positive and $$sum_n=1^infty Pr_F(n) = sum_n=1^infty left(frac1n - frac1n+1right) = lim_nto infty 1 - frac1n+1 = 1.$$



                            The expectation is



                            $$sum_n=1^infty n,Pr_F(n) = sum_n=1^infty nleft(frac1n - frac1n+1right) =sum_n=1^infty frac1n+1 = 1/2 + 1/3 + 1/4 + cdots$$



                            which diverges.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered 45 mins ago









                            whuber♦

                            195k31417778




                            195k31417778




















                                up vote
                                1
                                down vote














                                1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



                                  $P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



                                  where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



                                  (edit: The case $theta=2$ is very similar to whuber's answer)



                                  Another distribution with similar tail behaviour is the Yule-Simon distribution.




                                2. Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



                                  $P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$







                                share|cite|improve this answer


























                                  up vote
                                  1
                                  down vote














                                  1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



                                    $P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



                                    where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



                                    (edit: The case $theta=2$ is very similar to whuber's answer)



                                    Another distribution with similar tail behaviour is the Yule-Simon distribution.




                                  2. Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



                                    $P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$







                                  share|cite|improve this answer
























                                    up vote
                                    1
                                    down vote










                                    up vote
                                    1
                                    down vote










                                    1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



                                      $P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



                                      where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



                                      (edit: The case $theta=2$ is very similar to whuber's answer)



                                      Another distribution with similar tail behaviour is the Yule-Simon distribution.




                                    2. Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



                                      $P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$







                                    share|cite|improve this answer















                                    1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<thetaleq 2$) .



                                      $P(X=x|theta)=frac 1zeta (theta)x^-theta,,: x=1,2,...,:theta>1$



                                      where the normalizing constant involves $zeta(cdot)$, the Riemann zeta function



                                      (edit: The case $theta=2$ is very similar to whuber's answer)



                                      Another distribution with similar tail behaviour is the Yule-Simon distribution.




                                    2. Another example would be the beta-negative binomial distribution with $0<alphaleq 1$:



                                      $P(X=x|alpha ,beta ,r)=frac Gamma (r+x)x!;Gamma (r)frac mathrmB (alpha +r,beta +x)mathrmB (alpha ,beta ),,:x=0,1,2...:alpha,beta,r > 0$








                                    share|cite|improve this answer














                                    share|cite|improve this answer



                                    share|cite|improve this answer








                                    edited 8 mins ago

























                                    answered 33 mins ago









                                    Glen_b♦

                                    202k22381707




                                    202k22381707



























                                         

                                        draft saved


                                        draft discarded















































                                         


                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function ()
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f366863%2fexample-of-a-non-negative-discrete-distribution-where-the-mean-or-another-momen%23new-answer', 'question_page');

                                        );

                                        Post as a guest













































































                                        Comments

                                        Popular posts from this blog

                                        Long meetings (6-7 hours a day): Being “babysat” by supervisor

                                        Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

                                        Confectionery