Why should an image be blurred using a Gaussian Kernel before downsampling?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I recently read that before downsampling an image, it should be blurred using a Gaussian Kernel. This way, the downsampled image is better than just picking a single pixel out of a NxN block or averaging over the block. After searching in this site as well as google, I didn't get any exact answer.



But there were questions on how to select $sigma$ for blurring. Reading the answers on those posts, I learned that downsampling has to be done in accordance with sampling theorem. Downsampling without blurring causes aliasing effects.



  1. Can someone please explain why image has to be blurred before downsampling? I mean what is the exact relation to sampling theorem. What happens when an image is downsampled without blurring? I mean what are these aliasing effects? How can I notice them in the downsampled image?

  2. Why is Gaussian blurring better than Averaging over a block?

If you can give some examples of the images, I would be a lot grateful. Ans, I would appreciate all kind of answers, partial, intuitive, rigorous, anything.



Thank You!










share|improve this question

























    up vote
    1
    down vote

    favorite












    I recently read that before downsampling an image, it should be blurred using a Gaussian Kernel. This way, the downsampled image is better than just picking a single pixel out of a NxN block or averaging over the block. After searching in this site as well as google, I didn't get any exact answer.



    But there were questions on how to select $sigma$ for blurring. Reading the answers on those posts, I learned that downsampling has to be done in accordance with sampling theorem. Downsampling without blurring causes aliasing effects.



    1. Can someone please explain why image has to be blurred before downsampling? I mean what is the exact relation to sampling theorem. What happens when an image is downsampled without blurring? I mean what are these aliasing effects? How can I notice them in the downsampled image?

    2. Why is Gaussian blurring better than Averaging over a block?

    If you can give some examples of the images, I would be a lot grateful. Ans, I would appreciate all kind of answers, partial, intuitive, rigorous, anything.



    Thank You!










    share|improve this question























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I recently read that before downsampling an image, it should be blurred using a Gaussian Kernel. This way, the downsampled image is better than just picking a single pixel out of a NxN block or averaging over the block. After searching in this site as well as google, I didn't get any exact answer.



      But there were questions on how to select $sigma$ for blurring. Reading the answers on those posts, I learned that downsampling has to be done in accordance with sampling theorem. Downsampling without blurring causes aliasing effects.



      1. Can someone please explain why image has to be blurred before downsampling? I mean what is the exact relation to sampling theorem. What happens when an image is downsampled without blurring? I mean what are these aliasing effects? How can I notice them in the downsampled image?

      2. Why is Gaussian blurring better than Averaging over a block?

      If you can give some examples of the images, I would be a lot grateful. Ans, I would appreciate all kind of answers, partial, intuitive, rigorous, anything.



      Thank You!










      share|improve this question













      I recently read that before downsampling an image, it should be blurred using a Gaussian Kernel. This way, the downsampled image is better than just picking a single pixel out of a NxN block or averaging over the block. After searching in this site as well as google, I didn't get any exact answer.



      But there were questions on how to select $sigma$ for blurring. Reading the answers on those posts, I learned that downsampling has to be done in accordance with sampling theorem. Downsampling without blurring causes aliasing effects.



      1. Can someone please explain why image has to be blurred before downsampling? I mean what is the exact relation to sampling theorem. What happens when an image is downsampled without blurring? I mean what are these aliasing effects? How can I notice them in the downsampled image?

      2. Why is Gaussian blurring better than Averaging over a block?

      If you can give some examples of the images, I would be a lot grateful. Ans, I would appreciate all kind of answers, partial, intuitive, rigorous, anything.



      Thank You!







      image-processing gaussian downsampling smoothing






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked 5 hours ago









      Nagabhushan S N

      1455




      1455




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          2
          down vote













          It should not, the need really depends on your application. However, this is a safe bet for most needs, and almost mandatory when you want to control the information lost by the downsampling.



          Blurring is often another word for low-pass filtering. When an image contains high-frequency content (fast variations), downsampling can produce visually weird or annoying aliasing or moiré artifacts. There is an example on wikipedia Aliasing: the original



          brick wall patterns



          and the downsampled, represented at the same size:



          brick wall patterns moiré



          The ripples on the bottom right are low-frequency artifacts generated by a careless brute-force downsampling. A blurring would attenuate image sharpness, dim the borders between bricks, and reduce the apparent aliasing aspects.



          The choice of appropriate blurring filters has a long history in image processing. Gaussian shapes have long been considered somehow optimal for different reasons for "theoretical" continuous images. Plus, it is both decreasing and symmetric in the space and the frequency domains. In the time domain, this means that faraway pixels have less influence. In the frequency domain, frequencies are reduced monotonously from low to high.



          Since most images are discretized, reality is somewhat different. Since the Gaussian convolution used to be computationally expensive, early approximating filters were designed with short support, borrowed for instance from Pascal triangle. Later, fast recursive implementations were designed (Deriche, Shen, etc.)



          I guess question 1) is answered. For question 2) simple averaging gives an equal weight to all pixels in the window. Hence, faraway pixels are given equal importance with respect to closer pixels, which is not optimal in regions where images exhibit weak stationarity, like trends, edges and textures.






          share|improve this answer



























            up vote
            1
            down vote













            According to (digital) sampling theory, signals should be properly bandlimited, before they are (down) sampled.



            A digital filter limits the bandwidth of the signal and makes it suitable for downsampling without aliasing.



            A Gausssian filter is very suitable as a filter, as it has a number of nice features. The Gaussian function is mathematically tractable. It has sufficient frequency attenuation. It has small time domain footprint. It has little noticeable artefacts. Therefore it's the programmers choice in image processing.






            share|improve this answer




















              Your Answer




              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "295"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: false,
              noModals: false,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdsp.stackexchange.com%2fquestions%2f52721%2fwhy-should-an-image-be-blurred-using-a-gaussian-kernel-before-downsampling%23new-answer', 'question_page');

              );

              Post as a guest






























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              2
              down vote













              It should not, the need really depends on your application. However, this is a safe bet for most needs, and almost mandatory when you want to control the information lost by the downsampling.



              Blurring is often another word for low-pass filtering. When an image contains high-frequency content (fast variations), downsampling can produce visually weird or annoying aliasing or moiré artifacts. There is an example on wikipedia Aliasing: the original



              brick wall patterns



              and the downsampled, represented at the same size:



              brick wall patterns moiré



              The ripples on the bottom right are low-frequency artifacts generated by a careless brute-force downsampling. A blurring would attenuate image sharpness, dim the borders between bricks, and reduce the apparent aliasing aspects.



              The choice of appropriate blurring filters has a long history in image processing. Gaussian shapes have long been considered somehow optimal for different reasons for "theoretical" continuous images. Plus, it is both decreasing and symmetric in the space and the frequency domains. In the time domain, this means that faraway pixels have less influence. In the frequency domain, frequencies are reduced monotonously from low to high.



              Since most images are discretized, reality is somewhat different. Since the Gaussian convolution used to be computationally expensive, early approximating filters were designed with short support, borrowed for instance from Pascal triangle. Later, fast recursive implementations were designed (Deriche, Shen, etc.)



              I guess question 1) is answered. For question 2) simple averaging gives an equal weight to all pixels in the window. Hence, faraway pixels are given equal importance with respect to closer pixels, which is not optimal in regions where images exhibit weak stationarity, like trends, edges and textures.






              share|improve this answer
























                up vote
                2
                down vote













                It should not, the need really depends on your application. However, this is a safe bet for most needs, and almost mandatory when you want to control the information lost by the downsampling.



                Blurring is often another word for low-pass filtering. When an image contains high-frequency content (fast variations), downsampling can produce visually weird or annoying aliasing or moiré artifacts. There is an example on wikipedia Aliasing: the original



                brick wall patterns



                and the downsampled, represented at the same size:



                brick wall patterns moiré



                The ripples on the bottom right are low-frequency artifacts generated by a careless brute-force downsampling. A blurring would attenuate image sharpness, dim the borders between bricks, and reduce the apparent aliasing aspects.



                The choice of appropriate blurring filters has a long history in image processing. Gaussian shapes have long been considered somehow optimal for different reasons for "theoretical" continuous images. Plus, it is both decreasing and symmetric in the space and the frequency domains. In the time domain, this means that faraway pixels have less influence. In the frequency domain, frequencies are reduced monotonously from low to high.



                Since most images are discretized, reality is somewhat different. Since the Gaussian convolution used to be computationally expensive, early approximating filters were designed with short support, borrowed for instance from Pascal triangle. Later, fast recursive implementations were designed (Deriche, Shen, etc.)



                I guess question 1) is answered. For question 2) simple averaging gives an equal weight to all pixels in the window. Hence, faraway pixels are given equal importance with respect to closer pixels, which is not optimal in regions where images exhibit weak stationarity, like trends, edges and textures.






                share|improve this answer






















                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  It should not, the need really depends on your application. However, this is a safe bet for most needs, and almost mandatory when you want to control the information lost by the downsampling.



                  Blurring is often another word for low-pass filtering. When an image contains high-frequency content (fast variations), downsampling can produce visually weird or annoying aliasing or moiré artifacts. There is an example on wikipedia Aliasing: the original



                  brick wall patterns



                  and the downsampled, represented at the same size:



                  brick wall patterns moiré



                  The ripples on the bottom right are low-frequency artifacts generated by a careless brute-force downsampling. A blurring would attenuate image sharpness, dim the borders between bricks, and reduce the apparent aliasing aspects.



                  The choice of appropriate blurring filters has a long history in image processing. Gaussian shapes have long been considered somehow optimal for different reasons for "theoretical" continuous images. Plus, it is both decreasing and symmetric in the space and the frequency domains. In the time domain, this means that faraway pixels have less influence. In the frequency domain, frequencies are reduced monotonously from low to high.



                  Since most images are discretized, reality is somewhat different. Since the Gaussian convolution used to be computationally expensive, early approximating filters were designed with short support, borrowed for instance from Pascal triangle. Later, fast recursive implementations were designed (Deriche, Shen, etc.)



                  I guess question 1) is answered. For question 2) simple averaging gives an equal weight to all pixels in the window. Hence, faraway pixels are given equal importance with respect to closer pixels, which is not optimal in regions where images exhibit weak stationarity, like trends, edges and textures.






                  share|improve this answer












                  It should not, the need really depends on your application. However, this is a safe bet for most needs, and almost mandatory when you want to control the information lost by the downsampling.



                  Blurring is often another word for low-pass filtering. When an image contains high-frequency content (fast variations), downsampling can produce visually weird or annoying aliasing or moiré artifacts. There is an example on wikipedia Aliasing: the original



                  brick wall patterns



                  and the downsampled, represented at the same size:



                  brick wall patterns moiré



                  The ripples on the bottom right are low-frequency artifacts generated by a careless brute-force downsampling. A blurring would attenuate image sharpness, dim the borders between bricks, and reduce the apparent aliasing aspects.



                  The choice of appropriate blurring filters has a long history in image processing. Gaussian shapes have long been considered somehow optimal for different reasons for "theoretical" continuous images. Plus, it is both decreasing and symmetric in the space and the frequency domains. In the time domain, this means that faraway pixels have less influence. In the frequency domain, frequencies are reduced monotonously from low to high.



                  Since most images are discretized, reality is somewhat different. Since the Gaussian convolution used to be computationally expensive, early approximating filters were designed with short support, borrowed for instance from Pascal triangle. Later, fast recursive implementations were designed (Deriche, Shen, etc.)



                  I guess question 1) is answered. For question 2) simple averaging gives an equal weight to all pixels in the window. Hence, faraway pixels are given equal importance with respect to closer pixels, which is not optimal in regions where images exhibit weak stationarity, like trends, edges and textures.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered 2 hours ago









                  Laurent Duval

                  15.3k32057




                  15.3k32057




















                      up vote
                      1
                      down vote













                      According to (digital) sampling theory, signals should be properly bandlimited, before they are (down) sampled.



                      A digital filter limits the bandwidth of the signal and makes it suitable for downsampling without aliasing.



                      A Gausssian filter is very suitable as a filter, as it has a number of nice features. The Gaussian function is mathematically tractable. It has sufficient frequency attenuation. It has small time domain footprint. It has little noticeable artefacts. Therefore it's the programmers choice in image processing.






                      share|improve this answer
























                        up vote
                        1
                        down vote













                        According to (digital) sampling theory, signals should be properly bandlimited, before they are (down) sampled.



                        A digital filter limits the bandwidth of the signal and makes it suitable for downsampling without aliasing.



                        A Gausssian filter is very suitable as a filter, as it has a number of nice features. The Gaussian function is mathematically tractable. It has sufficient frequency attenuation. It has small time domain footprint. It has little noticeable artefacts. Therefore it's the programmers choice in image processing.






                        share|improve this answer






















                          up vote
                          1
                          down vote










                          up vote
                          1
                          down vote









                          According to (digital) sampling theory, signals should be properly bandlimited, before they are (down) sampled.



                          A digital filter limits the bandwidth of the signal and makes it suitable for downsampling without aliasing.



                          A Gausssian filter is very suitable as a filter, as it has a number of nice features. The Gaussian function is mathematically tractable. It has sufficient frequency attenuation. It has small time domain footprint. It has little noticeable artefacts. Therefore it's the programmers choice in image processing.






                          share|improve this answer












                          According to (digital) sampling theory, signals should be properly bandlimited, before they are (down) sampled.



                          A digital filter limits the bandwidth of the signal and makes it suitable for downsampling without aliasing.



                          A Gausssian filter is very suitable as a filter, as it has a number of nice features. The Gaussian function is mathematically tractable. It has sufficient frequency attenuation. It has small time domain footprint. It has little noticeable artefacts. Therefore it's the programmers choice in image processing.







                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered 4 hours ago









                          Fat32

                          12.6k31127




                          12.6k31127



























                               

                              draft saved


                              draft discarded















































                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdsp.stackexchange.com%2fquestions%2f52721%2fwhy-should-an-image-be-blurred-using-a-gaussian-kernel-before-downsampling%23new-answer', 'question_page');

                              );

                              Post as a guest













































































                              Comments

                              Popular posts from this blog

                              Long meetings (6-7 hours a day): Being “babysat” by supervisor

                              Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

                              Confectionery