Why green phosphor instead of amber?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
3
down vote

favorite












According to this answer to Why were early personal computer monitors not green? they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.



I used Commodore PET's with green monitors in the early 80s and DEC terminals with amber monitors in the late 80s, in school and college computer labs respectively, both with 50 hertz fluorescent lights, and I don't remember the former having a problem with either dimness or burn-in, but obviously my eyesight was better then than it is now, and I'm not sure exactly how much to trust that memory. So say green monitors did indeed have that problem.



If amber was that much better, why did anyone use green?



According to Wikipedias list of Phosphor Types green is P1 and amber is P3, which might mean amber was simply not invented until later. However, according to that table, black and white TV is P4, which would suggest amber was invented long before monochrome monitors became common.



Was there any other reason to prefer green over amber?










share|improve this question



























    up vote
    3
    down vote

    favorite












    According to this answer to Why were early personal computer monitors not green? they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.



    I used Commodore PET's with green monitors in the early 80s and DEC terminals with amber monitors in the late 80s, in school and college computer labs respectively, both with 50 hertz fluorescent lights, and I don't remember the former having a problem with either dimness or burn-in, but obviously my eyesight was better then than it is now, and I'm not sure exactly how much to trust that memory. So say green monitors did indeed have that problem.



    If amber was that much better, why did anyone use green?



    According to Wikipedias list of Phosphor Types green is P1 and amber is P3, which might mean amber was simply not invented until later. However, according to that table, black and white TV is P4, which would suggest amber was invented long before monochrome monitors became common.



    Was there any other reason to prefer green over amber?










    share|improve this question

























      up vote
      3
      down vote

      favorite









      up vote
      3
      down vote

      favorite











      According to this answer to Why were early personal computer monitors not green? they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.



      I used Commodore PET's with green monitors in the early 80s and DEC terminals with amber monitors in the late 80s, in school and college computer labs respectively, both with 50 hertz fluorescent lights, and I don't remember the former having a problem with either dimness or burn-in, but obviously my eyesight was better then than it is now, and I'm not sure exactly how much to trust that memory. So say green monitors did indeed have that problem.



      If amber was that much better, why did anyone use green?



      According to Wikipedias list of Phosphor Types green is P1 and amber is P3, which might mean amber was simply not invented until later. However, according to that table, black and white TV is P4, which would suggest amber was invented long before monochrome monitors became common.



      Was there any other reason to prefer green over amber?










      share|improve this question















      According to this answer to Why were early personal computer monitors not green? they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.



      I used Commodore PET's with green monitors in the early 80s and DEC terminals with amber monitors in the late 80s, in school and college computer labs respectively, both with 50 hertz fluorescent lights, and I don't remember the former having a problem with either dimness or burn-in, but obviously my eyesight was better then than it is now, and I'm not sure exactly how much to trust that memory. So say green monitors did indeed have that problem.



      If amber was that much better, why did anyone use green?



      According to Wikipedias list of Phosphor Types green is P1 and amber is P3, which might mean amber was simply not invented until later. However, according to that table, black and white TV is P4, which would suggest amber was invented long before monochrome monitors became common.



      Was there any other reason to prefer green over amber?







      history screen






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 55 mins ago









      Raffzahn

      37.2k482149




      37.2k482149










      asked 2 hours ago









      rwallace

      7,402233101




      7,402233101




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          3
          down vote



          accepted











          According to this answer to a question about the early use of green monitors they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.




          It's be rather careful in accepting the conclusions made there. I can show you several green screens that have been used for more than a decade without burning in. There is no real difference between either coating. Of course if one pulls up the volume to excessive bright, all screens will burn in.




          If amber was that much better, why did anyone use green?




          Maybe because green is way better? The human eye has its highest sensitivity near to the red side of green (*1,2) thus a (somewhat) yelowish green is about the best colour. At the same time the sensitivity for brightness (*3) is to the bue side of green. In turn, orange/yelowisch colours are as good for fine colour seperation, but less sensitive to differences in brightness. So when it is about a monochrome screen, green outclasses amber.



          Bottom line: The highest over all sensitivity, as in ability to separate levels of brightness and detect colour, lays right around green.




          Was there any other reason to prefer green over amber?




          Does it need any other than that?




          The P Numbers you cite are not really related to any date of 'discovery'. They where an attempt of the RMA together with the US-Army to standardize the components used for CRTs in the mid 1940s. The original order of the first few, declared at once, was in time of their persistence. That means how long the image would stay without being refreshed.



          For example P1 and P2 have a quite close colour (P1 covers P2), but P1 continues to emit considerably longer than P2, so P1 is the classic green for early radar (and oscilloscopes used for low periodical signals), while P2 is used only for oscilloscopes.



          Further, P1 is most definitly not the one used for green computer CRTs. It got a decay time (*4) of about 100ms. Any refresh frequency past like 15 Hz or so would result in an extreme blurred screen. Its low writing speed is another nogo.



          For most green screens P31 was used. It is not only about three times brighter than P1 (at the same energy *5), but also has only about 30ms decay time, which goes well to display 'fast' content at 50-60 Hz without starting to flicker.




          *1 - The colour sensibility has been already discussed here.



          *2 - I guess that's a result of us being evolved in a green coloured environment ... well, at least back when we still climbed trees :)



          *3 - The majority of information our eyes deliver aren't about colour, but black & white.



          *4 - Decay time is the time a once initiated spot needs to go from 100% of its (specific) brightness down to 0.1% which is considered off. It is usually in reverse relation to absolute brightness - as brighter a given coating is when ignited (at a given energy, like 10kV) as faster it decays - as a rough guideline.



          *5 - Less energy for a given/intended brightnessmeans also less radiation (X-Rays) toward tue user.






          share|improve this answer






















            Your Answer







            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "648"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: false,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7911%2fwhy-green-phosphor-instead-of-amber%23new-answer', 'question_page');

            );

            Post as a guest






























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            3
            down vote



            accepted











            According to this answer to a question about the early use of green monitors they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.




            It's be rather careful in accepting the conclusions made there. I can show you several green screens that have been used for more than a decade without burning in. There is no real difference between either coating. Of course if one pulls up the volume to excessive bright, all screens will burn in.




            If amber was that much better, why did anyone use green?




            Maybe because green is way better? The human eye has its highest sensitivity near to the red side of green (*1,2) thus a (somewhat) yelowish green is about the best colour. At the same time the sensitivity for brightness (*3) is to the bue side of green. In turn, orange/yelowisch colours are as good for fine colour seperation, but less sensitive to differences in brightness. So when it is about a monochrome screen, green outclasses amber.



            Bottom line: The highest over all sensitivity, as in ability to separate levels of brightness and detect colour, lays right around green.




            Was there any other reason to prefer green over amber?




            Does it need any other than that?




            The P Numbers you cite are not really related to any date of 'discovery'. They where an attempt of the RMA together with the US-Army to standardize the components used for CRTs in the mid 1940s. The original order of the first few, declared at once, was in time of their persistence. That means how long the image would stay without being refreshed.



            For example P1 and P2 have a quite close colour (P1 covers P2), but P1 continues to emit considerably longer than P2, so P1 is the classic green for early radar (and oscilloscopes used for low periodical signals), while P2 is used only for oscilloscopes.



            Further, P1 is most definitly not the one used for green computer CRTs. It got a decay time (*4) of about 100ms. Any refresh frequency past like 15 Hz or so would result in an extreme blurred screen. Its low writing speed is another nogo.



            For most green screens P31 was used. It is not only about three times brighter than P1 (at the same energy *5), but also has only about 30ms decay time, which goes well to display 'fast' content at 50-60 Hz without starting to flicker.




            *1 - The colour sensibility has been already discussed here.



            *2 - I guess that's a result of us being evolved in a green coloured environment ... well, at least back when we still climbed trees :)



            *3 - The majority of information our eyes deliver aren't about colour, but black & white.



            *4 - Decay time is the time a once initiated spot needs to go from 100% of its (specific) brightness down to 0.1% which is considered off. It is usually in reverse relation to absolute brightness - as brighter a given coating is when ignited (at a given energy, like 10kV) as faster it decays - as a rough guideline.



            *5 - Less energy for a given/intended brightnessmeans also less radiation (X-Rays) toward tue user.






            share|improve this answer


























              up vote
              3
              down vote



              accepted











              According to this answer to a question about the early use of green monitors they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.




              It's be rather careful in accepting the conclusions made there. I can show you several green screens that have been used for more than a decade without burning in. There is no real difference between either coating. Of course if one pulls up the volume to excessive bright, all screens will burn in.




              If amber was that much better, why did anyone use green?




              Maybe because green is way better? The human eye has its highest sensitivity near to the red side of green (*1,2) thus a (somewhat) yelowish green is about the best colour. At the same time the sensitivity for brightness (*3) is to the bue side of green. In turn, orange/yelowisch colours are as good for fine colour seperation, but less sensitive to differences in brightness. So when it is about a monochrome screen, green outclasses amber.



              Bottom line: The highest over all sensitivity, as in ability to separate levels of brightness and detect colour, lays right around green.




              Was there any other reason to prefer green over amber?




              Does it need any other than that?




              The P Numbers you cite are not really related to any date of 'discovery'. They where an attempt of the RMA together with the US-Army to standardize the components used for CRTs in the mid 1940s. The original order of the first few, declared at once, was in time of their persistence. That means how long the image would stay without being refreshed.



              For example P1 and P2 have a quite close colour (P1 covers P2), but P1 continues to emit considerably longer than P2, so P1 is the classic green for early radar (and oscilloscopes used for low periodical signals), while P2 is used only for oscilloscopes.



              Further, P1 is most definitly not the one used for green computer CRTs. It got a decay time (*4) of about 100ms. Any refresh frequency past like 15 Hz or so would result in an extreme blurred screen. Its low writing speed is another nogo.



              For most green screens P31 was used. It is not only about three times brighter than P1 (at the same energy *5), but also has only about 30ms decay time, which goes well to display 'fast' content at 50-60 Hz without starting to flicker.




              *1 - The colour sensibility has been already discussed here.



              *2 - I guess that's a result of us being evolved in a green coloured environment ... well, at least back when we still climbed trees :)



              *3 - The majority of information our eyes deliver aren't about colour, but black & white.



              *4 - Decay time is the time a once initiated spot needs to go from 100% of its (specific) brightness down to 0.1% which is considered off. It is usually in reverse relation to absolute brightness - as brighter a given coating is when ignited (at a given energy, like 10kV) as faster it decays - as a rough guideline.



              *5 - Less energy for a given/intended brightnessmeans also less radiation (X-Rays) toward tue user.






              share|improve this answer
























                up vote
                3
                down vote



                accepted







                up vote
                3
                down vote



                accepted







                According to this answer to a question about the early use of green monitors they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.




                It's be rather careful in accepting the conclusions made there. I can show you several green screens that have been used for more than a decade without burning in. There is no real difference between either coating. Of course if one pulls up the volume to excessive bright, all screens will burn in.




                If amber was that much better, why did anyone use green?




                Maybe because green is way better? The human eye has its highest sensitivity near to the red side of green (*1,2) thus a (somewhat) yelowish green is about the best colour. At the same time the sensitivity for brightness (*3) is to the bue side of green. In turn, orange/yelowisch colours are as good for fine colour seperation, but less sensitive to differences in brightness. So when it is about a monochrome screen, green outclasses amber.



                Bottom line: The highest over all sensitivity, as in ability to separate levels of brightness and detect colour, lays right around green.




                Was there any other reason to prefer green over amber?




                Does it need any other than that?




                The P Numbers you cite are not really related to any date of 'discovery'. They where an attempt of the RMA together with the US-Army to standardize the components used for CRTs in the mid 1940s. The original order of the first few, declared at once, was in time of their persistence. That means how long the image would stay without being refreshed.



                For example P1 and P2 have a quite close colour (P1 covers P2), but P1 continues to emit considerably longer than P2, so P1 is the classic green for early radar (and oscilloscopes used for low periodical signals), while P2 is used only for oscilloscopes.



                Further, P1 is most definitly not the one used for green computer CRTs. It got a decay time (*4) of about 100ms. Any refresh frequency past like 15 Hz or so would result in an extreme blurred screen. Its low writing speed is another nogo.



                For most green screens P31 was used. It is not only about three times brighter than P1 (at the same energy *5), but also has only about 30ms decay time, which goes well to display 'fast' content at 50-60 Hz without starting to flicker.




                *1 - The colour sensibility has been already discussed here.



                *2 - I guess that's a result of us being evolved in a green coloured environment ... well, at least back when we still climbed trees :)



                *3 - The majority of information our eyes deliver aren't about colour, but black & white.



                *4 - Decay time is the time a once initiated spot needs to go from 100% of its (specific) brightness down to 0.1% which is considered off. It is usually in reverse relation to absolute brightness - as brighter a given coating is when ignited (at a given energy, like 10kV) as faster it decays - as a rough guideline.



                *5 - Less energy for a given/intended brightnessmeans also less radiation (X-Rays) toward tue user.






                share|improve this answer















                According to this answer to a question about the early use of green monitors they had a severe disadvantage in that you had to choose between 'too dim' and 'rapid burn-in' whereas amber could display decent brightness without burn-in.




                It's be rather careful in accepting the conclusions made there. I can show you several green screens that have been used for more than a decade without burning in. There is no real difference between either coating. Of course if one pulls up the volume to excessive bright, all screens will burn in.




                If amber was that much better, why did anyone use green?




                Maybe because green is way better? The human eye has its highest sensitivity near to the red side of green (*1,2) thus a (somewhat) yelowish green is about the best colour. At the same time the sensitivity for brightness (*3) is to the bue side of green. In turn, orange/yelowisch colours are as good for fine colour seperation, but less sensitive to differences in brightness. So when it is about a monochrome screen, green outclasses amber.



                Bottom line: The highest over all sensitivity, as in ability to separate levels of brightness and detect colour, lays right around green.




                Was there any other reason to prefer green over amber?




                Does it need any other than that?




                The P Numbers you cite are not really related to any date of 'discovery'. They where an attempt of the RMA together with the US-Army to standardize the components used for CRTs in the mid 1940s. The original order of the first few, declared at once, was in time of their persistence. That means how long the image would stay without being refreshed.



                For example P1 and P2 have a quite close colour (P1 covers P2), but P1 continues to emit considerably longer than P2, so P1 is the classic green for early radar (and oscilloscopes used for low periodical signals), while P2 is used only for oscilloscopes.



                Further, P1 is most definitly not the one used for green computer CRTs. It got a decay time (*4) of about 100ms. Any refresh frequency past like 15 Hz or so would result in an extreme blurred screen. Its low writing speed is another nogo.



                For most green screens P31 was used. It is not only about three times brighter than P1 (at the same energy *5), but also has only about 30ms decay time, which goes well to display 'fast' content at 50-60 Hz without starting to flicker.




                *1 - The colour sensibility has been already discussed here.



                *2 - I guess that's a result of us being evolved in a green coloured environment ... well, at least back when we still climbed trees :)



                *3 - The majority of information our eyes deliver aren't about colour, but black & white.



                *4 - Decay time is the time a once initiated spot needs to go from 100% of its (specific) brightness down to 0.1% which is considered off. It is usually in reverse relation to absolute brightness - as brighter a given coating is when ignited (at a given energy, like 10kV) as faster it decays - as a rough guideline.



                *5 - Less energy for a given/intended brightnessmeans also less radiation (X-Rays) toward tue user.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited 33 mins ago

























                answered 1 hour ago









                Raffzahn

                37.2k482149




                37.2k482149



























                     

                    draft saved


                    draft discarded















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7911%2fwhy-green-phosphor-instead-of-amber%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Comments

                    Popular posts from this blog

                    What does second last employer means? [closed]

                    Installing NextGIS Connect into QGIS 3?

                    One-line joke