History of Ctrl-S and Ctrl-Q for flow control

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
22
down vote

favorite
1












Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?



I first discovered Ctrl-S in IBM PC DOS 1.1.







share|improve this question






















  • I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
    – RichF
    Aug 13 at 5:56






  • 4




    For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
    – jvb
    Aug 13 at 7:03






  • 2




    @jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
    – BruceWayne
    Aug 14 at 16:58














up vote
22
down vote

favorite
1












Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?



I first discovered Ctrl-S in IBM PC DOS 1.1.







share|improve this question






















  • I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
    – RichF
    Aug 13 at 5:56






  • 4




    For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
    – jvb
    Aug 13 at 7:03






  • 2




    @jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
    – BruceWayne
    Aug 14 at 16:58












up vote
22
down vote

favorite
1









up vote
22
down vote

favorite
1






1





Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?



I first discovered Ctrl-S in IBM PC DOS 1.1.







share|improve this question














Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?



I first discovered Ctrl-S in IBM PC DOS 1.1.









share|improve this question













share|improve this question




share|improve this question








edited Aug 15 at 18:30









chicks

12517




12517










asked Aug 13 at 4:22









Old Geezer

21316




21316











  • I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
    – RichF
    Aug 13 at 5:56






  • 4




    For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
    – jvb
    Aug 13 at 7:03






  • 2




    @jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
    – BruceWayne
    Aug 14 at 16:58
















  • I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
    – RichF
    Aug 13 at 5:56






  • 4




    For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
    – jvb
    Aug 13 at 7:03






  • 2




    @jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
    – BruceWayne
    Aug 14 at 16:58















I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
– RichF
Aug 13 at 5:56




I'm not sure if it were the OS or the VT-102 terminal which responded to <ctrl>S and <ctrl>Q. Our LS1-11 system could boot into RT-11 or native FORTH (where FORTH was OS, language, development environment, and applications). Both seemed to respond the same, so my guess it was the VT-102 responding directly, which probably then immediately told the serial port hardware to "hold its horses" as necessary.
– RichF
Aug 13 at 5:56




4




4




For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
– jvb
Aug 13 at 7:03




For XON/XOFF flow control characters also see here. It's a teleprinter convention and predates the computer aera.
– jvb
Aug 13 at 7:03




2




2




@jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
– BruceWayne
Aug 14 at 16:58




@jvb - I always was under the impression Mr. Miyagi invented the XON/XOFF control :P
– BruceWayne
Aug 14 at 16:58










4 Answers
4






active

oldest

votes

















up vote
45
down vote



accepted











Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?




TL;DR;



It's been developed independently of anything one might call an OS (*1). It's (nowadays) called Software Flow Control and has been around since the early days of computers using ASCII I/O, as the Model 33 Teletype used the device control codes DC1/DC3 (CTRL-Q/CTRL-S) to make a sender stop or resume transmission.




Long Read:



Today it's defined in the ITU-T V.43 standard as 'Flow control by use of DC3/DC1 characters', but has been around since even before the definition of ASCII. The oldest origins I know are the FIELDATA alphabet definitions starting in 1956. While itself based on the 5 bit International Alphabet No.2 (ITA2), FIELDATA used 7 bit words (*2). Here for the first time two characters were officially (*3) defined to control transmission:



RTR - Ready to Receive (Code 49) 
NRR - Not Ready to Receive (Code 50)


These characters were to be used by receiving equipment to signal non-ready conditions and make the sender stop (with a delay of up to two characters). This functionality was incorporated by Teletype in what they delivered for military use.



FIELDATA also defined many more control characters, and while most made it into ASCII (*3), these didn't. Instead a set of four general-purpose device control codes got added. When Teletype defined the Model 32/33, they decided to use two (*4) of these four codes for their start/stop feature. The result was:



DC1 - XON (Code 17)
DC3 - XOFF (Code 19)


The Model 33 not only became much used with early computers, especially minis, it also gave the template for early terminals, AKA glass TTYs. Application and later device drivers learned to obey these commands codes to avoid overrunning slow equipment.



As usual it was fast discovered by users, that it's possible to issue DC1/3 by pressing CTRL-Q/S, thus handy control of fast scrolling text output. Neat, isn't it? The rest I guess is history.




More History



Already the Teletype Model 28, originally designed for military use, had an optional way to use a received . (or #) to stop the (printing) motor. Thus sneaking in control function. This was part of the so called 'Stunt Box' where a character or a character sequence could be detected and a switch could be flipped. This not only meant closing an electric contact, but literally pushing a rod that could flip a switch or whatsoever needed.



The Beginning



This brings us to the introduction of the very first control characters with ITA2 of 1924 as:



CR - Carriage Return
LF - Line Feed
BELL - Ring a Bell
WRU - Who aRe You (*5)


Before that, teletype text didn't have any structure of lines and even with ITA2, the page was still not 'invented'.



Addendum:



A point has been made by Tofro, that the concept of control characters is older, as Morse Code already uses prosigns ('Procedural Signs') for special messages, including flow control. For one, these are later additions grown out of custom, but more importantly, they are not separate control characters, but reuse of existing combinations which may or may not be understood by the context the receiver works in.



For example 'Wait' uses the same code as ampersand & which in itself is a digraph of Aand S (Both letters sent consecutively without a letter space; *6). Similar the signal used for 'Continue' is the letter 'K' officially called 'Over'. There is no hint in the encoding what meaning this letters are supposed to have.



In a more general view it's a problem with in-band signalling. Here, protocol information needs to be transmitted together with a payload. To handle this on code layer, separate and unique codes for protocol handling are needed. Otherwise an additional encoding layer (escape sequences) atop the character encoding but before protocol must be added.



Bottom line, ITA-2 was the first to go that route, and FIELDATA the first to incorporate flow control - with Teletype bringing that into applied ASCII.




*1 - Well, then again, assuming that an OS is after all nothing more than a set of rules followed up by its components, then even handling such rules may qualify. No-one says an OS needs to have memory protection or a GUI at all ;)



*2 - Or Start-Stop Seven-Unit Code as they called it.



*3 - Like blocking, (not) acknowledgement, field separators or the 'Special Character'.



*4 - The other two where used by the sender to control the paper tape punch. DC2 as RION, DC4 as RIOFF



*5 - Looks like Leetspeak or sounding letters as words aren't really a new invention after all, does it?



*6 - Digraphs are a constant issue of problems, as it's often hard to decide if a sequence is meant to be a digraph letter or two single letters. Usually it takes knowledge of the language to be safe here. More complicating it for native English speakers, as the rest of the world uses a huge number of digraphs for their language special characters. Even more, the simple joining of two letters in Morse may lead to various valid interpretations, so again, human knowledge is the only way to pick the most plausible interpretation






share|improve this answer


















  • 3




    Ah! XON/XOFF. Now I remember.
    – Old Geezer
    Aug 13 at 13:02






  • 2




    Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
    – tofro
    Aug 13 at 15:17






  • 2




    Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
    – tofro
    Aug 13 at 15:48






  • 1




    For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
    – Raffzahn
    Aug 13 at 15:53






  • 3




    Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
    – Drew
    Aug 13 at 16:20


















up vote
22
down vote













First OS is hard to say. The codes go back to the 1960s with the Teletype Model 33. I have a hunch the original usage was not part of an operating system but at a lower level. In later times, certainly by the 1980s, there were plenty of devices that functioned with software flow control at the hardware level.



Microcomputer operating systems that supported Ctrl-S/Ctrl-Q included CP/M 2.2. I found plenty of versions of the CP/M 2.2 manual with multiple copyright dates from 1976 through 1984 that included clear references to this feature, but it is not clear at what time it was actually added. I suspect it was very early on (closer to 1976), but I'm not sure.






share|improve this answer



























    up vote
    13
    down vote













    It was nothing to do with an OS as such. Ctrl-S and Ctrl-Q are simply XON and XOFF in ASCII.



    These codes are used in serial communications to pause and resume sending.



    With hardware handshake on RS232 and similar communications standards used the RTS (Ready To Send) & CTS (Clear To Send) handshake lines.

    Many comms links were, and still are, three wire - send, receive and earth. On these, you use Software handshake. The receiving end will send XOFF to tell the sender that its input buffers were nearly full and then XON to tell the sender to resume once the buffer had emptied a bit.



    These codes go back to the dawn of RS-232 in 1960. A standard that was deliberately OS-agnostic.






    share|improve this answer



























      up vote
      1
      down vote













      As a user of teletypes and serial communications back in the 1970s - this is my understanding as well. There were hardware signals for "flow control".



      Was thinking you set bits on UART which then converted to voltages on communications cable.



      For example if Device A was sending to Device B - Device B needed a way to slow down the sending of data from Device A - so that Device A did not overrun the receive buffer of Device B. There were hardware signals that Device B could set to tell Device A to stop sending.



      It was considered a huge improvement when they could embed software flow control like Ctrl-Q and Ctrl-S in the data and you no longer needed to set hardware flow control signals on the UART.






      share|improve this answer
















      • 2




        Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
        – Chenmunka♦
        Aug 15 at 13:41











      • Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
        – manassehkatz
        Aug 15 at 14:07










      Your Answer







      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "648"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7263%2fhistory-of-ctrl-s-and-ctrl-q-for-flow-control%23new-answer', 'question_page');

      );

      Post as a guest






























      4 Answers
      4






      active

      oldest

      votes








      4 Answers
      4






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      45
      down vote



      accepted











      Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?




      TL;DR;



      It's been developed independently of anything one might call an OS (*1). It's (nowadays) called Software Flow Control and has been around since the early days of computers using ASCII I/O, as the Model 33 Teletype used the device control codes DC1/DC3 (CTRL-Q/CTRL-S) to make a sender stop or resume transmission.




      Long Read:



      Today it's defined in the ITU-T V.43 standard as 'Flow control by use of DC3/DC1 characters', but has been around since even before the definition of ASCII. The oldest origins I know are the FIELDATA alphabet definitions starting in 1956. While itself based on the 5 bit International Alphabet No.2 (ITA2), FIELDATA used 7 bit words (*2). Here for the first time two characters were officially (*3) defined to control transmission:



      RTR - Ready to Receive (Code 49) 
      NRR - Not Ready to Receive (Code 50)


      These characters were to be used by receiving equipment to signal non-ready conditions and make the sender stop (with a delay of up to two characters). This functionality was incorporated by Teletype in what they delivered for military use.



      FIELDATA also defined many more control characters, and while most made it into ASCII (*3), these didn't. Instead a set of four general-purpose device control codes got added. When Teletype defined the Model 32/33, they decided to use two (*4) of these four codes for their start/stop feature. The result was:



      DC1 - XON (Code 17)
      DC3 - XOFF (Code 19)


      The Model 33 not only became much used with early computers, especially minis, it also gave the template for early terminals, AKA glass TTYs. Application and later device drivers learned to obey these commands codes to avoid overrunning slow equipment.



      As usual it was fast discovered by users, that it's possible to issue DC1/3 by pressing CTRL-Q/S, thus handy control of fast scrolling text output. Neat, isn't it? The rest I guess is history.




      More History



      Already the Teletype Model 28, originally designed for military use, had an optional way to use a received . (or #) to stop the (printing) motor. Thus sneaking in control function. This was part of the so called 'Stunt Box' where a character or a character sequence could be detected and a switch could be flipped. This not only meant closing an electric contact, but literally pushing a rod that could flip a switch or whatsoever needed.



      The Beginning



      This brings us to the introduction of the very first control characters with ITA2 of 1924 as:



      CR - Carriage Return
      LF - Line Feed
      BELL - Ring a Bell
      WRU - Who aRe You (*5)


      Before that, teletype text didn't have any structure of lines and even with ITA2, the page was still not 'invented'.



      Addendum:



      A point has been made by Tofro, that the concept of control characters is older, as Morse Code already uses prosigns ('Procedural Signs') for special messages, including flow control. For one, these are later additions grown out of custom, but more importantly, they are not separate control characters, but reuse of existing combinations which may or may not be understood by the context the receiver works in.



      For example 'Wait' uses the same code as ampersand & which in itself is a digraph of Aand S (Both letters sent consecutively without a letter space; *6). Similar the signal used for 'Continue' is the letter 'K' officially called 'Over'. There is no hint in the encoding what meaning this letters are supposed to have.



      In a more general view it's a problem with in-band signalling. Here, protocol information needs to be transmitted together with a payload. To handle this on code layer, separate and unique codes for protocol handling are needed. Otherwise an additional encoding layer (escape sequences) atop the character encoding but before protocol must be added.



      Bottom line, ITA-2 was the first to go that route, and FIELDATA the first to incorporate flow control - with Teletype bringing that into applied ASCII.




      *1 - Well, then again, assuming that an OS is after all nothing more than a set of rules followed up by its components, then even handling such rules may qualify. No-one says an OS needs to have memory protection or a GUI at all ;)



      *2 - Or Start-Stop Seven-Unit Code as they called it.



      *3 - Like blocking, (not) acknowledgement, field separators or the 'Special Character'.



      *4 - The other two where used by the sender to control the paper tape punch. DC2 as RION, DC4 as RIOFF



      *5 - Looks like Leetspeak or sounding letters as words aren't really a new invention after all, does it?



      *6 - Digraphs are a constant issue of problems, as it's often hard to decide if a sequence is meant to be a digraph letter or two single letters. Usually it takes knowledge of the language to be safe here. More complicating it for native English speakers, as the rest of the world uses a huge number of digraphs for their language special characters. Even more, the simple joining of two letters in Morse may lead to various valid interpretations, so again, human knowledge is the only way to pick the most plausible interpretation






      share|improve this answer


















      • 3




        Ah! XON/XOFF. Now I remember.
        – Old Geezer
        Aug 13 at 13:02






      • 2




        Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
        – tofro
        Aug 13 at 15:17






      • 2




        Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
        – tofro
        Aug 13 at 15:48






      • 1




        For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
        – Raffzahn
        Aug 13 at 15:53






      • 3




        Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
        – Drew
        Aug 13 at 16:20















      up vote
      45
      down vote



      accepted











      Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?




      TL;DR;



      It's been developed independently of anything one might call an OS (*1). It's (nowadays) called Software Flow Control and has been around since the early days of computers using ASCII I/O, as the Model 33 Teletype used the device control codes DC1/DC3 (CTRL-Q/CTRL-S) to make a sender stop or resume transmission.




      Long Read:



      Today it's defined in the ITU-T V.43 standard as 'Flow control by use of DC3/DC1 characters', but has been around since even before the definition of ASCII. The oldest origins I know are the FIELDATA alphabet definitions starting in 1956. While itself based on the 5 bit International Alphabet No.2 (ITA2), FIELDATA used 7 bit words (*2). Here for the first time two characters were officially (*3) defined to control transmission:



      RTR - Ready to Receive (Code 49) 
      NRR - Not Ready to Receive (Code 50)


      These characters were to be used by receiving equipment to signal non-ready conditions and make the sender stop (with a delay of up to two characters). This functionality was incorporated by Teletype in what they delivered for military use.



      FIELDATA also defined many more control characters, and while most made it into ASCII (*3), these didn't. Instead a set of four general-purpose device control codes got added. When Teletype defined the Model 32/33, they decided to use two (*4) of these four codes for their start/stop feature. The result was:



      DC1 - XON (Code 17)
      DC3 - XOFF (Code 19)


      The Model 33 not only became much used with early computers, especially minis, it also gave the template for early terminals, AKA glass TTYs. Application and later device drivers learned to obey these commands codes to avoid overrunning slow equipment.



      As usual it was fast discovered by users, that it's possible to issue DC1/3 by pressing CTRL-Q/S, thus handy control of fast scrolling text output. Neat, isn't it? The rest I guess is history.




      More History



      Already the Teletype Model 28, originally designed for military use, had an optional way to use a received . (or #) to stop the (printing) motor. Thus sneaking in control function. This was part of the so called 'Stunt Box' where a character or a character sequence could be detected and a switch could be flipped. This not only meant closing an electric contact, but literally pushing a rod that could flip a switch or whatsoever needed.



      The Beginning



      This brings us to the introduction of the very first control characters with ITA2 of 1924 as:



      CR - Carriage Return
      LF - Line Feed
      BELL - Ring a Bell
      WRU - Who aRe You (*5)


      Before that, teletype text didn't have any structure of lines and even with ITA2, the page was still not 'invented'.



      Addendum:



      A point has been made by Tofro, that the concept of control characters is older, as Morse Code already uses prosigns ('Procedural Signs') for special messages, including flow control. For one, these are later additions grown out of custom, but more importantly, they are not separate control characters, but reuse of existing combinations which may or may not be understood by the context the receiver works in.



      For example 'Wait' uses the same code as ampersand & which in itself is a digraph of Aand S (Both letters sent consecutively without a letter space; *6). Similar the signal used for 'Continue' is the letter 'K' officially called 'Over'. There is no hint in the encoding what meaning this letters are supposed to have.



      In a more general view it's a problem with in-band signalling. Here, protocol information needs to be transmitted together with a payload. To handle this on code layer, separate and unique codes for protocol handling are needed. Otherwise an additional encoding layer (escape sequences) atop the character encoding but before protocol must be added.



      Bottom line, ITA-2 was the first to go that route, and FIELDATA the first to incorporate flow control - with Teletype bringing that into applied ASCII.




      *1 - Well, then again, assuming that an OS is after all nothing more than a set of rules followed up by its components, then even handling such rules may qualify. No-one says an OS needs to have memory protection or a GUI at all ;)



      *2 - Or Start-Stop Seven-Unit Code as they called it.



      *3 - Like blocking, (not) acknowledgement, field separators or the 'Special Character'.



      *4 - The other two where used by the sender to control the paper tape punch. DC2 as RION, DC4 as RIOFF



      *5 - Looks like Leetspeak or sounding letters as words aren't really a new invention after all, does it?



      *6 - Digraphs are a constant issue of problems, as it's often hard to decide if a sequence is meant to be a digraph letter or two single letters. Usually it takes knowledge of the language to be safe here. More complicating it for native English speakers, as the rest of the world uses a huge number of digraphs for their language special characters. Even more, the simple joining of two letters in Morse may lead to various valid interpretations, so again, human knowledge is the only way to pick the most plausible interpretation






      share|improve this answer


















      • 3




        Ah! XON/XOFF. Now I remember.
        – Old Geezer
        Aug 13 at 13:02






      • 2




        Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
        – tofro
        Aug 13 at 15:17






      • 2




        Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
        – tofro
        Aug 13 at 15:48






      • 1




        For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
        – Raffzahn
        Aug 13 at 15:53






      • 3




        Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
        – Drew
        Aug 13 at 16:20













      up vote
      45
      down vote



      accepted







      up vote
      45
      down vote



      accepted







      Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?




      TL;DR;



      It's been developed independently of anything one might call an OS (*1). It's (nowadays) called Software Flow Control and has been around since the early days of computers using ASCII I/O, as the Model 33 Teletype used the device control codes DC1/DC3 (CTRL-Q/CTRL-S) to make a sender stop or resume transmission.




      Long Read:



      Today it's defined in the ITU-T V.43 standard as 'Flow control by use of DC3/DC1 characters', but has been around since even before the definition of ASCII. The oldest origins I know are the FIELDATA alphabet definitions starting in 1956. While itself based on the 5 bit International Alphabet No.2 (ITA2), FIELDATA used 7 bit words (*2). Here for the first time two characters were officially (*3) defined to control transmission:



      RTR - Ready to Receive (Code 49) 
      NRR - Not Ready to Receive (Code 50)


      These characters were to be used by receiving equipment to signal non-ready conditions and make the sender stop (with a delay of up to two characters). This functionality was incorporated by Teletype in what they delivered for military use.



      FIELDATA also defined many more control characters, and while most made it into ASCII (*3), these didn't. Instead a set of four general-purpose device control codes got added. When Teletype defined the Model 32/33, they decided to use two (*4) of these four codes for their start/stop feature. The result was:



      DC1 - XON (Code 17)
      DC3 - XOFF (Code 19)


      The Model 33 not only became much used with early computers, especially minis, it also gave the template for early terminals, AKA glass TTYs. Application and later device drivers learned to obey these commands codes to avoid overrunning slow equipment.



      As usual it was fast discovered by users, that it's possible to issue DC1/3 by pressing CTRL-Q/S, thus handy control of fast scrolling text output. Neat, isn't it? The rest I guess is history.




      More History



      Already the Teletype Model 28, originally designed for military use, had an optional way to use a received . (or #) to stop the (printing) motor. Thus sneaking in control function. This was part of the so called 'Stunt Box' where a character or a character sequence could be detected and a switch could be flipped. This not only meant closing an electric contact, but literally pushing a rod that could flip a switch or whatsoever needed.



      The Beginning



      This brings us to the introduction of the very first control characters with ITA2 of 1924 as:



      CR - Carriage Return
      LF - Line Feed
      BELL - Ring a Bell
      WRU - Who aRe You (*5)


      Before that, teletype text didn't have any structure of lines and even with ITA2, the page was still not 'invented'.



      Addendum:



      A point has been made by Tofro, that the concept of control characters is older, as Morse Code already uses prosigns ('Procedural Signs') for special messages, including flow control. For one, these are later additions grown out of custom, but more importantly, they are not separate control characters, but reuse of existing combinations which may or may not be understood by the context the receiver works in.



      For example 'Wait' uses the same code as ampersand & which in itself is a digraph of Aand S (Both letters sent consecutively without a letter space; *6). Similar the signal used for 'Continue' is the letter 'K' officially called 'Over'. There is no hint in the encoding what meaning this letters are supposed to have.



      In a more general view it's a problem with in-band signalling. Here, protocol information needs to be transmitted together with a payload. To handle this on code layer, separate and unique codes for protocol handling are needed. Otherwise an additional encoding layer (escape sequences) atop the character encoding but before protocol must be added.



      Bottom line, ITA-2 was the first to go that route, and FIELDATA the first to incorporate flow control - with Teletype bringing that into applied ASCII.




      *1 - Well, then again, assuming that an OS is after all nothing more than a set of rules followed up by its components, then even handling such rules may qualify. No-one says an OS needs to have memory protection or a GUI at all ;)



      *2 - Or Start-Stop Seven-Unit Code as they called it.



      *3 - Like blocking, (not) acknowledgement, field separators or the 'Special Character'.



      *4 - The other two where used by the sender to control the paper tape punch. DC2 as RION, DC4 as RIOFF



      *5 - Looks like Leetspeak or sounding letters as words aren't really a new invention after all, does it?



      *6 - Digraphs are a constant issue of problems, as it's often hard to decide if a sequence is meant to be a digraph letter or two single letters. Usually it takes knowledge of the language to be safe here. More complicating it for native English speakers, as the rest of the world uses a huge number of digraphs for their language special characters. Even more, the simple joining of two letters in Morse may lead to various valid interpretations, so again, human knowledge is the only way to pick the most plausible interpretation






      share|improve this answer















      Which OS was the first to use Ctrl-S and Ctrl-Q on the console for pause and continue?




      TL;DR;



      It's been developed independently of anything one might call an OS (*1). It's (nowadays) called Software Flow Control and has been around since the early days of computers using ASCII I/O, as the Model 33 Teletype used the device control codes DC1/DC3 (CTRL-Q/CTRL-S) to make a sender stop or resume transmission.




      Long Read:



      Today it's defined in the ITU-T V.43 standard as 'Flow control by use of DC3/DC1 characters', but has been around since even before the definition of ASCII. The oldest origins I know are the FIELDATA alphabet definitions starting in 1956. While itself based on the 5 bit International Alphabet No.2 (ITA2), FIELDATA used 7 bit words (*2). Here for the first time two characters were officially (*3) defined to control transmission:



      RTR - Ready to Receive (Code 49) 
      NRR - Not Ready to Receive (Code 50)


      These characters were to be used by receiving equipment to signal non-ready conditions and make the sender stop (with a delay of up to two characters). This functionality was incorporated by Teletype in what they delivered for military use.



      FIELDATA also defined many more control characters, and while most made it into ASCII (*3), these didn't. Instead a set of four general-purpose device control codes got added. When Teletype defined the Model 32/33, they decided to use two (*4) of these four codes for their start/stop feature. The result was:



      DC1 - XON (Code 17)
      DC3 - XOFF (Code 19)


      The Model 33 not only became much used with early computers, especially minis, it also gave the template for early terminals, AKA glass TTYs. Application and later device drivers learned to obey these commands codes to avoid overrunning slow equipment.



      As usual it was fast discovered by users, that it's possible to issue DC1/3 by pressing CTRL-Q/S, thus handy control of fast scrolling text output. Neat, isn't it? The rest I guess is history.




      More History



      Already the Teletype Model 28, originally designed for military use, had an optional way to use a received . (or #) to stop the (printing) motor. Thus sneaking in control function. This was part of the so called 'Stunt Box' where a character or a character sequence could be detected and a switch could be flipped. This not only meant closing an electric contact, but literally pushing a rod that could flip a switch or whatsoever needed.



      The Beginning



      This brings us to the introduction of the very first control characters with ITA2 of 1924 as:



      CR - Carriage Return
      LF - Line Feed
      BELL - Ring a Bell
      WRU - Who aRe You (*5)


      Before that, teletype text didn't have any structure of lines and even with ITA2, the page was still not 'invented'.



      Addendum:



      A point has been made by Tofro, that the concept of control characters is older, as Morse Code already uses prosigns ('Procedural Signs') for special messages, including flow control. For one, these are later additions grown out of custom, but more importantly, they are not separate control characters, but reuse of existing combinations which may or may not be understood by the context the receiver works in.



      For example 'Wait' uses the same code as ampersand & which in itself is a digraph of Aand S (Both letters sent consecutively without a letter space; *6). Similar the signal used for 'Continue' is the letter 'K' officially called 'Over'. There is no hint in the encoding what meaning this letters are supposed to have.



      In a more general view it's a problem with in-band signalling. Here, protocol information needs to be transmitted together with a payload. To handle this on code layer, separate and unique codes for protocol handling are needed. Otherwise an additional encoding layer (escape sequences) atop the character encoding but before protocol must be added.



      Bottom line, ITA-2 was the first to go that route, and FIELDATA the first to incorporate flow control - with Teletype bringing that into applied ASCII.




      *1 - Well, then again, assuming that an OS is after all nothing more than a set of rules followed up by its components, then even handling such rules may qualify. No-one says an OS needs to have memory protection or a GUI at all ;)



      *2 - Or Start-Stop Seven-Unit Code as they called it.



      *3 - Like blocking, (not) acknowledgement, field separators or the 'Special Character'.



      *4 - The other two where used by the sender to control the paper tape punch. DC2 as RION, DC4 as RIOFF



      *5 - Looks like Leetspeak or sounding letters as words aren't really a new invention after all, does it?



      *6 - Digraphs are a constant issue of problems, as it's often hard to decide if a sequence is meant to be a digraph letter or two single letters. Usually it takes knowledge of the language to be safe here. More complicating it for native English speakers, as the rest of the world uses a huge number of digraphs for their language special characters. Even more, the simple joining of two letters in Morse may lead to various valid interpretations, so again, human knowledge is the only way to pick the most plausible interpretation







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited Aug 13 at 16:52









      Toby Speight

      19210




      19210










      answered Aug 13 at 8:01









      Raffzahn

      31.8k467127




      31.8k467127







      • 3




        Ah! XON/XOFF. Now I remember.
        – Old Geezer
        Aug 13 at 13:02






      • 2




        Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
        – tofro
        Aug 13 at 15:17






      • 2




        Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
        – tofro
        Aug 13 at 15:48






      • 1




        For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
        – Raffzahn
        Aug 13 at 15:53






      • 3




        Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
        – Drew
        Aug 13 at 16:20













      • 3




        Ah! XON/XOFF. Now I remember.
        – Old Geezer
        Aug 13 at 13:02






      • 2




        Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
        – tofro
        Aug 13 at 15:17






      • 2




        Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
        – tofro
        Aug 13 at 15:48






      • 1




        For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
        – Raffzahn
        Aug 13 at 15:53






      • 3




        Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
        – Drew
        Aug 13 at 16:20








      3




      3




      Ah! XON/XOFF. Now I remember.
      – Old Geezer
      Aug 13 at 13:02




      Ah! XON/XOFF. Now I remember.
      – Old Geezer
      Aug 13 at 13:02




      2




      2




      Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
      – tofro
      Aug 13 at 15:17




      Note even standard Morse code has "push back" (wait) and "continue" sequences. So the concept seems to be even older.
      – tofro
      Aug 13 at 15:17




      2




      2




      Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
      – tofro
      Aug 13 at 15:48




      Morse code is, IMHO, the closest thing to digital anything analog can be. <CTRL>S and <CTRL>Q are not distinguishable control codes as well (in Escape sequences, which is a similar use case). But we digress.
      – tofro
      Aug 13 at 15:48




      1




      1




      For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
      – Raffzahn
      Aug 13 at 15:53




      For one, morse does use 5 different signals, that aside, more relevant, DC1/3 are distinguischable codes, as they are from the control plane and do not overlap with content encoding. Q is encoded as 81, while DC1 (CTRL-Q) is 17. That they are produceable by the same key just with a modifier is a keyboard implementation issue and not code related. Compare this to the Return key wich is as well CTRL-M
      – Raffzahn
      Aug 13 at 15:53




      3




      3




      Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
      – Drew
      Aug 13 at 16:20





      Fieldata! Brings back Univac memories - thanks. I'd forgotten about it. But I think Univac fieldatat used was 36-bit words (divided by 6).
      – Drew
      Aug 13 at 16:20











      up vote
      22
      down vote













      First OS is hard to say. The codes go back to the 1960s with the Teletype Model 33. I have a hunch the original usage was not part of an operating system but at a lower level. In later times, certainly by the 1980s, there were plenty of devices that functioned with software flow control at the hardware level.



      Microcomputer operating systems that supported Ctrl-S/Ctrl-Q included CP/M 2.2. I found plenty of versions of the CP/M 2.2 manual with multiple copyright dates from 1976 through 1984 that included clear references to this feature, but it is not clear at what time it was actually added. I suspect it was very early on (closer to 1976), but I'm not sure.






      share|improve this answer
























        up vote
        22
        down vote













        First OS is hard to say. The codes go back to the 1960s with the Teletype Model 33. I have a hunch the original usage was not part of an operating system but at a lower level. In later times, certainly by the 1980s, there were plenty of devices that functioned with software flow control at the hardware level.



        Microcomputer operating systems that supported Ctrl-S/Ctrl-Q included CP/M 2.2. I found plenty of versions of the CP/M 2.2 manual with multiple copyright dates from 1976 through 1984 that included clear references to this feature, but it is not clear at what time it was actually added. I suspect it was very early on (closer to 1976), but I'm not sure.






        share|improve this answer






















          up vote
          22
          down vote










          up vote
          22
          down vote









          First OS is hard to say. The codes go back to the 1960s with the Teletype Model 33. I have a hunch the original usage was not part of an operating system but at a lower level. In later times, certainly by the 1980s, there were plenty of devices that functioned with software flow control at the hardware level.



          Microcomputer operating systems that supported Ctrl-S/Ctrl-Q included CP/M 2.2. I found plenty of versions of the CP/M 2.2 manual with multiple copyright dates from 1976 through 1984 that included clear references to this feature, but it is not clear at what time it was actually added. I suspect it was very early on (closer to 1976), but I'm not sure.






          share|improve this answer












          First OS is hard to say. The codes go back to the 1960s with the Teletype Model 33. I have a hunch the original usage was not part of an operating system but at a lower level. In later times, certainly by the 1980s, there were plenty of devices that functioned with software flow control at the hardware level.



          Microcomputer operating systems that supported Ctrl-S/Ctrl-Q included CP/M 2.2. I found plenty of versions of the CP/M 2.2 manual with multiple copyright dates from 1976 through 1984 that included clear references to this feature, but it is not clear at what time it was actually added. I suspect it was very early on (closer to 1976), but I'm not sure.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Aug 13 at 5:14









          manassehkatz

          1,091110




          1,091110




















              up vote
              13
              down vote













              It was nothing to do with an OS as such. Ctrl-S and Ctrl-Q are simply XON and XOFF in ASCII.



              These codes are used in serial communications to pause and resume sending.



              With hardware handshake on RS232 and similar communications standards used the RTS (Ready To Send) & CTS (Clear To Send) handshake lines.

              Many comms links were, and still are, three wire - send, receive and earth. On these, you use Software handshake. The receiving end will send XOFF to tell the sender that its input buffers were nearly full and then XON to tell the sender to resume once the buffer had emptied a bit.



              These codes go back to the dawn of RS-232 in 1960. A standard that was deliberately OS-agnostic.






              share|improve this answer
























                up vote
                13
                down vote













                It was nothing to do with an OS as such. Ctrl-S and Ctrl-Q are simply XON and XOFF in ASCII.



                These codes are used in serial communications to pause and resume sending.



                With hardware handshake on RS232 and similar communications standards used the RTS (Ready To Send) & CTS (Clear To Send) handshake lines.

                Many comms links were, and still are, three wire - send, receive and earth. On these, you use Software handshake. The receiving end will send XOFF to tell the sender that its input buffers were nearly full and then XON to tell the sender to resume once the buffer had emptied a bit.



                These codes go back to the dawn of RS-232 in 1960. A standard that was deliberately OS-agnostic.






                share|improve this answer






















                  up vote
                  13
                  down vote










                  up vote
                  13
                  down vote









                  It was nothing to do with an OS as such. Ctrl-S and Ctrl-Q are simply XON and XOFF in ASCII.



                  These codes are used in serial communications to pause and resume sending.



                  With hardware handshake on RS232 and similar communications standards used the RTS (Ready To Send) & CTS (Clear To Send) handshake lines.

                  Many comms links were, and still are, three wire - send, receive and earth. On these, you use Software handshake. The receiving end will send XOFF to tell the sender that its input buffers were nearly full and then XON to tell the sender to resume once the buffer had emptied a bit.



                  These codes go back to the dawn of RS-232 in 1960. A standard that was deliberately OS-agnostic.






                  share|improve this answer












                  It was nothing to do with an OS as such. Ctrl-S and Ctrl-Q are simply XON and XOFF in ASCII.



                  These codes are used in serial communications to pause and resume sending.



                  With hardware handshake on RS232 and similar communications standards used the RTS (Ready To Send) & CTS (Clear To Send) handshake lines.

                  Many comms links were, and still are, three wire - send, receive and earth. On these, you use Software handshake. The receiving end will send XOFF to tell the sender that its input buffers were nearly full and then XON to tell the sender to resume once the buffer had emptied a bit.



                  These codes go back to the dawn of RS-232 in 1960. A standard that was deliberately OS-agnostic.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Aug 13 at 7:56









                  Chenmunka♦

                  5,62232651




                  5,62232651




















                      up vote
                      1
                      down vote













                      As a user of teletypes and serial communications back in the 1970s - this is my understanding as well. There were hardware signals for "flow control".



                      Was thinking you set bits on UART which then converted to voltages on communications cable.



                      For example if Device A was sending to Device B - Device B needed a way to slow down the sending of data from Device A - so that Device A did not overrun the receive buffer of Device B. There were hardware signals that Device B could set to tell Device A to stop sending.



                      It was considered a huge improvement when they could embed software flow control like Ctrl-Q and Ctrl-S in the data and you no longer needed to set hardware flow control signals on the UART.






                      share|improve this answer
















                      • 2




                        Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                        – Chenmunka♦
                        Aug 15 at 13:41











                      • Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                        – manassehkatz
                        Aug 15 at 14:07














                      up vote
                      1
                      down vote













                      As a user of teletypes and serial communications back in the 1970s - this is my understanding as well. There were hardware signals for "flow control".



                      Was thinking you set bits on UART which then converted to voltages on communications cable.



                      For example if Device A was sending to Device B - Device B needed a way to slow down the sending of data from Device A - so that Device A did not overrun the receive buffer of Device B. There were hardware signals that Device B could set to tell Device A to stop sending.



                      It was considered a huge improvement when they could embed software flow control like Ctrl-Q and Ctrl-S in the data and you no longer needed to set hardware flow control signals on the UART.






                      share|improve this answer
















                      • 2




                        Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                        – Chenmunka♦
                        Aug 15 at 13:41











                      • Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                        – manassehkatz
                        Aug 15 at 14:07












                      up vote
                      1
                      down vote










                      up vote
                      1
                      down vote









                      As a user of teletypes and serial communications back in the 1970s - this is my understanding as well. There were hardware signals for "flow control".



                      Was thinking you set bits on UART which then converted to voltages on communications cable.



                      For example if Device A was sending to Device B - Device B needed a way to slow down the sending of data from Device A - so that Device A did not overrun the receive buffer of Device B. There were hardware signals that Device B could set to tell Device A to stop sending.



                      It was considered a huge improvement when they could embed software flow control like Ctrl-Q and Ctrl-S in the data and you no longer needed to set hardware flow control signals on the UART.






                      share|improve this answer












                      As a user of teletypes and serial communications back in the 1970s - this is my understanding as well. There were hardware signals for "flow control".



                      Was thinking you set bits on UART which then converted to voltages on communications cable.



                      For example if Device A was sending to Device B - Device B needed a way to slow down the sending of data from Device A - so that Device A did not overrun the receive buffer of Device B. There were hardware signals that Device B could set to tell Device A to stop sending.



                      It was considered a huge improvement when they could embed software flow control like Ctrl-Q and Ctrl-S in the data and you no longer needed to set hardware flow control signals on the UART.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Aug 15 at 13:31









                      user14983

                      111




                      111







                      • 2




                        Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                        – Chenmunka♦
                        Aug 15 at 13:41











                      • Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                        – manassehkatz
                        Aug 15 at 14:07












                      • 2




                        Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                        – Chenmunka♦
                        Aug 15 at 13:41











                      • Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                        – manassehkatz
                        Aug 15 at 14:07







                      2




                      2




                      Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                      – Chenmunka♦
                      Aug 15 at 13:41





                      Actually hardware flow control was considered an improvement of software flow control. Allowing B to stop A sending by asserting voltage on another wire released XON, XOFF and all the other control codes from their flow control meanings. This accelerated the transmission of binary, as opposed to pure ASCII, data in messages. You just needed 5-wire, not 3-wire cables - more expensive. Alternatively, what was your application? Could you give more as to why software flow was better for you then?
                      – Chenmunka♦
                      Aug 15 at 13:41













                      Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                      – manassehkatz
                      Aug 15 at 14:07




                      Basically BOTH were "better". Hardware flow control because it is (a) out-of-band so no software or device "understanding" of codes is needed and (b) the data transfers are binary safe. Software flow control because it can translate perfectly into analog modems, radio signals or any other transmission method. The issue with 5-wire cables isn't just the expense of the cable - it is simply not practical when using the PSTN. Later (microprocessor controlled) modems supported hardware flow control using additional data mixed in with the user data as part of advanced protocols.
                      – manassehkatz
                      Aug 15 at 14:07

















                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7263%2fhistory-of-ctrl-s-and-ctrl-q-for-flow-control%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Comments

                      Popular posts from this blog

                      What does second last employer means? [closed]

                      List of Gilmore Girls characters

                      One-line joke