Is there a reason why MS-DOS didn't use more English words for commands?
Clash Royale CLAN TAG#URR8PPP
up vote
7
down vote
favorite
When using diskpart, I can list all the drives by typing LIST DISK
or to select a specific drive I can type select disk 1
.
Is there a reason why MS-DOS didn't use more English words to do tasks, for example, to list all of the files in a directory with no additional information, I can do this:
dir /b
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare
?
It's a little longer to type, but it's a little more clear what exactly the code is doing. Insert it into a batch file that can take command line parameters and you can save yourself some typing.
history ms-dos
New contributor
 |Â
show 1 more comment
up vote
7
down vote
favorite
When using diskpart, I can list all the drives by typing LIST DISK
or to select a specific drive I can type select disk 1
.
Is there a reason why MS-DOS didn't use more English words to do tasks, for example, to list all of the files in a directory with no additional information, I can do this:
dir /b
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare
?
It's a little longer to type, but it's a little more clear what exactly the code is doing. Insert it into a batch file that can take command line parameters and you can save yourself some typing.
history ms-dos
New contributor
1
The title's a little off; DOS does use English words. There'sdir
ectory,echo
,copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.
â wizzwizz4â¦
21 hours ago
2
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file calledDIR.BAT
to save yourself some typing ;)
â tofro
20 hours ago
1
@tofro - I get what you're saying, a lot of typing compared todir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come acrossdir /b
and not understand it, but that's why we have the manual.
â BasementJoe
20 hours ago
3
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago
 |Â
show 1 more comment
up vote
7
down vote
favorite
up vote
7
down vote
favorite
When using diskpart, I can list all the drives by typing LIST DISK
or to select a specific drive I can type select disk 1
.
Is there a reason why MS-DOS didn't use more English words to do tasks, for example, to list all of the files in a directory with no additional information, I can do this:
dir /b
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare
?
It's a little longer to type, but it's a little more clear what exactly the code is doing. Insert it into a batch file that can take command line parameters and you can save yourself some typing.
history ms-dos
New contributor
When using diskpart, I can list all the drives by typing LIST DISK
or to select a specific drive I can type select disk 1
.
Is there a reason why MS-DOS didn't use more English words to do tasks, for example, to list all of the files in a directory with no additional information, I can do this:
dir /b
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare
?
It's a little longer to type, but it's a little more clear what exactly the code is doing. Insert it into a batch file that can take command line parameters and you can save yourself some typing.
history ms-dos
history ms-dos
New contributor
New contributor
edited 5 mins ago
Peter Mortensen
1455
1455
New contributor
asked 21 hours ago
BasementJoe
386
386
New contributor
New contributor
1
The title's a little off; DOS does use English words. There'sdir
ectory,echo
,copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.
â wizzwizz4â¦
21 hours ago
2
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file calledDIR.BAT
to save yourself some typing ;)
â tofro
20 hours ago
1
@tofro - I get what you're saying, a lot of typing compared todir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come acrossdir /b
and not understand it, but that's why we have the manual.
â BasementJoe
20 hours ago
3
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago
 |Â
show 1 more comment
1
The title's a little off; DOS does use English words. There'sdir
ectory,echo
,copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.
â wizzwizz4â¦
21 hours ago
2
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file calledDIR.BAT
to save yourself some typing ;)
â tofro
20 hours ago
1
@tofro - I get what you're saying, a lot of typing compared todir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come acrossdir /b
and not understand it, but that's why we have the manual.
â BasementJoe
20 hours ago
3
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago
1
1
The title's a little off; DOS does use English words. There's
dir
ectory, echo
, copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.â wizzwizz4â¦
21 hours ago
The title's a little off; DOS does use English words. There's
dir
ectory, echo
, copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.â wizzwizz4â¦
21 hours ago
2
2
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file called
DIR.BAT
to save yourself some typing ;)â tofro
20 hours ago
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file called
DIR.BAT
to save yourself some typing ;)â tofro
20 hours ago
1
1
@tofro - I get what you're saying, a lot of typing compared to
dir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come across dir /b
and not understand it, but that's why we have the manual.â BasementJoe
20 hours ago
@tofro - I get what you're saying, a lot of typing compared to
dir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come across dir /b
and not understand it, but that's why we have the manual.â BasementJoe
20 hours ago
3
3
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago
 |Â
show 1 more comment
6 Answers
6
active
oldest
votes
up vote
18
down vote
accepted
MS DOS inherited many of its commands from CP/M. CP/M was designed with influences from classic minicomputer operating systems, especially those produced by DEC. Many of these systems dated back to the mid to late 1960s and were designed to run in very little space, e.g. DOS-11 ran on a PDP 11 with 8Â KB of RAM. They were also mostly designed primarily for programmers, not end users.
This meant that the influences drawn had terse command structures, often using as few characters as possible, and users were expected to understand and appreciate the precision of using technical terms.
You can see the influence of this on DIR
, as in your example. It shows the directory of a disk (CP/M and early DOS versions only supported a single directory per disk - nested subdirectories were a feature added in DOS 2.0). "VIEW
" could have been ambiguous - it may have shown the contents of a file instead.
TYPE
is another good example, but for a different reason - it printed the contents of a file (early versions of CP/M were intended for use on a teletype, not a visual display). It's only because of changing technology in the interim that the choice of command became obscure.
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
add a comment |Â
up vote
11
down vote
Because typing is awful.
Having to type "select folder" or "change directory" or anything else like that gets very tedious, very quickly.
The commands are meant to be used to express the users wishes to the computer, they're not designed to be necessarily used to communicate between human beings.
In truth "change directory" is no more intuitive than "cd" or "wd" or "F1" or double clicking on that folder thing in the window, especially when the user has no concept of what a "directory" is in the first place. It all needs to be trained up.
Also, there's the usage of the programs. Consider the CP/M pip
command. "Peripheral Interface Program" (or something close). Regardless of what the command is called, this is a particularly arcane multitool to try and figure out and use.
Computers are not the commands, they're concepts. The concepts are (demonstrably), even after all this time, quite difficult for folks to grasp. No matter how much lipstick you paint on these things, they're still computers underneath.
Today, we have the momentum of 40+ years of history, so there's motivation today to keep things "similar". So as not to have to relearn vocabularies. Take a look at the UCSD P-System. It used "English" for everything (almost, it's menu driven), but to someone used to how things work today? It's not an intuitive system at all. It's assumptions and preconceptions don't match what we're used to today. However, to someone who never used a computer, I'm sure it may have been easier to adopt. But in the end, an advanced user isn't looking at commands at all. "FC" becomes F)ile -> C)opy.
If you've ever watched someone learn a computer system, they, in general, do not know "why" they do something, only "What to do". This was starkly demonstrated to me when I saw someone learning a hotel management system. The instructor described the interface, the menus, the organization, all this taxonomy of structure built in the system. Meanwhile, at the end, the student basically said "So, to do XX I pre F6, right?" That's all they wanted to know. All of the rest, none of that took.
In the end, we want to express ourselves succinctly to the computer. Repeatedly typing long treatises to the machine is tedious, slow, and error prone. For the same reason we don't like listening to voice response menus at customer support, once we know the system, we don't like typing long strings of commands if we can possibly avoid it. And since any system needs to be trained upon anyway, in the end many of the commands don't matter, as long as they're remotely mnemonic. (But even that's not true, how is typing 'ant' to compile a program intuitive or mnemonic?)
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,Select-String
andGet-ChildItem
andForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g.,sls
,gci
,%
).
â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
add a comment |Â
up vote
4
down vote
Most of MS-DOS's commands were inherited from CP/M and UNIX, both of which used cryptic commands to conserve space (resources were at a premium in those days - 16K of RAM and 10MB of disk space was considered A LOT! - and expensive to acquire). Since MS-DOS was also operating under resource constraints, it made sense at the time to use the same sort of compact commands, plus, it increased familiarity and made the transition from those older systems to MS-DOS easier. To this day, CMD.EXE on Windows maintains that compatibility/familiarity, even though resources aren't at that kind of premium. PowerShell, however, breaks that paradigm, and provides longer, more descriptive commands.
add a comment |Â
up vote
4
down vote
Is there any reason why MS-DOS didn't use more English words to do tasks,
The most obvious one is that MS-DOS was in the beginning a rather plain CP/M clone, which itself was created with DEC systems in mind. Other, later (2.0) additions where taken from Unix. From a developer point of view it was more import to get the system running than to think about (maybe) more apealing command names.
This as well worked with early users, as they where coming with a great majoriy from CP/M (and some from Unix) and didn't have to relearn everything, only the newer/different features. That's way less work than learning a whole new system.
On the long run this is also quite important for international sales. While commands like DIR
, CD
or DEL
are based on English words, in itself they are just a few letters to memorize. No English required. In contrast a command like 'SHOWALLFILESONDISK' ist a nonsensical bloat - and equaly hard to read/learn for anyone speaking English as well.
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare? It's a little longer to type, but it's a little more clear what exactly the code is doing.
Wouldn't that be just helpful in the early learning stage? If at all?
After all, the command line is to be used in direct dialog. Here short easy memorable commands are king. Remember how often you screw up typing even these short commands? Every additional letter 26-folds the chances of geting an error instead of a result.
A good CLI is always a compromise between shortness and the support of memorizing them - and regularity to crossreference (See story below)
Insert it into a batch file that can take command line parameters and you can save yourself some typing.
Which then again would get of course a short name, to make it aplicable, right? Except it would be a different short name on different machines and resulting in a real Babylon effect of noone being able to tolk to a different computer as his own - as by then, the long names used during learn phase have been vanished from memory, and guessing about name and spelling starts ove.
Conclusion: Such a CLI wouldn't be foreward looking.
A little mainframe anecdote to add:
In the late 1970 Siemens tried to push their mainframes (/370ish) into the mini range and targeting office applications. They invested almost more money into a creating good educational/user manuals then in downsizeing the machines (*1). The basic idea was that DEC, DR or others made their users successfuly use some cryptic command line, so why not use the even more complex ... err ... powerful BS2000 (*2) command line :) The project flopped part due the hardware, but equaly due the fact that the targeted Users aren't doing personal computing. They use company applications and never touched a command line.
Forcing the micro/mini based concept of a command line to use separated, rather minimal applications instead of integrated software, did not only not bring the promised cost cutting, but also produced unsatisfied users. As a last effort some realy nifty menu system was created. While it eased usage for first time users, it became a lump foot to more experienced ones. Several revisions of the system happened in short time until it got buried with the whole project.
So far the usual. Exept (*3), the ones responsible stayed with the company, close/within OS development.
Not much later complaints about an ever growing complexity of the command line (there where function with dozends of arguments) together with increasing costs to implement new commands and/or parameters needed for new functions resulted in a project to redo the command system/interpreter. A really nifty system to structure the command interpreter as well as command and parameter modules was developed. It worked quite well and proved again, that a complete redo of something, that has been grown over years, is a great idea. Productivity of the OS development related to command execution did more than double with the new system.
Except, management (guess who *4) also came up with the idea of a way more user friendly command language. Instead of stupid pseudo language commands (and them even abreviated), a much more clear readable, writeable and memorizable command language should be used. And of course with way less parameters to make it easy. As handy it is, there was also a project about that at a German university about how to create such a language in a systematic way using tools to define and handle that, so that was also brought in.
As a result, even simple things like to get attributes of a file turned into a typing Session:
Old:
/fs <filename>,all
New:
/show-file-attribues file-name=<filename> (*5)
There where many good reasons to do this, not at least the interpreter structure. Now a command an all its parameters were defined in an unambiguous way (using a DSL) that could not only be used to check for valid commands, but also take these checks out of each function into a centralized command interpreter doing all needed syntax checks before the function gets invoked, thus releaving the function program of many, many statements. The improvement on the OS development side was unprecedented. And they couldn't understand why reaction on the user side was considerable less enthusiasic. Not at all.
To be fair, they were aware that this is much to type, so the system also included a method to shorten commands to the least amout of letters needed to identify it correctly. Also, at least parameters that had to be always included (like the file name in above example) could be given without their identifyer. In above case this would shorten that to:
/s-f-a <filename>
Great, except, this is an especially simple example, and even here it doesn't work as simply, as there is also a set-file-attribute
command, so abrevation must be rather sh-f-a
And here lies the real issue for humans. To decide which abrevation is possible and which not, one needs to know all existing commands. There was no structure (as with MS-DOS or other hand made CLI). A human could deduct it, it had to be learned. and relearned when new commands, conflicting with his memorized abreviations came up.
This proved even a true bug breeder for batchjobs. While most 'official' jobs where done with the long form, the usual quick typed scripts did more often than not use the abrevation its programmer preferred (*6). We have all seen these scripts not just used once, but surviving for years in production environments. Haven't we? In case of the command syntax system in use this would mean each of them could fail with any new OS release without any prior warning. Not only hard to detect and to correct, still a dangling Damocles sword over all jobs.
To make it worse, the now way longer commands made abrevations almost mandatory. The command line is limited in length to 72 characters per line and allows only a certain amount of extension lines. So with all these lengthy parameter names, this maximum was easy to reach. So not just quick and dirty procedures did sit below the sword.
Creating a file (entry) with certain attribues about devices to use, access rights and alike was originally possible within a single FILE
command, making it not just compact (and hard to read), but also atomic on the file system. Except for rather simple cases the new syntax required several different commands to first create it, then assign attributes and the like, but also several of them, as complex cases couldn't fit a single command line. All while hoping that all other processes watching the file system would be able to cope with these intermediate states that could not have happened before (*7).
Heck, it wasn't all that bad - or at least it wouldn't seem so at first sight:
The systematic command definition allowed the addition of a help screen system not just showing general messages, but giving way detailed information what is wrong with the command typed - all without any involvement of the function. These screens even allowed menu driven point and clock alike completion with nicely formatted fields for all usable parameters and so on. Except, to do so the terminal was switched from line into form mode. Somethign a CLI user hates. Not just because of a different handling metaphore (thing about a CLI window just poping up a modal box to be handled with the mouse) destroying the flow, but also because the screen was cleared afterwards, so no easy to use of any former line to copy and reuse. Some users really developed a neurosis about not geting the help screen (*8).
Oh, and to close the cicle, it also got a staircase wit: People introduced today (well, after 1990) complain a lot about the total unimaginitive bureaucratic and unhandy nature of this CLI - and blame it on been typical mainframe complex.
Long story short: OS-development has been there and it turned out to be less than desirable.
*1 - The later resulted in what was the most unreliable and least sold of all of their mainframs, the infamous CoCo - here meaning Compact Computer, desprite the fact still being the size of several table height refrigiators.
*2 - BS2000 is still around, now owned by Fujitsu. In my opinion eventually the best OS design ever. Ok, given, I should limit this to 'best internal structure and design ever'.
*3 - Which again is usual in large companies.
*4 - I don't have proof for that claim beside being told so back then. So while it sounds truthish, it might not be true
*5 - Looks much like the OP asked for - even more readable with hyphens seperating real words, doesn't it?
*6 - It's much like the problem I mentioned in the answer abotu private naming conventions for batch files ment to shorten the typing.
*7 - I had one case like that in a real customer application, where one (redone) job was creating a file in a transfer directory. The complete independent (and older) transfer process did not include handling for a file with attrbutes set only partly the way it was expecting them. As this software was third party and a (mostly) closed system it took me quite a while to figure out a sequence to make both of us happy.
*8 - Later on it was made optional and regular error messages where displayed - but then again, instead of a simple line like File
name missing` a structured analysis was presented ofer up to a dozend lines - again killing most of your screen.
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
add a comment |Â
up vote
2
down vote
The reason for short commands has been re-stated several ways here. But there's one critical detail missing from all of the answers: diskpart
was introduced after Windows NT was developed.
Windows NT is based on many of the core concepts of Digital's VMS. Dave Cutler - the chief system architect of VMS - quit Digital (DEC) in 1988, and Microsoft hired him and part of his team to develop a new enterprise-grade OS.
In VMS, most complex commands could either be executed as interactive commands with their own shell, or by specifying the command with enough options from the terminal. MS-DOS had previously borrowed DEC's /
as an option specifier, so the convention was already familiar to PC users when it was used in Windows NT's shell.
All of VMS's commands could be shortened down to the shortest unique identifier (with occasionally puerile side-effects, especially for larval VMS admins like I was). While you could type out the commands in full (helpful in scripts you were going to reuse) it was usual to shorten commands down to the bone.
add a comment |Â
up vote
1
down vote
It seems that an important fact is missing in the answers here:
Almost all commands in MS-DOS (and most other operating systems) are in fact filenames.
The length of filenames where restricted to 8 characters on MS-DOS (and CP/M, from where many commands where taken).
Up to MS-DOS 6.22 they still used only 8 character commands List of all MS-DOS commands
The filename is also an explanation why the commands are single words: space was for a long time a no-no in filenames, even when filenames longer that 8 chracter started to be used most systems had problem with space in a filename.
Historical note, some systems had even shorter filenames that restricted the length of commands even more. (RT-11 on PDP-11 is one example: 6+3 characters per filename)
New contributor
add a comment |Â
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
18
down vote
accepted
MS DOS inherited many of its commands from CP/M. CP/M was designed with influences from classic minicomputer operating systems, especially those produced by DEC. Many of these systems dated back to the mid to late 1960s and were designed to run in very little space, e.g. DOS-11 ran on a PDP 11 with 8Â KB of RAM. They were also mostly designed primarily for programmers, not end users.
This meant that the influences drawn had terse command structures, often using as few characters as possible, and users were expected to understand and appreciate the precision of using technical terms.
You can see the influence of this on DIR
, as in your example. It shows the directory of a disk (CP/M and early DOS versions only supported a single directory per disk - nested subdirectories were a feature added in DOS 2.0). "VIEW
" could have been ambiguous - it may have shown the contents of a file instead.
TYPE
is another good example, but for a different reason - it printed the contents of a file (early versions of CP/M were intended for use on a teletype, not a visual display). It's only because of changing technology in the interim that the choice of command became obscure.
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
add a comment |Â
up vote
18
down vote
accepted
MS DOS inherited many of its commands from CP/M. CP/M was designed with influences from classic minicomputer operating systems, especially those produced by DEC. Many of these systems dated back to the mid to late 1960s and were designed to run in very little space, e.g. DOS-11 ran on a PDP 11 with 8Â KB of RAM. They were also mostly designed primarily for programmers, not end users.
This meant that the influences drawn had terse command structures, often using as few characters as possible, and users were expected to understand and appreciate the precision of using technical terms.
You can see the influence of this on DIR
, as in your example. It shows the directory of a disk (CP/M and early DOS versions only supported a single directory per disk - nested subdirectories were a feature added in DOS 2.0). "VIEW
" could have been ambiguous - it may have shown the contents of a file instead.
TYPE
is another good example, but for a different reason - it printed the contents of a file (early versions of CP/M were intended for use on a teletype, not a visual display). It's only because of changing technology in the interim that the choice of command became obscure.
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
add a comment |Â
up vote
18
down vote
accepted
up vote
18
down vote
accepted
MS DOS inherited many of its commands from CP/M. CP/M was designed with influences from classic minicomputer operating systems, especially those produced by DEC. Many of these systems dated back to the mid to late 1960s and were designed to run in very little space, e.g. DOS-11 ran on a PDP 11 with 8Â KB of RAM. They were also mostly designed primarily for programmers, not end users.
This meant that the influences drawn had terse command structures, often using as few characters as possible, and users were expected to understand and appreciate the precision of using technical terms.
You can see the influence of this on DIR
, as in your example. It shows the directory of a disk (CP/M and early DOS versions only supported a single directory per disk - nested subdirectories were a feature added in DOS 2.0). "VIEW
" could have been ambiguous - it may have shown the contents of a file instead.
TYPE
is another good example, but for a different reason - it printed the contents of a file (early versions of CP/M were intended for use on a teletype, not a visual display). It's only because of changing technology in the interim that the choice of command became obscure.
MS DOS inherited many of its commands from CP/M. CP/M was designed with influences from classic minicomputer operating systems, especially those produced by DEC. Many of these systems dated back to the mid to late 1960s and were designed to run in very little space, e.g. DOS-11 ran on a PDP 11 with 8Â KB of RAM. They were also mostly designed primarily for programmers, not end users.
This meant that the influences drawn had terse command structures, often using as few characters as possible, and users were expected to understand and appreciate the precision of using technical terms.
You can see the influence of this on DIR
, as in your example. It shows the directory of a disk (CP/M and early DOS versions only supported a single directory per disk - nested subdirectories were a feature added in DOS 2.0). "VIEW
" could have been ambiguous - it may have shown the contents of a file instead.
TYPE
is another good example, but for a different reason - it printed the contents of a file (early versions of CP/M were intended for use on a teletype, not a visual display). It's only because of changing technology in the interim that the choice of command became obscure.
edited 3 mins ago
wizzwizz4â¦
7,87363599
7,87363599
answered 21 hours ago
Jules
7,39612038
7,39612038
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
add a comment |Â
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
+1 - Essentially the same thing I said, though you provided a little more detail.
â Jeff Zeitlin
21 hours ago
1
1
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
It is interesting enough whether CP/M inherites some of its shell commands from somewhere else, probably from OS/8 (DIR, PIP for example). And whether MSDOS inherites something from RT-11 or similar PDP-11 shells.
â lvd
20 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
Apple DOS uses CATALOG D1. We had programmable keyboards to cope with that nuisance.
â Janka
19 hours ago
1
1
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
Re: "the directory of a disk", "only supported a single directory per disk": *jaw drop* I never understood until right now why directories are called that. Before nested directories, the directory was literally a directory of the files on the disk. Only after we had nested directories, and started thinking of files as being "in" a directory (= a folder), was the metaphor abandoned.
â ruakh
16 hours ago
add a comment |Â
up vote
11
down vote
Because typing is awful.
Having to type "select folder" or "change directory" or anything else like that gets very tedious, very quickly.
The commands are meant to be used to express the users wishes to the computer, they're not designed to be necessarily used to communicate between human beings.
In truth "change directory" is no more intuitive than "cd" or "wd" or "F1" or double clicking on that folder thing in the window, especially when the user has no concept of what a "directory" is in the first place. It all needs to be trained up.
Also, there's the usage of the programs. Consider the CP/M pip
command. "Peripheral Interface Program" (or something close). Regardless of what the command is called, this is a particularly arcane multitool to try and figure out and use.
Computers are not the commands, they're concepts. The concepts are (demonstrably), even after all this time, quite difficult for folks to grasp. No matter how much lipstick you paint on these things, they're still computers underneath.
Today, we have the momentum of 40+ years of history, so there's motivation today to keep things "similar". So as not to have to relearn vocabularies. Take a look at the UCSD P-System. It used "English" for everything (almost, it's menu driven), but to someone used to how things work today? It's not an intuitive system at all. It's assumptions and preconceptions don't match what we're used to today. However, to someone who never used a computer, I'm sure it may have been easier to adopt. But in the end, an advanced user isn't looking at commands at all. "FC" becomes F)ile -> C)opy.
If you've ever watched someone learn a computer system, they, in general, do not know "why" they do something, only "What to do". This was starkly demonstrated to me when I saw someone learning a hotel management system. The instructor described the interface, the menus, the organization, all this taxonomy of structure built in the system. Meanwhile, at the end, the student basically said "So, to do XX I pre F6, right?" That's all they wanted to know. All of the rest, none of that took.
In the end, we want to express ourselves succinctly to the computer. Repeatedly typing long treatises to the machine is tedious, slow, and error prone. For the same reason we don't like listening to voice response menus at customer support, once we know the system, we don't like typing long strings of commands if we can possibly avoid it. And since any system needs to be trained upon anyway, in the end many of the commands don't matter, as long as they're remotely mnemonic. (But even that's not true, how is typing 'ant' to compile a program intuitive or mnemonic?)
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,Select-String
andGet-ChildItem
andForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g.,sls
,gci
,%
).
â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
add a comment |Â
up vote
11
down vote
Because typing is awful.
Having to type "select folder" or "change directory" or anything else like that gets very tedious, very quickly.
The commands are meant to be used to express the users wishes to the computer, they're not designed to be necessarily used to communicate between human beings.
In truth "change directory" is no more intuitive than "cd" or "wd" or "F1" or double clicking on that folder thing in the window, especially when the user has no concept of what a "directory" is in the first place. It all needs to be trained up.
Also, there's the usage of the programs. Consider the CP/M pip
command. "Peripheral Interface Program" (or something close). Regardless of what the command is called, this is a particularly arcane multitool to try and figure out and use.
Computers are not the commands, they're concepts. The concepts are (demonstrably), even after all this time, quite difficult for folks to grasp. No matter how much lipstick you paint on these things, they're still computers underneath.
Today, we have the momentum of 40+ years of history, so there's motivation today to keep things "similar". So as not to have to relearn vocabularies. Take a look at the UCSD P-System. It used "English" for everything (almost, it's menu driven), but to someone used to how things work today? It's not an intuitive system at all. It's assumptions and preconceptions don't match what we're used to today. However, to someone who never used a computer, I'm sure it may have been easier to adopt. But in the end, an advanced user isn't looking at commands at all. "FC" becomes F)ile -> C)opy.
If you've ever watched someone learn a computer system, they, in general, do not know "why" they do something, only "What to do". This was starkly demonstrated to me when I saw someone learning a hotel management system. The instructor described the interface, the menus, the organization, all this taxonomy of structure built in the system. Meanwhile, at the end, the student basically said "So, to do XX I pre F6, right?" That's all they wanted to know. All of the rest, none of that took.
In the end, we want to express ourselves succinctly to the computer. Repeatedly typing long treatises to the machine is tedious, slow, and error prone. For the same reason we don't like listening to voice response menus at customer support, once we know the system, we don't like typing long strings of commands if we can possibly avoid it. And since any system needs to be trained upon anyway, in the end many of the commands don't matter, as long as they're remotely mnemonic. (But even that's not true, how is typing 'ant' to compile a program intuitive or mnemonic?)
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,Select-String
andGet-ChildItem
andForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g.,sls
,gci
,%
).
â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
add a comment |Â
up vote
11
down vote
up vote
11
down vote
Because typing is awful.
Having to type "select folder" or "change directory" or anything else like that gets very tedious, very quickly.
The commands are meant to be used to express the users wishes to the computer, they're not designed to be necessarily used to communicate between human beings.
In truth "change directory" is no more intuitive than "cd" or "wd" or "F1" or double clicking on that folder thing in the window, especially when the user has no concept of what a "directory" is in the first place. It all needs to be trained up.
Also, there's the usage of the programs. Consider the CP/M pip
command. "Peripheral Interface Program" (or something close). Regardless of what the command is called, this is a particularly arcane multitool to try and figure out and use.
Computers are not the commands, they're concepts. The concepts are (demonstrably), even after all this time, quite difficult for folks to grasp. No matter how much lipstick you paint on these things, they're still computers underneath.
Today, we have the momentum of 40+ years of history, so there's motivation today to keep things "similar". So as not to have to relearn vocabularies. Take a look at the UCSD P-System. It used "English" for everything (almost, it's menu driven), but to someone used to how things work today? It's not an intuitive system at all. It's assumptions and preconceptions don't match what we're used to today. However, to someone who never used a computer, I'm sure it may have been easier to adopt. But in the end, an advanced user isn't looking at commands at all. "FC" becomes F)ile -> C)opy.
If you've ever watched someone learn a computer system, they, in general, do not know "why" they do something, only "What to do". This was starkly demonstrated to me when I saw someone learning a hotel management system. The instructor described the interface, the menus, the organization, all this taxonomy of structure built in the system. Meanwhile, at the end, the student basically said "So, to do XX I pre F6, right?" That's all they wanted to know. All of the rest, none of that took.
In the end, we want to express ourselves succinctly to the computer. Repeatedly typing long treatises to the machine is tedious, slow, and error prone. For the same reason we don't like listening to voice response menus at customer support, once we know the system, we don't like typing long strings of commands if we can possibly avoid it. And since any system needs to be trained upon anyway, in the end many of the commands don't matter, as long as they're remotely mnemonic. (But even that's not true, how is typing 'ant' to compile a program intuitive or mnemonic?)
Because typing is awful.
Having to type "select folder" or "change directory" or anything else like that gets very tedious, very quickly.
The commands are meant to be used to express the users wishes to the computer, they're not designed to be necessarily used to communicate between human beings.
In truth "change directory" is no more intuitive than "cd" or "wd" or "F1" or double clicking on that folder thing in the window, especially when the user has no concept of what a "directory" is in the first place. It all needs to be trained up.
Also, there's the usage of the programs. Consider the CP/M pip
command. "Peripheral Interface Program" (or something close). Regardless of what the command is called, this is a particularly arcane multitool to try and figure out and use.
Computers are not the commands, they're concepts. The concepts are (demonstrably), even after all this time, quite difficult for folks to grasp. No matter how much lipstick you paint on these things, they're still computers underneath.
Today, we have the momentum of 40+ years of history, so there's motivation today to keep things "similar". So as not to have to relearn vocabularies. Take a look at the UCSD P-System. It used "English" for everything (almost, it's menu driven), but to someone used to how things work today? It's not an intuitive system at all. It's assumptions and preconceptions don't match what we're used to today. However, to someone who never used a computer, I'm sure it may have been easier to adopt. But in the end, an advanced user isn't looking at commands at all. "FC" becomes F)ile -> C)opy.
If you've ever watched someone learn a computer system, they, in general, do not know "why" they do something, only "What to do". This was starkly demonstrated to me when I saw someone learning a hotel management system. The instructor described the interface, the menus, the organization, all this taxonomy of structure built in the system. Meanwhile, at the end, the student basically said "So, to do XX I pre F6, right?" That's all they wanted to know. All of the rest, none of that took.
In the end, we want to express ourselves succinctly to the computer. Repeatedly typing long treatises to the machine is tedious, slow, and error prone. For the same reason we don't like listening to voice response menus at customer support, once we know the system, we don't like typing long strings of commands if we can possibly avoid it. And since any system needs to be trained upon anyway, in the end many of the commands don't matter, as long as they're remotely mnemonic. (But even that's not true, how is typing 'ant' to compile a program intuitive or mnemonic?)
answered 20 hours ago
Will Hartung
2,622616
2,622616
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,Select-String
andGet-ChildItem
andForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g.,sls
,gci
,%
).
â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
add a comment |Â
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,Select-String
andGet-ChildItem
andForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g.,sls
,gci
,%
).
â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
6
6
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
I know I could say "Alexa, turn down the volume". But it's far more effective to just rotate the knob.
â Brian H
19 hours ago
1
1
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
Anyone who used NOS/VE for more than a minute learned to appreciate the abbreviated command names. Typing "chawc" instead of "change_working_catalog" to change directories was a time saver.
â Rob Crawford
16 hours ago
1
1
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,
Select-String
and Get-ChildItem
and ForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g., sls
, gci
, %
).â Soren Bjornstad
14 hours ago
IMO, PowerShell is the first major CLI to finally hit the perfect balance â commands consistently have long, official forms that are easy to read and you can use in scripts, e.g.,
Select-String
and Get-ChildItem
and ForEach-Object
, and then they come with short forms for when you know the system and are typing interactively (e.g., sls
, gci
, %
).â Soren Bjornstad
14 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
PIP came from the series of operating systems for the PDP-11.
â No'am Newman
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
@Soren - The George 3 operating system (ICL 1900 series mainframes, in the 1960s) consistently had a long form and a two-letter abbreviation for each command. For example, LISTFILE or LF to print or display a file. icl1900.co.uk/g3/commands.html
â dave
13 hours ago
add a comment |Â
up vote
4
down vote
Most of MS-DOS's commands were inherited from CP/M and UNIX, both of which used cryptic commands to conserve space (resources were at a premium in those days - 16K of RAM and 10MB of disk space was considered A LOT! - and expensive to acquire). Since MS-DOS was also operating under resource constraints, it made sense at the time to use the same sort of compact commands, plus, it increased familiarity and made the transition from those older systems to MS-DOS easier. To this day, CMD.EXE on Windows maintains that compatibility/familiarity, even though resources aren't at that kind of premium. PowerShell, however, breaks that paradigm, and provides longer, more descriptive commands.
add a comment |Â
up vote
4
down vote
Most of MS-DOS's commands were inherited from CP/M and UNIX, both of which used cryptic commands to conserve space (resources were at a premium in those days - 16K of RAM and 10MB of disk space was considered A LOT! - and expensive to acquire). Since MS-DOS was also operating under resource constraints, it made sense at the time to use the same sort of compact commands, plus, it increased familiarity and made the transition from those older systems to MS-DOS easier. To this day, CMD.EXE on Windows maintains that compatibility/familiarity, even though resources aren't at that kind of premium. PowerShell, however, breaks that paradigm, and provides longer, more descriptive commands.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
Most of MS-DOS's commands were inherited from CP/M and UNIX, both of which used cryptic commands to conserve space (resources were at a premium in those days - 16K of RAM and 10MB of disk space was considered A LOT! - and expensive to acquire). Since MS-DOS was also operating under resource constraints, it made sense at the time to use the same sort of compact commands, plus, it increased familiarity and made the transition from those older systems to MS-DOS easier. To this day, CMD.EXE on Windows maintains that compatibility/familiarity, even though resources aren't at that kind of premium. PowerShell, however, breaks that paradigm, and provides longer, more descriptive commands.
Most of MS-DOS's commands were inherited from CP/M and UNIX, both of which used cryptic commands to conserve space (resources were at a premium in those days - 16K of RAM and 10MB of disk space was considered A LOT! - and expensive to acquire). Since MS-DOS was also operating under resource constraints, it made sense at the time to use the same sort of compact commands, plus, it increased familiarity and made the transition from those older systems to MS-DOS easier. To this day, CMD.EXE on Windows maintains that compatibility/familiarity, even though resources aren't at that kind of premium. PowerShell, however, breaks that paradigm, and provides longer, more descriptive commands.
answered 21 hours ago
Jeff Zeitlin
52727
52727
add a comment |Â
add a comment |Â
up vote
4
down vote
Is there any reason why MS-DOS didn't use more English words to do tasks,
The most obvious one is that MS-DOS was in the beginning a rather plain CP/M clone, which itself was created with DEC systems in mind. Other, later (2.0) additions where taken from Unix. From a developer point of view it was more import to get the system running than to think about (maybe) more apealing command names.
This as well worked with early users, as they where coming with a great majoriy from CP/M (and some from Unix) and didn't have to relearn everything, only the newer/different features. That's way less work than learning a whole new system.
On the long run this is also quite important for international sales. While commands like DIR
, CD
or DEL
are based on English words, in itself they are just a few letters to memorize. No English required. In contrast a command like 'SHOWALLFILESONDISK' ist a nonsensical bloat - and equaly hard to read/learn for anyone speaking English as well.
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare? It's a little longer to type, but it's a little more clear what exactly the code is doing.
Wouldn't that be just helpful in the early learning stage? If at all?
After all, the command line is to be used in direct dialog. Here short easy memorable commands are king. Remember how often you screw up typing even these short commands? Every additional letter 26-folds the chances of geting an error instead of a result.
A good CLI is always a compromise between shortness and the support of memorizing them - and regularity to crossreference (See story below)
Insert it into a batch file that can take command line parameters and you can save yourself some typing.
Which then again would get of course a short name, to make it aplicable, right? Except it would be a different short name on different machines and resulting in a real Babylon effect of noone being able to tolk to a different computer as his own - as by then, the long names used during learn phase have been vanished from memory, and guessing about name and spelling starts ove.
Conclusion: Such a CLI wouldn't be foreward looking.
A little mainframe anecdote to add:
In the late 1970 Siemens tried to push their mainframes (/370ish) into the mini range and targeting office applications. They invested almost more money into a creating good educational/user manuals then in downsizeing the machines (*1). The basic idea was that DEC, DR or others made their users successfuly use some cryptic command line, so why not use the even more complex ... err ... powerful BS2000 (*2) command line :) The project flopped part due the hardware, but equaly due the fact that the targeted Users aren't doing personal computing. They use company applications and never touched a command line.
Forcing the micro/mini based concept of a command line to use separated, rather minimal applications instead of integrated software, did not only not bring the promised cost cutting, but also produced unsatisfied users. As a last effort some realy nifty menu system was created. While it eased usage for first time users, it became a lump foot to more experienced ones. Several revisions of the system happened in short time until it got buried with the whole project.
So far the usual. Exept (*3), the ones responsible stayed with the company, close/within OS development.
Not much later complaints about an ever growing complexity of the command line (there where function with dozends of arguments) together with increasing costs to implement new commands and/or parameters needed for new functions resulted in a project to redo the command system/interpreter. A really nifty system to structure the command interpreter as well as command and parameter modules was developed. It worked quite well and proved again, that a complete redo of something, that has been grown over years, is a great idea. Productivity of the OS development related to command execution did more than double with the new system.
Except, management (guess who *4) also came up with the idea of a way more user friendly command language. Instead of stupid pseudo language commands (and them even abreviated), a much more clear readable, writeable and memorizable command language should be used. And of course with way less parameters to make it easy. As handy it is, there was also a project about that at a German university about how to create such a language in a systematic way using tools to define and handle that, so that was also brought in.
As a result, even simple things like to get attributes of a file turned into a typing Session:
Old:
/fs <filename>,all
New:
/show-file-attribues file-name=<filename> (*5)
There where many good reasons to do this, not at least the interpreter structure. Now a command an all its parameters were defined in an unambiguous way (using a DSL) that could not only be used to check for valid commands, but also take these checks out of each function into a centralized command interpreter doing all needed syntax checks before the function gets invoked, thus releaving the function program of many, many statements. The improvement on the OS development side was unprecedented. And they couldn't understand why reaction on the user side was considerable less enthusiasic. Not at all.
To be fair, they were aware that this is much to type, so the system also included a method to shorten commands to the least amout of letters needed to identify it correctly. Also, at least parameters that had to be always included (like the file name in above example) could be given without their identifyer. In above case this would shorten that to:
/s-f-a <filename>
Great, except, this is an especially simple example, and even here it doesn't work as simply, as there is also a set-file-attribute
command, so abrevation must be rather sh-f-a
And here lies the real issue for humans. To decide which abrevation is possible and which not, one needs to know all existing commands. There was no structure (as with MS-DOS or other hand made CLI). A human could deduct it, it had to be learned. and relearned when new commands, conflicting with his memorized abreviations came up.
This proved even a true bug breeder for batchjobs. While most 'official' jobs where done with the long form, the usual quick typed scripts did more often than not use the abrevation its programmer preferred (*6). We have all seen these scripts not just used once, but surviving for years in production environments. Haven't we? In case of the command syntax system in use this would mean each of them could fail with any new OS release without any prior warning. Not only hard to detect and to correct, still a dangling Damocles sword over all jobs.
To make it worse, the now way longer commands made abrevations almost mandatory. The command line is limited in length to 72 characters per line and allows only a certain amount of extension lines. So with all these lengthy parameter names, this maximum was easy to reach. So not just quick and dirty procedures did sit below the sword.
Creating a file (entry) with certain attribues about devices to use, access rights and alike was originally possible within a single FILE
command, making it not just compact (and hard to read), but also atomic on the file system. Except for rather simple cases the new syntax required several different commands to first create it, then assign attributes and the like, but also several of them, as complex cases couldn't fit a single command line. All while hoping that all other processes watching the file system would be able to cope with these intermediate states that could not have happened before (*7).
Heck, it wasn't all that bad - or at least it wouldn't seem so at first sight:
The systematic command definition allowed the addition of a help screen system not just showing general messages, but giving way detailed information what is wrong with the command typed - all without any involvement of the function. These screens even allowed menu driven point and clock alike completion with nicely formatted fields for all usable parameters and so on. Except, to do so the terminal was switched from line into form mode. Somethign a CLI user hates. Not just because of a different handling metaphore (thing about a CLI window just poping up a modal box to be handled with the mouse) destroying the flow, but also because the screen was cleared afterwards, so no easy to use of any former line to copy and reuse. Some users really developed a neurosis about not geting the help screen (*8).
Oh, and to close the cicle, it also got a staircase wit: People introduced today (well, after 1990) complain a lot about the total unimaginitive bureaucratic and unhandy nature of this CLI - and blame it on been typical mainframe complex.
Long story short: OS-development has been there and it turned out to be less than desirable.
*1 - The later resulted in what was the most unreliable and least sold of all of their mainframs, the infamous CoCo - here meaning Compact Computer, desprite the fact still being the size of several table height refrigiators.
*2 - BS2000 is still around, now owned by Fujitsu. In my opinion eventually the best OS design ever. Ok, given, I should limit this to 'best internal structure and design ever'.
*3 - Which again is usual in large companies.
*4 - I don't have proof for that claim beside being told so back then. So while it sounds truthish, it might not be true
*5 - Looks much like the OP asked for - even more readable with hyphens seperating real words, doesn't it?
*6 - It's much like the problem I mentioned in the answer abotu private naming conventions for batch files ment to shorten the typing.
*7 - I had one case like that in a real customer application, where one (redone) job was creating a file in a transfer directory. The complete independent (and older) transfer process did not include handling for a file with attrbutes set only partly the way it was expecting them. As this software was third party and a (mostly) closed system it took me quite a while to figure out a sequence to make both of us happy.
*8 - Later on it was made optional and regular error messages where displayed - but then again, instead of a simple line like File
name missing` a structured analysis was presented ofer up to a dozend lines - again killing most of your screen.
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
add a comment |Â
up vote
4
down vote
Is there any reason why MS-DOS didn't use more English words to do tasks,
The most obvious one is that MS-DOS was in the beginning a rather plain CP/M clone, which itself was created with DEC systems in mind. Other, later (2.0) additions where taken from Unix. From a developer point of view it was more import to get the system running than to think about (maybe) more apealing command names.
This as well worked with early users, as they where coming with a great majoriy from CP/M (and some from Unix) and didn't have to relearn everything, only the newer/different features. That's way less work than learning a whole new system.
On the long run this is also quite important for international sales. While commands like DIR
, CD
or DEL
are based on English words, in itself they are just a few letters to memorize. No English required. In contrast a command like 'SHOWALLFILESONDISK' ist a nonsensical bloat - and equaly hard to read/learn for anyone speaking English as well.
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare? It's a little longer to type, but it's a little more clear what exactly the code is doing.
Wouldn't that be just helpful in the early learning stage? If at all?
After all, the command line is to be used in direct dialog. Here short easy memorable commands are king. Remember how often you screw up typing even these short commands? Every additional letter 26-folds the chances of geting an error instead of a result.
A good CLI is always a compromise between shortness and the support of memorizing them - and regularity to crossreference (See story below)
Insert it into a batch file that can take command line parameters and you can save yourself some typing.
Which then again would get of course a short name, to make it aplicable, right? Except it would be a different short name on different machines and resulting in a real Babylon effect of noone being able to tolk to a different computer as his own - as by then, the long names used during learn phase have been vanished from memory, and guessing about name and spelling starts ove.
Conclusion: Such a CLI wouldn't be foreward looking.
A little mainframe anecdote to add:
In the late 1970 Siemens tried to push their mainframes (/370ish) into the mini range and targeting office applications. They invested almost more money into a creating good educational/user manuals then in downsizeing the machines (*1). The basic idea was that DEC, DR or others made their users successfuly use some cryptic command line, so why not use the even more complex ... err ... powerful BS2000 (*2) command line :) The project flopped part due the hardware, but equaly due the fact that the targeted Users aren't doing personal computing. They use company applications and never touched a command line.
Forcing the micro/mini based concept of a command line to use separated, rather minimal applications instead of integrated software, did not only not bring the promised cost cutting, but also produced unsatisfied users. As a last effort some realy nifty menu system was created. While it eased usage for first time users, it became a lump foot to more experienced ones. Several revisions of the system happened in short time until it got buried with the whole project.
So far the usual. Exept (*3), the ones responsible stayed with the company, close/within OS development.
Not much later complaints about an ever growing complexity of the command line (there where function with dozends of arguments) together with increasing costs to implement new commands and/or parameters needed for new functions resulted in a project to redo the command system/interpreter. A really nifty system to structure the command interpreter as well as command and parameter modules was developed. It worked quite well and proved again, that a complete redo of something, that has been grown over years, is a great idea. Productivity of the OS development related to command execution did more than double with the new system.
Except, management (guess who *4) also came up with the idea of a way more user friendly command language. Instead of stupid pseudo language commands (and them even abreviated), a much more clear readable, writeable and memorizable command language should be used. And of course with way less parameters to make it easy. As handy it is, there was also a project about that at a German university about how to create such a language in a systematic way using tools to define and handle that, so that was also brought in.
As a result, even simple things like to get attributes of a file turned into a typing Session:
Old:
/fs <filename>,all
New:
/show-file-attribues file-name=<filename> (*5)
There where many good reasons to do this, not at least the interpreter structure. Now a command an all its parameters were defined in an unambiguous way (using a DSL) that could not only be used to check for valid commands, but also take these checks out of each function into a centralized command interpreter doing all needed syntax checks before the function gets invoked, thus releaving the function program of many, many statements. The improvement on the OS development side was unprecedented. And they couldn't understand why reaction on the user side was considerable less enthusiasic. Not at all.
To be fair, they were aware that this is much to type, so the system also included a method to shorten commands to the least amout of letters needed to identify it correctly. Also, at least parameters that had to be always included (like the file name in above example) could be given without their identifyer. In above case this would shorten that to:
/s-f-a <filename>
Great, except, this is an especially simple example, and even here it doesn't work as simply, as there is also a set-file-attribute
command, so abrevation must be rather sh-f-a
And here lies the real issue for humans. To decide which abrevation is possible and which not, one needs to know all existing commands. There was no structure (as with MS-DOS or other hand made CLI). A human could deduct it, it had to be learned. and relearned when new commands, conflicting with his memorized abreviations came up.
This proved even a true bug breeder for batchjobs. While most 'official' jobs where done with the long form, the usual quick typed scripts did more often than not use the abrevation its programmer preferred (*6). We have all seen these scripts not just used once, but surviving for years in production environments. Haven't we? In case of the command syntax system in use this would mean each of them could fail with any new OS release without any prior warning. Not only hard to detect and to correct, still a dangling Damocles sword over all jobs.
To make it worse, the now way longer commands made abrevations almost mandatory. The command line is limited in length to 72 characters per line and allows only a certain amount of extension lines. So with all these lengthy parameter names, this maximum was easy to reach. So not just quick and dirty procedures did sit below the sword.
Creating a file (entry) with certain attribues about devices to use, access rights and alike was originally possible within a single FILE
command, making it not just compact (and hard to read), but also atomic on the file system. Except for rather simple cases the new syntax required several different commands to first create it, then assign attributes and the like, but also several of them, as complex cases couldn't fit a single command line. All while hoping that all other processes watching the file system would be able to cope with these intermediate states that could not have happened before (*7).
Heck, it wasn't all that bad - or at least it wouldn't seem so at first sight:
The systematic command definition allowed the addition of a help screen system not just showing general messages, but giving way detailed information what is wrong with the command typed - all without any involvement of the function. These screens even allowed menu driven point and clock alike completion with nicely formatted fields for all usable parameters and so on. Except, to do so the terminal was switched from line into form mode. Somethign a CLI user hates. Not just because of a different handling metaphore (thing about a CLI window just poping up a modal box to be handled with the mouse) destroying the flow, but also because the screen was cleared afterwards, so no easy to use of any former line to copy and reuse. Some users really developed a neurosis about not geting the help screen (*8).
Oh, and to close the cicle, it also got a staircase wit: People introduced today (well, after 1990) complain a lot about the total unimaginitive bureaucratic and unhandy nature of this CLI - and blame it on been typical mainframe complex.
Long story short: OS-development has been there and it turned out to be less than desirable.
*1 - The later resulted in what was the most unreliable and least sold of all of their mainframs, the infamous CoCo - here meaning Compact Computer, desprite the fact still being the size of several table height refrigiators.
*2 - BS2000 is still around, now owned by Fujitsu. In my opinion eventually the best OS design ever. Ok, given, I should limit this to 'best internal structure and design ever'.
*3 - Which again is usual in large companies.
*4 - I don't have proof for that claim beside being told so back then. So while it sounds truthish, it might not be true
*5 - Looks much like the OP asked for - even more readable with hyphens seperating real words, doesn't it?
*6 - It's much like the problem I mentioned in the answer abotu private naming conventions for batch files ment to shorten the typing.
*7 - I had one case like that in a real customer application, where one (redone) job was creating a file in a transfer directory. The complete independent (and older) transfer process did not include handling for a file with attrbutes set only partly the way it was expecting them. As this software was third party and a (mostly) closed system it took me quite a while to figure out a sequence to make both of us happy.
*8 - Later on it was made optional and regular error messages where displayed - but then again, instead of a simple line like File
name missing` a structured analysis was presented ofer up to a dozend lines - again killing most of your screen.
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
add a comment |Â
up vote
4
down vote
up vote
4
down vote
Is there any reason why MS-DOS didn't use more English words to do tasks,
The most obvious one is that MS-DOS was in the beginning a rather plain CP/M clone, which itself was created with DEC systems in mind. Other, later (2.0) additions where taken from Unix. From a developer point of view it was more import to get the system running than to think about (maybe) more apealing command names.
This as well worked with early users, as they where coming with a great majoriy from CP/M (and some from Unix) and didn't have to relearn everything, only the newer/different features. That's way less work than learning a whole new system.
On the long run this is also quite important for international sales. While commands like DIR
, CD
or DEL
are based on English words, in itself they are just a few letters to memorize. No English required. In contrast a command like 'SHOWALLFILESONDISK' ist a nonsensical bloat - and equaly hard to read/learn for anyone speaking English as well.
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare? It's a little longer to type, but it's a little more clear what exactly the code is doing.
Wouldn't that be just helpful in the early learning stage? If at all?
After all, the command line is to be used in direct dialog. Here short easy memorable commands are king. Remember how often you screw up typing even these short commands? Every additional letter 26-folds the chances of geting an error instead of a result.
A good CLI is always a compromise between shortness and the support of memorizing them - and regularity to crossreference (See story below)
Insert it into a batch file that can take command line parameters and you can save yourself some typing.
Which then again would get of course a short name, to make it aplicable, right? Except it would be a different short name on different machines and resulting in a real Babylon effect of noone being able to tolk to a different computer as his own - as by then, the long names used during learn phase have been vanished from memory, and guessing about name and spelling starts ove.
Conclusion: Such a CLI wouldn't be foreward looking.
A little mainframe anecdote to add:
In the late 1970 Siemens tried to push their mainframes (/370ish) into the mini range and targeting office applications. They invested almost more money into a creating good educational/user manuals then in downsizeing the machines (*1). The basic idea was that DEC, DR or others made their users successfuly use some cryptic command line, so why not use the even more complex ... err ... powerful BS2000 (*2) command line :) The project flopped part due the hardware, but equaly due the fact that the targeted Users aren't doing personal computing. They use company applications and never touched a command line.
Forcing the micro/mini based concept of a command line to use separated, rather minimal applications instead of integrated software, did not only not bring the promised cost cutting, but also produced unsatisfied users. As a last effort some realy nifty menu system was created. While it eased usage for first time users, it became a lump foot to more experienced ones. Several revisions of the system happened in short time until it got buried with the whole project.
So far the usual. Exept (*3), the ones responsible stayed with the company, close/within OS development.
Not much later complaints about an ever growing complexity of the command line (there where function with dozends of arguments) together with increasing costs to implement new commands and/or parameters needed for new functions resulted in a project to redo the command system/interpreter. A really nifty system to structure the command interpreter as well as command and parameter modules was developed. It worked quite well and proved again, that a complete redo of something, that has been grown over years, is a great idea. Productivity of the OS development related to command execution did more than double with the new system.
Except, management (guess who *4) also came up with the idea of a way more user friendly command language. Instead of stupid pseudo language commands (and them even abreviated), a much more clear readable, writeable and memorizable command language should be used. And of course with way less parameters to make it easy. As handy it is, there was also a project about that at a German university about how to create such a language in a systematic way using tools to define and handle that, so that was also brought in.
As a result, even simple things like to get attributes of a file turned into a typing Session:
Old:
/fs <filename>,all
New:
/show-file-attribues file-name=<filename> (*5)
There where many good reasons to do this, not at least the interpreter structure. Now a command an all its parameters were defined in an unambiguous way (using a DSL) that could not only be used to check for valid commands, but also take these checks out of each function into a centralized command interpreter doing all needed syntax checks before the function gets invoked, thus releaving the function program of many, many statements. The improvement on the OS development side was unprecedented. And they couldn't understand why reaction on the user side was considerable less enthusiasic. Not at all.
To be fair, they were aware that this is much to type, so the system also included a method to shorten commands to the least amout of letters needed to identify it correctly. Also, at least parameters that had to be always included (like the file name in above example) could be given without their identifyer. In above case this would shorten that to:
/s-f-a <filename>
Great, except, this is an especially simple example, and even here it doesn't work as simply, as there is also a set-file-attribute
command, so abrevation must be rather sh-f-a
And here lies the real issue for humans. To decide which abrevation is possible and which not, one needs to know all existing commands. There was no structure (as with MS-DOS or other hand made CLI). A human could deduct it, it had to be learned. and relearned when new commands, conflicting with his memorized abreviations came up.
This proved even a true bug breeder for batchjobs. While most 'official' jobs where done with the long form, the usual quick typed scripts did more often than not use the abrevation its programmer preferred (*6). We have all seen these scripts not just used once, but surviving for years in production environments. Haven't we? In case of the command syntax system in use this would mean each of them could fail with any new OS release without any prior warning. Not only hard to detect and to correct, still a dangling Damocles sword over all jobs.
To make it worse, the now way longer commands made abrevations almost mandatory. The command line is limited in length to 72 characters per line and allows only a certain amount of extension lines. So with all these lengthy parameter names, this maximum was easy to reach. So not just quick and dirty procedures did sit below the sword.
Creating a file (entry) with certain attribues about devices to use, access rights and alike was originally possible within a single FILE
command, making it not just compact (and hard to read), but also atomic on the file system. Except for rather simple cases the new syntax required several different commands to first create it, then assign attributes and the like, but also several of them, as complex cases couldn't fit a single command line. All while hoping that all other processes watching the file system would be able to cope with these intermediate states that could not have happened before (*7).
Heck, it wasn't all that bad - or at least it wouldn't seem so at first sight:
The systematic command definition allowed the addition of a help screen system not just showing general messages, but giving way detailed information what is wrong with the command typed - all without any involvement of the function. These screens even allowed menu driven point and clock alike completion with nicely formatted fields for all usable parameters and so on. Except, to do so the terminal was switched from line into form mode. Somethign a CLI user hates. Not just because of a different handling metaphore (thing about a CLI window just poping up a modal box to be handled with the mouse) destroying the flow, but also because the screen was cleared afterwards, so no easy to use of any former line to copy and reuse. Some users really developed a neurosis about not geting the help screen (*8).
Oh, and to close the cicle, it also got a staircase wit: People introduced today (well, after 1990) complain a lot about the total unimaginitive bureaucratic and unhandy nature of this CLI - and blame it on been typical mainframe complex.
Long story short: OS-development has been there and it turned out to be less than desirable.
*1 - The later resulted in what was the most unreliable and least sold of all of their mainframs, the infamous CoCo - here meaning Compact Computer, desprite the fact still being the size of several table height refrigiators.
*2 - BS2000 is still around, now owned by Fujitsu. In my opinion eventually the best OS design ever. Ok, given, I should limit this to 'best internal structure and design ever'.
*3 - Which again is usual in large companies.
*4 - I don't have proof for that claim beside being told so back then. So while it sounds truthish, it might not be true
*5 - Looks much like the OP asked for - even more readable with hyphens seperating real words, doesn't it?
*6 - It's much like the problem I mentioned in the answer abotu private naming conventions for batch files ment to shorten the typing.
*7 - I had one case like that in a real customer application, where one (redone) job was creating a file in a transfer directory. The complete independent (and older) transfer process did not include handling for a file with attrbutes set only partly the way it was expecting them. As this software was third party and a (mostly) closed system it took me quite a while to figure out a sequence to make both of us happy.
*8 - Later on it was made optional and regular error messages where displayed - but then again, instead of a simple line like File
name missing` a structured analysis was presented ofer up to a dozend lines - again killing most of your screen.
Is there any reason why MS-DOS didn't use more English words to do tasks,
The most obvious one is that MS-DOS was in the beginning a rather plain CP/M clone, which itself was created with DEC systems in mind. Other, later (2.0) additions where taken from Unix. From a developer point of view it was more import to get the system running than to think about (maybe) more apealing command names.
This as well worked with early users, as they where coming with a great majoriy from CP/M (and some from Unix) and didn't have to relearn everything, only the newer/different features. That's way less work than learning a whole new system.
On the long run this is also quite important for international sales. While commands like DIR
, CD
or DEL
are based on English words, in itself they are just a few letters to memorize. No English required. In contrast a command like 'SHOWALLFILESONDISK' ist a nonsensical bloat - and equaly hard to read/learn for anyone speaking English as well.
Why couldn't the developers have made it so that I can type view insertdirectorynamehere -bare? It's a little longer to type, but it's a little more clear what exactly the code is doing.
Wouldn't that be just helpful in the early learning stage? If at all?
After all, the command line is to be used in direct dialog. Here short easy memorable commands are king. Remember how often you screw up typing even these short commands? Every additional letter 26-folds the chances of geting an error instead of a result.
A good CLI is always a compromise between shortness and the support of memorizing them - and regularity to crossreference (See story below)
Insert it into a batch file that can take command line parameters and you can save yourself some typing.
Which then again would get of course a short name, to make it aplicable, right? Except it would be a different short name on different machines and resulting in a real Babylon effect of noone being able to tolk to a different computer as his own - as by then, the long names used during learn phase have been vanished from memory, and guessing about name and spelling starts ove.
Conclusion: Such a CLI wouldn't be foreward looking.
A little mainframe anecdote to add:
In the late 1970 Siemens tried to push their mainframes (/370ish) into the mini range and targeting office applications. They invested almost more money into a creating good educational/user manuals then in downsizeing the machines (*1). The basic idea was that DEC, DR or others made their users successfuly use some cryptic command line, so why not use the even more complex ... err ... powerful BS2000 (*2) command line :) The project flopped part due the hardware, but equaly due the fact that the targeted Users aren't doing personal computing. They use company applications and never touched a command line.
Forcing the micro/mini based concept of a command line to use separated, rather minimal applications instead of integrated software, did not only not bring the promised cost cutting, but also produced unsatisfied users. As a last effort some realy nifty menu system was created. While it eased usage for first time users, it became a lump foot to more experienced ones. Several revisions of the system happened in short time until it got buried with the whole project.
So far the usual. Exept (*3), the ones responsible stayed with the company, close/within OS development.
Not much later complaints about an ever growing complexity of the command line (there where function with dozends of arguments) together with increasing costs to implement new commands and/or parameters needed for new functions resulted in a project to redo the command system/interpreter. A really nifty system to structure the command interpreter as well as command and parameter modules was developed. It worked quite well and proved again, that a complete redo of something, that has been grown over years, is a great idea. Productivity of the OS development related to command execution did more than double with the new system.
Except, management (guess who *4) also came up with the idea of a way more user friendly command language. Instead of stupid pseudo language commands (and them even abreviated), a much more clear readable, writeable and memorizable command language should be used. And of course with way less parameters to make it easy. As handy it is, there was also a project about that at a German university about how to create such a language in a systematic way using tools to define and handle that, so that was also brought in.
As a result, even simple things like to get attributes of a file turned into a typing Session:
Old:
/fs <filename>,all
New:
/show-file-attribues file-name=<filename> (*5)
There where many good reasons to do this, not at least the interpreter structure. Now a command an all its parameters were defined in an unambiguous way (using a DSL) that could not only be used to check for valid commands, but also take these checks out of each function into a centralized command interpreter doing all needed syntax checks before the function gets invoked, thus releaving the function program of many, many statements. The improvement on the OS development side was unprecedented. And they couldn't understand why reaction on the user side was considerable less enthusiasic. Not at all.
To be fair, they were aware that this is much to type, so the system also included a method to shorten commands to the least amout of letters needed to identify it correctly. Also, at least parameters that had to be always included (like the file name in above example) could be given without their identifyer. In above case this would shorten that to:
/s-f-a <filename>
Great, except, this is an especially simple example, and even here it doesn't work as simply, as there is also a set-file-attribute
command, so abrevation must be rather sh-f-a
And here lies the real issue for humans. To decide which abrevation is possible and which not, one needs to know all existing commands. There was no structure (as with MS-DOS or other hand made CLI). A human could deduct it, it had to be learned. and relearned when new commands, conflicting with his memorized abreviations came up.
This proved even a true bug breeder for batchjobs. While most 'official' jobs where done with the long form, the usual quick typed scripts did more often than not use the abrevation its programmer preferred (*6). We have all seen these scripts not just used once, but surviving for years in production environments. Haven't we? In case of the command syntax system in use this would mean each of them could fail with any new OS release without any prior warning. Not only hard to detect and to correct, still a dangling Damocles sword over all jobs.
To make it worse, the now way longer commands made abrevations almost mandatory. The command line is limited in length to 72 characters per line and allows only a certain amount of extension lines. So with all these lengthy parameter names, this maximum was easy to reach. So not just quick and dirty procedures did sit below the sword.
Creating a file (entry) with certain attribues about devices to use, access rights and alike was originally possible within a single FILE
command, making it not just compact (and hard to read), but also atomic on the file system. Except for rather simple cases the new syntax required several different commands to first create it, then assign attributes and the like, but also several of them, as complex cases couldn't fit a single command line. All while hoping that all other processes watching the file system would be able to cope with these intermediate states that could not have happened before (*7).
Heck, it wasn't all that bad - or at least it wouldn't seem so at first sight:
The systematic command definition allowed the addition of a help screen system not just showing general messages, but giving way detailed information what is wrong with the command typed - all without any involvement of the function. These screens even allowed menu driven point and clock alike completion with nicely formatted fields for all usable parameters and so on. Except, to do so the terminal was switched from line into form mode. Somethign a CLI user hates. Not just because of a different handling metaphore (thing about a CLI window just poping up a modal box to be handled with the mouse) destroying the flow, but also because the screen was cleared afterwards, so no easy to use of any former line to copy and reuse. Some users really developed a neurosis about not geting the help screen (*8).
Oh, and to close the cicle, it also got a staircase wit: People introduced today (well, after 1990) complain a lot about the total unimaginitive bureaucratic and unhandy nature of this CLI - and blame it on been typical mainframe complex.
Long story short: OS-development has been there and it turned out to be less than desirable.
*1 - The later resulted in what was the most unreliable and least sold of all of their mainframs, the infamous CoCo - here meaning Compact Computer, desprite the fact still being the size of several table height refrigiators.
*2 - BS2000 is still around, now owned by Fujitsu. In my opinion eventually the best OS design ever. Ok, given, I should limit this to 'best internal structure and design ever'.
*3 - Which again is usual in large companies.
*4 - I don't have proof for that claim beside being told so back then. So while it sounds truthish, it might not be true
*5 - Looks much like the OP asked for - even more readable with hyphens seperating real words, doesn't it?
*6 - It's much like the problem I mentioned in the answer abotu private naming conventions for batch files ment to shorten the typing.
*7 - I had one case like that in a real customer application, where one (redone) job was creating a file in a transfer directory. The complete independent (and older) transfer process did not include handling for a file with attrbutes set only partly the way it was expecting them. As this software was third party and a (mostly) closed system it took me quite a while to figure out a sequence to make both of us happy.
*8 - Later on it was made optional and regular error messages where displayed - but then again, instead of a simple line like File
name missing` a structured analysis was presented ofer up to a dozend lines - again killing most of your screen.
edited 33 mins ago
Martin Argerami
1436
1436
answered 18 hours ago
Raffzahn
34.6k477137
34.6k477137
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
add a comment |Â
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
1
1
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
Thanks for telling us about this precursor of PowerShell ;)
â grahamj42
14 hours ago
add a comment |Â
up vote
2
down vote
The reason for short commands has been re-stated several ways here. But there's one critical detail missing from all of the answers: diskpart
was introduced after Windows NT was developed.
Windows NT is based on many of the core concepts of Digital's VMS. Dave Cutler - the chief system architect of VMS - quit Digital (DEC) in 1988, and Microsoft hired him and part of his team to develop a new enterprise-grade OS.
In VMS, most complex commands could either be executed as interactive commands with their own shell, or by specifying the command with enough options from the terminal. MS-DOS had previously borrowed DEC's /
as an option specifier, so the convention was already familiar to PC users when it was used in Windows NT's shell.
All of VMS's commands could be shortened down to the shortest unique identifier (with occasionally puerile side-effects, especially for larval VMS admins like I was). While you could type out the commands in full (helpful in scripts you were going to reuse) it was usual to shorten commands down to the bone.
add a comment |Â
up vote
2
down vote
The reason for short commands has been re-stated several ways here. But there's one critical detail missing from all of the answers: diskpart
was introduced after Windows NT was developed.
Windows NT is based on many of the core concepts of Digital's VMS. Dave Cutler - the chief system architect of VMS - quit Digital (DEC) in 1988, and Microsoft hired him and part of his team to develop a new enterprise-grade OS.
In VMS, most complex commands could either be executed as interactive commands with their own shell, or by specifying the command with enough options from the terminal. MS-DOS had previously borrowed DEC's /
as an option specifier, so the convention was already familiar to PC users when it was used in Windows NT's shell.
All of VMS's commands could be shortened down to the shortest unique identifier (with occasionally puerile side-effects, especially for larval VMS admins like I was). While you could type out the commands in full (helpful in scripts you were going to reuse) it was usual to shorten commands down to the bone.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
The reason for short commands has been re-stated several ways here. But there's one critical detail missing from all of the answers: diskpart
was introduced after Windows NT was developed.
Windows NT is based on many of the core concepts of Digital's VMS. Dave Cutler - the chief system architect of VMS - quit Digital (DEC) in 1988, and Microsoft hired him and part of his team to develop a new enterprise-grade OS.
In VMS, most complex commands could either be executed as interactive commands with their own shell, or by specifying the command with enough options from the terminal. MS-DOS had previously borrowed DEC's /
as an option specifier, so the convention was already familiar to PC users when it was used in Windows NT's shell.
All of VMS's commands could be shortened down to the shortest unique identifier (with occasionally puerile side-effects, especially for larval VMS admins like I was). While you could type out the commands in full (helpful in scripts you were going to reuse) it was usual to shorten commands down to the bone.
The reason for short commands has been re-stated several ways here. But there's one critical detail missing from all of the answers: diskpart
was introduced after Windows NT was developed.
Windows NT is based on many of the core concepts of Digital's VMS. Dave Cutler - the chief system architect of VMS - quit Digital (DEC) in 1988, and Microsoft hired him and part of his team to develop a new enterprise-grade OS.
In VMS, most complex commands could either be executed as interactive commands with their own shell, or by specifying the command with enough options from the terminal. MS-DOS had previously borrowed DEC's /
as an option specifier, so the convention was already familiar to PC users when it was used in Windows NT's shell.
All of VMS's commands could be shortened down to the shortest unique identifier (with occasionally puerile side-effects, especially for larval VMS admins like I was). While you could type out the commands in full (helpful in scripts you were going to reuse) it was usual to shorten commands down to the bone.
answered 13 hours ago
scruss
6,2821142
6,2821142
add a comment |Â
add a comment |Â
up vote
1
down vote
It seems that an important fact is missing in the answers here:
Almost all commands in MS-DOS (and most other operating systems) are in fact filenames.
The length of filenames where restricted to 8 characters on MS-DOS (and CP/M, from where many commands where taken).
Up to MS-DOS 6.22 they still used only 8 character commands List of all MS-DOS commands
The filename is also an explanation why the commands are single words: space was for a long time a no-no in filenames, even when filenames longer that 8 chracter started to be used most systems had problem with space in a filename.
Historical note, some systems had even shorter filenames that restricted the length of commands even more. (RT-11 on PDP-11 is one example: 6+3 characters per filename)
New contributor
add a comment |Â
up vote
1
down vote
It seems that an important fact is missing in the answers here:
Almost all commands in MS-DOS (and most other operating systems) are in fact filenames.
The length of filenames where restricted to 8 characters on MS-DOS (and CP/M, from where many commands where taken).
Up to MS-DOS 6.22 they still used only 8 character commands List of all MS-DOS commands
The filename is also an explanation why the commands are single words: space was for a long time a no-no in filenames, even when filenames longer that 8 chracter started to be used most systems had problem with space in a filename.
Historical note, some systems had even shorter filenames that restricted the length of commands even more. (RT-11 on PDP-11 is one example: 6+3 characters per filename)
New contributor
add a comment |Â
up vote
1
down vote
up vote
1
down vote
It seems that an important fact is missing in the answers here:
Almost all commands in MS-DOS (and most other operating systems) are in fact filenames.
The length of filenames where restricted to 8 characters on MS-DOS (and CP/M, from where many commands where taken).
Up to MS-DOS 6.22 they still used only 8 character commands List of all MS-DOS commands
The filename is also an explanation why the commands are single words: space was for a long time a no-no in filenames, even when filenames longer that 8 chracter started to be used most systems had problem with space in a filename.
Historical note, some systems had even shorter filenames that restricted the length of commands even more. (RT-11 on PDP-11 is one example: 6+3 characters per filename)
New contributor
It seems that an important fact is missing in the answers here:
Almost all commands in MS-DOS (and most other operating systems) are in fact filenames.
The length of filenames where restricted to 8 characters on MS-DOS (and CP/M, from where many commands where taken).
Up to MS-DOS 6.22 they still used only 8 character commands List of all MS-DOS commands
The filename is also an explanation why the commands are single words: space was for a long time a no-no in filenames, even when filenames longer that 8 chracter started to be used most systems had problem with space in a filename.
Historical note, some systems had even shorter filenames that restricted the length of commands even more. (RT-11 on PDP-11 is one example: 6+3 characters per filename)
New contributor
New contributor
answered 5 hours ago
UncleBod
993
993
New contributor
New contributor
add a comment |Â
add a comment |Â
BasementJoe is a new contributor. Be nice, and check out our Code of Conduct.
BasementJoe is a new contributor. Be nice, and check out our Code of Conduct.
BasementJoe is a new contributor. Be nice, and check out our Code of Conduct.
BasementJoe is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7685%2fis-there-a-reason-why-ms-dos-didnt-use-more-english-words-for-commands%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
The title's a little off; DOS does use English words. There's
dir
ectory,echo
,copy
... The question's pretty good, though; I can guess, but I don't actually know the answer.â wizzwizz4â¦
21 hours ago
2
How about "ShowMeAllTheFiles,filesizes,fileAttributesinThisDirectory" then"? (Yes, it's a little long, but says exactly what it does) ;) - But you could insert it into a batch file called
DIR.BAT
to save yourself some typing ;)â tofro
20 hours ago
1
@tofro - I get what you're saying, a lot of typing compared to
dir /b
;). I was just wondering, because in more sophisticated programming languages it's a good idea to write clean code, descriptive variable and method names, small methods. To a new comer or someone starting to maintain a system that uses batch files and cmd, they might come acrossdir /b
and not understand it, but that's why we have the manual.â BasementJoe
20 hours ago
3
@BasementJoe "because in more sophisticated programming languages" :)) Keep in mind a CLI is not a programming language, but a way for a human to steer what a computer is supposed to do. The ability to repeat sequences is a nice add on but not the primary function. Or would you prefer to key in 'GO LEFT' (and so on) instead of WASD in a shooter?
â Raffzahn
13 hours ago
@Raffzahn - Good point!
â BasementJoe
13 hours ago