This is NOT any attempt to be an all-inclusive computer industry glossary. In fact, to be honest, it might be more humorous than useful, but if you aren't careful, you might learn something...
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
Atari: Maker of early computers such as the Atari 400 and 800, which brought computers home for low prices, and later the Atari ST, which again earned attention for power at a low price, but also at a low quality...
Altair: One of the very earliest computers, although your typical kid would in no way recognize it as a computer, for it basically consisted of a large number of lights and switches. Very good at balancing the checkbook -- just set all switches to zero, for it pretty well drained your checking account when it came out in the mid 1970s. For reference, an Altair with 4k of RAM was a BIG system...
Amiga: A computer originally developed by an independent company, which was purchased by Commodore before it hit the market place in the mid-to-late 1980s. A very graphical machine, developed a very loyal following even before it hit the market. The temptation would be to compare the fierce loyalty of Amiga owners to that of Macintosh owners, but while the loyalty is comparable, most Amiga fans were poor college students, whereas the Macintosh had a sizable following in business. Hint: Poor college students are POOR! Not a good market for most products! A very notable trait of the early Amigas: They did their graphics using NTSC (U.S. television) timing standards, which meant with VERY little effort, they could be used to create very good animations and graphic displays which could be recorded OR BROADCAST using standard video equipment. One of the only business niches the Amiga seemed to have penetrated was the TV news and weather broadcast business as a graphics generation system. The company (Commodore) faded a few years back, but the Amiga name is making some kind of comeback, exactly what, I'm not sure. I could look it up, but based on past track records, I must say don't believe ANYTHING until it ships and is evaluated. Microsoft may have learned much of what they know of vaporware and unkept promises from Amiga.
Asynchronous Communications, or ASYNC: Typically, RS-232 serial communications. The characters can be sent or received at any time, without regard to fitting into a rigid timing frame. This is done by means of a start bit and a stop bit, transitions which indicate the data is coming. While it is convenient for many applications and the protocols are simple, the additional overhead of start and stop bits account for approximately 20% overhead to the data stream.
ATM: Probably the most over-used three
letters in the computer industry. Most commonly: Automatic Teller
Machine: Those wondrous devices which dispense money for new
computer stuff on demand. Unfortunately, using computer technology,
they tend to do TOO accurate a job of tracking the money you extract from
Asynchronous Transfer Mode: One of those Computer Industry three-letter combinations that when you find out what it means, doesn't help you in the slightest. ATM is the Up and Coming thing in high-speed network backbones and will soon be attaching all computers in the world. And, it has been this for at least five years (keep in mind that a computer "generation" is about two years...so ATM has been right around the corner for something like 50 "computer years"). Yeah, right. ATM probably deserves the award for the product with the most hype that actually happened, and the whole world still said "yeah, so what?". Everything else keeps getting faster, better, and cheaper faster than ATM.
Adobe Type Manager: Now somewhat forgotten, ATM is a technology Adobe came out with to display PostScript fonts on your computer's screen and to allow them to be printed to non-PostScript printers.
For quite some time, I was trying to figure out what Adobe Type Manager or Automatic Teller Machines had to do with networking...
Artificial Intelligence: Years ago, it
was hoped that computers would someday be taught to think, and this would
be Artificial Intelligence or "AI". In the 1980s, as computers steadfastly
refused to think for themselves and instead insisted upon doing as instructed,
AI was given a new definition: Being able to do things that couldn't be
done by computers a few years ago, and thus, AI became a success!
People in the computer industry rarely admit defeat, they just redefine
success. Now, once again, people are starting to talk about "thinking
computers". Personally, the more we learn about how the brain works,
the more I believe the digital computer won't be able to do it, no more
than a hand saw can be used as a delivery truck. Wrong tool for the
Byte: A collection of 8 bits running around together in some sort of order. Often, one byte can represent one character of text. Since a byte is 8 bits, and each bit has two possible states, there are 28 distinct single byte values, for example, 0 through 255 (or -128 to 127, or 7 through 262, or 256 different colors, or anything else you wish to define the bit combinations to mean). Note: Some older (1970s vintage) literature defines the byte as the basic unit of information of a particular computer design. So, by this definition, a PDP-11 has a 16 bit byte, a PDP-8 has a 12 bit byte, an IBM PC has either an 8 or 16 bit byte (depending on what you count), a Commodore 64 has an 8 bit byte, etc. In the late 70s, some kind of consensus was reached that a byte is 8 bits, and the basic unit of information of a processor became defined as the "Word length". In the 1980s, an attempt was made to define a "Word" as two bytes, or 16 bits, but if someone says "Word" and you really care, you might want to ask how big of a word they mean.
Bug: What we have when a computer doesn't work
as we hope. According to computer lore, the term came from a problem
discovered in the Mark I electro-mechanical computer at (I believe) MIT
in the 1940s. A program malfunction was traced to a moth which had
been smashed in the relay contacts, preventing their proper operation.
This means the first debugging tool was a pair of tweezers. (An early
small computer debugging tool was named "DDT"). This story is may
be true, but it isn't likely to be the source of the word "bug" in systems
failures, as I have seen people cite references to "bugs in the system"
long before this story allegedly took place.
Consultant: An otherwise unemployable jerk, brought in by management to tell them the exact same things their underlings have been saying for months, but since they paid this guy somewhere between $100 and $500/hr to tell them, they are more likely to listen to his faulty advice than to listen to the faulty advice of their employees.
Cracker: One who gets pleasure out of breaking into or otherwise damaging other people's computer systems. The truly ignorant (including the media) praise the skills of these people. Sorry, but the computer skills of a typical cracker are about on par with the artistic skills of the typical graffiti vandal. Most work off scripts guiding them step by step through the process of breaking into the target machine, or use the ignorance of the maintainers of the system to achieve their goal (one does not praise the cleverness of a burglar who noticed the back door was unlocked, and the key to the front door was under the door mat!) These people deserve no praise, if a cracker wishes to prove they are skilled with the computer, there are many things they can do that are creative and productive. Clever and creative people don't have to turn to destruction and vandalism. Sometimes, the term "Hacker" is used synonymously, this is wrong. Hackers are the creative and constructive true masters of the computer. They got better things to do than to crack systems...unless it is at the owners request as a security check. See Script Kiddie
Caffeine: One of the Geek's four basic food groups. The others include fat, sugar, and cholesterol.
CP/M: "Control Program/Monitor",
the very popular disk operating system developed by Digital Research.
CP/M-80 was an 8 bit OS capable of running on 8080 processors with as little
as 16K of RAM (the more famous CP/M v2.2 required 32K). Typically,
the OS took up less than 15k of RAM,
which sounds impressive until you remember the processor maxed out at 64k.
Later versions of this well-designed and well-implemented OS included MP/M
(a multi-user version!), CP/M-86 (16 bit version for 8086), CP/M 68k (for
the Motorola 68000 series), Concurrent CP/M (Multi-tasking), Concurrent
DOS (multi-tasking with MS-DOS emulation and windowing).
Disk: A storage system for computers. There are two types of disks (not discs...) in common use on computers: Floppy and Hard. Floppy disks are the limited storage devices used to initially boot up your computer, and used by many as a convenient way to loose data and spread viruses. The name "Floppy" comes from the flexible mylar used as a backing for the magnetic material, whereas hard disks use ridged aluminum platters. Many people mistakenly consider the 3.5" floppies "Hard disks" because of their "hard" shell surrounding the media, compared to the old 5.25" and 8" floppies, which were very clearly floppy. This is incorrect, but understandable if you have never shredded a 3.5" disk to see the guts.
Disc: Apparently, CD-ROMs are 'Discs', hard drives and floppies are 'Disks'. I think there should be some elaboration on this. I think this is silly. In my book, a disk is a round, flat thing, a 'Disc' is a round flat thing attempting to look sophisticated.
Dynamic RAM: The most popular kind of memory
in computers today. There are two basic divisions of RAM: Static
and Dynamic. Static RAM holds its data in a logic gate called a "Flip-Flop",
which holds the value as long as power is applied. In other words,
the data is held accurately as long as power remains applied and it isn't
told to change the data, imagine a good lights witch. You forcibly
flip the switch from one position to the other, and it stays there.
Dynamic RAM, on the other hand, stores data in tiny capacitors, which loose
their their charge over time (typically, 1-2ms -- that's not much time
for a person, but for a computer, it is a relatively long time period).
Imagine here a cheap light switch -- you flip it up to on, and it starts
to sag back down to off. So, you have to keep going back, and if
the switch is on, you have to push it back up towards on, and do your work
in between runs to the switch. Dynamic RAM sounds very inefficient,
and it DOES take a performance penalty on the system, but it has MUCH greater
chip densities than Static RAM, meaning lower costs and greater storage,
so Dynamic RAM is the standard. Now some people might say "Oh, but
DRAM isn't the standard any more!" Well, before the current age of
micro-specializing terms that used to be more broad, Dynamic RAM just was
any kind of RAM which required periodic refreshes. Now, they have
subdivided DRAM into SDRAM, EDO DRAM, RDRAM, etc. Yes, there are
substantially different in compatibility, but they use much the same basic
storage technology (the primary difference is in how the data is retrieved
from the storage and delivered to the processor).
EEPROM: Electrically Erasable Read Only Memory. VERY similar to an EPROM, except the chip can be erased by electricity, rather than with ultraviolet light. Gotta look up if an EEPROM can be reprogrammed one byte at a time or if the entire chip is erased at once.
Endless Loop: See Loop,
Geek: Social undesirable who, if not for the computer industry, would often be considered unemployable, but because of the computer industry, they often take home more money than people who work for a living.
The "standard" Internet Search Engine.
Originated the concept of the "just search" screen, with minimal
graphics, and minimal "other stuff".
Gone from being "just another search engine" to being a verb ("to
Google" for something, as in "Google for it, dammit!"), and the closest
thing to the index of all human knowledge.
Also provides many other services, such as mail, mapping, advertising,
and much more.
Hacker: A person who has mastered the art
and science of making computers and software do much more than the original
designers intended. A true hacker is to be respected, even if socially
a bit undesirable. Not to be confused with "crackers".
A true hacker can find plenty of constructive projects to work on, breaking
things is more a mark of children (of any age).
Internet: The most effective productivity
destroyer in business today. Designed originally as a way for geeks
to communicate with each other, it was later found that geeks were barely
able to communicate, so it was opened up for the general public.
Loop, Endless: See Endless
Microsoft: Originator of the mass market
BASIC interpreter. At one point, someone had counted MS as being
responsible for over 200 different dialects of the BASIC language on a
host of very different computers (and that was BEFORE the IBM PC came out!).
Legend has it that Bill Gate's first BASIC Interpreter (for the Altair)
was shipped sans copyright notice, and thus, was freely distributed at
the computer clubs in the mid 1970s, probably spreading the MS name far
further than it ever would have if Bill had remembered to copyright his
$150 program. In 1981, they purchased the code for what became MS-DOS
v1, basically a 16 bit re-write and strip down of CP/M-80.
The company also deserves credit for a FORTRAN compiler which gave erroneous
results on simple math problems, Windows v1, v2, 386, v3.0 and the more
well known 3.1 95, 98, NT, and 2000. Noted for blatantly unfair monopolistic
trade practices, and software which sells better than it works.
Nerd: Something akin to the term 'Geek',
the exact distinction has been lost (at least by me, and this is my list!).
Ages and ages ago, a group of friends, some of Electrical Engineering majors
and others Computer Science majors had decided that EEs were one, and CS
were the other. I don't recall which was which. I have also
seen it said a nerd is someone who is immersed in technology, a geek is
a nerd that enjoys it. Whatever.
Open Software: 1) Software published
with complete source code. 2) A religion practiced by advocates of
definition 1. Advocates of Open Software say because the source code
to the programs is in the user's hands, bugs can be fixed by talented users,
resulting in better quality software. Critics of the Open Software
movement point out, there is no financial motivation to the programmers
to improve or perfect their programs. Both sides may have some points,
although the Open Software people are pretty clearly ahead in quality of
product, the traditional software publishers are certainly making more
money, and probably making a product more "user friendly".
PC: Originally, a "Personal Computer", a computer used by one person, typically small enough to fit on or under a desk. IBM adopted the name for their first commercially successful small computer, the IBM PC. It has since come to mean a computer crippled by the 1981 design IBM came up with, as in "Do you prefer PCs or Macs?"
PCI: The "modern" high-performance microcomputer expansion bus. Characteristics: Resources are allocated by slot (unlike the ISA bus, where the slots were indistinguishable from each other). 32 Bit bus. Some processor independence (Intel, PowerPC, Alpha, etc). 132MBps theoretical data transfer rate.
PROM: Programmable, Read Only Memory.
A ROM which can be (typically) programmed at the factory (and sometimes
the field). Generically, it is any ROM whose data could be set sometime
after the chip is fabricated. It often means, however, "fusible link"
PROMs, where the data is stored in microscopic "fuses" which can be blown
(with a high voltage/current pulse) or left intact to represent the
desired data. Almost never seen anymore, for their programming time
was long (heat), the costs were high, and EPROMs got cheap.
RTFM: Geek-speak for Read The Flaming Manual. Exasperated cry of many a weary support person. Other words are often substituted for Flaming...
Recursion: See Recursion.
ROM: Read Only Memory. Memory which holds programs and data which can not be changed, and maintains its data without power. Generically, these cover PROMs, EPROMs, EEPROMs, etc., but often, it means specifically mask-programmed ROMs. These ROMs are very cheap, but they require huge quantities of identical chips, for the program is actually encoded in the masks which are used to fabricate the ROMs. Mask-programmed ROMs have virtually fallen out of existence, which is unfortunate, as they are one of the only truly permanent storage mediums, but only only standard setting manufacturers (IBM, Apple) could normally justify the setup-costs and permanence of design of mask-programmed ROMs
Static RAM: RAM which holds its data with nothing more than the application of power (see Dynamic RAM). Typically used only where the additional performance or lower power consumption is more important than the higher cost and lower density.
Script Kiddie: A cracker,
typically rather young of either body or mind (hence, the 'kiddie' portion),
who delights in breaking into people's computers by following someone else's
script. They typically have no understanding of what they are actually
accomplishing, except that by following the instructions, they get someplace
they know they aren't supposed to be, and thus, this is cool.
Truth in Advertising: A concept which
has no place on this page, as it does not apply to the computer industry.
USB: Universal Serial Bus. Probably a cool idea, basically, a high-speed, multi-device serial interface. Actually, closer perhaps to a simplistic network system for peripherals. Crams a lot of devices into a single interrupt, which is something we REALLY need on the PC architecture. Claims to be "plug-and-play", and comes closer than other things describing themselves that way. Currently being used for keyboards, mice, scanners, printers, digital cameras, etc. One port on the computer can be "split" into multiple ports using hubs. USB ports actually also provide a 5v power supply, so many devices can be powered directly from your computer (or the hub), so the device may connect to the outside world with only one cable.
ULOS: Unix Like Operating System. This is my own creation. This is a catch-all for all the operating systems which in one way or another emulate, look like, act like, or arguably are Unix, but can't say that in the fear of being sued by Unix System Labs or whoever "owns" Unix this week. Examples: Linux, OpenBSD, FreeBSD, NetBSD, Cromix, Dynix, Coherent. Many people use *nix as a shorthand, but you see, that leaves out a number of significant choices.
WYSIWYG: "What You See Is What You Get". Indicates that the image you see on your computer screen is representing what you will see on paper. I am first aware of this phrase (not the "acronym") in 1982 in an ad for the then cutting-edge word processor, Wordstar. Now, Wordstar is considered an example of the antithesis of WYSIWYG. Some examples of probably unargued not WYSIWYG applications: Postscript (the printer language), HTML (the Web language), TEX, and many main-frame page-layout applications.
XMODEM: An early file transfer protocol,
developed by Ward Christensen, an early God of the small computer world..
Basically, 128 bytes of data and a checksum or CRC were sent between the
sender and the receiver, and the receiver sent an acknowledgement back
to the sender, saying "Got it, send more". Fairly efficient in the
days of 110bps and 300bps modems, found to be rather inefficient by the
time modems had reached 2400bps and almost useless today. Still,
it was an elegant and simple protocol for its day, and most of the later
protocols basically took the idea of XMODEM and fixed the problems, rather
than starting from scratch, by improving the error detection, enlarging
the packet size and in some cases, allowing the acknowledgement for a packet
to be accepted long after later packets have been sent.
Y2K: Short for "Year 2000" (and keep in mind abbreviations like that are why we got into trouble in the first place!) A hoax the computer industry put upon the rest of the world. We got rich, and you were stuffing money in our pockets. We are all happy, right? Mostly, people got caught being worried about how other people were doing their jobs, rather than doing their own jobs.
ZMODEM: An extension of YMODEM
and XMODEM most significantly, using a "Sliding window"
protocol where several packets could be sent to the receiver before an
acknowledgement is required. For example, packets 1, 2, 3, and
4 could be on their way to the receiver before the first acknowledgement
is received, and if packet 2 turned out to be corrupted, it could be resent
to Computer Opinion Page
Back to Home
$Id: glossary.html,v 1.7 2007/06/27 19:14:42 nick Exp $