The concept of devices that operate following a pre-defined set of instructions traces back to
Greek Mythology, notably
Hephaestus and his mechanical servants
[3]. The
Antikythera mechanism was a calculator utilizing gears of various sizes and configuration to determine its operation. The earliest known programmable
machines (machines whose behavior can be controlled and predicted with a set of instructions) were
Al-Jazari's programmable
Automata in 1206.
[4] One of Al-Jazari's
robots was originally a boat with four automatic musicians that floated on a lake to entertain guests at royal drinking parties. Programming this
mechanism's behavior meant placing
pegs and
cams into a wooden drum at specific locations. These would then bump into little
levers that operate a
percussion instrument. The output of this device was a small drummer playing various rhythms and drum patterns.
[5][6] Another sophisticated programmable machine by Al-Jazari was the
castle clock, notable for its concept of
variables, which the operator could manipulate as necessary (i.e., the length of day and night). The
Jacquard Loom, which Joseph Marie Jacquard developed in 1801, uses a series of
pasteboard cards with holes punched in them. The hole pattern represented the pattern that the loom had to follow in weaving cloth. The loom could produce entirely different weaves using different sets of cards.
Charles Babbage adopted the use of
punched cards around 1830 to control his
Analytical Engine. The synthesis of numerical calculation, predetermined operation and output, along with a way to organize and input instructions in a manner relatively easy for humans to conceive and produce, led to the modern development of computer programming. Development of computer programming accelerated through the
Industrial Revolution.
In the late 1880s,
Herman Hollerith invented the recording of data on a medium that could then be read by a machine. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on
punched cards..."
[7] To process these punched cards, first known as "Hollerith cards" he invented the
tabulator, and the
key punch machines. These three inventions were the foundation of the modern information processing industry. In 1896 he founded the
Tabulating Machine Company (which later became the core of
IBM). The addition of a
control panel to his 1906 Type I Tabulator allowed it to do different jobs without having to be physically rebuilt. By the late 1940s, there were a variety of plug-board programmable machines, called
unit record equipment, to perform data-processing tasks (card reading). Early computer programmers used plug-boards for the variety of complex calculations requested of the newly invented machines.
Data and instructions could be stored on external
punch cards, which were kept in order and arranged in program decks.
The invention of the
Von Neumann architecture allowed computer programs to be stored in
computer memory. Early programs had to be painstakingly crafted using the instructions of the particular machine, often in
binary notation. Every model of computer would be likely to need different instructions to do the same task. Later
assembly languages were developed that let the programmer specify each instruction in a text format, entering abbreviations for each operation code instead of a number and specifying addresses in symbolic form (e.g., ADD X, TOTAL). In 1954
Fortran was invented, being the first high level programming language to have a functional implementation.
[8][9] It allowed programmers to specify calculations by entering a formula directly (e.g.
Y = X*2 + 5*X + 9). The program text, or
source, is converted into machine instructions using a special program called a
compiler. Many other languages were developed, including some for commercial programming, such as
COBOL. Programs were mostly still entered using punch cards or
paper tape. (See
computer programming in the punch card era). By the late 1960s,
data storage devices and
computer terminals became inexpensive enough so programs could be created by typing directly into the computers.
Text editors were developed that allowed changes and corrections to be made much more easily than with punch cards.
As time has progressed, computers have made giant leaps in the area of processing power. This has brought about newer programming languages that are more
abstracted from the underlying hardware. Although these
high-level languages usually incur greater
overhead, the increase in speed of modern computers has made the use of these languages much more practical than in the past. These increasingly abstracted languages typically are easier to learn and allow the programmer to develop applications much more efficiently and with less code. However, high-level languages are still impractical for a few programs, such as those where low-level hardware control is necessary or where processing speed is at a premium.
Throughout the second half of the twentieth century, programming was an attractive career in most developed countries. Some forms of programming have been increasingly subject to
offshore outsourcing (importing software and services from other countries, usually at a lower wage), making programming career decisions in developed countries more complicated, while increasing economic opportunities in less developed areas. It is unclear how far this trend will continue and how deeply it will impact programmer wages and opportunities.
No comments:
Post a Comment