Free Essay

Computer Language

In:

Submitted By cjwilson
Words 1980
Pages 8
Evolution/History of the Public Switched Telephone Network
Christopher J Wilson
American Public University System

Since the beginning of civilized mankind, there has always been the need for humans to communicate over long distances. At first this required couriers to bring letters from destination to another. It then evolved into trained birds, to Morse code. Over time this type of communication quickly became out dated, because of the fact that it took too long for messages to be moved from one point to another point. So the telephone was created. Allowing people to talk to each other over vast distances. The PSTN had been designed to allow instantious two way voice communication between anyone. These first set of telephones had no networks to them, they were all private user based. If someone wished to speak with another person they would pick up their end and whistle into the receiver until someone on the other end answered, that was the very first introduction of the telephone ring. This was of course replaced with the bell sound, which is still being used to this date although it has changed quite a bit from its creation. After the bell was created people wished for a way to turn it off when not in use, thus the switch hook was created, which aloud you to turn the device off while no calls were coming in. With all these new advancements added to it, it was only a matter of time before some combined all the telephone wires into one central hub.

But as this new device took off, it required that each telephone be connected to every other telephone that person wished to speak with. All these lines came together at a central hub, where employees would receive the call phone operator A, be requested to be transferred to operator B’s line. Over the first few years of these new hubs and allowing people to transfer them this way, required many employees to cover the vast amount of calls coming in. This was the creation of the switch board. Seeing that this way of connecting everyone together through verbal commands, quickly became outdated. So that’s when a man by the name of Almon Strowger created the very first automatic telephone switch system in 1891. This man quickly found out that in his town calls where no being appropriately handled in a timely manner, being that some people of higher status were always connected faster than those of lower status.

That’s when he set out to provide a revolutionary system that would automatically connect people to their desired person, while being non – biased, with pinpoint efficient manner, and with little to no human interaction needed. This system is still in use today, known as the PSTN. This new system was designed using electromechanical switches based around Pawls and electromagnets. By the late 1880s, with the help of his nephew. Almon Strowger had his first working model, to be used. In this model, they used a moving wiper with magnetic contacts on the end, weaved and moved around a bank of many other contacts. Only making contact with the one that it was required to connect to.

Although Mr. Strowger was the created of the switch telephone network, he was not the creator of the base device used. That device was created by a group known by Connolly and McTigthe. Strowger was just the first person to put it into application on a semi global use. After showing that his new device was able to handle any number of calls, Mr. Strowger set out to make a name for himself. And with the help of two gentlemen named Joseph B. Harris and Moses A. Meyer, his company was formed under the name of Strowger Automatic Telelphone Exchange October of 1891. Within 20 years the company goes through another merger and was renamed Automatic Electric. Even to this day this company is still up and running, having gone through many changes and owners, it now is known as AG Communication Systems

As the demands and needs for this new system increased over the years the need for more switches and lines required to handle the increase naturally grew with the demand, and is still growing to this day. The first concern for this new technology was the local populace, how could they make it so that everyone locally and country wide could make calls and have very little delay time. This required for the rapid expansion of this new device to grow across the U.S.

The design of this system was and is always changing, being converted to adapt to the ever changing needs of its consumers. The hierarchical type of design, that local switches named exchanges on bottom, up next are the regional ones called access tandems. This continued to the Switching tandems level which are at the top. Each level handled different types of calls and consumers. The bottom level handled anything locally and were directly connected to other bottom level switches. The level above this handled any type of call that wished to be transferred long distance. It did this by using
The Inter-exchange carriers or LECs. All of these are mostly handled by big name companies which include AT&T, Verzion and sprint, are just some of the companies that handle this. The upper most level handles any type of call that has to go between the carriers. Meaning if a consumer of Sprint wishes to speak with a consumer of Verzion, this the level that would handle that call.

You can see in the image above the three different levels that were used.

With consumers wishing to be able to talk to anyone in any state across the U.S it required a vast amount of what are known today as Area Codes. This allowed people to dial a ten digit number and reach the desired person of their liking. Before this it was only seven number and after a short period of time the number of available of telephone numbers quickly emptied. This required Phone companies to devise a plan in which would allow them to reuse any number as many times as they wished, but would also be unique to each persons.

To help in the routing of traffic over the PSTN a new system was created called the North American Numbering plan, Thus a three digit number was assigned to states, some receiving more than one based on the number of consumers in that area. In the state of California there are at least 5 different area codes if not more. New exchanges were created to organize and help different NPAs and allow further deeper understanding of routing. Breaking down a telephone number can be hard to understand but also quite easy to learn. The first three numbers are the area code, which routes where the call to the correct region. The next three digits determine which exchange server the call will be headed into. The final four numbers of any telephone number simply define the end user address within this exchange. When the bell system was finally broken up, the nation was to be divieded into LATA’s or Local Access Transport Areas, these areas would determine the difference between long distances calls that could be handled by the Local Exchange provider or if it would have to be taken over by the inter-exchange carrier. This LEC used the End-Office networks to determine where a call would be handled. Now of course this type of system could never stay around forever. It would eventually have to be put to rest and allow for a better system to take its place. That’s where SS7 comes into play. This evolved from Common Channel Signaling Number 6 or CCIS6 which began in the late 1960s. This was used to separate all calls of the switch. This CCIS6 was used to help solve two different problems that were still relevant in the traditional Public Switched Telephone network. The first being that the CCIS is “out-of-band” it reduced the cause of fraud, meaning that anyone who tried to use certain tones to manipulate the system for fraudulent intentions had a much harder time doing so.

The second reason was because of call attempts vs call completions, which would tie up the switch board operators and then the automatic switch board. Meaning that when a call is made, the circuit that it used was tied up on that one call for the full duration. Meaning, if a person repeatedly tried to make a connection that circuit would be used the entire time, making it more difficult for the system to route calls as one or more of its circuits were already being used for another call, this meant that lines would get backed up until the few number of open circuits were available. By 1977, CCS had been deployed any many toll networks and by the 1980s SS7 had been defined by the International telegraph & telephone consultative committee or the CCITT. AT&T was the first company to have 80% of their intertoll trunks to be severed by the CCIS. It allowed calls to be set faster, torn down faster and signals could be sent in to different directions at the same time, while the call was still being set up. By the year 1987 installation and trial began for many companies across the U.S CCIS6 was being improved in almost every accept possible by the SS7. This included but was not limited to the bit rates that the CCIS6 used, the multiple layers of the SS7 allowed into be changed at will without much trouble for the rest of the system. The greater volume of the signaling allowed for calls to be set up a lot faster. The trunks time and usage became a lot more efficient and it even lowered the fraud rate for the entire system, making it much safer for consumers to use.

With the help of the SS7 the end offices were not only connected to each other but they were also connected to the SS7 allowing the transfer of long distance calls much faster, seeing as it could make it seem like an end officer in Georgia was connected directly to an end office in Maine. This system stayed in use for a long time, but of course people were always looking for ways to expand and improve upon this system. So after the rise of the SS7 and how it seamlessly switch calls and manage them, the switch intelligence could move away from the switch and into many vast networks. It laid the path for the emergence of the AIN or the Advanced Intelligent Network, and its purpose was very simple. Remove the switch control from service provider. Allowing the owner of the switch to find better services and handle logic from more places

There are two types of primary methods of connecting switch nodes. The first approach is a mesh topology, in which all nodes are in connected to each other. The better approach is a hierarchical tree in which nodes are aggregated as the hierarchy traverses from the subscriber access points to the top of the trees. But both methods are used in the PSTN but of course they are driven by cost and usage.

References http://en.wikipedia.org/wiki/History_of_telecommunication - History of telecommunication http://lyle.smu.edu/~levine/ee8304/moseley.pdf - The Evolution of the Public Switched Telephone Network” http://www.internetsociety.org/sites/default/files/The%20Internet%20and%20the%20Public%20Switched%20Telephone%20Network.pdf – The internet and the Public Switched Telephone Network http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2241658 – No Dialtone: The End of the Public Switched Telephone Network http://homepage.ntlworld.com/martin.essam/development_of_BT.htm - The Public Switched Telephone Network

Similar Documents

Premium Essay

Computer Games In Language Instruction Essay

...Computer Games in Language Instruction A computer game is a formal, rule-based system performed on a computer, with a variable and quantifiable outcome (Tobias, Fletcher, Dai, & Wind, 2011). The computer game engages players by requiring them to influence the outcome using various strategies and to feel the consequences (such as winning the game or certain rewards) (Tobias, et al., 2011). In the classroom, computer games can increase a feeling of involvement and engagement with the game and improve motivation to learn the fundamental material (Tobias, et al., 2011). The history of computer games as an instructive tool is relatively short. The entry of computer games into learning and instruction began in the 1980s, when the design and construction of the games themselves was a popular way to learn about computers (Games & Squire, 2011). However, it was not until the 1990s that the computer game began to be a common tool for instruction. In the mid-1990s, commercial edutainment games were used commonly, and then they were disappeared in the mid-2000s due to the poor management of the sector’s leaders (Games & Squire, 2011). Nowadays, digital game-based learning, focusing on an emphasis of interaction and learning, is more common than only edutainment games (Games & Squire, 2011). Computer games have been used in language...

Words: 2182 - Pages: 9

Free Essay

Pt1420 Exploring Computer Languages.

...Exploring programming languages. 1970 Forth: Forth was created by Charles H Moores. This was created around 1973 when the company known as fourth came into play. This was made because do to the job Charles have, he couldn’t help but wonder if he could take his work wherever he goes. This also lead to the popularity of the microchips as the time. C: C was created by Dennis Ritchie. Created at around 1972, this program was originaly gonna be called B. But do to it taken avantage of the PDP, it became C. Prolong: Created by Alian Colmerauer, Prolong was made within 1972 and became the most popular during that time. It is well known for expression terms of relationships within the coding of the language. ML: ML was design and created by Robin Miler and other people he work closely within the university of Edinburgh. Around 1973, this was created to help analyze but mostly applied in language design and manipulation (compilers, analyzers, theorem provers), but it is a general-purpose language also used in bioinformatics, financial systems, and applications including a genealogical database SQL: Created by Donald D. Chamberlim and Rayman F. Boyce around 1974, this was Originally based upon relational algebra and tuple relational calculus, SQL consists of a data definition language and a data manipulation language. The two saw the potential of the concepts described by Codd, Chamberlin, and Boyce, and developed their own SQL-based RDBMS with aspirations of selling...

Words: 1226 - Pages: 5

Premium Essay

Computers and Languages

...Week one Why I chose the course The role of the computer/web in my life Week one on the course Get with the real-time and stop lagging I chose the Computers and Languages module because I am not as skilled or knowledgeable in computers as I would like to be. Most people know or have a “computer geek” in their group who they turn to when they don’t understand what their computer is doing or need technology related advice. I have a friend like this. Admittedly, when he talks in “compuspeak”[1] I am a little bit clueless. Recently, this frequent ‘turning’ to my friend has got me a little dizzy. When concerning my studies and efficiency, computers and technology seem to occupy centre stage. I have therefore often wished that I was a bit more clued up, I guess this where Computers and Languages comes in! ( Yeah, you can do that on the computer too, and that, and that... As a Comparative Literature student, I considered myself more of the “creative” type, with an interest in literature/creative writing, art, film and the like. However I realise now that the computer can be used as a creative resource. In exploring possible career options, I came across many ‘creative’ roles which involved video editing, online researching, online reviewing, social networking, blogging and editing and maintaining websites. Whilst I know that this course won’t turn me into a complete “computer geek” It will help me develop the valuable skills I need to enter the ‘creative’ roles I am interested...

Words: 1726 - Pages: 7

Premium Essay

The First Programming Languages Predate the Modern Computer.

...programming languages predate the modern computer. During a nine-month period in 1842-1843, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea about Charles Babbage's newest proposed machine, the Analytical Engine. With the article she appended a set of notes which specified in complete detail a method for calculating Bernoulli numbers with the Analytical Engine, recognized by some historians as the world's first computer program.[1] Herman Hollerith realized that he could encode information on punch cards when he observed that train conductors encode the appearance of the ticket holders on the train tickets using the position of punched holes on the tickets. Hollerith then encoded the 1890 census data on punch cards. The first computer codes were specialized for their applications. In the first decades of the 20th century, numerical calculations were based on decimal numbers. Eventually it was realized that logic could be represented with numbers, not only with words. For example, Alonzo Church was able to express the lambda calculus in a formulaic way. The Turing machine was an abstraction of the operation of a tape-marking machine, for example, in use at the telephone companies. Turing machines set the basis for storage of programs as data in the von Neumann architecture of computers by representing a machine through a finite number. However, unlike the lambda calculus, Turing's code does not serve well as a basis for higher-level languages—its principal...

Words: 1105 - Pages: 5

Premium Essay

Computer Assisted Language Learning Research Paper

...of CALL CALL Stand for : Computer Assisted Language Learning . The search for and programs of the computer in language teaching and learning . Introduction t has been over 50 years since the emergence of computer-assisted language learning (CALL) that would forever change how second/foreign languages are taught. This article presents a historical overview of the evolution of CALL from the previous years of the mainframe computer to the integrative technologies of the 21st century. It examines the evolution of the dual fields of educational technology and second/foreign language teaching as they intertwined over the last half of the 20th century into present day CALL. The paper describes the paradigm shifts experienced along thisjourney...

Words: 867 - Pages: 4

Free Essay

Assignment and Essay.... Others)Information Technology (Programming/ Languages (Java, C++, Vb,.Net, & Etc)/Database Design/ Computer Networking/ System Analysis/ Project Management/Project Development/ It & Society/ and.

...ASSIGNMENT and ESSAY. ... others)Information Technology (Programming/ Languages (Java, C++, VB, .NET, & etc)/Database Design/ Computer Networking/ System Analysis/ Project Management/Project Development/ IT & Society/ and. - NET programmers continue to struggle with the complexities of a hybrid managed/unmanaged environment. ..... Sorry, I had to laugh at that paper! ... Java on the other hand is cross-platform, and also traditionally runs as an ... - NET programmers continue to struggle with the complexities of a hybrid managed/unmanaged environment. ..... Sorry, I had to laugh at that paper! ... Java on the other hand is cross-platform, and also traditionally runsASSIGNMENT and ESSAY. ... others)Information Technology (Programming/ Languages (Java, C++, VB, .NET, & etc)/Database Design/ Computer Networking/ System Analysis/ Project Management/Project Development/ IT & Society/ and. - NET programmers continue to struggle with the complexiASSIGNMENT and ESSAY. ... others)Information Technology (Programming/ Languages (Java, C++, VB, .NET, & etc)/Database Design/ Computer Networking/ System Analysis/ Project Management/Project Development/ IT & Society/ and. - NET programmers continue to struggle with the complexities of a hybrid managed/unmanaged environment. ..... Sorry, I had to laugh at that paper! ... Java on the other hand is cross-platform, and also traditionally runs as an ... - NET programmers continue to struggle with the complexities of a hybrid managed/unmanaged environment...

Words: 784 - Pages: 4

Free Essay

Evolution of Unix

...Evolution of UNIX Evolution of UNIX Bill Stewart December 01, 2011 Marshall University CIS155: UNIX Operating System In the late 1960's computers worked entirely different than the ones that we do our work on every day. They did not talk to each other and programs written for use on one computer did not work on another. Today’s basic cell phone has more processing power and memory capabilities as computers from the 1960's. The few operating systems available at that time performed very limited tasks and were exclusive to the computer it was written on. In other words when one upgraded to a newer computer, the operating system and all data that you wanted transferred from the old computer had to be rewritten on the newer model. In 1965 a joint effort of Bell Labs, MIT and GE began to develop a general computer operating system that was named the MULTICS (Multiplexed Information and Computing Service) mainframe timesharing system. The MULTICS project was being funded by the Department of Defense Advanced Research Projects Agency. The goal of the MULTICS group was to develop a feature-packed information utility that would allow timesharing of mainframe computers by large communities of users. It was also designed to be able to support multilevels of security with the military in mind. When Bell Labs joined the project their goal was to obtain a timesharing system for use by members of the technical staff at Bell Labs. When the planned time had passed and...

Words: 1891 - Pages: 8

Free Essay

Evolution of Unix

...Evolution of UNIX Evolution of UNIX Bill Stewart December 01, 2011 Marshall University CIS155: UNIX Operating System In the late 1960's computers worked entirely different than the ones that we do our work on every day. They did not talk to each other and programs written for use on one computer did not work on another. Today’s basic cell phone has more processing power and memory capabilities as computers from the 1960's. The few operating systems available at that time performed very limited tasks and were exclusive to the computer it was written on. In other words when one upgraded to a newer computer, the operating system and all data that you wanted transferred from the old computer had to be rewritten on the newer model. In 1965 a joint effort of Bell Labs, MIT and GE began to develop a general computer operating system that was named the MULTICS (Multiplexed Information and Computing Service) mainframe timesharing system. The MULTICS project was being funded by the Department of Defense Advanced Research Projects Agency. The goal of the MULTICS group was to develop a feature-packed information utility that would allow timesharing of mainframe computers by large communities of users. It was also designed to be able to support multilevels of security with the military in mind. When Bell Labs joined the project their goal was to obtain a timesharing system for use by members of the technical staff at Bell Labs. When the planned time had passed and...

Words: 1891 - Pages: 8

Premium Essay

DoxCx

...1940 – 1956:  First Generation [ Vacuum Tubes ] These first generation computers relied on ‘machine language’ (which is the most basic programming language that can be understood by computers). These computers were limited to solving one problem at a time. Input was based on punched cards and paper tape. Output came out on print-outs. The two notable machines of this era were the UNIVAC and ENIAC machines – the UNIVAC is the first every commercial computer which was purchased in 1951 by a business – the US Census Bureau. 1956 – 1963: Second Generation [ Transistors ] The replacement of vacuum tubes by transistors saw the advent of the second generation of computing. Although first invented in 1947, transistors weren’t used significantly in computers until the end of the 1950s. They were a big improvement over the vacuum tube, despite still subjecting computers to damaging levels of heat. However they were hugely superior to the vacuum tubes, making computers smaller, faster, cheaper and less heavy on electricity use. They still relied on punched 1964 – 1971: Third Generation [ Integrated Circuits ] By this phase, transistors were now being miniaturised and put on silicon chips (called semiconductors). This led to a massive increase in speed and efficiency of these machines.  These were the first computers where users interacted using keyboards and monitors which interfaced with an operating system, a significant...

Words: 438 - Pages: 2

Free Essay

Computer

...Computer The word'computer ' is an old word that has changed its meaning several times in the last few centuries.The Techencyclopedia(2003) defines computer as " a general purpose machine that processes data according to a set of instructions that are stored internally either temorarily or permanently" Computer history The trem history means past events.It indicates the gradual development of computers.Here we will discuss how this extraordinary machine has reached of it's apex. In the begining............................... The history of computers starts out about 2000 years ago, at the birth of the 'abacus' a wooden rack holding two horizontal wires with breads strung on them.Just like our present computer,abacus also considered a digit as a singal or codeo and processed the calculation. Blasie Pascal ists usually credited to building the first digital computer in 1942.It added numbers to help his father.In 1671,Gottofried Wilhelm Von Leibniz invented a computer that was built in 1694.It could add,and, after changing somethings around,multiply. Charles Babbage: A serious of very intersting developement in computer was started in Cambridge,England,by Charles Babbage, a mathmatics proffessor.In 1982,Babbge realized that many lng calculations,espically those need to make mathematical tabes ,were really a series of predictable actions that were constantly repated.From this he suspected that it should...

Words: 995 - Pages: 4

Free Essay

Artificial Intelligence

...ARTIFICIAL INTELLIGENCE Name INF 103: Computer Literacy Instructor: Bonita Spight-Williams April 13, 2013 Artificial Intelligence What does our future hold in the area of Artificial Intelligence? “The goal of many computer scientists since the mid–20th century has been to create a computer that could perform logical operations so well that it could actually learn and become sentient or conscious. The effort to achieve this is called artificial intelligence, or AI.” (Bowles, 2010). AI is a branch of computer science that deals with developing machines that solve complex problems in a more human-like manner. This involves computers adopting characteristics of human intelligence. However, it has many associations with other fields of study such as Math, Psychology, Biology, and Philosophy. Many scientists believe that by combining these various fields of study they will ultimately succeed in creating an artificially intelligent machine. A lot of scientists believe that the key to figuring out artificial intelligence is to copy the basic function of the human brain. While it is certainly evident that a computer can acquire knowledge from a program or programmer, it is the new developments in AI that will enable it to apply the knowledge. The new advancements in AI will hopefully enable these machines to not only possess the knowledge, but also understand how to utilize it in a number of situations. Artificial Intelligence researchers analyze human...

Words: 1235 - Pages: 5

Premium Essay

Computer

...Computer From Wikipedia, the free encyclopedia Jump to: navigation, search For other uses, see Computer (disambiguation). "Computer technology" redirects here. For the company, see Computer Technology Limited. A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem. Conventionally a computer consists of some form of memory for data storage, at least one element that carries out arithmetic and logic operations, and a sequencing and control element that can change the order of operations based on the information that is stored. Peripheral devices allow information to be entered from external source, and allow the results of operations to be sent out. A computer's processing unit executes series of instructions that make it read, manipulate and then store data. Conditional instructions change the sequence of instructions as a function of the current state of the machine or its environment. The first electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).[1] Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers...

Words: 6579 - Pages: 27

Premium Essay

Computer History

...second generation computer in TechnologyExpand architecture  A computer built from transistors, designed between the mid-1950s andmid-1960s.  Ferrite core memory and magnetic drums replaced cathode ray tubes anddelay-line storage for main memory. Index registers and floating pointarithmetic hardware became widespread. Machine-independent high levelprogramming languages such as ALGOL, COBOL and Fortran wereintroduced to simplify programming.  I/O processors were introduced to supervise input-output operationsindependently of the CPU thus freeing the CPU from time-consuminghousekeeping functions. The CPU would send the I/O processor an initialinstruction to start operating and the I/O processor would then continueindependently of the CPU. When completed, or in the event of an error, theI/O processor sent an interrupt to the CPU.  Batch processing became feasible with the improvement in I/O and storagetechnology in that a batch of jobs could be prepared in advance, stored onmagnetic tape and processed on the computer in one continuous operationplacing the results on another magnetic tape. It became commonplace forauxiliary, small computers to be used to process the input and output tapesoff-line thus leaving the main computer free to process user programs.Computer manufacturers began to provide system software such ascompilers, subroutine libraries and batch monitors.  With the advent of second generation computers it became necessary totalk about computer systems, since the number of memory units...

Words: 556 - Pages: 3

Premium Essay

Protection

...numerous developments and started off the computer age. Electronic Numerical Integrator and Computer (ENIAC) was produced by a partnershp between University of Pennsylvannia and the US government. It consisted of 18,000 vacuum tubes and 7000 resistors. It was developed by John Presper Eckert and John W. Mauchly and was a general purpose computer. "Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data." Von Neumann's computer allowed for all the computer functions to be controlled by a single source. Then in 1951 came the Universal Automatic Computer(UNIVAC I), designed by Remington rand and collectively owned by US census bureau and General Electric. UNIVAC amazingly predicted the winner of 1952, presidential elections, Dwight D. Eisenhower. In first generation computers, the operating instructions or programs were specifically built for the task for which computer was manufactured. The Machine language was the only way to tell these machines to perform the operations. There was great difficulty to program these computers ,and more when there were some malfunctions. First Generation computers used Vacuum tubes and magnetic drums(for data storage). Second Generation Computers (1956-1963) The invention of Transistors marked the start of the second generation. These transistors took place of the vacuum tubes used in the first generation computers. First large scale machines were made...

Words: 749 - Pages: 3

Premium Essay

History of Computing

...Hardware before 1960 Hardware 1960s to present Hardware in Soviet Bloc countries Software Software Unix Free software and open-source software Computer science Artificial intelligence Compiler construction Computer science Operating systems Programming languages Software engineering Modern concepts Graphical user interface Internet Personal computers Laptops Video games World Wide Web Timeline of computing 2400 BC–1949 1950–1979 1980–1989 1990–1999 2000–2009 2010–2019 more timelines ... Category Category v t e Computer operating systems (OSes) provide a set of functions needed and used by most application programs on a computer, and the linkages needed to control and synchronize computer hardware. On the first computers, with no operating system, every program needed the full hardware specification to run correctly and perform standard tasks, and its own drivers for peripheral devices like printers and punched paper card readers. The growing complexity of hardware and application programs eventually made operating systems a necessity. Contents [hide] 1 Background 2 Mainframes 2.1 Systems on IBM hardware 2.2 Other mainframe operating systems 3 Minicomputers and the rise of Unix 4 Microcomputers: 8-bit home computers and game consoles 4.1 Home computers 4.2 Rise of OS in video games and consoles 5 Personal computer era 6 Rise of virtualization 7 See also 8 Notes 9 References 10 Further reading Background[edit] Question book-new.svg This section does...

Words: 4042 - Pages: 17