Free Essay

History of Computer Science

In:

Submitted By fightx2112
Words 2415
Pages 10
vA Very Brief History of Computer Science
Written by Jeffrey Shallit for CS 134 at the University of Waterloo in the summer of 1995.
This little web page was hastily stitched together in a few days. Perhaps eventually I will get around to doing a really good job. Suggestions are always welcome.
A translation of this web page into French has been prepared by Anne Dicky at the University of Bordeaux.

Before 1900
People have been using mechanical devices to aid calculation for thousands of years. For example, the abacus probably existed in Babylonia (present-day Iraq) about 3000 B.C.E. The ancient Greeks developed some very sophisticated analog computers. In 1901, an ancient Greek shipwreck was discovered off the island of Antikythera. Inside was a salt-encrusted device (now called the Antikythera mechanism) that consisted of rusted metal gears and pointers. When this c. 80 B.C.E. device was reconstructed, it produced a mechanism for predicting the motions of the stars and planets. (More Antikythera info here.)
John Napier (1550-1617), the Scottish inventor of logarithms, invented Napier's rods (sometimes called "Napier's bones") c. 1610 to simplify the task of multiplication.
In 1641 the French mathematician and philosopher Blaise Pascal (1623-1662) built a mechanical adding machine. Similar work was done by Gottfried Wilhelm Leibniz (1646-1716). Leibniz also advocated use of the binary system for doing calculations.
Recently it was discovered that Wilhelm Schickard (1592-1635), a graduate of the University of Tübingen (Germany), constructed such a device in 1623-4, before both Pascal and Leibniz. A brief description of the device is contained in two letters to Johannes Kepler. Unfortunately, at least one copy of the machine burned up in a fire, and Schickard himself died of bubonic plague in 1635, during the Thirty Years' War.
Joseph-Marie Jacquard (1752-1834) invented a loom that could weave complicated patterns described by holes in punched cards. Charles Babbage (1791-1871) worked on two mechanical devices: the Difference Engine and the far more ambitious Analytical Engine (a precursor of the modern digital computer), but neither worked satisfactorily. (Babbage was a bit of an eccentric -- one biographer calls him an "irascible genius" -- and was probably the model for Daniel Doyce in Charles Dickens' novel, Little Dorrit. A little-known fact about Babbage is that he invented the science of dendrochronology -- tree-ring dating -- but never pursued his invention. In his later years, Babbage devoted much of his time to the persecution of street musicians (organ-grinders).) The Difference Engine can be viewed nowadays in the Science Museum in London, England.
One of Babbage's friends, Ada Augusta Byron, Countess of Lovelace (1815-1852), sometimes is called the "first programmer" because of a report she wrote on Babbage's machine. (The programming language Ada was named for her.)
William Stanley Jevons (1835-1882), a British economist and logician, built a machine in 1869 to solve logic problems. It was "the first such machine with sufficient power to solve a complicated problem faster than the problem could be solved without the machine's aid." (Gardner) It is now in the Oxford Museum of the History of Science.
Herman Hollerith (1860-1929) invented the modern punched card for use in a machine he designed to help tabulate the 1890 census.

1900 - 1939: The Rise of Mathematics
Work on calculating machines continued. Some special-purpose calculating machines were built. For example, in 1919, E. O. Carissan (1880-1925), a lieutenant in the French infantry, designed and had built a marvelous mechanical device for factoring integers and testing them for primality. The Spaniard Leonardo Torres y Quevedo (1852-1936) built some electromechanical calculating devices, including one that played simple chess endgames.
In 1928, the German mathematician David Hilbert (1862-1943) addressed the International Congress of Mathematicians. He posed three questions: (1) Is mathematics complete; i.e. can every mathematical statement be either proved or disproved? (2) Is mathematics consistent, that is, is it true that statements such as "0 = 1" cannot be proved by valid methods? (3) Is mathematics decidable, that is, is there a mechanical method that can be applied to any mathematical assertion and (at least in principle) will eventually tell whether that assertion is true or not? This last question was called the Entscheidungsproblem.
In 1931, Kurt Gödel (1906-1978) answered two of Hilbert's questions. He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, if an axiom system is consistent, this consistency cannot be proved within itself. The third question remained open, with 'provable' substituted for 'true'.
In 1936, Alan Turing (1912-1954) provided a solution to Hilbert's Entscheidungsproblem by constructing a formal model of a computer -- the Turing machine -- and showing that there were problems such a machine could not solve. One such problem is the so-called "halting problem": given a Pascal program, does it halt on all inputs?

1940's: Wartime brings the birth of the electronic digital computer
The calculations required for ballistics during World War II spurred the development of the general-purpose electronic digital computer. At Harvard, Howard H. Aiken (1900-1973) built the Mark I electromechanical computer in 1944, with the assistance of IBM.
Military code-breaking also led to computational projects. Alan Turing was involved in the breaking of the code behind the German machine, the Enigma, at Bletchley Park in England. The British built a computing device, the Colossus, to assist with code-breaking.
At Iowa State University in 1939, John Vincent Atanasoff (1904-1995) and Clifford Berry designed and built an electronic computer for solving systems of linear equations, but it never worked properly.
Atanasoff discussed his invention with John William Mauchly (1907-1980), who later, with J. Presper Eckert, Jr. (1919-1995), designed and built the ENIAC, a general-purpose electronic computer originally intended for artillery calculations. Exactly what ideas Mauchly got from Atanasoff is not complely clear, and whether Atanasoff or Mauchly and Eckert deserve credit as the originators of the electronic digital computer was the subject of legal battles and ongoing historical debate. The ENIAC was built at the Moore School at the University of Pennsylvania, and was finished in 1946.
In 1944, Mauchly, Eckert, and John von Neumann (1903-1957) were already at work designing a stored-program electronic computer, the EDVAC. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a mergesort routine. Eckert and Mauchly went on to build UNIVAC.
Meanwhile, in Germany, Konrad Zuse (1910-1995) built the first operational, general-purpose, program-controlled calculator, the Z3, in 1941. More information about Zuse can be found here.
In 1945, Vannevar Bush published a surprisingly prescient article in the Atlantic Monthly about the ways information processing would affect the society of the future. (Another copy of the Bush article appears here.)
Maurice Wilkes (b. 1913), working in Cambridge, England, built the EDSAC, a computer based on the EDVAC. F. C. Williams (b. 1911) and others at Manchester University built the Manchester Mark I, one version of which was working as early as June 1948. This machine is sometimes called the first stored-program digital computer.
The invention of the transistor in 1947 by John Bardeen (1908-1991), Walter Brattain (1902-1987), and William Shockley (1910-1989) transformed the computer and made possible the microprocessor revolution. For this discovery they won the 1956 Nobel Prize in physics. (Shockley later became notorious for his racist views.)
Jay Forrester (b. 1918) invented magnetic core memory c. 1949. More about Forrester here.

1950's
Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.)
John Backus and others developed the first FORTRAN compiler in April 1957. LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. Alan Perlis, John Backus, Peter Naur and others developed Algol.
In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.
Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).
In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".
In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

1960's
In the 1960's, computer science came into its own as a discipline. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.
Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. Edsger Dijkstra at Eindhoven designed the THE multiprogramming system.
At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.
Many new programming languages were invented, such as BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)).
The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.
Proving correctness of programs using formal methods also began to be more important in this decade. The work of Tony Hoare played an important role. Hoare also invented Quicksort.
Douglas C. Englebart invents the computer mouse c. 1968, at SRI.
Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.
A rigorous mathematical basis for the analysis of algorithms began with the work of Donald Knuth (b. 1938), author of 3-volume treatise entitled The Art of Computer Programming.

1970's
The theory of databases saw major advances with the work of Edgar F. Codd on relational databases. Codd won the Turing award in 1981.
Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941). Brian Kernighan and Ritchie together developed C, an influential programming language.
Other new programming languages, such as Pascal (invented by Niklaus Wirth) and Ada (developed by a team led by Jean Ichbiah), arose.
The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.
The 1970's also saw the rise of the supercomputer. Seymour Cray (b. 1925) designed the CRAY-1, which was first shipped in March 1976. It could perform 160 million operations in a second. The Cray XMP came out in 1982. Cray Research was taken over by Silicon Graphics.
There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.
In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.

1980's
This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.
The first computer viruses are developed c. 1981. The term was coined by Leonard Adleman, now at the University of Southern California.
In 1981, the first truly successful portable computer was marketed, the Osborne I. In 1984, Apple first marketed the Macintosh computer.
In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.

1990's and Beyond
Parallel computers continue to be developed.
Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise. The Human Genome Project is attempting to sequence all the DNA in a single human being.
Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a (theoretical) quantum computer.
The "Information Superhighway" links more and more computers worldwide.
Computers get smaller and smaller; the birth of nano-technology.

Other Web Resources for History of Computer Science * History of Computing web page at Virginia Tech * Computers from the Past to the Present (A lecture by Michelle Hoyle at the University of Regina) * A Brief History of Computer Technology * Famous Women of Computing * The Machine That Changed the World * Historic Computer Images * Charles Babbage Institute, Center for the History of Information Processing * Turing Award Winners, 1966-1998 * The Retrocomputing Museum (old programs and programming languages) * The ENIAC Virtual Museum at the University of Pennsylvania (under construction) * COMMPUTERSEUM -- The Commercial Computing Museum (Waterloo, Ontario) * The Computer Museum (Boston, Massachusetts) * The Virtual Museum of Computing * Museum of Obsolete Computers * Annals of the History of Computing * Grace Murray Hopper Celebration of Women in Computing * Index for History of Computers * History of the Electronic Computer * Histoire de l'Informatique * Theory of Computing Hall of Fame * Frank Delaney's History of the Microcomputer

shallit@graceland.uwaterloo.com

Similar Documents

Premium Essay

History of Computer Science

...History of Computer Science Name: Kamyll Dawn Cocon Course, Yr. & Sec.: BSMT 1-D REACTION PAPER The topic of the video which is about the history of computer was kind of interesting since it high lightened our mind about where the computer had really came from. Not only have that, it also made us understand how the computers of today became very wonderful and powerful. Before, computers only existed in the imagination of humans and were believed that creating such monstrous device was impossible. It was a huge leap in the field of computer during the 19th century when Charles Babbage developed the 1st modern computer called the difference machine. The most advantageous feature of this machine is that it reflected Babbage’s attitude of being a perfectionist. Although his work was not finished, the detailed text that was written by Ada was significant in modifying his versions and for documentary purposes of his work. The rapid increase of the American population, on the other hand, triggered the development of a machine that will help the census tabulate such population very fast. Hermin Horrith’s occupation was very helpful to the development of this machine since he used business to gain money to revise his machine which had later evolved into the international business machine. Although war causes devastation to the environment as well as the people involved, it also had contributed to the development of computers, which is the birth of ENIAC, the first large-scale...

Words: 339 - Pages: 2

Premium Essay

Computer Science History

...Computer science is the scientific and practical approach to computation and its applications. It is the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information, whether such information is encoded as bits in a computer memory or transcribed in genes and protein structures in a biological cell.[1] A computer scientist specializes in the theory of computation and the design of computational systems.[2] Its subfields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory (which explores the fundamental properties of Computational and intractable problems), are highly abstract, while fields such as computer graphics emphasize real-world visual applications. Still other fields focus on the challenges in implementing computation. For example, programming language theory considers various approaches to the description of computation, whilst the study of computer programming itself investigates various aspects of the use of programming language and complex systems. Human-computer interaction considers the challenges in making computers and computations useful, usable, and universally accessible to humans. The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for...

Words: 284 - Pages: 2

Premium Essay

Foundation and History of Computer Science

...INTRODUCTION Computers and computation have been around for a long time and were developed over many years with immense contributions from inventors, engineers, physicists, philosophers, mathematicians, technicians, scholars, and visionaries. Right from ancient times, when man used his fingers to count and keep record and straight to the era of ancient civilisations, computation had been paramount in almost everyday life. The Babylonians used base 60 to calculate and tell hours of the day a format which is still used up to this day, the ancient Egyptians needed math for practical problems: measuring time, flooding of the Nile, cooking and baking, book keeping and accounting, taxes. They also made the first math textbook, which contained the first trigonometry including sine, cosine, tangent, and cotangent (sin = o/h, cos = a/h. tan = o/a, cot = a/o), which has been one of the basis of mathematical calculations till date. The Greeks brought the Pythagorean theory; the Romans brought the Roman Numerals and even the Islam brought the “al jabr” which is known today as “Algebra”. The Chinese brought the remainder theorem and the Indians developed the decimal system, zero and negative numbers, and did early trigonometric work on the sine and cosine. The first computers were calculating machines and over time evolved into the digital computers, as we know them today. It has taken over 180 years for the computer to develop from an idea in Charles Babbage head into an actual computer developed...

Words: 2611 - Pages: 11

Premium Essay

Role of Information Technology

...The History of Information Technology March 2010 Draft version to appear in the Annual Review of Information Science and Technology, Vol. 45, 2011 Thomas Haigh thaigh@computer.org University of Wisconsin, Milwaukee Thomas Haigh The History of Information Technology – ARIST Draft 2 In many scholarly fields the new entrant must work carefully to discover a gap in the existing literature. When writing a doctoral dissertation on the novels of Nabokov or the plays of Sophocles, clearing intellectual space for new construction can be as difficult as finding space to erect a new building in central London. A search ensues for an untapped archive, an unrecognized nuance, or a theoretical framework able to demolish a sufficiently large body of existing work. The history of information technology is not such a field. From the viewpoint of historians it is more like Chicago in the mid-nineteenth century (Cronon, 1991). Building space is plentiful. Natural resources are plentiful. Capital, infrastructure, and manpower are not. Boosters argue for its “natural advantages” and promise that one day a mighty settlement will rise there. Speculative development is proceeding rapidly and unevenly. But right now the settlers seem a little eccentric and the humble structures they have erected lack the scale and elegance of those in better developed regions. Development is uneven and streets fail to connect. The native inhabitants have their ideas about how things should be done, which sometimes...

Words: 27274 - Pages: 110

Premium Essay

The Development of Computers

...incredible to consider that in 1969 men landed on the moon using a computer with a 32-kilobyte memory, that was only programmable by the use of punch cards. In 1973, Astronaut Alan Shepherd participated in the first computer "hack" while orbiting the moon in his landing vehicle, as two programmers back on Earth attempted to "hack" into the duplicate computer, to find a way for Shepherd to convince his computer that a catastrophe requiring a mission abort was not happening; the successful hack took 45 minutes to accomplish, and Shepherd went on to hit his golf ball on the moon. Today, the average computer sitting on the desk of a suburban home office has more computing power than the entire U.S. space program that put humans on another world (Rheingold, 2000, p. 4). Computer science has affected the human condition in many radical ways. Throughout its history, its developers have striven to make calculation and computation easier, as well as to offer new means by which the other sciences can be advanced. Modern massively-paralleled super computers help scientists with previously unfeasible problems such as fluid dynamics, complex function convergence, finite element analysis and real time weather dynamics. Likewise, our daily lives are increasingly affected by computers, not only alleviating us from menial tasks but making it possible for us to accomplish more (Rheingold, 2000, p. 7). The personal computer (PC) has revolutionized business and personal activities and even...

Words: 1809 - Pages: 8

Premium Essay

Computer Science Careers

...Information Science Careers Brian Maltass University of South Florida Introduction They say that we are living in information age, this however is a clear scandal that neither definition nor theory, of information both precise and broad enough to qualify such assertion sensible. Niels Ole Fiennemann in the year 2001 came up with Media’s general history. He said that no Society could exist where exchange and production of information is of little significance. Therefore one cannot compare information societies and industrial societies in any steady way. Information societies could be industrial societies and industrial societies could as well mean information societies. The following is a media matrix he suggested. literature cultures: writing(number systems and primary alphabets),secondly print cultures: print + speech + written texts, Second order alphabetical cultures: written texts + speech + analogue electrical media + digital media and speech based oral cultures .This paper seeks to visit the origin of the Information Technology and the developments it has undergone to become what it is today. In history of development of information technology, the paper looks into challenges that were encountered during the advancement stages. Discussion In today’s era, the crucial influence with regard to the concept of information is borrowed from information theory that was developed by Shannon alongside others. In...

Words: 2995 - Pages: 12

Free Essay

Cool

...Broward Virtual School / Florida Virtual School -High School 2010-2011 Course Offerings 2010-2011 Online High School Courses Broward Virtual School Broward County students have the opportunity to take courses for middle and high school credit taught online by Broward County teachers. Florida Legislators have made virtual education a component of parent/student choice. Broward Virtual School (BVS) has franchised the award-winning program for online learning from the Florida Virtual School, sponsored by the State of Florida. All courses are based on the Sunshine State Standards and the curriculum is directly linked to the benchmarks established by the Florida Department of School. Students may learn wherever they are, whenever they choose, maintaining a specified course pace. Students will use the Internet to participate in a learning experience quite different from the traditional school classroom. BVS serves full-time students as well as students who take courses at traditional high and middle schools. Broward County Schools will offer courses not otherwise available to students at their schools, such as select Advanced Placement classes. Any student eligible to enroll in a Broward County middle or high school may select the online environment. Successful online students are self-disciplined, motivated to learn, possess time management skills, and 21st century technology skills. Course Offerings Students may register for any BVS course offering (contingent...

Words: 1074 - Pages: 5

Free Essay

Subjects Information

...Academic assistance is the defined as an activity for teaching available for students in all subjects including science, mathematics, management, business studies, business and law and information technology. In the academic assistance, all subject related helps is being provided to the students to meet specific subject related queries. It is defined as a tutoring practice, which provides support to the students in solving particular subject related queries. Apart from this, this makes the learning process easy for the students through providing ready to learn or tailor made notes and helps in solving specific subject problems. Basically, academic assistance is the new method of tutoring by a large number of institutions to facilitate the students in their studies. Academic assistance encompasses all types of subjects from English to Management. In the academic content development, a number of subjects such as business studies, marketing, accounting and financial management, operations management, qualitative techniques, history, science, statistics, dissertation and its proposal development, human resources and organizational behaviour are covered. In pertinent to the given subjects, academic assistance is a kind of help provided to the students in developing particular topic related subjects content. Academic assistance is not only limited to provide a notes specific to subjects, but also it covers a full helps in completing the project steps such as authentic data collection...

Words: 5329 - Pages: 22

Premium Essay

Pert

...Computer science From Wikipedia, the free encyclopedia Jump to: navigation, search Computer science or computing science (abbreviated CS) is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems.[1][2] Computer scientists invent algorithmic processes that create, describe, and transform information and formulate suitable abstractions to model complex systems. Computer science has many sub-fields; some, such as computational complexity theory, study the fundamental properties of computational problems, while others, such as computer graphics, emphasize the computation of specific results. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describe computations, while computer programming applies specific programming languages to solve specific computational problems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to humans. The general public sometimes confuses computer science with careers that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement...

Words: 5655 - Pages: 23

Free Essay

Computer Science

...____________________________________ Computer Science, An Overview ------------------------------------------------- Chapter 00 Introduction Read the introduction to the text. The answers to the following questions will appear in order as you read. What is computer science? ------------------------------------------------- The Role of Algorithms What is an algorithm? What are some examples of algorithms? What is a program? What is programming? What is software? What is hardware? Where did the study of algorithms come from? Once you discover an algorithm, do others need to understand it? When does the solution of a problem lie beyond the capabilities of machines? ------------------------------------------------- The History of Computing What are some of the ancestors of the Computer? Eventually, people began using gears for computing. Who are some of the people involved? Which of the men above produced something that was programmable? What were (and are) some uses of holes punched in cards? What kinds of devices replaced gears? What were some early computers? According to the text, what were the first commercially viable computers, and who built them? What happened in 1981? Who wrote the underlying software for the PC? What important development in computers happened as the Twentieth century was closing? What were two big developments for the Internet? (hint, look for the next two bolded phrases) As computers get smaller and...

Words: 406 - Pages: 2

Premium Essay

Information System

...as: the analysis and design of systems, computer networking, information security, database management, and decision support systems. Information Management deals with the practical and theoretical problems of collecting and analyzing information in a business function area including business productivity tools, applications programming and implementation, electronic commerce, digital media production, data mining, and decision support. Communications and Networking deals with the telecommunication technologies. Information Systems bridges business and computer science using the theoretical foundations of information and computation to study various business models and related algorithmic processes within a computer science discipline.[4][5][6][7][8][9][10][11][12][13][14] Computer information system(s) (CIS) is a field studying computers and algorithmic processes, including their principles, their software and hardware designs, their applications, and their impact on society[15][16][17] while IS emphasizes functionality over design.[18] Any specific information system aims to support operations, management and decision making.[19] In a broad sense, the term is used to refer not only to the information and communication technology (ICT) that an organization uses, but also to the way in which people interact with this technology in support of business processes.[20] Some authors make a clear distinction between information systems, computer systems, and business processes. Information...

Words: 5388 - Pages: 22

Premium Essay

Introduction to Science

...Science Impacting Modern Life Colorado Technical University November 25, 2013 Abstract As I grow to understand on life in this world I realize that computers are a big part in the way we live our life everyday such as our workforce and for our personal use. For most homes now carries a computer or some type of tablet, a smartphone or some types of handheld electronics products and over the past decades, I have experiences that computers and other devices are better equipped in storing information faster than ever. For the past few years’ we have learned that over the past decade computer had improved and designs in more ways since 10 years ago. I have learned and have discussed how the electron spin of atoms hints at a new approach in the computer revolution for the near future. In this paper I will discuss new ways in development in material science which allowed many changes in the life of computers and the essential properties of the life in which has influenced the industrial of computers. Science Impacting Modern Life The three essential properties of every material are 1: the kind of atoms of which it is made, 2: the way those atoms are arranged, and 3: the way the atoms are bonded to each other” and (Trefil and Hazen), stated that “Based on our understanding of atoms and their chemical bonding, we now realize that the properties of every material depend on three essential features, such as As we learned we will realize that all materials have difficult...

Words: 872 - Pages: 4

Premium Essay

Job Market

...Beyond supply and demand: Assessing the Ph.D. job market by Elka Jones G reg O’Malley got a taste of the job market for Ph.D. graduates when he supervised several of them after earning his bachelor’s degree. “It was incredible to me that they had gone through so many years of rigorous training,” says O’Malley of his subordinates at his postbaccalaureate publishing job, “only to be working under someone who’d barely finished his undergrad work.” Still, the experience failed to deter him from pursuing a graduate degree of his own: O’Malley currently is enrolled in his second year of the history Ph.D. program at Johns Hopkins University. 22 22 Occupational Outlook Quarterly ● For O’Malley and thousands of others, the desire for a doctorate outweighs concern about the job market that awaits after graduation. Most Ph.D. candidates are willing to dedicate themselves to intensive research and study because they enjoy the subject matter. Winter 2002-03 Statistics also show other, more tangible payoffs for Ph.D. recipients when they enter the labor force. Unemployment rates are consistently lower and earnings are significantly higher for people with a Ph.D. degree than they are for people with lower levels of educational attainment. As chart 1 shows, doctoral degree holders in 2001 had an unemployment rate of slightly more than 1 percent and median annual earnings of $66,000—considerably Elka Jones is an economist in the Office of ...

Words: 5981 - Pages: 24

Free Essay

Micro Lesson Plan

...* EMAIL * FACEBOOK * TWITTER * MORE Invention: Computer Technology * Subject: Technology * | * Grade(s): 6-8 * | * Duration: One class periods Lesson Plan Sections * Objectives * | * Materials * | * Procedures * | * Adaptations * | * Discussion Questions * | * Evaluation * | * Extensions * | * Suggested Readings * | * Links * | * Vocabulary * | * Academic Standards Objectives | Students will understand the following:  1. | Inventions can change the way we live. | 2. | Many inventions start out with design flaws and are refined later by subsequent inventors and designers. | 3. | The computer, invented in 1834 by Charles Babbage and still being refined, is an example of such an invention. | | Materials | For this lesson, you will need:  • | If possible, an encyclopedia dated 1980 or earlier, with an entry for computer | • | A computer with Internet access | | Procedures | 1. | Ask students if they know who invented the computer. If they don't know, inform them that, in 1884, Charles Babbage, an English mathematician, tried to build a complicated machine called the "analytical engine." It was mechanical, rather than electronic, and Babbage never completed it, but computers today are based on many of the principles he used in his design. Your students may be interested to know that, as recently as forty years ago, computers were so large that they filled whole rooms. They were so...

Words: 1716 - Pages: 7

Free Essay

Gendered Subjects

...is it that women make up such a small percentage compared to men in this field? Is it because they are not capable? Or that they do not want to be associated with the “geek” culture of science technology? Women make up more than half of the college population in America,and today women are beginning to major in majors such as math and science in impressive numbers, but with computer science and technology it is a whole different story. On top of the small numbers of women entering the field, the typicality of a women to enter the field only to leave it after a few short years is extremely common. The number of women in computer science and technology who enter and remain in these fields is very low because of underlining cultural stereotypes and the higher demand on women to be in the home juggling with multiple responsibilities. Men compromise a whopping eighty to eighty-five percent of people who enroll in computer science related fields in the university level. Carnegie Mellon University in particular has a ten to one ratio of men compared to women in the field. In 2004-2005, the number of women pursuing Bachelor's degrees in Computer Science was a mere twenty percent (Fry, 2001). Ellen Spurtus, one of the few women in the technology field, remembers as a girl when she attended a computer camp with boy to girl ratio of six to one (Stross, 2008). After having a small look at some of the figures dealing with women in this industry, it is not hard to see that the women...

Words: 1621 - Pages: 7