Free Essay

Quantum Computing: Overview

In: Computers and Technology

Submitted By rservais
Words 5884
Pages 24
BUS-5013: Robert Servais ID: 0323483

Overview, Historical Context and Development of Technology Since the dawn of the Information Age in the 20th century, academics, governments, big business and tech savvy consumers have all been granted access to endless amounts of data through an online repository with the simplicity of a mouse click. The amount of information available to the world is a triumph mainly attributable to breakthroughs such as those given by Tim Berners-Lee with his introduction of the World Wide Web in the early 1990’s, and constant developments in computer science and computational power made possible through advances in microprocessing capabilities which seem to be paralleling closely with forecasts given by Moore’s Law (Greenemeir, 2009). Moore’s Law, which is not exactly a law, is an estimation regarding the progression deemed possible in an integrated circuit. It was postulated by one of the founders of Intel, Dr. Gordon Moore, who stated that the number of transistors placed on a circuit will double roughly every 18 months (Colwell, 2013), and the capacity to follow this trend will end at a critical point when transistors become so close together that the manifestations of quantum mechanics will begin to transpire (Tally, 2012). Classical computers have undoubtedly reshaped our world and have provided incalculable progress unforeseen through the eyes of some of history’s greatest academics. Early computers were mammoth-sized mechanical devices utilized by the military for processing information that would take humans hours by manual calculation alone (Watson, 2012). Computers, in the days prior to digital computers, were actual people who would literally compute information for whatever their task or job may have been. Though these positions were often filled with clever mathematicians, their capacity to compute problems with increasing complexity became difficult as the number of variables in any given problem could easily grow as new information was added to a scenario (Leavitt, 2006). One of the first computers ever launched to challenge these computational problems was the Mark I, which was developed in response to a growing need to perform military calculations during the second world war (Bellis). This was a triumphant feat as it was able to cut calculation times from hours to mere seconds. This paradigm shift in computing power was partly possible through the incredible brilliance of an English mathematician named Alan Turing, who envisaged a universal machine which could essentially take inputs and generate outputs if instructed with the appropriate algorithm. In 1936 Turing wrote a paper to challenge this idea titled, On Computable Numbers, with an Application to the Entscheidungsproblem, in which he explained such “automatic machines” could indeed exist given the right set of instructions and led him to devise a hypothetical mechanism called the Turing Machine (Turing, 1936). These automatic machines were the foundation for early computers like the Mark I, and the perils of war drove the innovation needed to develop more sophisticated computers. Initially these a-machines as Turing called them, were becoming more advanced to approach problems in cryptanalysis and allowed him to reach part of his intellectual stardom through his work at Bletchley Park during the second world war by deciphering the German Enigma Code (Watson, 2012). Turing’s accomplishments let him ponder the applications of greater computing power and what it would mean for the future of mankind. He believed one day computers would possess such extraordinary computational power that artificial intelligence (AI) was not only possible but only a matter of time (Leavitt, 2006). He devised a method for determining what constituted artificial intelligence through an evaluation called the Turing Test. The test is carried out by placing a human examiner in one room at a computer and allows him to type to both another computer and another human correspondent in a separate room. When the examiner communicates with the other two entities through the terminal and cannot tell which one is the human and which one is the machine, then the computer has passed the Turing Test (Leavitt, 2006). Dreaming of artificially intelligent machines in the 1950’s set the bar for today’s computer programmers and software engineers. It remains the holy grail for the computer world. Classical computing has catapulted technological potential over the last sixty years to allow for breakthroughs in virtually every aspect of life, but still stands in the shadow of some of the worlds greatest problems. As our ability to generate, store and process information increases exponentially as more and more people gain access to information, we have encountered an unforeseen problem. The proverbial fruits of our technological advantage have come sweet, but have left a bitter taste of information overload in our mouth. Classical computers are undoubtedly amazing machines which have given us the ability to generate information beyond our comprehension, but simply we just don’t know how to utilize it to its fullest potential. So much data now exists that new research streams revolve around it. One such avenue of research is Big Data. This information-research uses huge datasets to study more or less emergent behaviour or emergent trends in various disciplines (West, 2013). As an example, suppose you set out to find correlational data between cell phone users and incidents of cancer in a particular country. Traditionally the researcher would adopt the scientific method and would suggest a hypothesis based on whether he/she thought there would be a correlation or not. The research would be deemed valid by choosing a large enough sample size at random and testing it against a control group. If people who used cell phones got cancer more than people who did not (loosely suggesting), then the researcher would be on the right track of posing a statistically significant result. This method has been the conventional means among research studies but it is costly, it takes time, and there are a multitude of other variables like genetics, lifestyle and environmental factors that could have been overlooked. Big Data on the other hand does not care about the intermediary factors. For example, a Big Data study in Ireland used the entire population of the country to investigate cell phone users and their geographical dispersion, health trends and the frequency of cancer among them. It was found that there was no correlation among cell phone use and cancer cases (Tim Hardie’s lecture). Using this many variables with this many people is not realistic for conventional studies and it signifies the dawn of new computational expectations. Using this example of how Big Data can prove to be superior and more efficient than traditional research, it foreshadows how classic computing will face its limitations as raw information and variations of each input will prove to be difficult for contemporary computation. While the beginning of the end may be in view for Gordon Moore’s Law, new computing power and information processing machines will need to be developed to manage a world where petabytes of data are generated every day. As physicists and mathematicians paved the way for computer science, theoretical research allowed academics to explore the subatomic world, eventually giving birth to Quantum Mechanics. Quantum Theory was the culmination of some of the world’s great minds at the beginning of the 20th century, but Max Planck has been coined the father of the theory. With powerful minds such as Niels Bohr, Einstein, Max Planck, Heisenberg, Paul Dirac, Maxwell and countless others, the idea that particles such as electrons and photons behaved like the discrete spherical units we envisioned was abolished, as the wave-particle duality became increasingly more apparent (Bartusiak, 2000). Though much of this topic is beyond the scope of this paper, it must be noted that the wave-particle duality of the subatomic world is the infrastructure that built quantum theory from the ground up. The word quantum, is latin for how much, or in our understanding, quanta, which represents a packet of energy constituted in the “waves” contained in radiation, light, and other energy forms (Bone & Castro). The wave like characteristics of light had become apparent as far back as 1803 when Thomas Young performed his famous double-slit experiment. In this experiment it was realized that the interference pattern (light and dark bands) generated on a detector was the result of the interaction of two light waves passing through two adjacent slits, adding together or cancelling each other out (Bartusiak, 2000). This is analogous to two stones thrown in a pond and watching the additive and subtractive nature of the waves when they meet. This new area of research captivated many and troubled even more, including Albert Einstein. The quantum world flipped theoretical physics on its head and did not complement Einstein’s general Theory of Relativity which was one of the hallmarks of theoretical physics even to this day. Quantum mechanics remained an illusive area of physics and did not enter the realm of computer science until 1982 when Richard Feynman proposed a computer that would be able to harness the bizarre properties of the quantum world. Like Turing, Feynman envisaged a computer that could take inputs, and through clever algorithmic instruction could generate the appropriate output of any problem addressed within the limits of computation. The difference between the two machines was that the Turing Machine could not withstand the peculiar properties of the quantum world where Feynman’s universal quantum simulator could (Feynman, 1982). This proposition later sparked the interest of David Deutsch, a British physicist who proved that such a universal quantum computer could indeed utilize the effects of the quantum world in a single finite machine (Bone & Castro). David Deutsch explained "there exists or can be built a universal quantum computer that can be programmed to perform any computational task that can be performed by any physical object” (Bone & Castro). The historical context of this great computing power has incrementally gained momentum in modern times. The need to develop a quantum computer has been the result of passionate physicists and computer scientists simply wishing to push the boundary of their intellect, and also largely to satisfy the urge to solve some of the worlds most complex information problems. Some of the most promising applications of quantum computation has recently been made possible with the launch of the Canadian company D-Wave. The company was founded in 1999 with the goal of successfully building a quantum computer that could provide a practical means to which quantum computation could become a reality at an appropriate scale (www.dwavesys.com). The launch of their first computer became a reality in 2010 when they announced the D-Wave One, a 256 qubit computer, and soon after in 2013 its successor, the D-Wave Two, with 512 qubits (www.dwavesys.com). The excitement around this news led to immediate interests among information-rich organizations, and the commercialization of the computer system was a success with global powerhouses such as Google, NASA, Lockheed Martin and a few private buyers who acquired one with its ten million dollar price tag. Each of these institutions have demonstrated that there lays no boundary to innovation when large quantities of information are at your disposal. The right information coupled with the right technology will carry the human race into the next information age, an age of understanding. While quantum computers are not for the standard user, nor will they be any better than a classical computer for everyday problems, companies like D-Wave are targeting large institutions that need a technology capable of handling copious amounts of information, each with the potential for complex probabilistic outcomes. Some problems of this kind will include optimization problems and machine learning, financial analysis, pattern recognition, protein folding or radiotherapy, cryptanalysis and of course the ability to explore the depths of the quantum world itself (www.dwavesys.com). It will be exciting to see what this technology will reveal and how the commercialization of such a powerful tool will unfold.

Technical Analysis

Some might wonder why Google or NASA would want a quantum computer in the first place, or what makes a quantum bit more powerful than a classical computer bit? The answer lays in the properties governed by quantum behaviour, at scales so small that atoms and subatomic particles experience unconventional properties, so much so that their characteristics have been coined to exhibit “spooky behaviour at a distance” (Bartusiak, 2000). Max Planck once said, “When you change the way you look at things, the things you look at change” (www.dwavesys.com). What did he mean by this? To delve into the underpinnings of quantum computation, it must first be shown how classical information, the conventional computer jargon we hear concerning megabytes, bits and binary code, differs firm quantum information. A classical computer, like the one you will find in your pocket or in the school library, allows information to be sent based on algorithms and computer code. Programmers realized that coding in C++ or Java takes up considerable room and is tedious for the programmer and the computer to sift through. This is where bits or binary code makes itself useful. Binary code as you have probably encountered is a stream of zeros and ones configured something like the following, 01101 000100100 1110 1001 (Blumel, 2010). Essentially what this stream of numbers represents are actual numbers, or pieces of information called bits (McMahon, 2007). A bit can be represented as either the 0, or the 1 of the binary code and basically represents an on or off position in computer language (McMahon, 2007). This on/off designation is an arbitrary labeling of the binary code, it could mean a number of things like yes/no, white/black or up and down, it doesn't really matter (McMahon, 2007). What matters is that one side of the pair does one thing and the other side does the opposite. This two-state system becomes more apparent when the appropriate mathematics is applied to illustrate what these states mean. The binary code functions by using base 2, hence the word binary (McMahon, 2007). The base 2, or 2n, mathematical expression helps us define sets of numbers using only the 0’s and 1’s illustrated in the code. If we wished to express the numbers 0, 1, 2, 3, 4, 5, 6, 7, we would need eight numbers or permutations of the binary language to do so. By utilizing the mathematical expression 2n, where n represents the number of bits we need, we can simply substitute the appropriate integer within the exponential function to arrive at 3, or 23 = 8 (Gildert, 2007). Thus we need 3 bits to represent the numbers 0-7. In binary code it is represented as follows, 000 001 010 011 100 101 110 111. By only using zeros and ones we are able to represent large numbers with ease. With this little bit of background information we are able to deduce some of the limitations of classical information. Binary digits have to be in either in the 0 or the 1 designation, they cannot be both. This is where quantum mechanics shows its advantage.
Quantum computing can best be described through careful examination from some of the key properties of quantum theory, and are also some of the most challenging concepts to understand. Albert Einstein’s famous quote, “God does not play dice”, is a proclamation resulting from the frustration he experienced trying to understand why uncertainties are certain with respect to quantum mechanics (Bartusiak, 2000). Feynman also said, “if you think you understand quantum mechanics, then you don't understand quantum mechanics”. Both of these revolutionary thinkers understood the complexity of the subject and while Feynman went on to make great progress in the area, Einstein died trying to figure out how his general theory of relativity and quantum mechanics would fit together. So where does the complexity come from? Quantum computing gains its advantage through the peculiar properties at the nano scale. At the subatomic level, properties such as superposition, entanglement and coherence take effect and the wonders of the quantum world emerge (Gildert, 2007). The qubit, or the quantum bit, is analogous to the bit discussed in classical information, the only difference is that a qubit can be a number of things including a nucleus, a single photon or an electron, all of which have the ability to represent the 0 and 1’s discussed in binary code (McMahon, 2007). The key difference is that qubits can be a 0 and 1 at the same time, that is until they are measured (Altepeter, 2010). This is where the expression superposition comes from. Superposition means exactly that, the qubit is in two positions simultaneously and this is where the computational power of quantum machines can exhibit its unique processing capabilities (Altepeter, 2010). Some researchers have been applying clever physics to their advantage and using phosphorus (P15) atoms to create a qubit. By using the atoms valence electrons, which are the electrons furthest away from the nucleus, they are able to manipulate the orientation of the electron using a magnetic field. Electrons possess a property called spin, which gives them qualities analogous to a magnet when placed in a magnetic field, with a corresponding north and south pole. This spin allows the electron to display two different positions which are treated as the 0 and 1 positions of the classical bit (McMahon, 2007). By having the electron positioned within the magnetic field, it will naturally take an orientation called its ground state, or in our case the zero state from the 0 and 1 designation (Altepeter, 2010). When an appropriate magnetic field is applied to the electron, it is able to re-orientate to its opposite position, switching from the 0 ground state to the “1” position, thus giving rise to the binary signal achieved in classical computers (Altepeter, 2010). Superposition is achieved when just the right amount of magnetic force is applied to the electron, that it sits in a state between the two orientations, thus experiencing its superposition (Galchen, 2011). It is this special orientation that allows a quantum computer to be in two states at once and thus achieve exponential gains as more qubits are added. So as mentioned before, three classical bits are needed to represent the numbers 0-7 from the expression 23 = 8. But how many qubits are needed? Qubits are able to inhabit a superposition between all eight of these possible permutations simultaneously, where a classical system must do each one after another (McMahon, 2007). This is the advantage. Quantum computers are able to process vast amounts of data with several permutations concurrently. As an example, Shor's algorithm allows extremely quick factoring of large numbers, a classical computer can be estimated at taking 10 million billion billion years to factor a 1000 digit number, where as a quantum computer would take around 20 minutes. (http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/#18).

User Analysis

When the Canadian company D-Wave publicly declared that they had successfully built a commercially available quantum computer, there was a jolt of excitement that shot across the world of academia but it came with a notable caveat, pure skepticism. When the researchers at D-Wave set out to build the worlds first quantum computer, they rigorously examined potential buyers and investigated the best way to market this technology to the right people. The illusive machine is not something a large organization like Google purchases without already knowing what kind of questions they want answered. The technology was built to fill a need, not create one. But the skeptics of D-Wave were concerned with two main issues; Is this actually a quantum computer, and is it any better than what we already have? D-Wave understood that literally every industry with data at its disposal would benefit from the technology. The ability to process large quantities of data with countless variables each with hundreds of permutations can be applied to areas in pharmaceuticals and medical treatment, financial analysis, discrete optimization (which covers a broad area concerning logistics), cryptography, pattern or trend recognition, artificial intelligence and countless others. The users of this technology in its infancy will likely be composed of large tech companies and information processing organizations. The Diffusion of Innovations theory explains that “individuals are seen as possessing different degrees of willingness to adopt innovations and thus it is generally observed that the portion of the population adopting an innovation is approximately normally distributed over time” (Rogers, 2003). This will show to be true for adopters of quantum computing technology as well. It seems to take a few innovators and early adopters to discover the value of a resource before more skeptical, later adopters follow suit. The complexity and the price tag associated with a quantum computer will limit the first commercial suppliers to offer this differentiated product to perhaps only a few buyers, but at the benefit of charging a premium. Early adopters of this technology will require a specialized skill set and a deep understanding of what types of problems a quantum machine is suited for.

Business Analysis

Early developers of quantum computers will have to pay close attention to market reactions and competitive forces in the early years of the technology. Utilizing Porter’s Five Forces Model, developers can evaluate each force to help gain insight into the perception and reaction that the market will incur upon launching commercially available quantum machines. Currently D-Wave is the only developer issuing this technology commercially, and thus will help illustrate Porter’s Five Forces.

Threat of New Entrants: The threat of new competition focuses on the risk of new entrants whom must face the barriers associated with developing a technology similar too or better than the existing technology (Hill & Jones, 2008). Threats may develop from economies of scale, absolute cost advantages, switching costs and brand loyalty for the consumer. Quantum computers have only recently entered the market and currently have only one commercial producer of the technology. This provides them with absolute cost advantages relative to new entrants because they have already secured patents, production methods, resources and relationships with early adopters of the technology (Hill & Jones, 2008). Switching costs and brand loyalty will be relatively important limitations for companies like D-Wave. For example when Google purchased its first quantum computer, the affiliated technical support, adoption of software and loyalty to the company (D-Wave) would be an important relationship to recognize and thus would make the switching costs relatively high and unattractive. For these reasons, companies like D-Wave would exhibit a relatively low threat from new entrants.

Bargaining Power of Buyers: This technology gives consumers relatively little buying power. For instance the buyers of a quantum computer will purchase small quantities of the product, likely one machine per company. This gives buyers little bargaining power and offers them no leverage (Hill & Jones, 2008). Switching costs as mentioned are also high, and thus provides developers like D-Wave another advantage. There currently are no substitutes or suppliers to compete with and leaves the developer with no reason to make the product any cheaper. Thirdly, there is little threat of buyers being able to independently produce the technology on their own due to the technical limitations.

Bargaining Power of Suppliers: The developers have an extremely high bargaining power in that there are few substitutes and no additional suppliers of the manufacturing materials (the components and resources needed to assemble a quantum computer are internally controlled). In addition developers maintain market control through the switching costs associated with consumer relationships already developed. By keeping technological, intellectual and resources internally bound, the developer will retain flexibility and control over price and levels of output.

Threat of Substitute Products: Due to the high level of product differentiation and technical requirements held by the developer, there is virtually no substitute for the product. The quantum computer produced at D-Wave Systems is not a substitute for classical computers as a whole, rather it is a specialized technological resource for specific sets of information processing. Potential problems for the developer will come from the technical skills that each buyer will require in order to adopt the new technology. The Diffusion of Innovations theory posed by Rogers (2003) states “the rate of adoption of innovations is impacted by five factors: relative advantage, compatibility, trial-ability, observability, and complexity”. Buyers may already see classical information processing as compatible with the level of information processing required within their industry, and switching to a quantum processing platform may be too complex or may not provide adequate value given the switching costs associated with the adoption of the new technology.

Intensity of Competitive Rivalry: Rivalry between developers can be advantageous for buyers as it drives companies to innovate and bring down prices in order to compete (Hill & Jones, 2008). Currently there are no commercial competitors in the quantum computing industry though a precaution comes for D-Wave as new rivals will be able to deduce weaknesses and flaws in marketing, manufacturing, pricing and resource allocation in order to make and produce quantum computers. These possible areas of improvement function through the current demand conditions and cost conditions of the industry (Hill & Jones, 2008). As the demand for greater computing power rises, the number of developers will increase and thus will lower rivalry because there will be more buyers to sell the product too (Hill & Jones, 2008). Additionally, cost conditions for D-Wave may initially be high due to internal manufacturing and development. But as the law of diffusion permeates the market, more quantum computer companies will develop and manufacturing or design may be outsourced to areas where it is cheaper to operate.
From the perspective of Porter’s Five Forces, it is evident that the developers currently have monopolistic control over the technology and face little threat from competition or substitute technologies. Companies like D-Wave have the advantage of building relationships and improving existing production and marketing strategies where new developers will have to exercise these operations after entering the market. The bargaining power held by the developer may only continue if they learn to market their technology effectively. Quantum computers are highly differentiated processing machines with a requirement for technical expertise. It will be up to developers to educate buyers on the practical applications of the technology and present their usefulness to information processing organizations. The usefulness of quantum computing will undoubtedly become apparent as early adopters start to realize the capabilities of the technology, and the value will surely diffuse through other markets. Investing in this technology will prove to be an educated decision and will provide new insights into the power held in information.

Benefits, Shortcomings and Impacts

The benefits of quantum computing will impact each level of society but will begin at the level of the organization. Large institutions with the capital and technical capability will be the first to adopt. Google and NASA for instance have both already been in possession of a quantum computer for the past three years. At the institutional level, NASA will be able to capitalize on the ability of the machine to solve discrete optimization problems regarding space missions (Chirgwin, 2012). NASA mentioned “the machine could help it map out the most efficient way for a Rover mission to explore a new planet, marshaling data from dozens of outer-space observation points” (Chirgwin, 2012). These types of scenarios with hundreds of variables, each with its own permutation is the kind of optimization problems that are more efficiently done with a quantum computer. It primarily utilizes vast amounts of data and comes up with one definite answer by exploiting the quantum computers ability to evaluate multiple options simultaneously using the principles of superposition. The benefit of optimization will allow organizations to better manage operational costs for developing projects and select the most cost effective strategies for new projects. Google on the other hand utilized this same strategy and found ways to deliver, organize and create new informational tools that made life at the individual level more efficient for users. This adoption of utilizing information technology with the launch of products like Google Drive or predictive text gave the company a powerful way to use, find and share information more efficiently. The problem arises when we ask how Google was able to achieve this level of innovation using classical information processing. These possible shortcomings for institutional adopters will come from the two questions posed earlier in the report; are these computers actually performing at the quantum scale, and are they any better than classical computational systems. Organizations will be forced to benchmark and perform relative analyses between the new quantum systems with that of conventional computing platforms. Thus adopting this technology will create additional costs through the purchase of the new hardware, training or hiring people to have the technical ability, and building the facilities to support the integration of the technology. These are important costs to consider, and they must be evaluated with the cost of doing nothing.
At the societal level we can hope to see this transfer of information to enrich society with a refined wealth of information. It will be up to organizations to generate information with this technology so it can be applied in a useful manner. The advent of technological breakthroughs has always benefited the world as we have seen many times throughout history. Quantum computing parallels the hesitation and excitement that was felt through society with the introduction of the personal computer. This hesitation or lack of vision is a major shortcoming for society because the technology is not well understood, or how it will fit into our everyday lives. It falls between a fine line of being extraordinarily useful and being extraordinarily complex. Regardless, quantum computing will likely have a chance to do much less harm than good once its resourcefulness is recognized. Individuals will not realize its potential without acceptance throughout the masses. It will be too difficult for individuals to appreciate the merits of a quantum computer at such a personal level. The impact must work its way through some intermediary users before reaching the individual. This level holds value because it allows for manipulation and creativity. As an example, once the smartphone was developed and popularized by Apple, it was the individual who created “apps” which gave the technology greater value than what was originally planned. This created revenue streams unforeseen by the tech giant and allowed reciprocity between individuals, society and organizations. It will be this type of unexpected, creative offshoot that will enable quantum computers to flourish.

Conclusion

The future of this information technology will be in big business and the spread or sale of information. It will be widely adopted by the financial sector, the government (including military, campaigning, defense, trade), universities, online marketers and retailers like Amazon, health care providers, pharmaceuticals, insurance companies, automotive, sports teams, engineering firms and many others who can find ways to manage, operate, develop and manufacture ideas, services or commodities in a more efficient way. Individuals for the time being will not be able to adopt the technology because there is simply no demand, nor is there the technology to produce computers at such a refined scale. Personal use also will not be possible because individuals do not require such complex information processors. Classical computers will serve these purposes better and more efficiently when dealing with smaller quantities of information as quantum computing is for sophisticated calculations involving large, variable datasets. Adopters of the technology will not completely replace their classical systems either, simply because there are some things conventional computation does better at a fraction of the price. Moore’s law has shown consistent progress in conventional computation power and our ability to develop more technical software will be one of the major impedances to overcome to keep up with sophisticated hardware. Quantum machines should stay on the market for at least the next century as the life span of classical systems have taken this long to achieve some of their greatest potential. The longevity of the technology will be related to how fast others can learn how to use and produce the technology, allowing for economies of scale, differentiation, competition and innovation. Some of the most useful applications of the technology will come through the competitive forces or rivalry among suppliers and developers, increasing the buying power among consumers. It would be wise to invest early in this technology to benefit from the information, and more importantly the capital saved through more efficient and effective operations of the business. By investing in this information technology you gain a considerable advantage amongst the laggards of the resource.

References:

------------------------------------------------- Altepeter, J. (2010). A Tale of Two Qubits: how quantum computers work. ArsTechnica.
-------------------------------------------------
Retrieved from http://arstechnica.com/science/2010/01/a-tale-of-two-qubits-how- quantum-computers-work/
-------------------------------------------------

------------------------------------------------- Bartusiak, M. (2000). Einstein’s Unfinished Symphony. Listening to the sounds of
-------------------------------------------------
spacetime. New York, NY: W.W. Berkley Publishing
-------------------------------------------------

------------------------------------------------- Bellis, M. nd. The History of Quantum Computers. Retrieved from http://inventors.about.com/
-------------------------------------------------
library/blcoindex.htm
-------------------------------------------------

------------------------------------------------- Blumel, R. (2010). Foundations of quantum mechanics: From photons to quantum computers.
-------------------------------------------------
Sudbury, Mass. : Jones and Bartlett Publishers
-------------------------------------------------

------------------------------------------------- Bone, S. & Castro, M. A Brief History of Quantum Computing.
-------------------------------------------------
Retrieved from http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/
-------------------------------------------------

------------------------------------------------- Chirgwin, R. (2012). What are Quantum Computers Good For. The Register.
-------------------------------------------------
Retrieved from http://www.theregister.co.uk/2012/11/17
-------------------------------------------------

------------------------------------------------- Colwell, R. (2013). End of Moore’s Law: It’s not just about physics. Scientific American.
-------------------------------------------------
Retrieved from http://www.scientificamerican.com/article/end-of-moores-law-its-not- just-about-physics/
-------------------------------------------------

------------------------------------------------- D-Wave: The Quantum Computer Company. Web. np. 2014. Retrieved from
-------------------------------------------------
www.dwavesys.com
-------------------------------------------------

------------------------------------------------- Feynman, R. (1982). Simulating Physics with Computers. International Journal of Theoretical
-------------------------------------------------
Physics. Vol. 21, Nos. 6/7
-------------------------------------------------

------------------------------------------------- Galchen, R. (2011). Dream Machine: The Mind Expanding World of Quantum Computing. The New Yorker. Retrieved from http://archives.newyorker.com/?i=2011-05-02#folio=034
-------------------------------------------------

------------------------------------------------- Gildert, S. (2007). Quantum Computing for Beginners: Building qubits. Condensed Matter Physics Research. University of Birmingham
-------------------------------------------------
Greenemeier, L. (2009). Remembering the Day the World Wide Web was Born. Scientific American. Retrieved from http://www.scientificamerican.com/article/day-the-web-was- born/
-------------------------------------------------

------------------------------------------------- Hill, C.W. & Jones, G.R. (2008). Strategic Management: An integrated Approach, 10th Ed.
-------------------------------------------------
Mason, OH: Cengage Learning
-------------------------------------------------

------------------------------------------------- ------------------------------------------------- Leavitt, D. (2006). The Man Who Knew Too Much: Alan Turing and the invention of the
-------------------------------------------------
computer. New York, NY: W.W. Norton & Company Inc.
-------------------------------------------------

------------------------------------------------- McMahon, D. (2007). Quantum computing explained. Retrieved from http://
-------------------------------------------------
proquestcombo.safaribooksonline.com.ezproxy.torontopubliclibrary.ca/9780470096994:
-------------------------------------------------
Toronto Reference Library: Wiley-IEEE Press
-------------------------------------------------

------------------------------------------------- Rogers, E. Diffusion of Innovations, 5th Ed. New York, NY: Simon & Schuster Inc. 2003.
-------------------------------------------------

------------------------------------------------- Tally, S. (2012). One and done: Single-atom transistor is end of Moore's Law; may be
-------------------------------------------------
beginning of quantum computing. Purdue University. Retrieved from http:// www.purdue.edu/newsroom/research/2012/120219KlimeckAtom.html
-------------------------------------------------

------------------------------------------------- Turing, A.M. (1936). On Computable Numbers, with an Application to the
-------------------------------------------------
Entscheidungsproblem. Princeton University: The Graduate College, 230
-------------------------------------------------

------------------------------------------------- Watson, I. (2012). How Alan Turing Invented the Computer Age. Scientific American.
-------------------------------------------------
Retrieved from http://blogs.scientificamerican.com/guest-blog/2012/04/26/how-alan- turing-invented-the-computer-age/
-------------------------------------------------

------------------------------------------------- West, G. (2013). Big Data Needs a Big Theory to Go with It. Scientific American.
-------------------------------------------------
Retrieved from http://www.scientificamerican.com/article/big-data-needs-big-theory/

Similar Documents

Free Essay

Quantum

...Seminar Synopsis Topic Introduction to Quantum Computer Abstract Overview: A classical computer has a memory made up of bits, where each bit is represented by either a one or a zero. A quantum computer maintains a sequence of qubits. What is qubits A single qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8 states. In general, a quantum computer with n qubits can be in an arbitrary superposition of up to 2^n different states simultaneously (this compares to a normal computer that can only be in one of these 2^n states at any one time). A quantum computer operates by setting the qubits in a controlled initial state that represents the problem at hand and by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. The calculation ends with a measurement, collapsing the system of qubits into one of the 2^n pure states, where each qubit is zero or one, decomposing into a classical state. The outcome can therefore be at most n classical bits of information. Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability. Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement...

Words: 391 - Pages: 2

Premium Essay

United Parcel Service’s It Infrastructure

...United Parcel Service’s IT Infrastructure: A Case Analysis Russell Baker, Jacksonville University Brian Wm. Dudley, Jacksonville University Sean Holt, Jacksonville University Chris Stockton, Jacksonville University Vanja Vukota, Jacksonville University ABSTRACT This analysis of the information technology infrastructure at United Parcel Service (UPS) begins with a general overview of the company’s information technology (IT) environment to establish awareness of the size and complexity of this corporation. It includes a macro-level breakdown of the technology infrastructure of UPS ranging from the data centers to the PCs currently in operation. The study also more closely analyzes the software and database structure of UPS along with an analysis of the company’s E-commerce activities. It includes an interview with Tom Creech, the North Florida District E-commerce manager at UPS. Finally, research was conducted to evaluate the emerging technologies that UPS is implementing and employing to support the business strategy and maintain its competitive edge. OVERVIEW nfrastructure, data and proprietary e-commerce software elements are the foundation of UPS’s technology success. These tools work together to support its e-commerce strategy, which in turn supports the company’s business strategy. The company’s ongoing investment and research in emerging technologies gives UPS its competitive advantage according to Tom Creech, North Florida District E-commerce Manager. “UPS...

Words: 3557 - Pages: 15

Premium Essay

Simplify

...Request For Inspection - Simplify ( RFI ) Simplify RFI : Write-up Overview Simplify RFI (Simplify) provides a central repository of all the activities and their actual updates. This tool is primarily developed for tracking progress of various Infrastructure projects pertaining to businesses like Roads, Metro, Sea Link etc. It is an integrated work flow management tool which interfaces online with project management tool - Primavera to draw the project plan on one hand, while on the other it accepts the update on these plans as per actual execution from the vendors. The actual submitted by the vendor gets approved by the concerned engineers involved as well as the project manager (PM). On final approval from PM, the status of execution gets updated back in Primavera. In addition to tracking of progress against planned work, it also automatically generates the invoice based on the approved work done. An automated work flow for invoice approval helps in minimizing delays and brings in transparency into the process for all stake holders. Moreover, it helps in keeping track of the huge number of manual paper based transactions through electronic mode at a centralized place – a promising step towards green computing. The transactions getting generated in Simplify are chronologically managed and easily identified through unique reference number assigned to each one of them. Thus, overall it acts as a simple tool to address the worry of manual mistakes, avoid duplication in work payments...

Words: 902 - Pages: 4

Premium Essay

Farahkyu

...2008 Volume 12, Number 3 United Parcel Service’s IT Infrastructure: A Case Analysis Russell Baker, Jacksonville University Brian Wm. Dudley, Jacksonville University Sean Holt, Jacksonville University Chris Stockton, Jacksonville University Vanja Vukota, Jacksonville University ABSTRACT This analysis of the information technology infrastructure at United Parcel Service (UPS) begins with a general overview of the company’s information technology (IT) environment to establish awareness of the size and complexity of this corporation. It includes a macro-level breakdown of the technology infrastructure of UPS ranging from the data centers to the PCs currently in operation. The study also more closely analyzes the software and database structure of UPS along with an analysis of the company’s E-commerce activities. It includes an interview with Tom Creech, the North Florida District E-commerce manager at UPS. Finally, research was conducted to evaluate the emerging technologies that UPS is implementing and employing to support the business strategy and maintain its competitive edge. OVERVIEW I nfrastructure, data and proprietary e-commerce software elements are the foundation of UPS’s technology success. These tools work together to support its e-commerce strategy, which in turn supports the company’s business strategy. The company’s ongoing investment and research in emerging technologies gives UPS its competitive advantage according to Tom Creech, North Florida District E-commerce...

Words: 3609 - Pages: 15

Premium Essay

Thesis

...GAME THEORY Game theory has found its applications in numerous fields such as Economics, Social Science, Political Science, and Evolutionary Biology. Game theory is now finding its applications in computer science. The nature of computing is changing because of success of Internet and the revolution in Information technology. The advancement in technologies have made it possible to commoditize the components such as network, computing, storage and software. In the new paradigm, there are multiple entities (hardware, software agents, protocols etc.) that work on behalf of different autonomous bodies (such as a user, a business etc.) and provide services to other similar entities. Internet has made is possible for many such geographically distributed antonymous entities to interact with each other and provide various services. These entities will work for their respective owners to achieve their individual goals (maximize their individual payoffs), as opposed to obtaining a system optima (that is socially desirable). This results in an entirely different paradigm of computing where the "work" is performed in a completely distributed/decentralized fashion by different entities where the primary objective of each entity is to maximize the objective of its owner. Therefore, it is important to study traditional computer science concepts such as algorithm design, protocols, and performance optimization under a game-theoretic model.  This course aims to provide a basic understanding...

Words: 2068 - Pages: 9

Premium Essay

Bigdata

...4. 4.1 Big Data Introduction In 2004, Wal-Mart claimed to have the largest data warehouse with 500 terabytes storage (equivalent to 50 printed collections of the US Library of Congress). In 2009, eBay storage amounted to eight petabytes (think of 104 years of HD-TV video). Two years later, the Yahoo warehouse totalled 170 petabytes1 (8.5 times of all hard disk drives created in 1995)2. Since the rise of digitisation, enterprises from various verticals have amassed burgeoning amounts of digital data, capturing trillions of bytes of information about their customers, suppliers and operations. Data volume is also growing exponentially due to the explosion of machine-generated data (data records, web-log files, sensor data) and from growing human engagement within the social networks. The growth of data will never stop. According to the 2011 IDC Digital Universe Study, 130 exabytes of data were created and stored in 2005. The amount grew to 1,227 exabytes in 2010 and is projected to grow at 45.2% to 7,910 exabytes in 2015.3 The growth of data constitutes the “Big Data” phenomenon – a technological phenomenon brought about by the rapid rate of data growth and parallel advancements in technology that have given rise to an ecosystem of software and hardware products that are enabling users to analyse this data to produce new and more granular levels of insight. Figure 1: A decade of Digital Universe Growth: Storage in Exabytes Error! Reference source not found.3 1 ...

Words: 22222 - Pages: 89

Premium Essay

Pert

...Computer science From Wikipedia, the free encyclopedia Jump to: navigation, search Computer science or computing science (abbreviated CS) is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems.[1][2] Computer scientists invent algorithmic processes that create, describe, and transform information and formulate suitable abstractions to model complex systems. Computer science has many sub-fields; some, such as computational complexity theory, study the fundamental properties of computational problems, while others, such as computer graphics, emphasize the computation of specific results. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describe computations, while computer programming applies specific programming languages to solve specific computational problems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to humans. The general public sometimes confuses computer science with careers that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement...

Words: 5655 - Pages: 23

Premium Essay

Knowledge Management

...Abstract Today the world has more and more of free flow of information leading to transfer of knowledge from a person or an organization to others. Whereas this invariably leads to faster development, it also impacts the competitive advantage held by the innovators of processes or technology. It has therefore become strategically important for one and all in business to understand the knowledge, processes and controls to effectively manage the system of sharing and transferring the information in the most beneficial fashion. This paper dwells upon definition, types, scope, technology and modeling of knowledge and Knowledge Management while examining its strategic importance for retaining the competitive advantage by the organizations. What is knowledge? Plato first defined the concept of knowledge as ‘‘justified true belief'' in his Meno, Phaedo and Theaetetus. Although not very accurate in terms of logic, this definition has been predominant in Western philosophy (Nonaka and Takeuchi, 1995). Davenport et al. (1998) define knowledge as ``information combined with experience, context, interpretation and reflection''. The terms ‘‘knowledge'' and ‘‘information'' are often used inter-changeably in the literature and praxis but a distinction is helpful. The chain of knowledge flow is data-information-knowledge. Information is data to which meaning has been added by being categorized, classified, corrected, and condensed. Information and experience, key components...

Words: 6564 - Pages: 27

Free Essay

Network

...13 Agent-Oriented Novel Quantum Key Distribution Protocol for the Security in Wireless Network Xu Huang, Shirantha Wijesekera and Dharmendra Sharma University of Canberra Australia 1. Introduction Wireless security is becoming increasingly important as wireless applications and systems are widely adopted. Numerous organizations have already installed or are busy in installing “wireless local area networks” (WLANs). These networks, based on the IEEE 802.11 standard, are very easy to deploy and inexpensive. Wi-Fi allows LANs to be deployed without cabling for client devices, typically reducing the costs of network deployment and expansion. As of 2007 wireless network adapters are built into most modern laptops. The price of chipsets for Wi-Fi continues to drop, making it an economical networking option included in ever more devices. Wi-Fi has become widespread in corporate infrastructures, which also helps with the deployment of RFID technology that can piggyback on Wi-Fi. WiFi is a global set of standards, unlike mobile telephones, any standard Wi-Fi device will work anywhere in the world. Other important trends in wireless adoptions are including the introduction of wireless email with devices such as the Blackberry and The Palm VII, rampant digital cell phone use, including the use of short message service (SMWS), and the advent of Bluetooth devices. But the risks associated with the adoption of wireless networking are only now coming to light. A number of impressive attacks...

Words: 6431 - Pages: 26

Free Essay

Paper

...[pic] VLSI Routing for Enhanced Performance through QUANTUM BINARY PARTICLE SWARM OPTIMIZATION Arkabandhu Chowdhury (Roll no.- 000810701048) Souvik Kumar Saha (Roll no.- 000810701053) In completion of the final year project under the guidance of Dr. S. K. Sarkar, H.O.D., ETCE. Introduction to VLSI Routing The design of Very Large Scale Integrated (VLSI) circuits is one of the broadest areas in which the methods of combinatorial optimization can be applied. In the physical design process of VLSI circuits, the logical structure of a circuit is transformed into its physical layout. Detailed routing is one of the important tasks in this process. A detailed router connects pins of signal nets in a rectangular region under a set of routing constraints, such as the number of layers, the minimal space between the wires and the minimum wire width. The quality of this detailed routing has a strong influence on the performance and production costs of the circuit. The detailed routing in a rectangular region with pins exclusively located on the upper or lower boundary of the routing region is called “channel routing”. It is one of the most commonly occurring routing problems in VLSI circuits. The channel routing problem is NP-complete and, therefore, there is no deterministic algorithm to solve it in a fixed time frame and the problem of finding a globally optimized solution is still open. There have been plenty of results in this topic from the last few...

Words: 4166 - Pages: 17

Premium Essay

Accounting Software Evaluation

...to be effectively managed while remaining compliant with laws and regulations. Overview Sage 50 incorporates the MySQL database technology with features that includes a cashflow manager, management of customer relationships and billing capabilities. With these integrated features, the software has the ability to: invoice customers, track receipts, utilize Microsoft excel and word integration for certain key tasks, analyze and forecast payments and receipts, and customize cash flow to meet their business needs. The system requirements include: 2.0GHz processor for single and multiple users, 1 GB of RAM for single and multiple users, updated versions of Windows XP SP3, Windows 7 or 8, 1 GB of available disk space, Internet explorer 7,8 or 9, Microsoft.Net framework CLR 4.0, internet access with speeds of at least 56 Kbps, and Adobe reader. This software allows a maximum of 5 licensed users which can be selected through its user maintenance screen. Sage 50 is available in 5 different versions: Pro, Complete, Premium, Quantum and Accountant priced at $279, $379, $599, $3,149 respectively; the price of the Accountant version is only accessible to members of the Sage Accountant Network. The version chosen is dependent on accounting need of the user; the Pro only provides essential accounting functions that are required to manage the business. Sage 50 is available for use on the web through cloud computing on a variety of host...

Words: 2229 - Pages: 9

Premium Essay

Computer Software

...Computer software From Wikipedia, the free encyclopedia Jump to: navigation, search "Software" redirects here. For other uses, see Software (disambiguation). Computer software, or just software, is a collection of computer programs and related data that provide the instructions for telling a computer what to do and how to do it. In other words, software is a conceptual entity which is a set of computer programs, procedures, and associated documentation concerned with the operation of a data processing system. We can also say software refers to one or more computer programs and data held in the storage of the computer for some purposes. In other words software is a set of programs, procedures, algorithms and its documentation. Program software performs the function of the program it implements, either by directly providing instructions to the computer hardware or by serving as input to another piece of software. The term was coined to contrast to the old term hardware (meaning physical devices). In contrast to hardware, software is intangible, meaning it "cannot be touched".[1] Software is also sometimes used in a more narrow sense, meaning application software only. Sometimes the term includes data that has not traditionally been associated with computers, such as film, tapes, and records.[2] Examples of computer software include: * Application software includes end-user applications of computers such as word processors or video games, and ERP software for groups of...

Words: 3223 - Pages: 13

Free Essay

Blue Eyes

...CERTIFICATE This is to certify that Mr.Subhakanta Rout bearing regd. no. 0701230381 is a bona fide student of final year(7th semester), Dept. of Computer Science and Engineering, Synergy Institute Of Engineering and Technology,Dhenkanal. This seminar on ”Blue eyes” technology presented successfully by him towards partial fulfillment of 4 year B.Tech in Computer science and Engineering of Synergy Institute of Engineering and Technology,Dhenkanal under Biju Pattanaik University of Technology. Er.Siddharth Dash (Seminar Guide) Dr.C.Dash H.O.D Dept.of C.S.E ABSTRACT Is it possible to create a computer which can interact with us as we interact each other? For example imagine in a fine morning you walk on to your computer room and switch on your computer, and then it tells you “Hey friend, good morning you seem to be a bad mood today. And then it opens your mail box and shows you some of the mails and tries to cheer you. It seems to be a fiction, but it will be the life lead by “BLUE EYES” in the very near future. The basic idea behind this technology is to give the computer the human power. We all have some perceptual abilities. That is we can understand each others feelings. For example we can understand ones emotional state by analyzing his...

Words: 4265 - Pages: 18

Free Essay

Paper

...Do More at Dummies.com® Start with FREE Cheat Sheets Cheat Sheets include • Checklists • Charts • Common Instructions • And Other Good Stuff! To access the Cheat Sheet created specifically for this book, go to www.dummies.com/cheatsheet/stringtheory Get Smart at Dummies.com Dummies.com makes your life easier with 1,000s of answers on everything from removing wallpaper to using the latest version of Windows. Check out our • Videos • Illustrated Articles • Step-by-Step Instructions Plus, each month you can win valuable prizes by entering our Dummies.com sweepstakes. * Want a weekly dose of Dummies? Sign up for Newsletters on • Digital Photography • Microsoft Windows & Office • Personal Finance & Investing • Health & Wellness • Computing, iPods & Cell Phones • eBay • Internet • Food, Home & Garden Find out “HOW” at Dummies.com *Sweepstakes not currently available in all countries; visit Dummies.com for official rules. String Theory FOR DUMmIES ‰ by Andrew Zimmerman Jones with Daniel Robbins, PhD in Physics String Theory For Dummies® Published by Wiley Publishing, Inc. 111 River St. Hoboken, NJ 07030-5774 www.wiley.com Copyright © 2010 by Wiley Publishing, Inc., Indianapolis, Indiana Published by Wiley Publishing, Inc., Indianapolis, Indiana Published simultaneously in Canada No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning...

Words: 133965 - Pages: 536

Premium Essay

Is Vct

...MASTER OF TECHNOLOGY ADVANCED ELECTIVES SELECTION For Semester II 2014/2015 ATA/SE-DIP/TS-11/V1.34 Master of Technology in Software /Knowledge Engineering and Enterprise Business Analytics Table of Contents. MTECH ADVANCED ELECTIVES 1. INTRODUCTION. 1.1 Overview. 1.2 Courses. 1.3 Assessment. 1.4 Elective Selection Process. 2 2 2 2 3 3 2. SCHEDULE FOR ADVANCED ELECTIVES OFFERED DURING SEMESTER II 2014/2015. 2.1 MTech SE and KE Students. 2.2 MTech EBAC Students. 5 5 9 3. CURRICULUM. 12 4. DESCRIPTION OF COURSES. 4.1 Department of Electrical & Computer Engineering. 4.2 School of Computing. 4.3 Institute of Systems Science. 4.4 Department of Industrial & Systems Engineering. 4.5 Division of Engineering & Technology Management. 12 15 23 31 32 34 ATA/SE-DIP/TS-11/V1.34 page 1 of 35 Master of Technology in Software /Knowledge Engineering and Enterprise Business Analytics MASTER OF TECHNOLOGY Advanced Electives 1. INTRODUCTION 1.1 Overview All students that expect to have passed four core courses and eight basic electives after completing the scheduled examinations in November, and also have or expect to pass their project/internship, will be entitled to commence their Advanced Electives in NUS Semester II 2014/2015, which starts on 12 January 2015. However, it should be noted that a student’s registration for the Advanced Electives will be withdrawn if they either: 1. 2. 3. 4. 5. Fail any elective examination in November. Do not successfully...

Words: 15607 - Pages: 63