Free Essay

Data Processing

In:

Submitted By jones126
Words 1074
Pages 5
Computer Information Systems Paper

Charles E. Jones Jr.
CIS/319 Version 8
April 29, 2012
Ike Shia

Abstract
This paper is to focus on the accuracy, convenience, different types of storage methods, and the roles they play as far as the speed of the systems. Throughout the paper it discusses the best input and output methods for various hardware and software.

Accuracy of Data
Accuracy of data input is important. What method of data input would be best for each of the following situations and why: As far for printed questionnaires the best method for data input would be a keyboard, because it permits one to produce the required test that formulates the questionnaires. The best method for a telephone survey would be voice recognition and recording system because a voice input device can be programmed to distinguish and record answers spoken into the receiver allowing it to all be computed. Bank checks would best be used in a scanning method that recognizes bar codes. To be more specific a magnetic scanning input device would be the exact system to use. This system reads the numbers at the bottom of the checks and will mechanically make modifications to the correct accounts. It will input the codes in a system will magnetize the information at the bottom of the check for simple interpretation. For retail tags the best data input for would be a bar code scanner as well. This specific scanner is called an optical scanning input device. By using a bar code scanner it would be unnecessary to use a keyboard to enter numbers manually. Finally for the long documents the best method to use would be to use a simple scanner linked to computer due to the fact that it can be easily kept in order and sequence. Convenience and Quality Convenience and quality of output are important. Explain what method of output would be best for each of the following situations and why:
The best method for a hand held computer would be a screen, because most people can easily read them, and their display can change quickly. For Color photograph the best method of output for this situation is an ink-jet, due to the high printing quality that a color photograph demand. As far for a Resume the best method would be a laser or ink-jet for printing, due to their specific format they require a good printing quality, it would be as a business letter, and therefore you’ll need a good quality printing system. For a memorandum statistical report and a company annual report a printer would be the best for all of the last situations because hard copies would be needed to pass around and have firm visual of the information being distributed. Being that everyone does not have access to video or internet this would be the best method to use.
Storage Devices
Different types of storage devices are optimal for different situations. Explain what situations are appropriate for the following devices and why:
Hard disks are needed for your computer to boot up. They store all the info that you need to run applications and all the files that you create. Extra hard drives are an expectable way to back up an entire main drive. Floppy disks are no longer appropriate for temporary storage because floppy disk drives are no longer standard equipment on computers. They would be useless to more updated systems. A RAM is necessary for your computer to complete operations faster than if it had to keep accessing the hard drive for information. With the right system the ram can work very with the server. For a CD-ROM it is an optical drive that will also fall in the category with the floppy disk. Although it is still used in the more updated systems it is still not as efficient. It can hold a lot of information but is easily damaged and rendered unreadable. Using a tape is another older method of storing back up data for your computer. This also categorized with the floppy disk tape drives are no longer effective for backing up hard drives in the modern computers. Flash drives are the most convenient way to transfer files from one computer to another. it can hold much more information, and less likely to become damaged unlike the CD-ROM.

Determining the speed of a computer
Explain the role of each of the following in determining the speed of a computer:
Next to the CPU, RAM is one of the most speed-critical elements on your computer. In most cases, even with a fast CPU, a system without enough RAM will be slow. On systems without enough RAM, this light will be constantly blinking off and on. The more RAM a system has, the less it has to access the hard drive to change out data and the longer that hard drive will last. When your CPU requests a file, the hard drive delivers that file and puts it into RAM where it can be manipulated with greater speed. A Clock speed is a measure of how fast a computer completes basic computations and operations. For example a computer with a clock speed of 800MHz is running 800,000,000 cycles per second, while a 2.4GHz computer is running 2,400,000,000 cycles per second. The clock speed of the CPU isn't such a reliable test of overall computer speed. Many other factors come into play, the amount of RAM a computer has, the clock speed of that RAM, the clock speed of the front-side bus, and the cache size all play significant roles in determining the performance. CD-ROM drives are rated with a speed factor. It gives a data transfer rate of 150 kilobytes per second in the most common data format. For example, an 8x CD-ROM data transfer rate would be 1.2 megabytes per second. Generally, a speed factor of around 52x is considered excellent. Floppy drives are generally slow even if you have the best of them.

References
Hoffman, W. and Moore, J., Eds. Ethics and the Management of Computer Technology, Oelgeschlager, Gunn and Hain, Cambridge, Mass., 1982.

N. Gibbs, D. Jain, A. Joshi, S. Muddamsetti, S. Singh. A New Auditor’s Guide to planning, performing and presenting IT Audits., 2010

Parker, R., Lotus copyright protection is turning into a feeding frenzy, Infoworld, 12, 28 (Jul. 1990), 42-49.

Similar Documents

Free Essay

Simplified Data Processing on Large Clusters

...MapReduce: Simplified Data Processing on Large Clusters Jeffrey Dean and Sanjay Ghemawat jeff@google.com, sanjay@google.com Google, Inc. Abstract MapReduce is a programming model and an associated implementation for processing and generating large data sets. Users specify a map function that processes a key/value pair to generate a set of intermediate key/value pairs, and a reduce function that merges all intermediate values associated with the same intermediate key. Many real world tasks are expressible in this model, as shown in the paper. Programs written in this functional style are automatically parallelized and executed on a large cluster of commodity machines. The run-time system takes care of the details of partitioning the input data, scheduling the program’s execution across a set of machines, handling machine failures, and managing the required inter-machine communication. This allows programmers without any experience with parallel and distributed systems to easily utilize the resources of a large distributed system. Our implementation of MapReduce runs on a large cluster of commodity machines and is highly scalable: a typical MapReduce computation processes many terabytes of data on thousands of machines. Programmers find the system easy to use: hundreds of MapReduce programs have been implemented and upwards of one thousand MapReduce jobs are executed on Google’s clusters every day. 1 Introduction Over the past five years, the authors and many others at...

Words: 9138 - Pages: 37

Premium Essay

Data Processing Contact

...SYSTEMS, INC, INC. DATA PROCESSING AGREEMENT This DATA PROCESSING AGREEMENT is made and entered into as of the 1st day of August 2008 by and between Big Bank and Systems, Inc. In consideration of the mutual promises and covenants contained herein, the parties hereto agree as follows: 1. DATA PROCESSING SERVICES. Systems Inc. agrees to render to Big Bank the data processing services described on Exhibit "A" (the "Services") for the term of this Agreement, and Big Bank agrees to purchase the Services. This Agreement describes the general nature of the Services and the terms under which the Systems Inc. is to provide or make the Services available to the Big Bank. In the event of any conflict between the language of this Agreement and any brochures, verbal representations, or other materials describing the Services, the language of this Agreement shall control. 2. CONVERSION OF BIG BANK'S INFORMATION. ---------------------------------------------a. Within a reasonable time following execution of this Agreement, Systems Inc. will undertake the programming required to convert Big Bank's information files into a format compatible with Systems Inc.’s systems.. Big Bank agrees to cooperate with Systems Inc. in this endeavor and to provide all information and assistance required for Systems Inc. to successfully convert Big Bank's information files to a form compatible with Systems Inc.'s systems and equipment so that Systems Inc. can provide the Services...

Words: 2565 - Pages: 11

Free Essay

Processing of Gene Expression Data

...in research and routine diagnostics. However, the major hurdle is not the practical performance of the experiments themselves but rather the efficient evaluation and the mathematical and statistical analysis of the enormous amount of data gained by this technology, as these functions are not included in the software provided by the manufacturers of the detection systems. In this work, we focus on the mathematical evaluation and analysis of the data generated by real-time quantitative PCR, the calculation of the final results, the propagation of experimental variation of the measured values to the final results, and the statistical analysis. We developed a Microsoft® Excel®-based software application coded in Visual Basic for Applications, called Q-Gene, which addresses these points. Q-Gene manages and expedites the planning, performance, and evaluation of real-time quantitative PCR experiments, as well as the mathematical and statistical analysis, storage, and graphical presentation of the data. The Q-Gene software application is a tool to cope with complex realtime quantitative PCR experiments at a high-throughput scale and considerably expedites and rationalizes the experimental setup, data analysis, and data management while ensuring highest reproducibility. INTRODUCTION One of the best characteristics for the functional status of a certain cell is its gene expression pattern. Cells belonging to different tissues, cells in different developmental...

Words: 671 - Pages: 3

Free Essay

The Brazilian Federal Data Processing Service

...The Brazilian Federal Data Processing Service Michael La Motte CIS 512 Dr. Robert Culver Strayer University 01/13/15 There where news that government officials and Petrobras, Brazil national oil company, have been targeted of electronic spying by the NSA and other agencies, based on the documents leaked by former NSA contractor Edward Snowden. Brazil’s President Dilma Rousseff consider this action a breach of international law and the Brazilian Federal Data Processing Service known as Serpro will build a secure email system for Brazil’s federal government. The statement by the Director of National Intelligence James R. Clapper said, “We collect this information for many important reasons: for one, it could provide the United States and our allies early warning of international financial crises which could negatively impact the global economy.  It also could provide insight into other countries’ economic policy or behavior which could affect global markets.” Brazil over the years pursued close relationships with key U.S rivals like Iran, Venezuela, and Cuba and also because of corruption in 2007 three Petrobras executive where arrested. This will be enough reason for NSA to snoop on Brazil. Information is the key to resolved present and future issues. The Brazilian Federal Data Processing Service was created on December 1964 to help de sector of the public administrator and has improved technologies on several government agencies and is incorporated to the...

Words: 927 - Pages: 4

Premium Essay

Castle Family Restaurant

...Course Project: Stage II Name HRM340: Human Resource Information Systems INTRODUCTION Various types of HRIS systems and processes have been reviewed to help Jay Morgan and Family Castle Restaurant run more efficiently as a business. Jay Morgan the Operations Manager, have been using outdated methods for scheduling, recruiting, hiring, and answering questions from its employees. Maintaining accurate books and keeping constant communication with the Managers have been a challenge for Jay Morgan. If an accurate HRIS system can be implemented, Jay Morgan will be able to achieve more in business with less time and travel. BUSINESS ASSESSMENT Castle Family Restaurant is a family friendly dinning facility that has 8 locations, employing some part time employees and over 40% of fulltime employees. The locations are strategically located across the state of Florida in the hearts of neighborhoods that are suitable for any family. Castle Family provides a simple, old fashioned and pleasant dining experience. “Togetherness” is the culture that is promoted and the goal is to mimic the old fashioned restaurant environment where the family can sit, laugh and enjoy a meal together. Restaurants compete for customers every day, and proficiency is critical to providing quality customer service. In Rasmussen Reports, National Survey of 1000 Adults, more than 50 percent of Americans reported in 2011 say that they ate at a restaurant at least once a week, with...

Words: 1300 - Pages: 6

Premium Essay

Adp Company Analysis

...To evaluate the financial health and expected growth of Automatic Data Processing Inc., there are a few ratio analyses that can be looked at. The return on assets ratio analysis is a good indicator to see how profitable ADP Inc. is doing relative to its assets. Below is a line graph referring to the past five years of ADP’s ROA. In 2010, one can see that it started off well, but then started to decline until 2011. This occurred because there was a decline in employment as well as an increase in failed businesses (IBISWorld). After 2011 it’s percentage started to increase, as ADP better managed its assets to gain more earnings. The next ratio analysis to evaluate financial health is long-term debt to equity. Finding the long-term debt to equity ratio is important as it indicates the leverage of the firm. If the ratio is high than it is more risky which implies that there are more liabilities and less equity. As shown below, in a five-year span, the long-term debt to equity line graph looks the opposite of the one above. From 2010 to 2011 there is an increase in the ratio, which as mentioned before, there was an increase in failed businesses, causing the long-term debt to equity ratio to rise and become more risky. Fortunately, the risk decreased as ADP increased their clientele and client retention. The last ratio analysis is equally as important as the last two, as it is shows how much revenue ADP Inc. has made. The total asset turnover line graph also shows a trend of the past...

Words: 1491 - Pages: 6

Free Essay

Business Tactics

...Company Selection February 19, 2014 Business Tactics and Execution • Name of the organization Automatic Data Processing, Inc. (ADP) • Locations ADP Inc. has offices throughout the U.S., Ontario, Canada, Europe, Middle East, Africa, Asia/Pacific • Web address http://www.adp.com/ • Overall organizational strategy and business model According to the National Account Services website, “ADP focuses on service as the foundation of our business philosophy. ADP blends personalized service with best practices and innovative technology to give our clients a unique partnership experience that provides outsourced employee administrative solutions based on their specific business needs and requirements (2007).” World Class Service is how client relationship, product leadership, and operational excellence combine to meet the expectations and needs. These three elements are keys to our acclaimed service-based approach. Our dedication to client service makes us unique in the industry and helps ensure that every experience delivers on our value proposition. • Products or services, industry, and market position Products offered by ADP range from: Human Capital Management Payroll Services Talent Management Human Resources Management Benefits Administration Time and Attendance HR Business Process Outsourcing (HRBPO) Professional Employer Organization (PEO) Retirement...

Words: 401 - Pages: 2

Free Essay

Sfsdf

...here Input text here Input is taken from the user in form of touch, sound etc. From the first target. Input is taken from the user in form of touch, sound etc. From the first target. Processing is done through electro-optical sensing and tactile feedback detecting the first target and data related to it is collected. Execution of the program in response to the data is done. Processing is done through electro-optical sensing and tactile feedback detecting the first target and data related to it is collected. Execution of the program in response to the data is done. Processing Processing The data sensed and processed of the first target will be the output of the system and will act as the input to the computer. Results are displayed to the user in form of 5D. The data sensed and processed of the first target will be the output of the system and will act as the input to the computer. Results are displayed to the user in form of 5D. ------------------------------------------------- For every new user input, the process repeats. ------------------------------------------------- For every new user input, the process repeats. Output Output In the figure above, the system consists of user or target 1 the hardware and software used to sense the data and process it and result it in an output. The input is taken from the user or target 1 and sensors...

Words: 528 - Pages: 3

Premium Essay

Philippine Caats

...Auditing Standards and Practices Council Philippine Auditing Practice Statement 1009 COMPUTER-ASSISTED AUDIT TECHNIQUES PAPS 1009 PHILIPPINE AUDITING PRACTICE STATEMENT 1009 COMPUTER-ASSISTED AUDIT TECHNIQUES CONTENTS Paragraphs Introduction Description of Computer Assisted Audit Techniques (CAATs) Considerations in the Use of CAATs Using CAATs Using CAATs in Small Entity IT Environments Effective Date Acknowledgment 1-3 4-6 7-16 17-25 26 27 28-29 The Auditing Standards and Practices Council (ASPC) issues Philippine Auditing Practices Statements (PAPS or Statements) to provide practical assistance to auditors in implementing the Philippine Standards on Auditing (PSAs) or to promote good practice. Statements do not have the authority of PSAs. This Statement does not establish any new basic principles or essential procedures; its purpose is to assist auditors, and the development of good practice, by providing guidance on the application of the PSAs regarding the use of Computer Assisted Audit Techniques as an audit tool. This Statement applies to all uses of CAATs involving a computer of any type or size. The auditor exercises professional judgment to determine the extent to which any of the audit procedures described in this Statement may be appropriate in the light of the requirements of the PSAs and the entity’s particular circumstances. PAPS 1009 Introduction 1. The overall objectives and scope of an audit do not change when an audit is conducted in a computer...

Words: 3364 - Pages: 14

Premium Essay

The Importance of Measuring Enterprise Impact

...information. Huge amounts of raw data are produced during every operational transaction in the company. Processing raw data into valuable information allows an enterprise to take more accurate decisions into action. Information technologies give support in big business systems like (ERP) Enterprise Resource Planning, utilized in recognizing, extracting and analyzing business data, such as, sales revenue by product and/or department. Measuring data is difficult, and companies have to have complex systems for tracking ERP. Outsourcing Data With changing times, systems need to have data energy uses calculated into the core processes to retain more accurate data. Measuring impact is the recognized way in which you show the value your organization is delivering to its recipients and the general public as a whole. Often, companies feel the need to cut internal energy use; therefore, they outsource data processing duties. Businesses must be cautious when outsourcing data. This outsourcing can cause serious issues if the outsourced work is inaccurate or worse, manipulated to cause intentional damage to the company. It is difficult to have patience with outsourced companies that produce inaccurate work, as that is the main objective: they were hired to do the job proficiently and accurately. Having internal processes in place for data formulas can cut down significantly on misuse and incorrect data entry, as well as cut back on security breaches. Making sure that the data is properly reduced and...

Words: 360 - Pages: 2

Free Essay

Literature Review

...Paper on Big Data and Hadoop Harshawardhan S. Bhosale1, Prof. Devendra P. Gadekar2 1 Department of Computer Engineering, JSPM’s Imperial College of Engineering & Research, Wagholi, Pune Bhosale.harshawardhan186@gmail.com 2 Department of Computer Engineering, JSPM’s Imperial College of Engineering & Research, Wagholi, Pune devendraagadekar84@gmail.com Abstract: The term ‘Big Data’ describes innovative techniques and technologies to capture, store, distribute, manage and analyze petabyte- or larger-sized datasets with high-velocity and different structures. Big data can be structured, unstructured or semi-structured, resulting in incapability of conventional data management methods. Data is generated from various different sources and can arrive in the system at various rates. In order to process these large amounts of data in an inexpensive and efficient way, parallelism is used. Big Data is a data whose scale, diversity, and complexity require new architecture, techniques, algorithms, and analytics to manage it and extract value and hidden knowledge from it. Hadoop is the core platform for structuring Big Data, and solves the problem of making it useful for analytics purposes. Hadoop is an open source software project that enables the distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with a very high degree of fault tolerance. Keywords -Big Data, Hadoop, Map...

Words: 5034 - Pages: 21

Free Essay

Revenue Cycle Audit Objectives

...OBJECTIVES, CONTROLS, AND TEST OF CONTROLS INPUT CONTROLS 1. Credit Authorization Procedures * Only customer transactions that meet the organization’s credit standards are valid and should be processed further. 2. Testing Credit Procedures * It pertains to the valuation/allocation audit objectives and the accuracy objective 3. Data Validation Controls * Input validation controls are intended to detect transcription errors in transaction data before they are processed. Validation procedures are most effective when they are performed as close to the source of transaction as possible. a. Missing Data Check – are used to examine the contents of a field for the presence of blank data or spaces (missing product numbers, missing customer accounts numbers, or incomplete mailing address) b. Numeric-alphabetic data checks – determine whether the correct form of data is in a field. An invoice total should not contain alphabetic data. c. Limit Check-determine if the value In the fields exceeds an authorized limit. (prices limit or discounts limit) 4. Batch Controls – are used to manage high volumes of transaction data through a system. The following are relevant controls: a. A unique batch number b. A batch date c. A transaction code (indicating the type of transactions, such as sales order or cash receipt). d. The numbers of records in the batch (record count) e. The total dollar value of a financial field (batch control total) ...

Words: 507 - Pages: 3

Free Essay

Internet of Things

...| Internet of Things | 2014| Pragya Vaishwanar | Aricent Marketing Research Report | Summary It’s fair to say that more people have heard of the “internet of things” than have experienced it. More objects are becoming embedded with sensors and gaining the ability to communicate. The resulting information networks promise to create new business models, improve business processes, and reduce costs and risks. There is breathless press coverage of the phenomenon—always patiently re-explained by tech pundits as the trend by which all of one’s most mundane possessions will become internet-connected. These are invariably coupled with estimates that the internet of things will be a multi-trillion dollar business. 2014 is really, finally the year that the “internet of things”—that effort to remotely control every object on earth —becomes visible in one’s everyday lives. In a sense the internet of things is already with us. For one thing, anyone with a smartphone has already joined the club. The average smartphone is brimming with sensors—an accelerometer, a compass, GPS, light, sound, altimeter. It’s the prototypical internet-connected listening station, equally adept at monitoring our health, the velocity of our car, the magnitude of earthquakes and countless other things that its creators never envisioned. Yet despite repeated declarations one of the most successful sellers of baubles that help make your home “smart,” Smart-things, has only shipped 10,000 or so units since...

Words: 13930 - Pages: 56

Free Essay

Application Controls

...aAPPLICATION CONTROLS Application controls are programmed procedures designed to deal with potential exposures that threaten specific applications, such as payroll, purchases, and cash disbursements systems. Application controls fall into three broad categories: input controls, processing controls, and output controls. Input Controls Input controls at this stage are designed to ensure that these transactions are valid, accurate, and complete. Data input procedures can be either source document-triggered (batch) or direct input (real time). Source document input requires human involvement and is prone to clerical errors. Some types of errors that are entered on the source documents cannot be detected and corrected during the data input stage. Dealing with these problems may require tracing the transaction back to its source (such as contacting the customer) to correct the mistake. Direct input, on the other hand, employs real-time editing techniques to identify and correct errors immediately, thus significantly reducing the number of errors that enter the system. Classes of Input Control a. Source Document Controls. Careful control must be exercised over physical source documents in systems that use them to initiate transactions. Source document fraud can be used to remove assets from the organization. In the absence of other compensating controls to detect this type of fraud, the system would create an account payable and subsequently write a check in payment...

Words: 2018 - Pages: 9

Premium Essay

Xbis 219

...Evolution In order to keep up in the global revolution of internet Amazon has created services they can provide to businesses to utilize their unused processing capacity. These services make Amazon’s infrastructure available to companies and individuals to help them run the technical and logistical parts of their businesses (Introduction to Information Systems). I believe that Amazon is moving beyond its core competency, which began as an online book retailer. Amazon has gone from selling books to also selling some electronics, such as E-Readers. With the addition of the services Amazon is now introducing, they are more able to compete with Microsoft and Google. In adding services at reasonable prices Amazon has created and implemented a sound strategy that should help to keep them in the forefront of the global internet market. The services Amazon has added so far include Simple Storage Service (S3), which allows businesses to store data and applications on Amazon disk drives, Amazon charges 15 cents per gigabyte per month. Elastic Compute Cloud (EC2) is a service in which Amazon rents out processing power starting at 10 cents per hour. Some businesses use both of these services together saving the companies both money and time. The third service Amazon has added is called Mechanical Turk, this service combines processing power with networks of people who are paid to recognize inappropriate content in images or transcribing audio. Companies post work to Mechanical Turk...

Words: 593 - Pages: 3