Free Essay


In: Science

Submitted By desaijpr
Words 8297
Pages 34

Biotechnology is one of the innovative branches of science. Biotechnology has created new revolutions in this era by contributing industries, medical sciences, food technologies and genetics.

"Biotechnology is basically defined as the use of living organisms, their parts and their biochemical processes for the creation of beneficial products."

Bio-technology has its roots in the distant past and has a large, highly profitable, modern industrial outlets of great value to society for e.g. the fermentation, bio-pharmaceutical and food industries. The main reasons must be associated with the rapid advances in molecular biology, in particular, recombinant DNA technology, which is now giving bio-scientists a remarkable understanding and control over biological processes.

Some Technologies used in Biotechnology:

1. Bioprocessing technology * The use of bacteria, yeast, mammalian cells and/or enzymes to manufacture products * Large scale fermentation and cell cultures, carried out in huge bioreactors, manufacture useful products * Products: Insulin, vaccines, vitamins, antibiotics, amino acids, etc.

2. Monoclonal antibodies (MCAb) * Definition: Producing antibodies for medicine by cloning a single cell * MCAb are used for Home Pregnancy tests * Used to detect cancer (they bind to tumor cells) * Used to detect diseases in plants and animals and environmental pollutants 3. CELL CULTURE TECHNOLOGIES * Growing cells in containers or large bioreactors * Plant cell cultures are used to grow genetically engineered plants that contain useful traits, such as resistance to insect pests. 4. Tissue engineering technology * A combination of cell biology and materials science * Creates semi synthetic tissues in the laboratory * Uses natural collagen and synthetic polymers to produce artificial skin * The goal is to be able to create complex organs as replacement for diseased or injured organs
5. Genetic engineering technology * Makes use of Recombinant DNA technology * “The recombining of genetic material from two different sources” * It is the next step, after selective breeding, in changing the genetic makeup of organisms
6. Bioinformatics technology * Use and organization of information about biology * Interface of computer science, mathematics and molecular biology * Objective is to use database management to map and compare genomes, determine protein structure, design drugs, identify genes, etc.
7. DNA chip technology * A combination of the semiconductor industry and molecular biology * Consists Tagged DNA on a Microchip that can be read using lasers, computers and microscopes * Allows tens of thousands of gene to be analyzed on a single microchip. Used to detect mutations and diagnose genetic diseases.


Today, pioneers of biotechnology are discovering new solutions for better feed, food and consumer products. They are building on the knowledge we gained through the scientific innovations of earlier pioneers such as the Egyptians, Christopher Columbus, Louis Pasteur, Gregor Mendel, James Watson and Francis Crick, and Herbert Boyer. See how past discoveries have enhanced quality of life.

2500-2000BC- Science Along the Nile Expanding on their understanding of scientific processes, ancient Egyptians innovated with their use of advanced fermentation and breeding practices. Did you know? * The ancient Egyptians made wine using fermentation techniques based on an understanding of the microbiological processes that occur in the absence of oxygen. * Egyptians also applied fermentation technologies to make dough rise during bread making. Due in part to this application, there were more than 50 varieties of bread in Egypt more than 4,000 years ago. * In wetter parts of the Nile Valley, Egyptians also bred geese and cattle to meet their society's nutritional and dietary needs. 1492- Columbs and Potatos

Beginning with his first visit to the Americas in 1492, Christopher Columbus and other explorers introduced corn, native to the Americas, to the rest of the world, and European growers adapted the plant to their unique growing conditions. Spanish navigators also returned with potatoes, which are native to the Andes in South America. Two centuries after their European introduction, potatoes were a staple in Ireland, Germany and other European countries.

1864- Pasteurization

In 1864, French chemist Louis Pasteur developed the process named after him and known today as pasteurization, which uses heat to destroy harmful microorganisms in products. The products are then sealed airtight for safety. Pasteur's scientific breakthrough enhanced quality of life, allowing products such as milk to be transported without spoiling.
1865- Mendel and Modern Genetics

In the mid-1800s, Austrian monk, botanist and plant scientist Gregor Mendel carefully studied the principle of heredity. Experimenting with garden peas, Mendel successfully cross-bred traits, such as pea color, plant height and pod size. Mendel showed that differences, such as a plant's height or color, could be attributed to the passing of traits and genes — the basic building blocks of life.

Many people never knew of Mendel's innovations until after his death. When elected abbot of his monastery, Mendel's focus shifted from science to administrative duties. Many never knew of Mendel's scientific discovery until Europeans re-discovered his research and findings on their own decades later in 1900.


In the early 20th century, agricultural expert Henry Wallace applied the principles of hybridization to develop new, higher-yielding seeds. Wallace went on to apply his scientific innovation to a business model as one of the early leaders of Pioneer Hi-Bred International, Inc., today a DuPont business. A precursor to more advanced cross-breeding and eventually biotechnology, hybridization is the process of crossing plant varieties to produce crops with more favorable traits — or combining genes from two or more varieties of a plant species to produce improved seed. For example, a breeder might eliminate a plant's thorns by cross-breeding with a thornless variety. The often imprecise process of traditional plant breeding takes years to control for desired traits.

1953-Discovery of DNA structure

People didn't know where genes lived until DNA, or deoxyribonucleic acid, was "discovered" or understood in the early 1950s. British scientist Rosalind Franklin's DNA research formed the foundation for James Watson and Francis Crick's 1953 discovery of the structure of DNA, the ladder-like double helix. Watson and Crick perfected the DNA structural model that Franklin explored earlier.

Understanding DNA was essential to the exploration of biotechnology. Cells are the basic unit of living matter in all organisms, and DNA carries the information determining what traits a cell will have. With biotechnology, scientists could express favorable traits by lending DNA from one organism to another. From the beginning, scientists saw the potential for new drugs designed to help the body do what it couldn't do on its own, or crops able to protect themselves from disease. For example, through biotechnology-developed built-in protection, researchers have developed corn plants resistant to rootworm, beetle-like pests that, in early larval stages, feed on the plant's roots. Every year, corn rootworm costs around $1 billion to farmers.

1973-Biotechnology Arrives

In 1973, researchers Stanley Cohen and Herbert Boyer were the first to apply this technique. Working to help people living with diabetes, they lifted genetic materials from one organism's DNA and copy them into another's. It's the story of insulin.

The Story of Insulin & Biotechnology. The human body produces insulin to regulate blood sugar levels. Diabetes occurs when the body does not produce insulin or cannot produce enough insulin. People with diabetes often need injections of insulin, which doctors first provided patients through supplies taken from pigs and cows.

However, scientists did not know the long-term effects of having animal insulin in your body. In 1978, Boyer was able to take pieces of human DNA and isolate a gene for insulin using biotechnology. He then inserted it into bacteria, which allowed the gene to reproduce a larger quantity of insulin for diabetics. This scientific advancement vastly improved quality of life for many people living with diabetes and guaranteed their safety.

1980-Today-New Crop Varieties

Biotechnology continues to develop. In the 1980s, testing of biotechnology-derived foods began, and after its FDA approval in 1994, the FlavrSavr tomato gave consumers a more flavorful tomato that stays fresh longer. Soon after that, new soybean and corn crop varieties that protect themselves were introduced. Three years after the FlavrSavr tomato's introduction, 18 biotechnology-derived crops were approved by the U.S. government, and research and development continues to improve agricultural productivity and enhance foods' nutritional value.


A bio-technologist can utilize techniques derived from chemistry, microbiology, bio-chemistry, chemical engineering and computer science. Bio-technologists must also aim to achieve a close working cooperation with experts from other related fields such as medicine, nutrition, the pharmaceutical and chemical industries, environmental protection and waste process technology.

Bio-technology is a demanding industry that requires a skilled workforce and a supportive public to ensure continued growth. The main types of companies involved with bio-technology are:

CATEGORY | BIO-TECHNOLOGY INVOLVEMENT | Diagnostics | Clinical testing and diagnosis, food, environment, agriculture. | Agriculture/Forestry/Horticulture | Novel crops or animal varieties, pesticides. | Food | Wide range of food products, fertilizers, beverages, ingredients. | Environment | Waste treatment, bio-remediation, and energy production. | Chemical Intermediates | Reagents including enzymes, DNA/RNA and specialty chemicals. | Equipment | Hardware, bio-reactors, software and consumables supporting bio-technology. |

Table: - Types of companies involved with bio-technology

Inherent in the development of fermentation processes is the growing close relationship between the bio-chemist, the microbiologist and the chemical engineer. Thus, bio-technology is not a sudden discovery but rather a coming of age of a technology that was initiated several decades ago.

The growth in awareness of modern bio-technology parallels the serious worldwide changes in the economic climate arising from the escalation of oil prices since 1973. There is a growing realization that fuels and other non-renewable resources will one day be in limited supply. This will result in the requirement of cheaper and more secure energy sources and chemical feedstock, which bio-technology could perhaps fulfill.

New applications are likely to be seen earliest in the area of health care and medicine, followed by agriculture and food technology. Exciting new medical treatments and drugs based on bio-technology are appearing with ever-increasing regularity.

Bio-technology will be increasingly required to meet the global population’s current and future needs for food products that are safe and nutritious while also ensuring a continuous improvement in the efficiency of food production. Bio-technology methods can now improve the nutrition, taste and appearance of plants and various food products, enhance resistance to specific viruses and insect pests, and produce safer herbicides.

ADVANTAGES OF BIO-TECHNOLOGY (1) Bioinformatics: - Makes the rapid organization and analysis of biological data possible, via interdisciplinary approaches which address biological problems using computational techniques. The field may also be referred to as computational biology, and can be defined as, "conceptualizing biology in terms of molecules, and then applying informatics techniques to understand and organize the information associated with these molecules, on a large scale." Bioinformatics plays a key role in various areas, such as functional genomics, structural genomics, and proteomics, and forms a key component in the biotechnology and pharmaceutical sector. (2) Blue biotechnology: - Marine and aquatic applications of biotechnology, used to improve cleanup of toxic spills, improve yields of fisheries, etc. (3) Green biotechnology: - Agricultural uses of biotechnology, such as the selection and domestication of plants via micro propagation, designing transgenic plants to grow under specific environmental conditions or in the presence (or absence) of certain agricultural chemicals, development of more environmentally friendly solutions than traditional industrial agriculture (e.g., the engineering of a plant to express a pesticide, thereby eliminating the need for external application of pesticides, like Bt corn). Among the benefits are crops with better taste, texture, appearance, aroma, nutrition, yield, robustness in adverse environmental conditions, and resistance to herbs, fungi, and pests. (4) Red biotechnology: - Application of biotechnology to medicine, including the designing of organisms to produce antibiotics, and the engineering of genetic cures through genomic manipulation. Other areas: (a) Drug production: - Genetically altered mammalian cells, such as Chinese Hamster Ovary (CHO) cells, are also used to manufacture certain pharmaceuticals. Another promising new biotechnology application is the development of plant-made pharmaceuticals. A genetically engineered bacterium produces vast quantities of synthetic human insulin at relatively low cost. Biotechnology has also made it possible to cheaply produce human growth hormone, clotting factors for hemophiliacs, fertility drugs, erythropoietin, and other drugs.

(b) Pharmacogenomics. The study of how genetic inheritance affects an individual's response to drugs, in order to design tailor-made medicines adapted uniquely to each person’s genetic makeup, based on the proteins, enzymes and RNA molecules that are associated with specific genes and diseases, to optimize drug dosage, maximize therapeutic effects, and decrease damage to nearby healthy cells. Pharmacogenomics should also significantly expedite the drug discovery process. (c) Gene therapy. Treating or even curing of genetic and acquired diseases like cancer and AIDS by using normal genes to supplement or replace defective genes, or to bolster a normal function such as immunity. (d) Genetic testing. DNA “probes” can be injected that will bind to any mutated sequences in a human's genome, flagging the mutation. DNA sequences in a diseased patient can also be compared to healthy individuals in order to determine the genetic cause of a malady (e.g., carrier screening, confirmational diagnosis of symptomatic individuals, forensic/identity testing, newborn screening, prenatal diagnostic screening, presymptomatic testing for estimating the risk of developing disorders). (e) Improved vaccines. Vaccines can be developed that will elicit the immune response without the attendant risks of infection, and that will be relatively inexpensive, stable, easy to store, and capable of being engineered to carry several strains of pathogen simultaneously. (f) Biopharmaceuticals. By using computer-generated images of complex molecules such as proteins, the underlying mechanisms and pathways of a malady can be better understood and targeted. (g) New medical therapies. Biotechnology has led to treatments for hepatitis B, hepatitis C, cancers, arthritis, haemophilia, bone fractures, multiple sclerosis, and cardiovascular disorders. (h) Diagnostics. The biotechnology industry has also been instrumental in developing molecular diagnostic devices that can be used to define the target patient population for a given biopharmaceutical. Herceptin, for example, was the first drug approved for use with a matching diagnostic test and is used to treat breast cancer in women whose cancer cells express the protein HER2. (5) White biotechnology: - Also known as industrial biotechnology. Exemplified by the designing of an organism to produce a useful chemical, the use of enzymes as industrial catalysts to either produce valuable chemicals or destroy hazardous/polluting chemicals, and the development of biotechnological processes that consume fewer resources than traditional processes used to produce industrial goods. (6) Bioeconomics: - Investment in applied biotechnologies to increase economic output DISADVANTAGES OF BIO-TECHNOLOGY

(1) Ethical and moral issues: - Ethical and moral issues surrounding cloning and the effect this has on society. (2) Loss of privacy: - Medical and genetic information is more likely to be stored and shared.

(3) Discrimination: - Private insurers, employers, and governmental entities are more likely to discriminate against people who have genetic or medical anomalies, especially if such information is available in databases. (4) Cloning: - Reproductive cloning could create "Frankensteins" or result in eugenic practices. Therapeutic cloning is also regarded as unethical by some groups, primarily religious organizations.

(5) Transformations of wild species: - Exposure of wild species to genetically modified crops or domestic livestock could cause "super species" to evolve with resistance to pesticides, herbicides, or fungicides. (6) Loss of biodiversity: - Development of genetically modified crops or domestic livestock could reduce genetic variety among both domesticated and wild species.

(7) Harmful chemicals: - Although biotechnology will generate many new and valuable chemicals, some chemicals with unknown or damaging environmental impacts are likely to be developed. APPLICATIONS OF BIO-TECHNOLOGY

Bio-technology has applications in four major industrial areas, including health care (medical), crop production and agriculture, non food (industrial) uses of crops and other products (e.g. bio-degradable plastics, vegetable oil, bio-fuels), and environmental uses. For example, one application of bio-technology is the directed use of organisms for the manufacture of organic products (examples include beer and milk products). Another example is using naturally present bacteria by the mining industry in bio-leaching. Bio-technology is also used to recycle, treat waste, clean up sites contaminated by industrial activities (bio-remediation), and also to produce bio-logical weapons.
a) Pharmacogenomics

Fig: - DNA Microarray chip – Some can do as many as a million blood tests at once
Pharmacogenomics is the study of how the genetic inheritance of an individual affects his/her body’s response to drugs. It is a coined word derived from the words “pharmacology” and “genomics”. It is hence the study of the relationship between pharmaceuticals and genetics. The vision of pharmacogenomics is to be able to design and produce drugs that are adapted to each person’s genetic makeup.
Pharmacogenomics results in the following benefits:

* Development of tailor-made medicines. Using pharmacogenomics, pharmaceutical companies can create drugs based on the proteins, enzymes and RNA molecules that are associated with specific genes and diseases. These tailor-made drugs promise not only to maximize therapeutic effects but also to decrease damage to nearby healthy cells. * More accurate methods of determining appropriate drug dosages. Knowing a patient’s genetics will enable doctors to determine how well his/ her body can process and metabolize a medicine. This will maximize the value of the medicine and decrease the likelihood of overdose.

* Improvements in the drug discovery and approval process. The discovery of potential therapies will be made easier using genome targets. Genes have been associated with numerous diseases and disorders. With modern bio-technology, these genes can be used as targets for the development of effective new therapies, which could significantly shorten the drug discovery process.

* Better vaccines. Safer vaccines can be designed and produced by organisms transformed by means of genetic engineering. These vaccines will elicit the immune response without the attendant risks of infection. They will be inexpensive, stable, easy to store, and capable of being engineered to carry several strains of pathogen at once.

b) Pharmaceutical products

Most traditional pharmaceutical drugs are relatively simple molecules that have been found primarily through trial and error to treat the symptoms of a disease or illness. Bio-pharmaceuticals are large biological molecules known as proteins and these usually target the underlying mechanisms and pathways of a malady (but not always, as is the case with using insulin to treat type 1 diabetes mellitus, as that treatment merely address the symptoms of the disease, not the underlying cause which is autoimmunity); it is a relatively young industry. They can deal with targets in humans that may not be accessible with traditional medicines. A patient typically is dosed with a small molecule via a tablet while a large molecule is typically injected.

Fig: - Image of insulin hexamers highlighting the threefold symmetry, the zinc ions holding it together, and the histidine residues involved in zinc binding.
Small molecules are manufactured by chemistry but larger molecules are created by living cells such as those found in the human body: for example, bacteria cells, yeast cells, animal or plant cells.

Modern bio-technology is often associated with the use of genetically altered microorganisms such as E. coli or yeast for the production of substances like synthetic insulin or antibiotics. It can also refer to transgenic animals or transgenic plants, such as Bt corn. Genetically altered mammalian cells, such as Chinese Hamster Ovary (CHO) cells, are also used to manufacture certain pharmaceuticals. Another promising new bio-technology application is the development of plant-made pharmaceuticals.

Bio-technology is also commonly associated with landmark breakthroughs in new medical therapies to treat hepatitis B, hepatitis C, cancers, arthritis, hemophilia, bone fractures, multiple sclerosis, and cardiovascular disorders. The bio-technology industry has also been instrumental in developing molecular diagnostic devices that can be used to define the target patient population for a given bio-pharmaceutical. Herceptin, for example, was the first drug approved for use with a matching diagnostic test and is used to treat breast cancer in women whose cancer cells express the protein HER2.

Modern bio-technology can be used to manufacture existing medicines relatively easily and cheaply. The first genetically engineered products were medicines designed to treat human diseases. To cite one example, in 1978 Genentech developed synthetic humanized insulin by joining its gene with a plasmid vector inserted into the bacterium Escherichia coli. Insulin, widely used for the treatment of diabetes, was previously extracted from the pancreas of abattoir animals (cattle and/or pigs).

The resulting genetically engineered bacterium enabled the production of vast quantities of synthetic human insulin at relatively low cost. According to a 2003 study undertaken by the International Diabetes Federation (IDF) on the access to and availability of insulin in its member countries, synthetic 'human' insulin is considerably more expensive in most countries where both synthetic 'human' and animal insulin are commercially available: e.g. within European countries the average price of synthetic 'human' insulin was twice as high as the price of pork insulin. Yet in its position statement, the IDF writes that "there is no overwhelming evidence to prefer one species of insulin over another" and modern, highly purified animal insulin remain a perfectly acceptable alternative.

Modern bio-technology has evolved, making it possible to produce more easily and relatively cheaply human growth hormone, clotting factors for hemophiliacs, fertility drugs, erythropoietin and other drugs. Most drugs today are based on about 500 molecular targets. Genomic knowledge of the genes involved in diseases, disease pathways, and drug-response sites are expected to lead to the discovery of thousands more new targets.

c) Genetic testing

Genetic testing involves the direct examination of the DNA molecule itself. A scientist scans a patient’s DNA sample for mutated sequences.

There are two major types of gene tests. In the first type, a researcher may design short pieces of DNA (“probes”) whose sequences are complementary to the mutated sequences. These probes will seek their complement among the base pairs of an individual’s genome. If the mutated sequence is present in the patient’s genome, the probe will bind to it and flag the mutation. In the second type, a researcher may conduct the gene test by comparing the sequence of DNA bases in a patient’s gene to disease in healthy individuals or their progeny.
Genetic testing is now used for:

* Carrier screening, or the identification of unaffected individuals who carry one copy of a gene for a disease that requires two copies for the disease to manifest; * Conformational diagnosis of symptomatic individuals; * Determining sex; * Forensic/identity testing; * Newborn screening; * Prenatal diagnostic screening; * Presymptomatic testing for estimating the risk of developing adult-onset cancers; * Presymptomatic testing for predicting adult-onset disorders. Some genetic tests are already available, although most of them are used in developed countries. The tests currently available can detect mutations associated with rare genetic disorders like cystic fibrosis, sickle cell anemia, and Huntington’s disease. Recently, tests have been developed to detect mutation for a handful of more complex conditions such as breast, ovarian, and colon cancers. However, gene tests may not detect every mutation associated with a particular condition because many are as yet undiscovered, and the ones they do detect may present different risks to different people and populations.

d) Gene therapy

Gene therapy may be used for treating, or even curing, genetic and acquired diseases like cancer and AIDS by using normal genes to supplement or replace defective genes or to bolster a normal function such as immunity. It can be used to target somatic (i.e., body) or gametes (i.e., egg and sperm) cells. In somatic gene therapy, the genome of the recipient is changed, but this change is not passed along to the next generation. In contrast, in germline gene therapy, the egg and sperm cells of the parents are changed for the purpose of passing on the changes to their offspring.

Fig: - Gene therapy

There are basically two ways of implementing a gene therapy treatment:

1. Ex vivo, which means “outside the body” – Cells from the patient’s blood or bone marrow are removed and grown in the laboratory. They are then exposed to a virus carrying the desired gene. The virus enters the cells, and the desired gene becomes part of the DNA of the cells. The cells are allowed to grow in the laboratory before being returned to the patient by injection into a vein. 2. In vivo, which means “inside the body” – No cells are removed from the patient’s body. Instead, vectors are used to deliver the desired gene to cells in the patient’s body. As of June 2001, more than 500 clinical gene-therapy trials involving about 3,500 patients have been identified worldwide. Around 78% of these are in the United States, with Europe having 18%. These trials focus on various types of cancer, although other multigenic diseases are being studied as well. Recently, two children born with severe combined immunodeficiency disorder (“SCID”) were reported to have been cured after being given genetically engineered cells.
Gene therapy faces many obstacles before it can become a practical approach for treating disease. At least four of these obstacles are as follows:

* Gene delivery tools: Genes are inserted into the body using gene carriers called vectors. The most common vectors now are viruses, which have evolved a way of encapsulating and delivering their genes to human cells in a pathogenic manner. Scientists manipulate the genome of the virus by removing the disease-causing genes and inserting the therapeutic genes. However, while viruses are effective, they can introduce problems like toxicity, immune and inflammatory responses, and gene control and targeting issues. In addition, in order for gene therapy to provide permanent therapeutic effects, the introduced gene needs to be integrated within the host cell's genome. Some viral vectors affect this in a random fashion, which can introduce other problems such as disruption of an endogenous host gene. * High costs: Since gene therapy is relatively new and at an experimental stage, it is an expensive treatment to undertake. This explains why current studies are focused on illnesses commonly found in developed countries, where more people can afford to pay for treatment. It may take decades before developing countries can take advantage of this technology. * Limited knowledge of the functions of genes: Scientists currently know the functions of only a few genes. Hence, gene therapy can address only some genes that cause a particular disease. Worse, it is not known exactly whether genes have more than one function, which creates uncertainty as to whether replacing such genes is indeed desirable. * Multi-gene disorders and effect of environment: Most genetic disorders involve more than one gene. Moreover, most diseases involve the interaction of several genes and the environment. For example, many people with cancer not only inherit the disease gene for the disorder, but may have also failed to inherit specific tumor suppressor genes. Diet, exercise, smoking and other environmental factors may have also contributed to their disease.


Genetic modification involves the insertion or deletion of genes. In the process of cisgenesis, genes are artificially transferred between organisms that could be conventionally bred. In the process of transgenesis, genes from a different species are inserted, which is a form of horizontal gene transfer. In nature this can occur when exogenous DNA penetrates the cell membrane for any reason. To do this artificially may require transferring genes as part of an attenuated virus genome or physically inserting the extra DNA into the nucleus of the intended host using a micro syringe, or as a coating on gold nanoparticles fired from a gene gun. However, other methods exploit natural forms of gene transfer, such as the ability of Agrobacterium to transfer genetic material to plants, and the ability of lentiviruses to transfer genes to animal cells.
The main advantages are listed as follows: * Crop yield: Using the techniques of modern bio-technology, one or two genes (Smartstax from Monsanto in collaboration with Dow AgroSciences will use 8, starting in 2010) may be transferred to a highly developed crop variety to impart a new character that would increase its yield. However, while increases in crop yield are the most obvious applications of modern bio-technology in agriculture, it is also the most difficult one. Current genetic engineering techniques work best for effects that are controlled by a single gene. Many of the genetic characteristics associated with yield (e.g., enhanced growth) are controlled by a large number of genes, each of which has a minimal effect on the overall yield. There is, therefore, much scientific work to be done in this area. * Reduced vulnerability of crops to environmental stresses: Crops containing genes that will enable them to withstand biotic and abiotic stresses may be developed. For example, drought and excessively salty soil are two important limiting factors in crop productivity. Bio-technologists are studying plants that can cope with these extreme conditions in the hope of finding the genes that enable them to do so and eventually transferring these genes to the more desirable crops. One of the latest developments is the identification of a plant gene, At-DBF2, from Arabidopsis thaliana, a tiny weed that is often used for plant research because it is very easy to grow and its genetic code is well mapped out. When this gene was inserted into tomato and tobacco cells (see RNA interference), the cells were able to withstand environmental stresses like salt, drought, cold and heat, far more than ordinary cells. If these preliminary results prove successful in larger trials, then At-DBF2 genes can help in engineering crops that can better withstand harsh environments. Researchers have also created transgenic rice plants that are resistant to rice yellow mottle virus (RYMV). In Africa, this virus destroys majority of the rice crops and makes the surviving plants more susceptible to fungal infections. * Increased nutritional qualities: Proteins in foods may be modified to increase their nutritional qualities. Proteins in legumes and cereals may be transformed to provide the amino acids needed by human beings for a balanced diet. A good example is the work of Professors Ingo Potrykus and Peter Beyer in creating Golden rice * Improved taste, texture or appearance of food: Modern bio-technology can be used to slow down the process of spoilage so that fruit can ripen longer on the plant and then be transported to the consumer with a still reasonable shelf life. This alters the taste, texture and appearance of the fruit. More importantly, it could expand the market for farmers in developing countries due to the reduction in spoilage. However, there is sometimes a lack of understanding by researchers in developed countries about the actual needs of prospective beneficiaries in developing countries. For example, engineering soybeans to resist spoilage makes them less suitable for producing tempeh which is a significant source of protein that depends on fermentation. The use of modified soybeans results in a lumpy texture that is less palatable and less convenient when cooking. The first genetically modified food product was a tomato which was transformed to delay its ripening. Researchers in Indonesia, Malaysia, Thailand, Philippines and Vietnam are currently working on delayed-ripening papaya in collaboration with the University of Nottingham and Zeneca. * Bio-technology in cheese production: Enzymes produced by micro-organisms provide an alternative to animal rennet – a cheese coagulant – and an alternative supply for cheese makers. This also eliminates possible public concerns with animal-derived material, although there are currently no plans to develop synthetic milk, thus making this argument less compelling. Enzymes offer an animal-friendly alternative to animal rennet. While providing comparable quality, they are theoretically also less expensive. About 85 million tons of wheat flour is used every year to bake bread. By adding an enzyme called maltogenic amylase to the flour, bread stays fresher longer. Assuming that 10–15% of bread is thrown away as stale; if it could be made to stay fresh another 5–7 days then perhaps 2 million tons of flour per year would be saved. Other enzymes can cause bread to expand to make a lighter loaf, or alter the loaf in a range of ways. * Reduced dependence on fertilizers, pesticides and other agrochemicals: Most of the current commercial applications of modern bio-technology in agriculture are on reducing the dependence of farmers on agrochemicals. For example, Bacillus thuringiensis (Bt) is a soil bacterium that produces a protein with insecticidal qualities. Traditionally, a fermentation process has been used to produce an insecticidal spray from these bacteria. In this form, the Bt toxin occurs as an inactive protoxin, which requires digestion by an insect to be effective. There are several Bt toxins and each one is specific to certain target insects. Crop plants have now been engineered to contain and express the genes for Bt toxin, which they produce in its active form. When a susceptible insect ingests the transgenic crop cultivar expressing the Bt protein, it stops feeding and soon thereafter dies as a result of the Bt toxin binding to its gut wall. Bt corn is now commercially available in a number of countries to control corn borer (a lepidopteran insect), which is otherwise controlled by spraying (a more difficult process). Crops have also been genetically engineered to acquire tolerance to broad-spectrum herbicide. The lack of herbicides with broad-spectrum activity and no crop injury was a consistent limitation in crop weed management. Multiple applications of numerous herbicides were routinely used to control a wide range of weed species detrimental to agronomic crops. Weed management tended to rely on preemergence—that is, herbicide applications were sprayed in response to expected weed infestations rather than in response to actual weeds present. Mechanical cultivation and hand weeding were often necessary to control weeds not controlled by herbicide applications. The introduction of herbicide-tolerant crops has the potential of reducing the number of herbicide active ingredients used for weed management, reducing the number of herbicide applications made during a season, and increasing yield due to improved weed management and less crop injury. Transgenic crops that express tolerance to glyphosate, glufosinate and bromoxynil have been developed. These herbicides can now be sprayed on transgenic crops without inflicting damage on the crops while killing nearby weeds. From 1996 to 2001, herbicide tolerance was the most dominant trait introduced to commercially available transgenic crops, followed by insect resistance. In 2001, herbicide tolerance deployed in soybean, corn and cotton accounted for 77% of the 626,000 square kilometres planted to transgenic crops; Bt crops accounted for 15%; and "stacked genes" for herbicide tolerance and insect resistance used in both cotton and corn accounted for 8%. * Production of novel substances in crop plants: Bio-technology is being applied for novel uses other than food. For example, oilseed can be modified to produce fatty acids for detergents, substitute fuels and petrochemicals. Potatoes, tomatoes, rice tobacco, lettuce, safflowers, and other plants have been genetically engineered to produce insulin and certain vaccines. If future clinical trials prove successful, the advantages of edible vaccines would be enormous, especially for developing countries. Homegrown vaccines would also avoid logistical and economic problems posed by having to transport traditional preparations over long distances and keeping them cold while in transit. And since they are edible, they will not need syringes, which are not only an additional expense in the traditional vaccine preparations but also a source of infections if contaminated. In the case of insulin grown in transgenic plants, it is well-established that the gastrointestinal system breaks the protein down therefore this could not currently be administered as an edible protein. However, it might be produced at significantly lower cost than insulin produced in costly bio-reactors. For example, Calgary, Canada-based SemBioSys Genetics, Inc. reports that its safflower-produced insulin will reduce unit costs by over 25% or more and approximates a reduction in the capital costs associated with building a commercial-scale insulin manufacturing facility of over $100 million, compared to traditional bio-manufacturing facilities.
The presence and functioning of microbial communities affect our everyday lives in so many ways, but none more so than their role in soil, waste and water management. Historically, the need to supply populations with safe drinking water and acceptable sewage disposal has mainly been the concern of sanitary engineers. (Engineering-driven solutions to these basic sanitary problems of communities were evolving long before there was any appreciation of the intrinsic roles of microorganisms.)More recently, the skills and knowledge of the microbiologist are increasingly being employed to develop new systems. Microbial ecology is the science that studies the interrelationships between microorganisms and their living (biotic) and non-living (abiotic) environments. The increasing scientific and public awareness of microbial ecology since the 1960s derives mainly from the recognition of the central role of microorganisms in maintaining good environmental quality. It is the microbes in their multivarious forms that largely direct the orderly flow of materials and energy (bio-geochemical cycles) through the world’s ecosystems by way of their immense and varied metabolic abilities to transform inorganic and organic materials.

Bio-degradation can be defined as the decomposition of substances by microbial activities either by single organisms or, most often, by microbial consortia. Microorganisms found in soil and water will attempt to utilise any organic substances encountered as sources of energy and carbon by enzymatically breaking them down into simple molecules that can be absorbed and used. Under suitable environmental conditions, all natural organic compounds should be degraded.

Fig: - Natural microbial bio-degradation of organic molecules.
And, for this reason, large-scale deposits of naturally formed organic compounds are rarely observed. When such organic deposits do occur, e.g. coal and oil, it has been under conditions that are hostile to bio-degradation. Environmental bio-technology will include the application of biological systems and processes in waste treatment and management. Many successful bio-technological processes have now been developed for water, gas, soil and solid waste treatments. Modern developments in environmental bio-technology now focus on process optimisation and will no longer accept processes which are inefficient and which sometimes merely transform one problem into another, e.g. formation of carcinogenic nitrosamine compounds by the reaction of some microorganisms with organic amines and nitrous oxide. Environmental safety should not be threatened by environmental processes. Organic chemicals that cannot easily be degraded by microorganisms, or are indeed totally resistant to attack, are termed ‘recalcitrant’, e.g. lignin. Xenobiotics are man-made synthetic compounds not formed by natural bio-synthetic processes and, in many cases, can be recalcitrant. A xenobiotic compound is, therefore, a foreign substance in our ecosystem and may often have toxic effects. All environmental bio-technological processes make use of the metabolic (degradative and anabolic) activities of microorganisms, demonstrating, again, the indispensable nature of microbes in our ecosystem.

The term ‘bio-degradable’ is often loosely identified with the term ‘environmentally friendly’, and numerous advertising campaigns and product packaging have put across the message that, because a product is bio-degradable, its impact on the environment will be dramatically reduced. This is not always the case, and demonstrations of product bio-degradability have often only been achieved under highly conducive microbial conditions that are not easily met in the natural environment. Furthermore, bio-degradability itself is a complex multi-factorial event whose mechanisms are not completely understood.


Many microorganisms can infect humans, animals and plants and cause disease. Successful establishment of disease results from interactions between the host and the causal organism. Many factors are involved, only a few of which are well understood. Most microorganisms used by industry are harmless and many are indeed used directly for the production of human or animal foods.

Their safety is well documented from long associations lasting up to hundreds of years. Only a small number of potentially dangerous microorganisms have been used by industry in the manufacture of vaccines or diagnostic reagents, e.g. Bordetella pertussis (whooping cough), Mycobacterium tuberculosis (tuberculosis) and the virus of foot-and-mouth disease. Stringent containment practices have been the norm when these microorganisms are used. In recent years there have been many scientific advances permitting alterations to the genetic make-up of microorganisms. Recombinant DNA techniques have been the most successful but have also been the cause of much concern to the public. However, this natural anxiety has been ameliorated by several compelling lines of evidence:

1. Risk-assessment studies have failed to demonstrate that host cells can acquire novel hazardous properties from DNA donor cells. 2. More rigorous evaluation of existing information concerning basic immunology, pathogenicity and infectious disease processes has led to relaxation of containment specifications previously set down. 3. Considerable experimentation has shown no observable hazard. While there is a vast amount of evidence that the application of genetic engineering is safe and that the bio-technological developments with plants and animals are being applied responsibly and safely, there are still some bodies of opinion that seek draconian bio-safety protocols based on conjectured potential consequences of genetic engineering.

Never has a new technology been more thoroughly scientifically scrutinized than in these new areas of bio-technology. Many of the opponents use inflammatory and totally unscientific reasoning in their attempts to derail this potentially valuable technology. Scientific research on safety aspects of this technology will continue to be an important and continuing issue. The European Federation of Bio-technology Working Party on Bio-safety has now been established to provide recommendations on safety aspects of bio-technology with respect to the environment, the public, personnel and product, to include:

* Identifying and monitoring hazards associated with various applications in bio-technology; * Assessing and quantifying risks; * Providing an international platform for issues related to safety in bio-technology; * Producing statements and recommendations (based on science and technology); * Identifying areas of insufficient knowledge and inadequate technology with * Respect to safety in bio-technology and proposing research and development in such areas; * Assisting in the implementation of the recommendations and guidelines in bio-technology.

There is a growing worldwide concern that pathogenic microorganisms may be used in acts of urban terrorism. All major nations have, at some time, run major research programmes on biological warfare. While the use of bio-weapons is prohibited under the Biological and Toxic Weapon Convention, there can be no doubt that some rogue nations have not subscribed to this ban and, while bio-warfare is unlikely, the availability of such potential pathogenic bio-agents could lead to acts of bio-terrorism. Potential biological agents have been assigned to three categories (Centre for Disease Control, Atlanta, USA):

Category A agents include the most serious – smallpox, anthrax, plague, botulism, tularaemia and viral hemorrhagic fevers such as Ebola;

Category B agents have a similar potential for large-scale dissemination but generally cause less serious illnesses – typhus, brucellosis and food poisoning agents such as Salmonella and E. coli 0157;

Category C agents include novel infectious diseases which could emerge as future threats.

Furthermore, there is also potential with bio-agents to target farm animals and crops, which could cause devastating economic effects. Aflatoxin (a fungal derived poison or mycotoxin) is a serious human carcinogen and has been identified as a potential biological weapon for food and water contamination. The production of most of these microorganisms is relatively straightforward when suitable fermentation equipment is available together with the appropriate containment facilities. Final delivery of the microorganism can be problematic for large-scale bio-warfare. The most appropriate means of dispersal would be as an aerosol or by contaminating food or water supplies at a local level. The quite serious potential of this new aspect of bio-terrorism has created considerable concern in public health authorities. Questions arise as to how quickly these bio-weapons can be identified and what form of rapid treatment can be administered. Massive stockpiles of appropriate vaccines and antibiotics must be set in motion, as has already taken place in the USA. Already, the USA has allocated massive funds to research on bio-weaponry.


A central feature of new bio-technological advances derives from an increasing understanding of the mechanisms of life and how these will eventually transform human lives as well as give a deeper appreciation of agriculture, aquaculture, forestry and the biological environment. The ability to select and manipulate genetic material within and out with species has permitted unprecedented opportunities to alter life forms for the benefit of society. The successful sequencing of the human and other genomes is the beginning of a new scientific period of discovery. However, rather than genomic sequences being an end in themselves, it is but the beginning of scientific study to put the information into context with regard to the biological significance to the organism. Currently, sequence data have been used to identify species, to derive evolutionary linkages and to study the basis of organism diversity. Many molecular biologists have postulated that a genetic or DNA sequence analysis of an individual could be predictive of future disease occurrence, e.g. cardiovascular disease, cancer and Alzheimer’s disease. This has generated much interest, especially with insurance companies. However, to rely on sequence analysis alone would be insufficient since this would not take into consideration all the multi-various adaptive systems of the living organism as well as the environmental input threshold on an individual’s lifespan. However, new microarray technology, where thousands of single-nucleotide polymorphisms can be analyzed, together with advances in proteomics may well give a meaningful patient read-out on potential susceptibility and early diagnosis of an impending problem, allowing much earlier medical or lifestyle intervention.

Recombinant DNA technology of mammalian cell cultures has produced many recombinant proteins, e.g. insulin and recombinant vaccines, which are now bringing considerable medical benefit to a wide range of human diseases. Undoubtedly, there will be continued research and application in this area. The present applications of genetic engineering technology to the life sciences, through apparently revolutionary techniques, are indeed nothing to what will evolve in the future. The further implementation of genomics and proteomics will allow a much deeper understanding of the biology of molecules, cells and whole organisms. Doctors and patients will have much to gain from the outcome of these studies. Much will be learned about human individuality and how these findings could influence individual health and disease susceptibility.

Plant-based genetic engineering did not really start until the early 1980s with the development of the Ti plasmid of Agrobacterium tumifaciens, which has allowed the introduction of simple genetic constructs into most of the important crop plants. These processes are now relatively routine, and the changes made within the plants have been so slight that it requires highly sophisticated bio-chemical assays to distinguish genetically modified varieties from their predecessors. Notwithstanding the large and growing body of evidence that the application of plant genetic engineering is safe and that development has always been applied responsibly and safely, there has been a small but highly organized and vociferous opposition to the application of the technology.

The application of any new technology is often fraught with public misconception and mistrust of scientific opinion. No technology can ever be free of risk and, in our present affluent developed world, perfection is now the expectation. Irresponsible media ‘experts’ (often with no scientific experience) lead the public to expect zero impact and risk from the new technological innovations of plant genetic engineering, and if any slight deficiency – real or imagined – is detected, they will propose the complete condemnation of the practice. The demand for moratoria or outright banning of GM food products, particularly in Europe, has its origins in inflammatory and unscientific phrases such as ‘biological pollution’ and ‘Frankenstein food’ and erroneous comparisons made with BSE, foot and mouth disease and nuclear power plants. The scientific community involved in GM food studies has shown a level of caution that has been lacking in most other new technologies. The basic work is routine and well established. There are few, if any, real risks associated with genetic engineering of crop plants that could in any way compete with the hazards that society presently accepts in order to uphold current ways of life, e.g. transport, smoking, alcohol, and many others. The new aspects of bio-technology, such as transgenic plants and animals, recombinant proteins and vaccines, will bring huge benefits to mankind but not without generating concern in some sections of the population. Bio-technologists, in general, stand accused of not communicating with the lay public, largely because the scientists have mostly been unable or unwilling to take the time to explain in simple, understandable language the basic principles of the science involved. They must also be more circumspect in the claims made for the future outcome of their studies. There are still vast areas of biological knowledge that must be deciphered before most speculative projects can be achieved. Also, the time-scale for accomplishment must also be more realistic. Although there is now a vast reservoir of relevant biological and engineering knowledge and expertise waiting to be put into productive bio-technological use, the eventual rate of application will be determined not primarily by science and technology but, rather, by many other equally important factors such as industrial investment policies, the establishment of market needs, the economics of the marketing skills needed to introduce new products into commercial use and, above all, how the public perceive this new range of innovative technologies.

Bio-technology will play a major role in the continued search for solutions to the many problems that will affect the society of tomorrow, namely health, food supply and a safe biological environment. Continued scientific research will be paramount to achieving these ends. However, as Louis Pasteur commented on the inexorable nature of scientific studies.

Similar Documents

Premium Essay

Operating System

...CMOS A complementary metal oxide semiconductor (CMOS) is a type of integrated circuit technology. The term is often used to refer to a battery-powered chip found in many personal computers that holds some basic information, including the date and time and system configuration settings, needed by the basic input/output system (BIOS) to start the computer. This name is somewhat misleading, however, as most modern computers no longer use CMOS chips for this function, but instead depend on other forms of non-volatile memory. CMOS chips are still found in many other electronic devices, including digital cameras. In a computer, the CMOS controls a variety of functions, including the Power On Self Test (POST). When the computer’s power supply fires up, CMOS runs a series of checks to make sure the system is functioning properly. One of these checks includes counting up random access memory (RAM). This delays boot time, so some people disable this feature in the CMOS settings, opting for a quick boot. If installing new RAM it is better to enable the feature until the RAM has been checked. Ad Once POST has completed, CMOS runs through its other settings. Hard disks and formats are detected, along with Redundant Array of Independent Disk (RAID) configurations, boot preferences, the presence of peripherals, and overclocking tweaks. Many settings can be manually changed within the CMOS configuration screen to improve performance; however, changes should be made by experienced......

Words: 747 - Pages: 3

Free Essay

History of Bios and Cmos

...William Rivas 02-09-2014 MR. Jones NT1110 A History of BIOS and CMOS The relationship between the BIOS and CMOS is important to the proper functionality of any computer. The BIOS is an integrated circuit which tells the CPU or Processor how to act. BIOS is neither hardware or software and is called firmware. Firmware is essentially software on a “chip” or integrated circuit, “chip” being the slang term. The BIOS is the “network administrator of each individual computer”, in other words, it is the reason all the physical parts i.e. motherboard, keyboard , cd drive, monitor, etcetera are able to communicate with each other. The CMOS chip or Complimentary Metal Oxide Semiconductor chip is a different integrated circuit in which the BIOS is dependent upon for storage of computer configuration settings. CMOS memory is attached to the motherboard upon assembly at the factory and uses DC power, from a battery to store BIOS settings. It is not the same as RAM (Random Access Memory) which is used by the Operating System to access instructions from different software added by the end user to perform whatever function desired. This type of memory is lost when power is shut down on the computer. The history of the CMOS appears to begin somewhere around 1963 in a conference paper by C.T. Sah and Frank Wanlass. In 1965 RCA and Somerville Manufacturing pioneered the production of CMOS technology. IN 1968 they created what would prove to be the forerunner of engine control......

Words: 408 - Pages: 2

Free Essay


...BIOS (basic input/output system) is the program a personal computer's microprocessor uses to get the computer system started after you turn it on. It also manages data flow between the computer's operating system and attached devices such as the hard disk , video adapter , keyboard , mouse , and printer . BIOS is an integral part of your computer and comes with it when you bring it home. (In contrast, the operating system can either be pre-installed by the manufacturer or vendor or installed by the user.) BIOS is a program that is made accessible to the microprocessor on an erasable programmable read-only memory (EPROM) chip. When you turn on your computer, the microprocessor passes control to the BIOS program, which is always located at the same place on EPROM. When BIOS boots up (starts up) your computer, it first determines whether all of the attachments are in place and operational and then it loads the operating system (or key parts of it) into your computer's random access memory (RAM) from your hard disk or diskette drive. With BIOS, your operating system and its applications are freed from having to understand exact details (such as hardware addresses) about the attached input/output devices. When device details change, only the BIOS program needs to be changed. Sometimes this change can be made during your system setup. In any case, neither your operating system or any applications you use need to be changed. Although BIOS is theoretically always the intermediary...

Words: 821 - Pages: 4

Free Essay


...User Guide EVGA nForce 780i SLI Motherboard 780i 3-Way SLI Motherboard EVGA ii nForce 780i SLI Motherboard Table of Contents Before You Begin… ..................................................................................................... ix Parts NOT in the Kit .................................................................................................ix Intentions of the Kit ...................................................................................................x EVGA nForce 780i Motherboard..................................................................................1 Motherboard Specifications...................................................................................... 1 Unpacking and Parts Descriptions...............................................................................4 Unpacking ................................................................................................................ 4 Equipment ................................................................................................................ 4 EVGA nForce 780i SLI Motherboard ....................................................................... 5 Hardware Installation ....................................................................................................9 Safety Instructions.................................................................................................... 9 Preparing the Motherboard .........

Words: 12933 - Pages: 52

Free Essay


...correct bootloader certification is needed and database key authentication is also required before the booting process. As a result, rootkit or other malware program have a hard time hijacking the boot process and concealing itself from the operating system. This paper will focus on the analysis of UEFI's secure boot feature and its implications and challenges for digital investigators conducting computer forensic investigation. Keywords: UEFI secure boot, boot firmware, malware, rootkit. Introduction To meet the demands of and preference for faster and more powerful computer most users want, hard disk drive manufacturers produce disks in excess of 2 TB. Moreover, most personal computers using the legacy Basic Input / Output System (BIOS) with master boot record (MBR) only allows a maximum disk size of approximately 2.2 TB and a maximum of four primary partitions to be run...

Words: 1677 - Pages: 7

Free Essay


...1.why is all data stored in a computer in binary form? Computers are only able to read and store data in binary form, 1 or 0, on or off, yes or no, voltage or none. Binary is the simplest way to manage information. 2. What are the four primary functions of hardware? input, output, PROCESS, and Storage 3. What are the two main input devices and two main output devices? the mouse, keyboard, printer, and monitor. 4. What three things do electronic hardware devices need in order to function? power, ground return, and load. 5. How many bits are in a byte? There are eight bits in a byte. 6. What is the purpose of an expansion slot on a motherboard? Expansion slots on a motherboard are designed to accept peripheral cards that add functionality to a computer system, such as a video or a sound card. 7. Which component on the motherboard is used primarily for processing? The CPU 8. Name the two main CPU manufacturers. Intel and AMD 9. Order the following ports according to speed, placing the fastest port first: FireWire, eSATA, USB. eSATA is faster than FireWire and firewire is faster than USB. 10. What are two other names for the system bus? The PCI bus, and the PCI express bus 11. What type of output does an S/PDIF port provide? Digital output 12. Why is an SSD hard drive more reliable under rugged conditions than an IDE hard drive? SSD hard drives have no moving parts that can disintegrate under rugged conditions. Because SDD drives......

Words: 1535 - Pages: 7

Free Essay

Information Technology

...BASIC INPUT OUTPUT SYSTEM [BIOS] Seminar Presented by Milind Chile - 2591 Dipti Borkar - 2778 Freddy Gandhi - 2787 Raghav Shreyas Murthi - 2804 Introduction The BIOS, short for BASIC INPUT OUTPUT SYSTEM is a set of built-in software routines that give a PC its personality. Although, less than 32 kilobytes of code, the BIOS controls many of the most important functions of the PC: how it interprets keystrokes (Ctrl + Alt + Delete), how it puts characters on the screen, and how and at what speed it communicates through its ports. The BIOS also determines the compatibility of the computer and its flexibility in use. Although all BIOSs have the same function; all are not the same. The BIOS governs the inner complexities arising out of the odd mixing of hardware and software. It acts as a link between the material hardware of the PC and its circuits, and the transcendent realm of software ideas and instructions. More than a link, the BIOS is both hardware and software. Like software, the BIOS is a set of instructions to the computer’s microprocessor. Like hardware, however, these instructions are not evanescent; rather they are coded into the hard, worldly silicon of PROM, EPROM chips. Due to the twilight state of programs like the BIOS, existing in the netherworld between hardware and software, such PROM-based programs are often termed firmware. The personality comes from the firmware code. This code determines how the computer will carry out the basic functions needed to make...

Words: 6641 - Pages: 27

Premium Essay


...Share this About Tech PC Support . . . Troubleshooting Guides Symptoms How To Fix a Computer That Turns On But Displays Nothing What To Do When Your Computer Starts but The Screen is Black By Tim Fisher PC Support Expert Share this PC Support Categories Troubleshooting Guides How-To's & Tutorials Getting Support Software Tools Hardware Tools Command Line Reference Parts of a Computer Cleaning & Cooling File Extensions Things You Didn't Know About... Computer Terms Other Resources Fix a Problem With Your Computer Maintaining Your Computer Tips & Tricks Updated Articles and Resources Free Email Newsletter Let send you the latest from our PC Support Expert. You can opt-out at any time. Please refer to our privacy policy for contact information. Photo of a Dell UltraSharp U2412M 24-Inch LED Monitor - © Dell, Inc. Dell UltraSharp U2412M 24-Inch LED Monitor. © Dell, Inc. The most common way that a computer "won't turn on" is when the PC actually does power on but doesn't display anything on the monitor. You see lights on the computer case, probably hear fans running from inside, and may even hear sounds, but nothing at all shows up on your screen. There are several possible reasons why your monitor isn't displaying information so it's very important that you step through an ordered process like the one I've outlined here. ...

Words: 2186 - Pages: 9

Free Essay

Nt1110 Unit 7 Research Paper 1 Uefi Bios.Docx

...Unit 7 Research Paper 1: UEFI BIOS 1. UEFI BIOS - is a specification that defines a software interface between the operating system(s) and the platform's firmware. In the mid 90s Intel was creating a new processor architecture that was 64-bit, but wasn't backwards-compatible with the old x86. This architecture was the Itanium 64. Because the IA-64 only supports 64-bit instructions, the PC BIOS couldn't be used, therefore Intel developed the EFI specification. Later on this specification was managed (and still is) by the UEFI board, an association of several companies such as AMD, Microsoft, Intel, Apple and so on. History - UEFI is actually an extension of the original Extensible Firmware Interface developed by Intel. They developed this new hardware and software interface system when they launched the ill-fated Itanium or IA64 server processor lineup. Because of its advanced architecture and the limitations of the existing BIOS systems, they wanted to develop a new method for handing off the hardware to the operating system that would allow for greater flexibility. Because the Itanium wasn't a huge success, the EFI standards also languished for many years. In 2005, the Unified EFI Forum was established between a number of major corporations that would expand upon the original specifications developed by Intel to produce a new standard for updating the hardware and software interface. This includes companies such as AMD, Apple, Dell, HP, IBM, Intel, Lenovo and Microsoft...

Words: 321 - Pages: 2

Free Essay

Operation Manual

...Chapter 3 This chapter helps you power up your system and install drivers and utilities that came with the support CD. VERA 2 Starting up 3.1 Installing an operating system This motherboard supports Windows® 2000/XP and VISTA Premium operating system (OS). Always install the latest OS version and Because motherboard settings and hardware options vary, use the setup procedures presented in this chapter for general reference only. Refer to your OS documentation for more information. 3.2 Support CD information The support CD that came with the motherboard contains useful software and several utility drivers that enhance the motherboard features. The contents of the support CD are subject to change at any time without notice. Visit the ASUS website for updates. 3.2.1 Running the support CD To begin using the support CD, simply insert the CD into your CD-ROM drive. The CD automatically displays the Drivers menu if Autorun is enabled in your computer. Click on an item to install. If Autorun is NOT enabled in your computer, browse the contents of the click the ASSETUP.EXE to run the CD. 3.2.2 Drivers menu The drivers menu shows the available device drivers if the system detects installed devices. Install the necessary drivers to activate the devices. 3-2 Chapter 3: Starting up Intel Chipset Inf Update Program This item installs the Intel® Chipset INF Update Program. This driver enables Plug-n-Play INF support for the Intel®...

Words: 5285 - Pages: 22

Free Essay

Laptop Level-1

...An ISO 9001:2008 Certified Company LAPTOP Level-1 SERVICE TRAINING - COURSE SYLLABUS Day1 INTRODUCTION 1. What is laptop? 2. History of laptop 3. How work laptop? 4. Advantages of laptop instead of desktop 1. CARD LEVEL SERVICE [Hard Ware] 2. CHIP LEVEL SERVICE [Motherboard] 5. Difference between laptop and desktop LAPTOP MANUFACTURING COMPANIES-Acer /Apple /Compaq /Dell -etc. 6. How to buy a laptop? 7. Operating System review &laptop uses & laptop booting process 8. Guide to purchase of second hand laptop Day2 TOOLS AND TESTING EQUIPMENTS 1. LAPTOP HARDWARE TOOLS  Screw Driver Kit  Laptop Casing Opener  Nose Pliers  Cutter  Electric Screw Driver  Tweezer  Anti Static Wrist Band  PCB Cleaning Brush & etc 2. LAPTOP CHIP LEVEL TOOLS  Micro Soldering Iron  Tip Soldering Iron  Hot Air Blower  Magnifying Lenz with Lamp  Liquid Flux  Dry Flux & etc 3. TESTING EQUIPMENT  1.Anolog Mutimeter  Digittal Mutimeter  Battery Booster  Universal AC adapter  Debug card  SATA to USB convertor  IDE to USB convertor 4. OTHER HARDWARE SPARES  External Monitor  External DVD Drive  Usb Keyboard / Mouse&etc Day3 Day4 Day5 Day6 Day7 Day8 5. ACCESSORIES  Laptop bag  LCD Screen Card  LCD Cleaner & etc 6. ADVANCED TOOLS  BGA Rework Machine  Reballing Kits  Infrared IC Heater  PCB Scanner  Oscilloscope [CRO]  SMD IC Extractor  PTH Desoldering Machine  RCL Meter SMT ELECTRONICS INTRODUCTION RESISTOR CAPACITOR INDUCTOR & TRANSFOEMER LAPTOP PARTS VIEW & KEYBOARD &......

Words: 2886 - Pages: 12

Free Essay

Information Technology Gt

...Semiconductor. It is a technology used for constructing integrated circuits. The technology is used in microprocessors, microcontrollers, static RAM, and other digital circuits. Frank Wanlass patented CMOS in 1963. CMOS’s typical design is for logic functions using various MOSFETs also known as Metal Oxide Semiconductor Field Effect Transistors. The early types of CMOS, which is used to store BIOS memory, used the on-board battery to maintain the power to the CMOS at all times. This prevented your memory settings that were stored on board from being erased after turning your computer off or after loss of power. In modern CMOS systems, the CMOS does not use the on-board battery to maintain and save BIOS settings; instead the battery is only used to provide power to the system clock on board the PC. Memory on-board the CMOS has relatively remained unchanged since it was first patented. Memory for CMOS ranges from 128 bytes to the largest, as of yet, of 512 bytes. The reason for not needing the change in size is that CMOS was and is only designed to hold the absolute basic boot settings needed for any given system. CMOS does indeed still utilize RAM for startup functions on a PC as of today, which has not changed since it was developed. Again, as mentioned above, the CMOS does not utilize the battery located on the motherboard any longer. CMOS has evolved into using EEPROM or Electrically Erasable Programmable Read-Only Memory. This technology allows the circuits on a...

Words: 368 - Pages: 2

Free Essay

Comp-Tia a+ Testing 801 and 802

...length is 100 meters. Uses baseband signaling. 100BaseT Generic term for Ethernet cabling system designed to run at 100 megabits per second on twisted pair cabling. Uses baseband signaling. 1000BaseT Gigabit Ethernet on UTP. 110 block The most common connection used with structured cabling, connecting horizontal cable runs with patch panels. 16 bit (PC Card) Type of PC Card that can have up to 2 distinct functions or devices, such as a modem/network card combo. 3.5 inch floppy drive Size of all modern floppy disk drives; format was introduced in 1986. 2.1 speaker system Speaker setup consisting of 2 stereo speakers combined with a subwoofer. 34 pin ribbon cable Type of cable used by floppy disk drives. 3-D graphics Video technology that attempts to create images with the same depth and texture as objects seen int he real world. 40 pin ribbon cable PATA cable used to attach EIDE devices (such as hard drives) ATAPI devices (such as optical drives) to a system 5.1 speaker system Speaker setup sonsisting of 4 satellite speakers plus a center speaker and subwoofer. 64 bit processing Atype of processing that can run a compatable 64 bit operating system, such as Windows 7, and 64 bit applicatins. 64 bit PCs have a 64 bit wide address bus enabling them to use more than 4GB of RAM. 8.3 naming system File naming convention that specified a maximum of 8 characters for a file name, followed by a 3 character file extension. Has been replaced by LFN (long filename)......

Words: 3658 - Pages: 15

Premium Essay


...available memory. The CMOS is used every time your computer is starting up and when it is powered off. The CMOS is powered by a CMOS battery that keeps the CMOS memory running even though the computer is off. The CMOS cannot stop running or else important hardware settings needed to start up your personal computer could be deleted. The BIOS uses the information stored in the CMOS when starting up your system, faulty CMOS batteries, can prevent your system from starting up. CMOS memory has not changed over the years, the CMOS memory is a 64 or 128 bytes of RAM. The CMOS memory is still 512 bytes; the CMOS only holds the basic BIOS boot settings used in the system. The CMOS memory has not had any memory capacity changes since it was first developed but has been developed to run faster and produce less noise. Also in order to lower cost and increase, “the functionality of IC’s has resulted in it being used for analog only, analog/digital, and mixed signal designs.” (Baker R. Jacob pg. 8). CMOS memory still utilizes RAM, requiring a battery on the motherboard. CMOS memory still utilizes RAM but EEPROM has come into the computer technology and is starting to become more popular than the CMOS memory utilizing RAM. EEPROM (Electrically Erasable Programmable Read Only Memory) is a type of non-volatile memory that computer and other electronic devices use. EEPROM stores small amounts of data that must be stored when power is off, it can also...

Words: 449 - Pages: 2

Premium Essay


...available memory. The CMOS is used every time your computer is starting up and when it is powered off. The CMOS is powered by a CMOS battery that keeps the CMOS memory running even though the computer is off. The CMOS cannot stop running or else important hardware settings needed to start up your personal computer could be deleted. The BIOS uses the information stored in the CMOS when starting up your system, faulty CMOS batteries, can prevent your system from starting up. CMOS memory has not changed over the years, the CMOS memory is a 64 or 128 bytes of RAM. The CMOS memory is still 512 bytes; the CMOS only holds the basic BIOS boot settings used in the system. The CMOS memory has not had any memory capacity changes since it was first developed but has been developed to run faster and produce less noise. Also in order to lower cost and increase, “the functionality of IC’s has resulted in it being used for analog only, analog/digital, and mixed signal designs.” (Baker R. Jacob pg. 8). CMOS memory still utilizes RAM, requiring a battery on the motherboard. CMOS memory still utilizes RAM but EEPROM has come into the computer technology and is starting to become more popular than the CMOS memory utilizing RAM. EEPROM (Electrically Erasable Programmable Read Only Memory) is a type of non-volatile memory that computer and other electronic devices use. EEPROM stores small amounts of data that must be stored when power is off, it can also read, erase, and......

Words: 315 - Pages: 2