Free Essay

Random Variables

In: Computers and Technology

Submitted By maneel1995
Words 1408
Pages 6
Topic # 3: Random Variables & Processes & Noise

T1. B.P. Lathi, Modern Digital and Analog Communication Systems, 3rd Edition, Oxford University Press, 1998: OR 4th Edition 2010 Chapter 8, 9 & 12

T2. Simon Haykin & Michael Moher: Communication Systems; John Wiely, 4th Edition OR 5th Edition, 2010, 5/e. : Chapter 5
R1.DIGITAL COMMUNICATIONS Fundamentals and Applications: ERNARD SKLAR and Pabitra Kumar Ray; Pearson Education 2009, 2/e. : ( Section 5.5) August 11- 18, 2014

1

What is Noise ?
Desired Signal : The one that is needed. Effect of Noise : Since the noise adds to the signal, it lives with it. Neither amplification nor the filtering can alleviate the effect of noise on the desired signal.

Undesired Signal : The one that gets added to the desired signal when the desired signal is passing through the medium, amplifiers, mixers, filters and other parts of the communication channel between the source and the destination. Noise : The undesired signal that adds to the desired signal and reaches the destination.

The only way to keep away from the effects of noise is to see that less amount of noise, relative to the desired signal, is present at the destination

Interference: Intentional or unintentional un desired signals that interfere with communication process.
2

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Noise Sources
Externally Generated Internally Generated  Thermal noise : Random Motion of electrons due to temperature in resistive components of the system  Shot Noise : Due to diffusion of carriers in semiconductors etc.

 Atmospheric : Due to lightening & Thunder storms :
2MHz – 10 MHz

 Extra Terrestrial Galactic sources

: Due to solar &

20 MHz- 1.5 GHz  Man Made Noise : Spark Plugs, engine Noise 1 MHz– 500 MHz Most of the discussion in our class will be on Thermal Noise

3

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Thermal Noise
Thermal noise is an inevitable reality with which the received signal power has to compete Gaussian or Normal probability density function

Thermal Noise is AWGN
Additive White Gaussian Noise.

Additive : Adds to Signal
White : Its power spectral density is flat Gaussian : The underlying probability density function is Gaussian

Cumulative distribution function We talk about the probability density function because, noise is random and hence to be dealt with properties of random variables. ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

4

Statistical Averages of Random Variable
For a Continuous RV case, the mean is Moments of a random variable:

Mean of a function (y = g(x)) of a random variable

Mean square of a random variable: use g(x) = x2
5

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

6

Sum of Random Variables z=x+y Central Limit Theorem: under certain conditions, sum of large number of independent random variables tends to be a Gaussian random variable, independent of the pdfs of the random variables involved.

If

Then the pdf of z is

Example: By adding 2 RVs, with density function as in the figure, the density function of the resulting RV is

6

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Random Process
A random variable that is a function of time is called a random process. Ex: Binary waveform generator, say over 10 pulse durations A cos (wct + Φ), with Φ being a random variable.

7

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Random Process
A random variable that is a function of time is called a random process.

Collection of all possible waveforms is called Ensemble

A given waveform in the Ensemble is called Sample Function X1, X2, .. Are the random variables generated by the amplitudes of the sample functions at time instants t1, t2, .. respectively ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

8

Random Process
The n random variables X1, X2, ..are dependent, in general The nth order joint PDF is expressed as

If a higher order joint PDF is available, the lower order PDF can be obtained

The mean of the random process can be obtained from the first order PDF as ELECTRICAL ELECTRONICS COMMUNICATION

9

INSTRUMENTATION

Auto Correlation of a Random Process

10

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Stationary Random Process

11

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Ergodic Random Process
Ensemble statistics

Time statistics

For Ergodic Process
ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

12

Power Spectral Density of Random Process

13

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Transmission of a Random Process through a Linear System.

If either or both of them are zero mean processes,

14

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Home Work

Solve & understand the following worked examples: 9.2 – 9.5 from Lathi (4th Edition)

15

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

System Noise Characterization

16

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Thermal Noise Power
The thermal noise is AWGN in nature and its power is N = k T0 W (or B) Watts T0 = Temperature in Kelvin degrees

k = Boltzman Constant = 1.38 X 10-23 J/K or W / K-Hz

= - 228.6 dBW / K-Hz
W or B = Bandwidth in Hz

Noise Power Spectral density
ELECTRICAL ELECTRONICS

N0 = (N / W ) = k T0
COMMUNICATION

Watts /Hz
INSTRUMENTATION

17

Noise Figure
All passive & active devices generate noise

Amplifiers in the system are made of active & passive devices, hence contribute to over all noise in the system
Noise Figure of Amplifier

For a lossy network, Loss is given by L = ������������������������������������ ������������������������������ Noise Figure COMMUNICATION F = L. 18 INSTRUMENTATION
������������������������������ ������������������������������

ELECTRICAL

ELECTRONICS

Noise Temperature

TR0= Effective Noise Temperature of Network or Receiver ELECTRICAL

To0= Reference Temperature of the noise source, chosen to be 2900 K COMMUNICATION INSTRUMENTATION

19

ELECTRONICS

Composite Noise figure
Noise at the output of Network2 (Nout)2 = G2 (Nout)1 + (F2-1) G2 k 290 W (Nout)2 = G2 { G1 N1 + (F1-1) G1 k 290 W } + (F2-1) G2 k 290 W (Nout)2 = G1G2 N1 + G1G2(F1-1) k 290 W + (F2-1) G2 k 290 W

Assume the over all gain of the network is G = G1 G2 and over all noise figure is F comp The total noise power at the output of the cascaded network is given by

comp

Comparing

Let the noise at the input of Network1 be N1 (Fcomp-1) G1 G2 k 290 W Noise at the output of Network1 (Nout)1 = G1 N1+ (F1-1) G1 k 290 W ELECTRICAL ELECTRONICS = G1G2(F1-1) k 290 W + (F2-1) G2 k 290 W

Fcomp = F1 + (F2-1)/ G1
COMMUNICATION INSTRUMENTATION

20

Composite Noise figure : Feed line & Amplifier
For an N-Stage Network..

Tcomp0 = (L-1)290 + (F-1) 290/(1/L) = (L-1)290 + L(F-1) 290

Tcomp0 = (LF-1) 2900 K
Tcomp0 = (LF-1) 2900 K
= (LF-1 + L -L) 2900 K = (L -1 + L(F-1) ) 2900 K

Tcomp0 = TL0 + L TR0
21

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

System Effective Temperature
Manmade noises: Radiation from Automobile ignition and electrical machinery and Radio transmissions from other users that fall into the BW.

The system effective Temperature is F

TS0 = TA0 + TL0 + TR0 / (1/L) = TA0 + (L-1)290 + L(F-1) 290 TS0 = TA0 + (LF-1) 2900 K

TA0 is the antenna noise temperature Natural Sources including : Lightening, Celestial radio sources, Atmospheric sources, Thermal radiation from The ground and other structures. ELECTRICAL ELECTRONICS

22

COMMUNICATION

INSTRUMENTATION

Example Problem on Lossy Line

23

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Example on NF & Noise Temp

and the overall Noise Figure of the system

Nout = G k TS0 W = 108 X 1.38 X 10 -23 X 2760 X 6 X 106 = 22.8 mw

TR0 = (F-1)2900 K = 26100 K
TS0 = TA0 + TL0 + LTR0 = 150 + 2610 = 2760 K
ELECTRICAL ELECTRONICS COMMUNICATION 29.1 – 16.4 = 12.7 dB INSTRUMENTATION
24

Improving SNR - Benefit of using Pre Amplifier

Fig. 5.19a

SNRout = 16.4 dB

TR20 = (F2-1)2900 K

= 26100 K

Tcomp0 = TR10 + TR20 / G1 = 290 + 2610/20 = 420.5 0K
Fig. 5.19a

TS0 = TA0 + Tcomp0

= 150 + 420.5 0K = 570.5 0K = 2+ 9/20 = 2.5 (4dB)

SNRout = 23.3 dB

Fcomp = F1 + (F2-1)/ G1

Fig. 5.19b

TR10 = (F1-1)2900 K = 2900 K

25

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Problem

<

75 feet Lossy Cable 3dB/100 ft Loss (a) Fcomp

Receiver F = 13 dB

Fcomp =?

Pre-amp G = 20 dB F = 3 dB (c) Fcomp = F1 + (F2-1)/ G1 + (F2-1)/ G1G2 = 2 + (1.68-1) /100

(b) Fcomp = F1 + (F2-1)/ G1 + (F3-1)/ G1G2 = 1.68 + (2-1) X 1.68

= 3 X 0.75
= 2.25 dB

= F1 + (F2-1)/ G1

= 1.68 +
(20-1) X 1.68 = 33.6 = 15.26 dB

= 1.68 =F

+ (20-1) X 1.68 /100 = 3.68 = 5.65dB

+ (20-1) X 1.68 /100 = 2.32 = 3.66dB

26

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Similar Documents

Premium Essay

Sdasdasd

...CHAPTER 6 RANDOM VARIABLES PART 1 – Discrete and Continuous Random Variables OBJECTIVE(S): • Students will learn how to use a probability distribution to answer questions about possible values of a random variable. • Students will learn how to calculate the mean and standard deviation of a discrete random variable. • Students will learn how to interpret the mean and standard deviation of a random variable. Random Variable – Probability Distribution - Discrete Random Variable - The probabilities of a probability distribution must satisfy two requirements: a. b. Mean (expected value) of a discrete random variable [pic]= E(X) = = 1. In 2010, there were 1319 games played in the National Hockey League’s regular season. Imagine selecting one of these games at random and then randomly selecting one of the two teams that played in the game. Define the random variable X = number of goals scored by a randomly selected team in a randomly selected game. The table below gives the probability distribution of X: Goals: 0 1 2 3 4 5 6 7 8 9 Probability: 0.061 0.154 0.228 0.229 0.173 0.094 0.041 0.015 0.004 0.001 a. Show that the probability distribution for X is legitimate. b. Make a histogram of the probability distribution. Describe what you see. 0.25 0.20 0.15 0.10 ...

Words: 3495 - Pages: 14

Premium Essay

Business Communication

...Lesson 2: Review of basic concepts of probability theory Coverage: Basic probability rules Random variables and associate concepts Normal distributions Reading: Chapter 2 (1-7), Chapter 3 (1-5, 10-11) and Chapter 4 (1-8) (1 8) Homework: Replicate and complete all the classroom exercises. Print answers of 2.1 (b-c-d), 2.3 (b-c-d) and 2.4(b-c-d) in one (1) page. 1 Business Statistics Lesson 2 - Page 2 Objectives At the end of the lesson, you should be able to: Define and apply the basic probability rules Describe the basic concepts related to random variables D ib th b i t l t dt d i bl Describe and use the properties of means and variances Recognize and understand the most commonly used probability distributions Use the basic data manipulation and descriptive statistical features of SPSS and transfer between SPSS and Excel SPSS, 1 Business Statistics • • • • Review of probability concepts Lesson 2 - Page 3 Probability: is defined on random events (occurrences), takes values between 0 and 1, and can be interpreted as limit of relative frequency (objective probability) Note: In everyday usage, probability might mean the extent of our belief in the occurrence of the event (subjective probability). However, statistics mostly deals with objective interpretation based on relative frequency. j p q y Basic probability rules: P( Sure event) = 1 and P( Impossible event) = 0 P(A or B) = P(A) + P(B) – P(A and B) Consequences: P( not A) = 1 -......

Words: 867 - Pages: 4

Free Essay

Basic Statistics

...the average (mean) of a long series of observations may be taken as the best estimate of the 'true value' of a variable. 3.slide * In other words, what is unpredictable and chancy in case of an individual is predictable and uniform in the case of a large group. * This law forms the basis for the expectation of probable-loss upon which insurance premium rates are computed. Also called law of averages. Law of Large Numbers Observe a random variable X very many times. In the long run, the proportion of outcomes taking any value gets close to the probability of that value. The Law of Large Numbers says that the average of the observed values gets close to the mean μ X of X. 4.slide ; Law of Large Numbers for Discrete Random Variables * The Law of Large Numbers, which is a theorem proved about the mathematical model of probability, shows that this model is consistent with the frequency interpretation of probability. 5.slide ; Chebyshev Inequality * To discuss the Law of Large Numbers, we first need an important inequality called the Chebyshev Inequality. * Chebyshev’s Inequality is a formula in probability theory that relates to the distribution of numbers in a set. * This formula is able to prove with little provided information the probability of outliers existing at a certain interval.  6.slide *  Given X is a random variable, A stands for the mean of the set, K is the number of standard deviations, and Y is the value of the standard......

Words: 1299 - Pages: 6

Premium Essay

Probability

...Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #07 Random Variables So, far we were discussing the laws of probability so, in the laws of the probability we have a random experiment, as a consequence of that we have a sample space, we consider a subset of the, we consider a class of subsets of the sample space which we call our event space or the events and then we define a probability function on that. Now, we consider various types of problems for example, calculating the probability of occurrence of a certain number in throwing of a die, probability of occurrence of certain card in a drain probability of various kinds of events. However, in most of the practical situations we may not be interested in the full physical description of the sample space or the events; rather we may be interested in certain numerical characteristic of the event, consider suppose I have ten instruments and they are operating for a certain amount of time, now after amount after working for a certain amount of time, we may like to know that, how many of them are actually working in a proper way and how many of them are not working properly. Now, if there are ten instruments, it may happen that seven of them are working properly and three of them are not working properly, at this stage we may not be interested in knowing the positions, suppose we are saying one instrument, two instruments and so, on tenth...

Words: 5830 - Pages: 24

Free Essay

Pdf, Docx

...begin to use probabilistic ideas in statistical inference and modelling, and the study of stochastic processes. Probability axioms. Conditional probability and independence. Discrete random variables and their distributions. Continuous distributions. Joint distributions. Independence. Expectations. Mean, variance, covariance, correlation. Limiting distributions. The syllabus is as follows: 1. Basic notions of probability. Sample spaces, events, relative frequency, probability axioms. 2. Finite sample spaces. Methods of enumeration. Combinatorial probability. 3. Conditional probability. Theorem of total probability. Bayes theorem. 4. Independence of two events. Mutual independence of n events. Sampling with and without replacement. 5. Random variables. Univariate distributions - discrete, continuous, mixed. Standard distributions - hypergeometric, binomial, geometric, Poisson, uniform, normal, exponential. Probability mass function, density function, distribution function. Probabilities of events in terms of random variables. 6. Transformations of a single random variable. Mean, variance, median, quantiles. 7. Joint distribution of two random variables. Marginal and conditional distributions. Independence. iii iv 8. Covariance, correlation. Means and variances of linear functions of random variables. 9. Limiting distributions in the Binomial case. These course notes explain the naterial in the syllabus. They have been “fieldtested” on the class of 2000. Many of the examples......

Words: 29770 - Pages: 120

Premium Essay

Jefferson

...product. We can define a random variable as x equals to the time in minutes to assemble the product b) The possible outcomes for this experiment is the worker may assemble the product from the first second to whatever how long it takes him or her to assemble the product. Therefore, the random variable x may assume any number greater than zero in minutes, meaning any positive number. It can be noted as x > 0. c) In the experiment x is assuming to be all the value greater than zero variable, so the experimental outcomes are based on a measurement of scale. Thus, the random variable x is a continuous random variable. Answer 2 a) The number of questions answered correctly are the possible outcomes. The experiment is based on a 20-question examination, so all the possible values the random variable can assume are 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, and 20. All the possible outcomes are range from 0 to 20, that means the random variable x can take a finite number of value, therefore, x is a discrete random variable b) The random variable x representing the number of cars arriving a tollbooth may assume all the following values 0, 1, 2, 3,…, n cars in one hour. The values of the random variable x is infinite as x may assume the value of n cars in one hour, it is a discrete random variable because it is bound to stop at the number n cars. c) 0, 1, 2, 3, 4, 5,……, 50 are all the values the random variable x may assume given......

Words: 1448 - Pages: 6

Premium Essay

Swag

...COURSE SYLLABUS AP STATISTICS Course Description: The purpose of this course is to introduce students to the major concepts and tools for collecting, analyzing, and drawing conclusions from data. Students are exposed to four broad conceptual themes: • Exploring Data: Describing patterns and departures from patterns • Sampling and Experimentation: Planning and conducting a study • Anticipating Patterns: Exploring random phenomena using probability and simulation • Statistical Inference: Estimating population parameters and testing hypotheses Students who successfully complete the course and examination may receive credit and/or advanced placement for a one-semester introductory college statistics course. Textbook: The Practice of Statistics, 3rd ed. (2008) by Yates, Moore and Starnes (Freeman Publishers) Calculator needed: TI-83 Graphing Calculator (Rentals Available) TI-83+, TI-84, TI-84+ are acceptable calculators as well Note: Any other calculator may/may not have statistical capabilities, and the instructor shall assist whenever possible, but in these instances, the student shall have sole responsibility for the calculator’s use and application in this course. AP STATISTICS Textbook: The Practice of Statistics, 3rd edition by Yates, Moore and Starnes Preliminary Chapter – What Is Statistics? (2 Days) A. Where Do Data Come From? 1. Explain why we should not draw conclusions based on personal experiences. 2....

Words: 7401 - Pages: 30

Free Essay

Week 5 Qnt Final

...crashes on the border of New York and New Jersey, where would you burry the casualties? 8. Double Barreled Question – Would you be in favor of imposing a tax on tobacco to pay for health care related diseases? 6.1 What is the difference between the probability distribution of a discrete random variable and that of a continuous random variable? Explain. A continuous random variable is achieved from information that can be measured instead of counted. A continuous random variable is a variable that can assume an infinite amount of possible values. The probability distribution of a discrete random variable included values that a random variable can assume and the corresponding probabilities of the values. The probability distribution of a continuous random variable has two characteristics: 1. The probability that x assumes a value in any interval lies in the range 0 to 1. 2. The probability of all the (mutually exclusive) intervals within which x can assume a value is 1. A continuous probability distribution differs from a discrete probability distribution in several ways: 1. The probability that a continuous random variable will assume a particular value is zero. 2. As a result, a continuous probability distribution cannot be expressed in tabular form. 3. Instead, an equation or formula is used to describe a continuous probability distribution....

Words: 346 - Pages: 2

Free Essay

Sampling

...Sampling Douglas P. Shumski April 27, 2014 MATH301-1402A-01 Susan Lee 1. In your own words, discuss the differences between discrete and continuous random variables because the statistical analysis of each type of variable is different. Discrete Variable – This type of variable is only in the form of a particular value, and does not consider any values that may fall in between each particular value. The example that I would provide here would be that I have two children. I do not have 2.8 children. Continuous Random Variable - This type of variable can take on any value that is available on a range. My example of this type of variable would be the measure of temperature. The temperature can be measured in tenths, such as 86.9 degrees, and not the whole number of 87 degrees. A person’s individual height or weight could also be considered as a continuous random variable. 2. Roll a die 20 times and record the event in Excel |Roll 1 |1 |Roll 11 |6 | |Roll 2 |5 |Roll 12 |5 | |Roll 3 |3 |Roll 13 |2 | |Roll 4 |6 |Roll 14 |4 | |Roll 5 |6 |Roll 15 |6 | |Roll 6 |3 |Roll 16 |3 | |Roll 7 |5 |Roll 17 |4 | |Roll 8 |4 |Roll 18 |6 | |Roll 9 |2 |Roll 19 |5 | |Roll 10 |1 ...

Words: 589 - Pages: 3

Premium Essay

Quantitative Methods in Management

...between the expected value of an estimator and the true value. • Binary data is data that can take only two values, usually represented by 0 and 1. • Conditional distribution : Given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X (written "Y | X") is the probability distribution of Y when X is known to be a particular value. • Conditional probability is the probability of some event A, assuming event B. Conditional probability is written P(A|B), and is read "the probability of A, given B". • A data set is a sample and the associated data points. • An Elementary event (or atomic event) is an event with only one element. For example, when pulling a card out of a deck, "getting the jack of spades" is an elementary event, while "getting a king or an ace" is not. • The Expected value (or expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. For example, the expected value of a six-sided die roll is 3.5. The concept is similar to the mean. The expected value of random variable X is typically written E(X) or μ (mu). • An event is a subset of the sample space, to which a probability can be assigned. For example, on rolling a die, "getting a five or a six" is an event (with a probability......

Words: 1401 - Pages: 6

Premium Essay

Edfdfde

...based on incomplete information from uncertain events. We use statistical methods and statistical analysis to make decisions in uncertain environment. Population: Sample: A population is the complete set of all items in which an investigator is interested. A sample is a subset of population values. & Example: Population - High school students - Households in the U.S. Sample - A sample of 30 students - A Gallup poll of 1,000 consumers - Nielson Survey of TV rating Random Sample: A random sample of n data values is one selected from the population in such a way that every different sample of size n has an equal chance of selection. & Example: Random Selection - Lotto numbers - Random numbers Random Variable: A variable takes different possible values for a given subject of study. Numerical Variable: A numerical variable takes some countable finite numbers or infinite numbers. Categorical Variable: A categorical variable takes values that belong to groups or categories. Data: Data are measured values of the variable. There are two types of data: quantitative data and qualitative data. 1 Part I (Chapters 1 – 11) Quantitative Data: Qualitative Data: & Example: 1. 2. 3. 3. 4. 5. 6. 7. 8. Statistics: Quantitative data are data measured on a numerical scale. Qualitative data are non-numerical data that can only be classified into one of a group of categories. Temperature Height Age in years Income Prices Occupations Race Sales and Advertising Consumption and Income......

Words: 3688 - Pages: 15

Free Essay

Statistics - Binomial and Poisson Probability

...Distributions Learning Objectives 1. Understand the concepts of a random variable and a probability distribution. 2. Be able to distinguish between discrete and continuous random variables. 3. Be able to compute and interpret the expected value, variance, and standard deviation for a discrete random variable. 4. Be able to compute and work with probabilities involving a binomial probability distribution. 5. Be able to compute and work with probabilities involving a Poisson probability distribution. A random variable is a numerical description of the outcome of an experiment. A discrete random variable may assume either a finite number of values or an infinite sequence of values. A continuous random variable may assume any numerical value in an interval or collection of intervals. Discrete Probability Distributions n Random Variables n Discrete Probability Distributions n Expected Value and Variance n Binomial Distribution n Poisson Distribution [pic] A random variable is a numerical description of the outcome of an experiment. A discrete random variable may assume either a finite number of values or an infinite sequence of values. A continuous random variable may assume any numerical value in an interval or collection of intervals. Example: JSL Appliances n Discrete random variable with a finite number of values n Let x = number of......

Words: 789 - Pages: 4

Premium Essay

Boolean

...the name of the function, and any parameter variables used by the function to accept arguments. The body is comprised of one or more statements that are executed when the function is called. In the following space, complete the following: (Reference: Writing Your Own Functions, page 225). a. Write a function with the header named addTen. b. The function will accept an Integer variable named number. c. The function body will ask the user to enter a number and the add 10 to the number. The answer will be stored in the variable number. d. The return statement will return the value of number. Function a.Integer a.addTen (b.integer number) Display “Enter a number:” Input c.number Set c.number = number + 10 Return d.15 Step 2: In the following space, write a function call to your function from Step 1. Set number =addTen (number) Writing Your Own Function that Returns a Boolean Value Step 1: A Boolean function will either return a true or a false value. You can use these functions to test a condition. They are useful for simplifying complex conditions that are tested in decision and repetition structures. In the following space, complete the following: (Reference: Returning Boolean Values, page 238). a. Write a function with the header named gender. b. The function will accept a Boolean variable named answer. c. The function body will ask the user to enter their gender into the variable type and then determine if they are male or...

Words: 2530 - Pages: 11

Premium Essay

Progamming

...58 65 79 93 101 102 110 114 Random Variables 4.1 Random Variables . . . . . . . . . . . . . . . . . . . . . 4.2 Discrete Random Variables . . . . . . . . . . . . . . . 4.3 Expected Value . . . . . . . . . . . . . . . . . . . . . . 4.4 Expectation of a Function of a Random Variable . . . 4.5 Variance . . . . . . . . . . . . . . . . . . . . . . . . . . 4.6 The Bernoulli and Binomial Random Variables . . . . 4.6.1 Properties of Binomial Random Variables . . . 4.6.2 Computing the Binomial Distribution Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 117 123 125 128 132 134 139 142 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii viii Contents 4.7 The Poisson Random Variable . . . . . . . . . . . . . 4.7.1 Computing the Poisson Distribution Function 4.8 Other Discrete Probability Distributions . . . . . . . 4.8.1 The Geometric Random Variable . . . . . . . 4.8.2 The Negative Binomial Random Variable . . 4.8.3 The Hypergeometric Random Variable . . . 4.8.4 The Zeta (or Zipf) Distribution . . . . . . . . 4.9 Expected Value of Sums of Random Variables . . . 4.10 Properties of the......

Words: 121193 - Pages: 485

Premium Essay

Management

...false positive paradox) Random variables: When all the outcomes of a chance situation can be expressed as numbers, they are called random variables. E.g. The number of girls in a family of 3 children/The score given by a judge in “Strictly Come Dancing”/The time it takes to run 100m Probability Distributions: For a discrete random variable X, its probability distribution is the set of probabilities for each possible value of X. E.g. If X is the number of heads in two tosses of a coin then P(X=0) = ¼/P(X=1) = ½/P(X=2) = ¼ Expected Value: The expected value of a random variable is the sum of values of the random variable weighted by their corresponding probabilities. E(X) = Σ xP(x) E.g. If X is no. heads in 2 tosses of a coin, then E(X) = 0×1/4 + 1×1/2 + 2×1/4 = 1 This is similar to calculating the mean of a frequency distribution. Problem formulation Discrete random variables: When all the outcomes of a chance situation can be expressed as integers, they are called discrete random variables. For example: No. people who pass their driving test on one day/No. goals scored in a football match Probability Distributions: For a discrete random variable X, its probability distribution is the set of probabilities for each possible value of X. Characteristics of a distribution The expected value, or mean, of a random variable is a measure of its central location. E(X) = The variance summarizes the variability in the values of a random variable. Var(X)= The standard......

Words: 483 - Pages: 2