Free Essay

T1. B.P. Lathi, Modern Digital and Analog Communication Systems, 3rd Edition, Oxford University Press, 1998: OR 4th Edition 2010 Chapter 8, 9 & 12

T2. Simon Haykin & Michael Moher: Communication Systems; John Wiely, 4th Edition OR 5th Edition, 2010, 5/e. : Chapter 5

R1.DIGITAL COMMUNICATIONS Fundamentals and Applications: ERNARD SKLAR and Pabitra Kumar Ray; Pearson Education 2009, 2/e. : ( Section 5.5) August 11- 18, 2014

1

What is Noise ?

Desired Signal : The one that is needed. Effect of Noise : Since the noise adds to the signal, it lives with it. Neither amplification nor the filtering can alleviate the effect of noise on the desired signal.

Undesired Signal : The one that gets added to the desired signal when the desired signal is passing through the medium, amplifiers, mixers, filters and other parts of the communication channel between the source and the destination. Noise : The undesired signal that adds to the desired signal and reaches the destination.

The only way to keep away from the effects of noise is to see that less amount of noise, relative to the desired signal, is present at the destination

Interference: Intentional or unintentional un desired signals that interfere with communication process.

2

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Noise Sources

Externally Generated Internally Generated Thermal noise : Random Motion of electrons due to temperature in resistive components of the system Shot Noise : Due to diffusion of carriers in semiconductors etc.

Atmospheric : Due to lightening & Thunder storms :

2MHz – 10 MHz

Extra Terrestrial Galactic sources

: Due to solar &

20 MHz- 1.5 GHz Man Made Noise : Spark Plugs, engine Noise 1 MHz– 500 MHz Most of the discussion in our class will be on Thermal Noise

3

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Thermal Noise

Thermal noise is an inevitable reality with which the received signal power has to compete Gaussian or Normal probability density function

Thermal Noise is AWGN

Additive White Gaussian Noise.

Additive : Adds to Signal

White : Its power spectral density is flat Gaussian : The underlying probability density function is Gaussian

Cumulative distribution function We talk about the probability density function because, noise is random and hence to be dealt with properties of random variables. ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

4

Statistical Averages of Random Variable

For a Continuous RV case, the mean is Moments of a random variable:

Mean of a function (y = g(x)) of a random variable

Mean square of a random variable: use g(x) = x2

5

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

6

Sum of Random Variables z=x+y Central Limit Theorem: under certain conditions, sum of large number of independent random variables tends to be a Gaussian random variable, independent of the pdfs of the random variables involved.

If

Then the pdf of z is

Example: By adding 2 RVs, with density function as in the figure, the density function of the resulting RV is

6

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Random Process

A random variable that is a function of time is called a random process. Ex: Binary waveform generator, say over 10 pulse durations A cos (wct + Φ), with Φ being a random variable.

7

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Random Process

A random variable that is a function of time is called a random process.

Collection of all possible waveforms is called Ensemble

A given waveform in the Ensemble is called Sample Function X1, X2, .. Are the random variables generated by the amplitudes of the sample functions at time instants t1, t2, .. respectively ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

8

Random Process

The n random variables X1, X2, ..are dependent, in general The nth order joint PDF is expressed as

If a higher order joint PDF is available, the lower order PDF can be obtained

The mean of the random process can be obtained from the first order PDF as ELECTRICAL ELECTRONICS COMMUNICATION

9

INSTRUMENTATION

Auto Correlation of a Random Process

10

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Stationary Random Process

11

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Ergodic Random Process

Ensemble statistics

Time statistics

For Ergodic Process

ELECTRICAL ELECTRONICS COMMUNICATION INSTRUMENTATION

12

Power Spectral Density of Random Process

13

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Transmission of a Random Process through a Linear System.

If either or both of them are zero mean processes,

14

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Home Work

Solve & understand the following worked examples: 9.2 – 9.5 from Lathi (4th Edition)

15

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

System Noise Characterization

16

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Thermal Noise Power

The thermal noise is AWGN in nature and its power is N = k T0 W (or B) Watts T0 = Temperature in Kelvin degrees

k = Boltzman Constant = 1.38 X 10-23 J/K or W / K-Hz

= - 228.6 dBW / K-Hz

W or B = Bandwidth in Hz

Noise Power Spectral density

ELECTRICAL ELECTRONICS

N0 = (N / W ) = k T0

COMMUNICATION

Watts /Hz

INSTRUMENTATION

17

Noise Figure

All passive & active devices generate noise

Amplifiers in the system are made of active & passive devices, hence contribute to over all noise in the system

Noise Figure of Amplifier

For a lossy network, Loss is given by L = ������������������������������������ ������������������������������ Noise Figure COMMUNICATION F = L. 18 INSTRUMENTATION

������������������������������ ������������������������������

ELECTRICAL

ELECTRONICS

Noise Temperature

TR0= Effective Noise Temperature of Network or Receiver ELECTRICAL

To0= Reference Temperature of the noise source, chosen to be 2900 K COMMUNICATION INSTRUMENTATION

19

ELECTRONICS

Composite Noise figure

Noise at the output of Network2 (Nout)2 = G2 (Nout)1 + (F2-1) G2 k 290 W (Nout)2 = G2 { G1 N1 + (F1-1) G1 k 290 W } + (F2-1) G2 k 290 W (Nout)2 = G1G2 N1 + G1G2(F1-1) k 290 W + (F2-1) G2 k 290 W

Assume the over all gain of the network is G = G1 G2 and over all noise figure is F comp The total noise power at the output of the cascaded network is given by

comp

Comparing

Let the noise at the input of Network1 be N1 (Fcomp-1) G1 G2 k 290 W Noise at the output of Network1 (Nout)1 = G1 N1+ (F1-1) G1 k 290 W ELECTRICAL ELECTRONICS = G1G2(F1-1) k 290 W + (F2-1) G2 k 290 W

Fcomp = F1 + (F2-1)/ G1

COMMUNICATION INSTRUMENTATION

20

Composite Noise figure : Feed line & Amplifier

For an N-Stage Network..

Tcomp0 = (L-1)290 + (F-1) 290/(1/L) = (L-1)290 + L(F-1) 290

Tcomp0 = (LF-1) 2900 K

Tcomp0 = (LF-1) 2900 K

= (LF-1 + L -L) 2900 K = (L -1 + L(F-1) ) 2900 K

Tcomp0 = TL0 + L TR0

21

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

System Effective Temperature

Manmade noises: Radiation from Automobile ignition and electrical machinery and Radio transmissions from other users that fall into the BW.

The system effective Temperature is F

TS0 = TA0 + TL0 + TR0 / (1/L) = TA0 + (L-1)290 + L(F-1) 290 TS0 = TA0 + (LF-1) 2900 K

TA0 is the antenna noise temperature Natural Sources including : Lightening, Celestial radio sources, Atmospheric sources, Thermal radiation from The ground and other structures. ELECTRICAL ELECTRONICS

22

COMMUNICATION

INSTRUMENTATION

Example Problem on Lossy Line

23

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Example on NF & Noise Temp

and the overall Noise Figure of the system

Nout = G k TS0 W = 108 X 1.38 X 10 -23 X 2760 X 6 X 106 = 22.8 mw

TR0 = (F-1)2900 K = 26100 K

TS0 = TA0 + TL0 + LTR0 = 150 + 2610 = 2760 K

ELECTRICAL ELECTRONICS COMMUNICATION 29.1 – 16.4 = 12.7 dB INSTRUMENTATION

24

Improving SNR - Benefit of using Pre Amplifier

Fig. 5.19a

SNRout = 16.4 dB

TR20 = (F2-1)2900 K

= 26100 K

Tcomp0 = TR10 + TR20 / G1 = 290 + 2610/20 = 420.5 0K

Fig. 5.19a

TS0 = TA0 + Tcomp0

= 150 + 420.5 0K = 570.5 0K = 2+ 9/20 = 2.5 (4dB)

SNRout = 23.3 dB

Fcomp = F1 + (F2-1)/ G1

Fig. 5.19b

TR10 = (F1-1)2900 K = 2900 K

25

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Problem

<

75 feet Lossy Cable 3dB/100 ft Loss (a) Fcomp

Receiver F = 13 dB

Fcomp =?

Pre-amp G = 20 dB F = 3 dB (c) Fcomp = F1 + (F2-1)/ G1 + (F2-1)/ G1G2 = 2 + (1.68-1) /100

(b) Fcomp = F1 + (F2-1)/ G1 + (F3-1)/ G1G2 = 1.68 + (2-1) X 1.68

= 3 X 0.75

= 2.25 dB

= F1 + (F2-1)/ G1

= 1.68 +

(20-1) X 1.68 = 33.6 = 15.26 dB

= 1.68 =F

+ (20-1) X 1.68 /100 = 3.68 = 5.65dB

+ (20-1) X 1.68 /100 = 2.32 = 3.66dB

26

ELECTRICAL

ELECTRONICS

COMMUNICATION

INSTRUMENTATION

Premium Essay

...CHAPTER 6 RANDOM VARIABLES PART 1 – Discrete and Continuous Random Variables OBJECTIVE(S): • Students will learn how to use a probability distribution to answer questions about possible values of a random variable. • Students will learn how to calculate the mean and standard deviation of a discrete random variable. • Students will learn how to interpret the mean and standard deviation of a random variable. Random Variable – Probability Distribution - Discrete Random Variable - The probabilities of a probability distribution must satisfy two requirements: a. b. Mean (expected value) of a discrete random variable [pic]= E(X) = = 1. In 2010, there were 1319 games played in the National Hockey League’s regular season. Imagine selecting one of these games at random and then randomly selecting one of the two teams that played in the game. Define the random variable X = number of goals scored by a randomly selected team in a randomly selected game. The table below gives the probability distribution of X: Goals: 0 1 2 3 4 5 6 7 8 9 Probability: 0.061 0.154 0.228 0.229 0.173 0.094 0.041 0.015 0.004 0.001 a. Show that the probability distribution for X is legitimate. b. Make a histogram of the probability distribution. Describe what you see. 0.25 0.20 0.15 0.10 ...

Words: 3495 - Pages: 14

Free Essay

...the average (mean) of a long series of observations may be taken as the best estimate of the 'true value' of a variable. 3.slide * In other words, what is unpredictable and chancy in case of an individual is predictable and uniform in the case of a large group. * This law forms the basis for the expectation of probable-loss upon which insurance premium rates are computed. Also called law of averages. Law of Large Numbers Observe a random variable X very many times. In the long run, the proportion of outcomes taking any value gets close to the probability of that value. The Law of Large Numbers says that the average of the observed values gets close to the mean μ X of X. 4.slide ; Law of Large Numbers for Discrete Random Variables * The Law of Large Numbers, which is a theorem proved about the mathematical model of probability, shows that this model is consistent with the frequency interpretation of probability. 5.slide ; Chebyshev Inequality * To discuss the Law of Large Numbers, we first need an important inequality called the Chebyshev Inequality. * Chebyshev’s Inequality is a formula in probability theory that relates to the distribution of numbers in a set. * This formula is able to prove with little provided information the probability of outliers existing at a certain interval. 6.slide * Given X is a random variable, A stands for the mean of the set, K is the number of standard deviations, and Y is the value of the standard deviation...

Words: 1299 - Pages: 6

Premium Essay

...Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #07 Random Variables So, far we were discussing the laws of probability so, in the laws of the probability we have a random experiment, as a consequence of that we have a sample space, we consider a subset of the, we consider a class of subsets of the sample space which we call our event space or the events and then we define a probability function on that. Now, we consider various types of problems for example, calculating the probability of occurrence of a certain number in throwing of a die, probability of occurrence of certain card in a drain probability of various kinds of events. However, in most of the practical situations we may not be interested in the full physical description of the sample space or the events; rather we may be interested in certain numerical characteristic of the event, consider suppose I have ten instruments and they are operating for a certain amount of time, now after amount after working for a certain amount of time, we may like to know that, how many of them are actually working in a proper way and how many of them are not working properly. Now, if there are ten instruments, it may happen that seven of them are working properly and three of them are not working properly, at this stage we may not be interested in knowing the positions, suppose we are saying one instrument, two instruments and so, on tenth...

Words: 5830 - Pages: 24

Free Essay

...result in difficult summarization. A closed-ended question might work better in this instance. Rephrase like: Do you prefer an artificial sweetener? 7. Confusing Words - If a plane crashes on the border of New York and New Jersey, where would you burry the casualties? 8. Double Barreled Question – Would you be in favor of imposing a tax on tobacco to pay for health care related diseases? 6.1 What is the difference between the probability distribution of a discrete random variable and that of a continuous random variable? Explain. A continuous random variable is achieved from information that can be measured instead of counted. A continuous random variable is a variable that can assume an infinite amount of possible values. The probability distribution of a discrete random variable included values that a random variable can assume and the corresponding probabilities of the values. The probability distribution of a continuous random variable has two characteristics: 1. The probability that x assumes a value in any interval lies in the range 0 to 1. 2. The probability of all the (mutually exclusive) intervals within which x can assume a value is 1. A continuous probability distribution differs from a discrete...

Words: 346 - Pages: 2

Free Essay

...begin to use probabilistic ideas in statistical inference and modelling, and the study of stochastic processes. Probability axioms. Conditional probability and independence. Discrete random variables and their distributions. Continuous distributions. Joint distributions. Independence. Expectations. Mean, variance, covariance, correlation. Limiting distributions. The syllabus is as follows: 1. Basic notions of probability. Sample spaces, events, relative frequency, probability axioms. 2. Finite sample spaces. Methods of enumeration. Combinatorial probability. 3. Conditional probability. Theorem of total probability. Bayes theorem. 4. Independence of two events. Mutual independence of n events. Sampling with and without replacement. 5. Random variables. Univariate distributions - discrete, continuous, mixed. Standard distributions - hypergeometric, binomial, geometric, Poisson, uniform, normal, exponential. Probability mass function, density function, distribution function. Probabilities of events in terms of random variables. 6. Transformations of a single random variable. Mean, variance, median, quantiles. 7. Joint distribution of two random variables. Marginal and conditional distributions. Independence. iii iv 8. Covariance, correlation. Means and variances of linear functions of random variables. 9. Limiting distributions in the Binomial case. These course notes explain the naterial in the syllabus. They have been “ﬁeldtested” on the class of 2000. Many of the examples are...

Words: 29770 - Pages: 120

Free Essay

...Distributions Learning Objectives 1. Understand the concepts of a random variable and a probability distribution. 2. Be able to distinguish between discrete and continuous random variables. 3. Be able to compute and interpret the expected value, variance, and standard deviation for a discrete random variable. 4. Be able to compute and work with probabilities involving a binomial probability distribution. 5. Be able to compute and work with probabilities involving a Poisson probability distribution. A random variable is a numerical description of the outcome of an experiment. A discrete random variable may assume either a finite number of values or an infinite sequence of values. A continuous random variable may assume any numerical value in an interval or collection of intervals. Discrete Probability Distributions n Random Variables n Discrete Probability Distributions n Expected Value and Variance n Binomial Distribution n Poisson Distribution [pic] A random variable is a numerical description of the outcome of an experiment. A discrete random variable may assume either a finite number of values or an infinite sequence of values. A continuous random variable may assume any numerical value in an interval or collection of intervals. Example: JSL Appliances n Discrete random variable with a finite number of values n Let x = number of TVs sold at the...

Words: 789 - Pages: 4

Free Essay

... L = labor costs (between $5,040 and $6,860) U = rent, utilities, other unavoidable costs = $3,995 What are the monthly variable costs? F = food costs M = number of meals served in month F = $11 x M What are the monthly total costs? L + U + F = L + 3,995 + 11 x M 2 Gentle Lentil’s Monthly Earnings,cont. What are the monthly revenues? R = monthly revenues P = price of meal R = PxM What are the monthly earnings? X = monthly earnings = revenues – costs = P x M – ( L + 3,995 + 11 x M ) = (P – 11 ) x M – L – 3,995 Which of these quantities are random variables? P M L X = = = = price of prix fixe meal number of meals sold labor cost monthly earnings 3 (X is a function of random variables, so it is a random variable) Assumptions Regarding the Behavior of the Random Variables M P = number of meals sold per month We assume that M obeys a Normal distribution with µ = 3,000 and σ = 1,000 = price of the prix fixe meal We assume that P obeys the following discrete probability distribution Scenario Very healthy market Healthy market Not so healthy market Unhealthy market Price of Prix Fixe Meal $20.00 $18.50 $16.50 $15.00 Probability 0.25 0.35 0.30 0.10 L = labor costs per month We assume that L obeys a uniform distribution with a minimum of $5,040 and maximum of $6,860 4 The Behavior of the Random Variables, cont. X = earnings per month We do not know the distribution of X . We assume, however, that X = (P – 11 ) x M – L – 3...

Words: 3675 - Pages: 15

Free Essay

...* The chi-square distribution arises in tests of hypotheses concerning the independence of two random variables and concerning whether a discrete random variable follows a specified distribution. * all the chi-square distributions form a family, and each of its members is also specified by a DF parameter. * ezay ne7seb el gegree of freedom? : Factor 1 has 2 levels + factor 2 has 3 levels (2 – 1) x (3 – 1) = 2 df * CHI-SQUARE is a greek letter denoted by X – square denoted by x2. * X2 is always right tailed. * CHI-SQUARE RANDOM VARIABLE is a random variable that assumes only positive values and follows a chi-square distribution. * critical value screenshot * we will investigate hypotheses that have to do with whether or not two random variables take their values independently, or whether the value of one has a relation to the value of the other. Thus the hypotheses will be expressed in words, not mathematical symbols. * One of the factors related to one variable has more than one level (2 levels) .. such as: testing the idependency of relationship between heart rates and baby gender, because the other factor , which is babay gender, has 2 levels!! “ male – female”. * E : expected number of observations Screenshot contingency table * squares of the difference of the numbers in each core cell: A measure of how much the data deviate from what we would expect to see if the factors really were independent * We would reject...

Words: 421 - Pages: 2

Premium Essay

...46 Probability, Random Variables and Expectations Exercises Exercise 1.1. Prove that E [a + b X ] = a + b E [X ] when X is a continuous random variable. Exercise 1.2. Prove that V [a + b X ] = b 2 V [X ] when X is a continuous random variable. Exercise 1.3. Prove that Cov [a + b X , c + d Y ] = b d Cov [X , Y ] when X and Y are a continuous random variables. Exercise 1.4. Prove that V [a + b X + c Y ] = b 2 V [X ] + c 2 V [Y ] + 2b c Cov [X , Y ] when X and Y are a continuous random variables. ¯ Exercise 1.5. Suppose {X i } is an sequence of random variables. Show that V X = V 2 2 σ where σ is V [X 1 ]. time. 1.4 Expectations and Moments 47 i. Assuming 99% of trades are legitimate, what is the probability that a detected trade is rogue? Explain the intuition behind this result. ii. Is this a useful test? Why or why not? Exercise 1.13. You corporate ﬁnance professor uses a few jokes to add levity to his lectures. He is also very busy, and so forgets week to week which jokes were used. i. Assuming he has 12 jokes, what is the probability of 1 repeat across 2 consecutive weeks? ii. What is the probability of hearing 2 of the same jokes in consecutive weeks? iii. What is the probability that all 3 jokes are the same? iv. Assuming the term is 8 weeks long, and they your professor has 96 jokes, what is the probability that there is no repetition across the term? Note: he remembers the jokes he gives in a particular lecture, only forgets across...

Words: 953 - Pages: 4

Premium Essay

...Expected Value The expected value of a random variable indicates its weighted average. Ex. How many heads would you expect if you flipped a coin twice? X = number of heads = {0,1,2} p(0)=1/4, p(1)=1/2, p(2)=1/4 Weighted average = 0*1/4 + 1*1/2 + 2*1/4 = 1 Draw PDF Definition: Let X be a random variable assuming the values x1, x2, x3, ... with corresponding probabilities p(x1), p(x2), p(x3),..... The mean or expected value of X is defined by E(X) = sum xk p(xk). Interpretations: (i) The expected value measures the center of the probability distribution - center of mass. (ii) Long term frequency (law of large numbers… we’ll get to this soon) Expectations can be used to describe the potential gains and losses from games. Ex. Roll a die. If the side that comes up is odd, you win the $ equivalent of that side. If it is even, you lose $4. Let X = your earnings X=1 X=3 X=5 X=-4 P(X=1) = P({1}) =1/6 P(X=1) = P({3}) =1/6 P(X=1) = P({5}) =1/6 P(X=1) = P({2,4,6}) =3/6 E(X) = 1*1/6 + 3*1/6 + 5*1/6 + (-4)*1/2 = 1/6 + 3/6 +5/6 – 2= -1/2 Ex. Lottery – You pick 3 different numbers between 1 and 12. If you pick all the numbers correctly you win $100. What are your expected earnings if it costs $1 to play? Let X = your earnings X = 100-1 = 99 X = -1 P(X=99) = 1/(12 3) = 1/220 P(X=-1) = 1-1/220 = 219/220 E(X) = 100*1/220 + (-1)*219/220 = -119/220 = -0.54 Expectation of a function of a random variable Let X be a random variable assuming the values x1, x2, x3, ... with corresponding...

Words: 1852 - Pages: 8

Premium Essay

...Equations in One Variable Manipulate the equation using Rule 1 so that all the terms involving the variable (call it x) are on one side of the equation and all constants are on the other side. Then use Rule 2 to solve for x. Rule 1: Adding the same quantity to both sides of an equation does not change the set of solutions to that equation. Rule 2: Multiplying or dividing both sides of an equation by the same nonzero number does not change the set of solutions to that equation. Straight Lines: Slope Intercept Form A straight line with slope m and y-intercept (b, 0) has the equation y = mx + b. Point Slope Form of a Line Equation Given two points on a line, (x0, y0) and (x1, y1), find the line's slope m =1−01−0. Then the equation of the line may be written as y – y0 = m(x – x0). Solving Two Linear Equations Two linear equations in two variables (call them x and y) have no solution, an infinite number of solutions, or a unique solution. You may solve two linear equations by either substitution or elimination. Substitution: Use one equation to solve for one variable in terms of the other (say, x in terms of y). Then substitute this relationship for each occurrence of x in the remaining equation. Now solve the remaining equation for y. Given that you know x in terms of y, you also know x. Elimination: Add a multiple of one equation to the other equation to eliminate a variable (say, x) from the other equation. Solve the resulting equation for the remaining variable (y). Substitute...

Words: 3181 - Pages: 13

Premium Essay

...mean (average) Variance 2 probability density function 1 x 2 1 exp f x 2 2 cumulative density function 1 t 2 1 F x dt exp 2 2 Standard Normal Density X ~ N 0,1 probability density function n x cumulative density function x N x 1 1 exp x 2 2 2 x important result: standardization 1 exp t 2 dt 2 2 1 if X~N , 2 and Z= then Z~N 0,1 X- 1 Mathematical Expectation: Given a random variable X and its pdf f x we define the expectation of the function g X to be the integral E g X g x f x dx Note that g X is also a random variable The Moment Generating Function (MGF) The MGF of a random variable X is a function of t denoted by M X t E e xt which is an expectation MGF of normal If X ~ N , 2 1 x 1 Xt xt Then M X t E e e e 2 2 Lognormal Distribution: 2 1 t 2t 2 dx e 2 Y has the lognormal distribution with parameters , 2 if: its logarithm is normally distributed X log e Y ~ N , 2 . This in turn means that Y e X 2 The cumulative density function of Y is log e y FY y Pr Y y N x 12 1 2t where N x ...

Words: 7933 - Pages: 32

Premium Essay

...Lesson 2: Review of basic concepts of probability theory Coverage: Basic probability rules Random variables and associate concepts Normal distributions Reading: Chapter 2 (1-7), Chapter 3 (1-5, 10-11) and Chapter 4 (1-8) (1 8) Homework: Replicate and complete all the classroom exercises. Print answers of 2.1 (b-c-d), 2.3 (b-c-d) and 2.4(b-c-d) in one (1) page. 1 Business Statistics Lesson 2 - Page 2 Objectives At the end of the lesson, you should be able to: Define and apply the basic probability rules Describe the basic concepts related to random variables D ib th b i t l t dt d i bl Describe and use the properties of means and variances Recognize and understand the most commonly used probability distributions Use the basic data manipulation and descriptive statistical features of SPSS and transfer between SPSS and Excel SPSS, 1 Business Statistics • • • • Review of probability concepts Lesson 2 - Page 3 Probability: is defined on random events (occurrences), takes values between 0 and 1, and can be interpreted as limit of relative frequency (objective probability) Note: In everyday usage, probability might mean the extent of our belief in the occurrence of the event (subjective probability). However, statistics mostly deals with objective interpretation based on relative frequency. j p q y Basic probability rules: P( Sure event) = 1 and P( Impossible event) = 0 P(A or B) = P(A) + P(B) – P(A and B) Consequences: P( not A) = 1 - P(A)...

Words: 867 - Pages: 4

Premium Essay

...Algebra Solving Linear Equations in One Variable Manipulate the equation using Rule 1 so that all the terms involving the variable (call it x) are on one side of the equation and all constants are on the other side. Then use Rule 2 to solve for x. Rule 1: Adding the same quantity to both sides of an equation does not change the set of solutions to that equation. Rule 2: Multiplying or dividing both sides of an equation by the same nonzero number does not change the set of solutions to that equation. Straight Lines: Slope Intercept Form A straight line with slope m and y-intercept (b, 0) has the equation y = mx + b. Point Slope Form of a Line Equation − Given two points on a line, (x0, y0) and (x1, y1), find the line's slope m = 1 −0 . 1 0 Then the equation of the line may be written as y – y0 = m(x – x0). Solving Two Linear Equations Two linear equations in two variables (call them x and y) have no solution, an infinite number of solutions, or a unique solution. You may solve two linear equations by either substitution or elimination. Substitution: Use one equation to solve for one variable in terms of the other (say, x in terms of y). Then substitute this relationship for each occurrence of x in the remaining equation. Now solve the remaining equation for y. Given that you know x in terms of y, you also know x. Elimination: Add a multiple of one equation to the other equation to eliminate a variable (say, x) from the other equation....

Words: 3184 - Pages: 13

Premium Essay

...Statistics for Business [pic] Discrete and Continuous Probability Distributions Business Statistics With Canadian Applications Hummelbrunner Rak Gray Third Edition Week6 Pages 261-263 chapter 8 Pages 288-314, 320-325 chapter 9 Arranged by: Neiloufar Aminneia Probability distribution A probability distribution is a list of all events of an experiment together with the probability associated with each event in a tabular form. It is used for business and economic problems. We learned frequency distribution to classify data, relating to actual observations and experiments but probability distribution describes how outcomes are expected to vary. Probability distribution for rolling a true die x P(x) Events Frequencies Probability 1 1 1/6 2 1 1/6 3 1 1/6 4 1 1/6 5 1 1/6 6 1 1/6 Total: 1 [pic] Probability distribution for tossing 3 coins x P(x) Events(# of Heads) Frequencies Probability 0 1 (TTT))...

Words: 2655 - Pages: 11