76197039 Ken Black Qa 5th Chapter 4 Solution

  • Uploaded by: sweety2584
  • 0
  • 0
  • February 2021
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View 76197039 Ken Black Qa 5th Chapter 4 Solution as PDF for free.

More details

  • Words: 6,477
  • Pages: 40
Loading documents preview...
Chapter 4: Probability

1

Chapter 4 Probability LEARNING OBJECTIVES The main objective of Chapter 4 is to help you understand the basic principles of probability, specifically enabling you to 1. 2. 3. 4. 5.

Comprehend the different ways of assigning probability. Understand and apply marginal, union, joint, and conditional probabilities. Select the appropriate law of probability to use in solving problems. Solve problems using the laws of probability, including the law of addition, the law of multiplication , and the law of conditional probability. Revise probabilities using Bayes' rule.

CHAPTER TEACHING STRATEGY Students can be motivated to study probability by realizing that the field of probability has some stand-alone application in their lives in such applied areas human resource analysis, actuarial science, and gaming. In addition, students should understand that much of the rest of this course is based on probabilities even though they will not be directly applying many of these formulas in other chapters. This chapter is frustrating for the learner because probability problems can be approached by using several different techniques. Whereas, in many chapters of this text, students will approach problems by using one standard technique, in chapter 4, different students will often use different approaches to the same problem. The text attempts to emphasize this point and underscore it by presenting several different ways to solve probability problems. The probability rules and laws presented in the chapter can virtually always be used in solving probability problems. However, it is sometimes easier to construct a probability matrix or a tree diagram or use the sample space to solve the problem. If the student is aware that what they have at their hands is an array of tools or techniques, they will be less overwhelmed in approaching a probability problem. An attempt has been made to differentiate the several types of probabilities so that students can sort out the various types of problems.

Chapter 4: Probability

2

In teaching students how to construct a probability matrix, emphasize that it is usually best to place only one variable along each of the two dimensions of the matrix. (That is, place Mastercard with yes/no on one axis and Visa with yes/no on the other instead of trying to place Mastercard and Visa along the same axis). This particular chapter is very amenable to the use of visual aids. Students enjoy rolling dice, tossing coins, and drawing cards as a part of the class experience. Of all the chapters in the book, it is most imperative that students work a lot of problems in this chapter. Probability problems are so varied and individualized that a significant portion of the learning comes in the doing. Experience is an important factor in working probability problems. Section 4.8 on Bayes’ theorem can be skipped in a one-semester course without losing any continuity. This section is a prerequisite to the chapter 19 presentation of “revising probabilities in light of sample information (section 19.4).

CHAPTER OUTLINE 4.1

Introduction to Probability

4.2

Methods of Assigning Probabilities Classical Method of Assigning Probabilities Relative Frequency of Occurrence Subjective Probability

4.3

Structure of Probability Experiment Event Elementary Events Sample Space Unions and Intersections Mutually Exclusive Events Independent Events Collectively Exhaustive Events Complimentary Events Counting the Possibilities The mn Counting Rule Sampling from a Population with Replacement Combinations: Sampling from a Population Without Replacement

4.4

Marginal, Union, Joint, and Conditional Probabilities

4.5

Addition Laws Probability Matrices Complement of a Union Special Law of Addition

Chapter 4: Probability

4.6

Multiplication Laws General Law of Multiplication Special Law of Multiplication

4.7

Conditional Probability Independent Events

4.8

Revision of Probabilities: Bayes' Rule

3

KEY TERMS A Priori Bayes' Rule Classical Method of Assigning Probabilities Collectively Exhaustive Events Combinations Complement of a Union Complement Conditional Probability Elementary Events Event Experiment Independent Events

Intersection Joint Probability Marginal Probability mn Counting Rule Mutually Exclusive Events Probability Matrix Relative Frequency of Occurrence Sample Space Set Notation Subjective Probability Union Union Probability

Chapter 4: Probability

4

SOLUTIONS TO PROBLEMS IN CHAPTER 4 4.1

Enumeration of the six parts: D1, D2, D3, A4, A5, A6 D = Defective part A = Acceptable part Sample Space: D1 D2, D1 D3, D1 A4, D1 A5, D1 A6,

D2 D3, D2 A4, D2 A5, D2 A6, D3 A4,

D3 A5 D3 A6 A4 A5 A4 A6 A5 A6

There are 15 members of the sample space The probability of selecting exactly one defect out of two is: 9/15 = .60

4.2

X = {1, 3, 5, 7, 8, 9}, Y = {2, 4, 7, 9}, and Z = {1, 2, 3, 4, 7,} a) X ⊥ Z = {1, 2, 3, 4, 5, 7, 8, 9} b) X _ Y = {7, 9} c) X _ Z = {1, 3, 7} d) X ⊥ Y ⊥ Z = {1, 2, 3, 4, 5, 7, 8, 9} e) X_ Y_ Z= {7} f) (X ⊥ Y) _ Z = {1, 2, 3, 4, 5, 7, 8, 9} _ {1, 2, 3, 4, 7} = {1, 2, 3, 4, 7} g) (Y _ Z) ⊥ (X _ Y) = {2, 4, 7} ⊥ {7, 9} = {2, 4, 7, 9} h) X or Y = X ⊥ Y = {1, 2, 3, 4, 5, 7, 8, 9} i) Y and Z = Y _ Z = {2, 4, 7}

Chapter 4: Probability

4.3

5

If A = {2, 6, 12, 24} and the population is the positive even numbers through 30, A’ = {4, 8, 10, 14, 16, 18, 20, 22, 26, 28, 30}

4.4

6(4)(3)(3) = 216

4.5

Enumeration of the six parts: D1, D2, A1, A2, A3, A4 D = Defective part A = Acceptable part Sample Space: D1 D2 A1, D1 D2 A4, D1 A1 A4, D1 A3 A4, D2 A1 A4, D2 A3 A4, A1 A3 A4,

D1 D2 A2, D1 A1 A2, D1 A2 A3, D2 A1 A2, D2 A2 A3, A1 A2 A3, A2 A3 A4

D1 D2 A3, D1 A1 A3, D1 A2 A4, D2 A1 A3, D2 A2 A4, A1 A2 A4,

Combinations are used to counting the sample space because sampling is done without replacement. C3 =

6

6! = 20 3!3!

Probability that one of three is defective is: 12/20 = 3/5

.60

There are 20 members of the sample space and 12 of them have exactly 1 defective part.

4.6

107 = 10,000,000 different numbers

4.7

20

C6 =

20 ! = 38,760 6!14 !

It is assumed here that 6 different (without replacement) employees are to be selected.

Chapter 4: Probability

4.8

P(A) = .10, P(B) = .12, P(C) = .21, P(A _ a) P(A ⊥ C) = P(A) + P(C) - P(A _

6

C) = .05 P(B _

C) = .03

C) = .10 + .21 - .05 = .26

b) P(B ⊥ C) = P(B) + P(C) - P(B _ C) = .12 + .21 - .03 = .30 c) If A, B mutually exclusive, P(A ⊥ B) = P(A) + P(B) = .10 + .12 = .22

4.9

7167

5167

D

E

F

A

5

8

12

25

B

10

6

4

20

C

8

2

5

15

23

16

21

60

a) P(A ⊥ D) = P(A) + P(D) - P(A _

D) = 25/60 + 23/60 - 5/60 = 43/60 = .

b) P(E ⊥ B) = P(E) + P(B) - P(E _ B) = 16/60 + 20/60 - 6/60 = 30/60 = .5000 c) P(D ⊥ E) = P(D) + P(E) = 23/60 + 16/60 = 39/60 = .6500 d) P(C ⊥ F) = P(C) + P(F) - P(C _ F) = 15/60 + 21/60 - 5/60 = 31/60 = .

Chapter 4: Probability

7

4.10

a) b) c) d)

4.11

E

F

A

.10

.03

.13

B

.04

.12

.16

C

.27

.06

.33

D

.31

.07

.38

.72

.28

1.00

P(A ⊥ P(E ⊥ P(B ⊥ P(E ⊥

F) = P(A) + P(F) - P(A _ F) = .13 + .28 - .03 = .38 B) = P(E) + P(B) - P(E _ B) = .72 + .16 - .04 = .84 C) = P(B) + P(C) =.16 + .33 = .49 F) = P(E) + P(F) = .72 + .28 = 1.000

A = event of having flown in an airplane at least once T = event of having ridden in a train at least once P(A) = .47

P(T) = .28

P (ridden either a train or an airplane) = P(A ⊥ T) = P(A) + P(T) - P(A _ T) = .47 + .28 - P(A _

T)

Cannot solve this problem without knowing the probability of the intersection. We need to know the probability of the intersection of A and T, the proportion who have ridden both or determine if these two events are mutually exclusive. 4.12

P(L) = .75 P(M) = .78 P(M _ L) = .61 a) P(M ⊥ L) = P(M) + P(L) - P(M _ L) = .78 + .75 - .61 = .92 b) P(M ⊥ L) but not both = P(M ⊥ L) - P(M _ L) = .92 - .61 = .31 c) P(NM _ NL) = 1 - P(M ⊥ L) = 1 - .92 = .08 Note: the neither/nor event is solved for here by taking the complement of the union.

Chapter 4: Probability

4.13

8

Let C = have cable TV Let T = have 2 or more TV sets P(C) = .67, P(T) = .74, P(C _ T) = .55 a) P(C ⊥ T) = P(C) + P(T) - P(C _ T) = .67 + .74 - .55 = .86 b) P(C ⊥ T but not both) = P(C ⊥ T) - P(C _ T) = .86 - .55 = .31 c) P(NC _ NT) = 1 - P(C ⊥ T) = 1 - .86 = .14 d) The special law of addition does not apply because P(C _ T) is not .0000. Possession of cable TV and 2 or more TV sets are not mutually exclusive.

4.14

Let T = review transcript F = consider faculty references P(T) = .54 P(F) = .44 P(T _ F) = .35 a) P(F ⊥ T) = P(F) + P(T) - P(F _ T) = .44 + .54 - .35 = .63 b) P(F ⊥ T) - P(F _ T) = .63 - .35 = .28 c) 1 - P(F ⊥ T) = 1 - .63 = .37 d) Faculty References Transcript

Y

Y .35

N .19

.54

N

.09

.37

.46

.44

.56

1.00

Chapter 4: Probability

9

4.15 C

D

E

F

A

5

11

16

8

40

B

2

3

5

7

17

7

14

21

15

57

a) b) c) d)

P(A _ E) = 16/57 = .2807 P(D _ B) = 3/57 = .0526 P(D _ E) = .0000 P(A _ B) = .0000

4.16 D

E

F

A

.12

.13

.08

.33

B

.18

.09

.04

.31

C

.06

.24

.06

.36

.36

.46

.18

1.00

a) P(E _ B) = .09 b) P(C _ F) = .06 c) P(E _ D) = .00

4.17

Let D = Defective part a) (without replacement) P(D1 _

D2) = P(D1) ⋅ P(D2 D1) =

6 5 30 ⋅ = = .0122 50 49 2450

b) (with replacement) P(D1 _ 4.18

D2) = P(D1) ⋅ P(D2) =

6 6 36 ⋅ = = .0144 50 50 2500

Let U = Urban I = care for Ill relatives P(U) = .78 P(I) = .15 P(IU) = .11 a) P(U _ I) = P(U) ⋅ P(IU) P(U _ I) = (.78)(.11) = .0858

Chapter 4: Probability

10

b) P(U _ NI) = P(U) ⋅ P(NIU) but P(IU) = .11 So, P(NIU) = 1 - .11 = .89 and P(U _ NI) = P(U) ⋅ P(NIU) = (.78)(.89) = .6942

c) U Yes I

No

Yes No

.15 .85 .78

.22

The answer to a) is found in the YES-YES cell. To compute this cell, take 11% or .11 of the total (.78) people in urban areas. (.11)(.78) = .0858 which belongs in the “YES-YES" cell. The answer to b) is found in the Yes for U and no for I cell. It can be determined by taking the marginal, .78, less the answer for a), .0858. d. P(NU _ I) is found in the no for U column and the yes for I row (1st row and 2nd column). Take the marginal, .15, minus the yes-yes cell, .0858, to get .0642.

Chapter 4: Probability

4.19

Let S = stockholder P(S) = .43

11

Let C = college

P(C) = .37

P(CS) = .75

a) P(NS) = 1 - .43 = .57 b) P(S _ C) = P(S)⋅ P(CS) = (.43)(.75) = .3225 c) P(S ⊥ C) = P(S) + P(C) - P(S _ C) = .43 + .37 - .3225 = .4775 d) P(NS _ NC) = 1 - P(S ⊥ C) = 1 - .4775 = .5225 e) P(NS ⊥ NC) = P(NS) + P(NC) - P(NS _ NC) = .57 + .63 - .5225 = .6775 f) P(C _ NS) = P(C) - P(C _ S) = .37 - .3225 = .0475 The matrix: C

S

Yes No

Yes .3225 .0475 .37

No .1075 .5225 .63

.43 .57 1.00

Chapter 4: Probability

4.20

Let F = fax machine

12

Let P = personal computer

Given: P(F) = .10 P(P) = .52 P(PF) = .91 a) P(F _ P) = P(F) ⋅ P(PF) = (.10)(.91) = .091 b) P(F ⊥ P) = P(F) + P(P) - P(F _ P) = .10 + .52 - .091 = .529 c) P(F _ NP) = P(F) ⋅ P(NPF) Since P(PF) = .91, P(NPF)= 1 - P(PF) = 1 - .91 = .09 P(F _ NP) = (.10)(.09) = .009 d) P(NF _ NP) = 1 - P(F ⊥ P) = 1 - .529 = .471 e) P(NF _ P) = P(P) - P(F _ P) = .52 - .091 = .429 The matrix: P

NP

F

.091

.009

.10

NF

.429

.471

.90

.520

.480

1.00

Chapter 4: Probability

13

4.21 Let S = safety

Let A = age

P(S) = .30 P(A) = .39 P(AS) = .87 a) P(S _ NA) = P(S)⋅ P(NAS) but P(NAS) = 1 - P(AS) = 1 - .87 = .13 P(S _ NA) = (.30)(.13) = .039 b) P(NS _ NA) = 1 - P(S ⊥ A) = 1 - [P(S) + P(A) - P(S _ A)] but P(S _ A) = P(S) ⋅ P(AS) = (.30)(.87) = .261 P(NS _ NA) = 1 - (.30 + .39 - .261) = .571 c) P(NS _ A) = P(NS) - P(NS _ NA) but P(NS) = 1 - P(S) = 1 - .30 = .70 P(NS _ A) = .70 - 571 = .129 The matrix: A

S

Yes No

Yes .261 .129 .39

No .039 .571 .61

.30 .70 1.00

Chapter 4: Probability

4.22

14

Let C = ceiling fans Let O = outdoor grill P(C) = .60 P(O) = .29 P(C _ O) = .13 a) P(C ⊥ O)= P(C) + P(O) - P(C _ O) = .60 + .29 - .13 = .76 b) P(NC _ NO) = 1 - P(C ⊥ O)= 1 - .76 = .24 c) P(NC _ O) = P(O) - P(C _ O) = .29 - .13 = .16 d) P(C _ NO) = P(C) - P(C _ O) = .60 - .13 = .47 The matrix: O Yes .13 .16 .29

Yes No

C

4.23 E

F

G

A

15

12

8

35

B

11

17

19

47

C

21

32

27

80

D

18

13

12

43

65

74

66

205

a) P(GA) = 8/35 = .2286 b) P(BF) = 17/74 = .2297 c) P(CE) = 21/65 = .3231 d) P(EG) = .0000

No .47 .24 .71

.60 .40 1.00

Chapter 4: Probability

15

4.24 C

D

A

.36

.44

.80

B

.11

.09

.20

.47

.53

1.00

a) P(CA) = .36/.80 = .4500 b) P(BD) = .09/.53 = .1698 c) P(AB) = .0000

4.25 Calculator

Computer

Yes

No

Yes

46

3

49

No

11

15

26

57

18

75

Select a category from each variable and test P(V1V2) = P(V1). For example, P(Yes ComputerYes Calculator) = P(Yes Computer)? 46 49 = ? 57 75

.8070 ≠ .6533 Since this is one example that the conditional does not equal the marginal in is matrix, the variable, computer, is not independent of the variable, calculator.

Chapter 4: Probability

4.26

Let C = construction

16

Let S = South Atlantic

83,384 total failures 10,867 failures in construction 8,010 failures in South Atlantic 1,258 failures in construction and South Atlantic a) P(S) = 8,010/83,384 = .09606 b) P(C ⊥ S) = P(C) + P(S) - P(C _ S) = 10,867/83,384 + 8,010/83,384 - 1,258/83,384 = 17,619/83,384 = .2113 1258 P (C ∩ S ) 83,384 = c) P(CS) = 8010 = .15705 P(S ) 83,384 1258 P (C ∩ S ) 83,384 = d) P(SC) = 10 ,867 P (C ) 83,384

e) P(NSNC) =

= .11576

P ( NS ∩ NC ) 1 − P (C ∪ S ) = P ( NC ) P ( NC )

but NC = 83,384 - 10,867 = 72,517 and P(NC) = 72,517/83,384 = .869675 Therefore, P(NSNC) = (1 - .2113)/(.869675) = .9069 f) P(NSC) =

P ( NS ∩C ) P (C ) − P (C ∩ S ) = P (C ) P (C )

but P(C) = 10,867/83,384 = .1303 P(C _ S) = 1,258/83,384 = .0151 Therefore, P(NSC) = (.1303 - .0151)/.1303 = .8842

Chapter 4: Probability

4.27

17

Let E = Economy Let Q = Qualified P(E) = .46 P(Q) = .37 P(E _ Q) = .15 a) P(EQ) = P(E _ Q)/P(Q) = .15/.37 = .4054 b) P(QE) = P(E _ Q)/P(E) = .15/.46 = .3261 c) P(QNE) = P(Q _ NE)/P(NE) but P(Q _ NE) = P(Q) - P(Q _ E) = .37 - .15 = .22 P(NE) = 1 - P(E) = 1 - .46 = .54 P(QNE) = .22/.54 = .4074 d) P(NE _ NQ) = 1 - P(E ⊥ Q) = 1 - [P(E) + P(Q) - P(E _ Q)] = 1 - [.46 + .37 - .15] = 1 - (.68) = .32 The matrix: Q

E

Yes No

Yes .15 .22 .37

No .31 .32 .63

.46 .54 1.00

Chapter 4: Probability

4.28

Let EM = email while on phone

18

Let TD = “to-do” lists during meetings

P(EM) = .54 P(TDEM) = .20 a) P(EM _ TD) = P(EM) ⋅ P(TDEM) = (.54)(.20) = .1080 b) P(not TDEM) = 1 - P(TDEM) = 1 - .20 = .80 c) P(not TD _ EM) = P(EM) - P(EM _ TD) = .54 - .1080 = .4320 Could have been worked as: P(not TD _ EM) = P(EM)⋅ P(not TDEM) = (.54)(.80) = .4320 The matrix: TD

E M

Yes No

Yes .1080

No .4320

.54 .46 1.00

Chapter 4: Probability

4.29

19

Let H = hardware Let S = software P(H) = .37 P(S) = .54 P(SH) = .97 a) P(NSH) = 1 - P(SH) = 1 - .97 = .03 b) P(SNH) = P(S _ NH)/P(NH) but P(H _ S) = P(H) ⋅ P(SH) = (.37)(.97) = .3589 so P(NH _ S) = P(S) - P(H _ S) = .54 - .3589 = .1811 P(NH) = 1 - P(H) = 1 - .37 = .63 P(SNH) = (.1811)/(.63) = .2875 c) P(NHS) = P(NH _ S)/P(S) = .1811//54 = .3354 d) P(NHNS) = P(NH _ NS)/P(NS) but P(NH _ NS) = P(NH) - P(NH _ S) = .63 - .1811 = .4489 and P(NS) = 1 - P(S) = 1 - .54 = .46 P(NHNS) = .4489/.46 = .9759 The matrix: S

H

Yes No

Yes .3589 .1811 .54

No .0111 .4489 .46

.37 .63 1.00

Chapter 4: Probability

4.30

20

Let R = agreed or strongly agreed that lack of role models was a barrier Let S = agreed or strongly agreed that gender-based stereotypes was a barrier P(R) = .43

P(S) = .46

P(RS) = .77

P(not S) = .54

a.) P(not RS) = 1 - P(RS) = 1 - .77 = .23 b.) P(not SR) = P(not S _ R)/P(R) but P(S _ R) = P(S) ⋅ P(RS) = (.46)(.77) = .3542 so P(not S _ R) = P(R) - P(S _ R) = .43 - .3542 = .0758 Therefore, P(not SR) = (.0758)/(.43) = .1763 c.) P(not Rnot S) = P(not R _ not S)/P(not S) but P(not R _ not S) = P(not S) - P(not S _ R) = .54 - .0758 = .4642 P(not Rnot S) = .4642/.54 = .8596 The matrix: S

R

Yes No

Yes .3542 .1058 .46

No .0758 .4642 .54

.43 .57 1.00

Chapter 4: Probability

4.31

Let A = product produced on Machine A B = product produces on Machine B C = product produced on Machine C D = defective product P(A) = .10 P(DA) = .05

Event

A B C

21

P(B) = .40 P(C) = .50 P(DB) = .12

Prior

Conditional

P(Ei) .10 .40 .50

P(DEi) .05 .12 .08

Revise:

Joint P(D _ Ei) .005 .048 .040 P(D)=.093

P(AD) = .005/.093 = .0538 P(BD) = .048/.093 = .5161 P(CD) = .040/.093 = .4301

P(DC) = .08 Revised

.005/.093=.0538 .048/.093=.5161 .040/.093=.4301

Chapter 4: Probability

4.32

Let

22

A = Alex fills the order B = Alicia fills the order C = Juan fills the order I = order filled incorrectly K = order filled correctly

P(A) = .30 P(B) = .45 P(C) = .25 P(IA) = .20 P(IB) = .12 P(IC) = .05 P(KA) = .80 P(KB) = .88 P(KC) = .95 a) P(B) = .45 b) P(KC) = 1 - P(IC) = 1 - .05 = .95 c) Event

Prior

Conditional

A B C

P(Ei) .30 .45 .25

P(IEi) .20 .12 .05

Joint P(I _ Ei) .0600 .0540 .0125 P(I)=.1265

Revised P(EiI) .0600/.1265=.4743 .0540/.1265=.4269 .0125/.1265=.0988

Revised: P(AI) = .0600/.1265 = .4743 P(BI) = .0540/.1265 = .4269 P(CI) = .0125/.1265 = .0988 d) Event

Prior

Conditional

A B C

P(Ei) .30 .45 .25

P(KEi) .80 .88 .95

Joint P(K _ Ei) .2400 .3960 .2375 P(K)=.8735

Revised P(EiK) .2400/.8735=.2748 .3960/.8735=.4533 .2375/.8735=.2719

Chapter 4: Probability

4.33

Let

23

T = lawn treated by Tri-state G = lawn treated by Green Chem V = very healthy lawn N = not very healthy lawn

P(T) = .72

P(G) = .28

Event

Prior

Conditional

A B

P(Ei) .72 .28

P(VEi) .30 .20

Revised:

P(VT) = .30 Joint P(V _ Ei) .216 .056 P(V)=.272

P(VG) = .20 Revised P(EiV) .216/.272=.7941 .056/.272=.2059

P(TV) = .216/.272 = .7941 P(GV) = .056/.272 = .2059

4.34

Let S = small

Let L = large

The prior probabilities are: P(S) = .70 P(TS) = .18 P(TL) = .82 Event

Prior

Conditional

S L

P(Ei) .70 .30

P(TEi) .18 .82

Revised:

P(L) = .30

Joint P(T _ Ei) .1260 .2460 P(T)=.3720

P(ST) = .1260/.3720 = .3387 P(LT) = .2460/.3720 = .6613

37.2% offer training since P(T) = .3720,.

Revised P(EiT) .1260/.3720 = .3387 .2460/.3720 = .6613

Chapter 4: Probability

24

4.35 Variable 1

Variable 2

D

E

A

10

20

30

B

15

5

20

C

30

15

45

55

40

95

a) P(E) = 40/95 = .42105 b) P(B ⊥ D) = P(B) + P(D) - P(B _ D) = 20/95 + 55/95 - 15/95 = 60/95 = .63158 c) P(A _ E) = 20/95 = .21053 d) P(BE) = 5/40 = .1250 e) P(A ⊥ B) = P(A) + P(B) = 30/95 + 20/95 = 50/95 = .52632 f) P(B _ C) = .0000 (mutually exclusive) g) P(DC) = 30/45 = .66667 h) P(AB) =

P(A ∩ B) .0 0 0 0 = = .0000 (A and B are mutually exclusive) P (B ) 2 /0 9 5

i) P(A) = P(AD)?? Does 30/95 = 10/95 ?? Since, .31579 ≠ .18182, Variables 1 and 2 are not independent.

Chapter 4: Probability

25

4.36

D

E

F

G

A

3

9

7

12

31

B

8

4

6

4

22

C

10

5

3

7

25

21

18

16

23

78

a) P(F _ A) = 7/78 = .08974 b) P(AB) =

P(A ∩ B) .0 0 0 0 = = .0000 (A and B are mutually exclusive) P (B ) 2 /2 7 8

c) P(B) = 22/78 = .28205 d) P(E _ F) = .0000 Mutually Exclusive e) P(DB) = 8/22 = .36364 f) P(BD) = 8/21 = .38095 g) P(D ⊥ C) = 21/78 + 25/78 – 10/78 = 36/78 = .4615 h) P(F) = 16/78 = .20513

Chapter 4: Probability

26

4.37 Age(years)

Gender

86

<35

35-44

45-54

55-64

>65

Male

.11

.20

.19

.12

.16

.78

Female

.07

.08

.04

.02

.01

.22

.18

.28

.23

.14

.17

1.00

a) P(35-44) = .28 b) P(Woman _ 45-54) = .04 c) P(Man ⊥ 35-44) = P(Man) + P(35-44) - P(Man _ 35-44) = .78 + .28 - .20 = . d) P(<35 ⊥ 55-64) = P(<35) + P(55-64) = .18 + .14 = .32 e) P(Woman45-54) = P(Woman _ 45-54)/P(45-54) = .04/.23= .1739 f) P(not W _ not 55-64) = .11 + .20 + .19 + .16 = .66

Chapter 4: Probability

4.38

Let T = thoroughness P(T) = .78 P(K) = .40

27

Let K = knowledge P(T _ K) = .27

a) P(T ⊥ K) = P(T) + P(K) - P(T _ K) = .78 + .40 - .27 = .91 b) P(NT _ NK) = 1 - P(T ⊥ K) = 1 - .91 = .09 c) P(KT) = P(K _ T)/P(T) = .27/.78 = .3462 d) P(NT _ K) = P(NT) - P(NT _ NK) but P(NT) = 1 - P(T) = .22 P(NT _ K) = .22 - .09 = .13 The matrix: K

T

Yes No

Yes .27 .13 .40

No .51 .09 .60

.78 .22 1.00

Chapter 4: Probability

4.39

28

Let R = retirement

Let L = life insurance

P(R) = .42

P(R _ L) = .33

P(L) = .61

a) P(RL) = P(R _ L)/P(L) = .33/.61 = .5410 b) P(LR) = P(R _ L)/P(R) = .33/.42 = .7857 c) P(L ⊥ R) = P(L) + P(R) - P(L _ R) = .61 + .42 - .33 = .70 d) P(R _ NL) = P(R) - P(R _ L) = .42 - .33 = .09 e) P(NLR) = P(NL _ R)/P(R) = .09/.42 = .2143 The matrix: L

R

Yes No

Yes .33 .28 .61

No .09 .30 .39

.42 .58 1.00

Chapter 4: Probability

4.40

P(T) = .16 P(TW) = .20 P(W) = .21 P(NE) = .20

29

P(TNE) = .17

a) P(W _ T) = P(W)⋅ P(TW) = (.21)(.20) = .042 b) P(NE _ T) = P(NE)⋅ P(TNE) = (.20)(.17) = .034 c) P(WT) = P(W _ T)/P(T) = (.042)/(.16) = .2625 d) P(NENT) = P(NE _ NT)/P(NT) = {P(NE)⋅ P(NTNE)}/P(NT) but P(NTNE) = 1 - P(TNE) = 1 - .17 = .83 and P(NT) = 1 - P(T) = 1 - .16 = .84 Therefore, P(NENT) = {P(NE)⋅ P(NTNE)}/P(NT) = {(.20)(.83)}/(.84) = .1976 e) P(not W _ not NET) = P(not W _ not NE _ T)/ P(T) but P(not W _ not NE _ T) = .16 - P(W _ T) - P(NE _ T) = .16 - .042 - .034 = .084 P(not W _ not NE _ T)/ P(T) = (.084)/(.16) = .525

Chapter 4: Probability

4.41

Let M = MasterCard

30

A = American Express

P(M) = .30 P(A) = .20 P(V) = .25 P(M _ A) = .08 P(V _ M) = .12

V = Visa

P(A _ V) = .06

a) P(V ⊥ A) = P(V) + P(A) - P(V _ A) = .25 + .20 - .06 = .39 b) P(VM) = P(V _ M)/P(M) = .12/.30 = .40 c) P(MV) = P(V _ M)/P(V) = .12/.25 = .48 d) P(V) = P(VM)?? .25 ≠ .40 Possession of Visa is not independent of possession of MasterCard e) American Express is not mutually exclusive of Visa because P(A _ V) ≠ .0000

Chapter 4: Probability

4.42

Let S = believe SS secure <45 = under 45 years old

31

N = don't believe SS will be secure >45 = 45 or more years old

P(N) = .51 Therefore, P(S) = 1 - .51 = .49 P(<45) = .57

P(> 45) = .43

P(S>45) = .70 Therefore, P(N>45) = 1 - P(S>45) = 1 - .70 = .30 a) P(>45) = 1 - P(<45) = 1 - .57 = .43 b) P(<45 _ S) = P(S) - P(>45 _ S) = but P(> 45 _ S) = P(> 45)⋅ P(S> 45) = (.43)(.70) = .301 P(<45 _ S) = P(S) - P(>45 _ S) = .49 - .301 = .189 c) P(>45S) = P(>45 _ S)/P(S) = .189/.49 = .6143 d) (<45 ⊥ N) = P(<45) + P(N) - P(<45 _ N) = but P(<45 _ N) = P(<45) - P(<45 _ S) = .57 - .189 = .381 so P(<45 ⊥ N) = .57 + .51 - .381 = .699 Probability Matrix Solution for Problem 4.42: S

N

<45

.189

.381

.57

>45

.301

.129

.43

.490

.510

1.00

Chapter 4: Probability

4.43

32

Let M = expect to save more R = expect to reduce debt NM = don't expect to save more NR = don't expect to reduce debt P(M) = .43 P(R) = .45 P(RM) = .81 P(NRM) = 1 - P(RM) = 1 - .81 = .19 P(NM) = 1 - P(M) = 1 - .43 = .57 P(NR) = 1 - P(R) = 1 - .45 = .55 a) P(M _ R) = P(M)⋅ P(RM) = (.43)(.81) = .3483 b) P(M ⊥ R) = P(M) + P(R) - P(M _ R) = .43 + .45 - .3483 = .5317 c) P(neither save nor reduce debt) = 1 - P(M ⊥ R) = 1 - .5317 = .4683 d) P(M _ NR) = P(M)⋅ P(NRM) = (.43)(.19) = .0817 Probability matrix for problem 4.43: Reduce

Save

Yes

No

Yes

.3483

.0817

.43

No

.1017

.4683

.57

.45

.55

1.00

Chapter 4: Probability

4.44

33

Let R = read Let B = checked in the with boss P(R) = .40

P(B) = .34

P(BR) = .78

a) P(B _ R) = P(R)⋅ P(BR) = (.40)(.78) = .312 b) P(NR _ NB) = 1 - P(R ⊥ B) but P(R ⊥ B) = P(R) + P(B) - P(R _ B) = .40 + .34 - .312 = .428 P(NR _ NB) = 1 - .428 = .572 c) P(RB) = P(R _ B)/P(B) = (.312)/(.34) = .9176 d) P(NBR) = 1 - P(BR) = 1 - .78 = .22 e) P(NBNR) = P(NB _ NR)/P(NR) but P(NR) = 1 - P(R) = 1 - .40 = .60 P(NBNR) = .572/.60 = .9533 f) Probability matrix for problem 4.44: B

NB

R

.312

.088

.40

NR

.028

.572

.60

.34

.66

1.00

Chapter 4: Probability

4.45

34

Let Q = keep quiet when they see co-worker misconduct Let C = call in sick when they are well P(Q) = .35 P(NQ) = 1 - .35 = .65 P(CQ) = .75 P(QC) = .40 a) P(C _ Q) = P(Q)⋅ P(CQ) = (.35)(.75) = .2625 b) P(Q ⊥ C) = P(Q) + P(C) - P(C _ Q) but P(C) must be solved for: P(C _ Q) = P(C) ⋅ P(QC) .2625 = P(C) ( .40) Therefore, P(C) = .2625/.40 = .65625 and P(Q ⊥ C) = .35 + .65626 - .2625 = .74375 c) P(NQC) = P(NQ _ C)/P(C) but P(NQ _ C) = P(C) - P(C _ Q) = .65625 - .2625 = ..39375 Therefore, P(NQC) = P(NQ _ C)/P(C) = .39375/.65626 = .60 d) P(NQ _ NC) = 1 - P(Q ⊥ C) = 1 - .74375 = .25625 e) P(Q _ NC) = P(Q) - P(Q _ C) = .35 - .2625 = .0875 Probability matrix for problem 4.45: C

Q

Y

N

Y

.2625

.0875

.35

N

.39375

.25625

.65

.65625

.34375

1.00

Chapter 4: Probability

4.46

35

Let: D = denial I = inappropriate C = customer P = payment dispute S = specialty G = delays getting care R = prescription drugs P(D) = .17 P(I) = .14 P(C) = .14 P(P) = .11 P(S) = .10 P(G) = .08 P(R) = .07 a) P(P ⊥ S) = P(P) + P(S) = .11 + .10 = .21 b) P(R _ C) = .0000 (mutually exclusive) c) P(IS) = P(I _ S)/P(S) = .0000/.10 = .0000 d) P(NG _ NP) = 1 - P(G ⊥ P) = 1 - [P(G) + P(P)] = 1 – [.08 + .11] = 1 - .19 = .81

Chapter 4: Probability

4.47

36

Let R = retention Let P = process improvement P(R) = .56 P(P _ R) = .36 P(RP) = .90 a) P(R _ NP) = P(R) - P(P _ R) = .56 - .36 = .20 b) P(PR) = P(P _ R)/P(R) = .36/.56 = .6429 c) P(P) = ?? Solve P(RP) = P(R _ P)/P(P) for P(P): P(P) = P(R _ P)/P(RP) = .36/.90 = .40 d) P(R ⊥ P) = P(R) + P(P) - P(R _ P) = .56 + .40 - .36 = .60 e) P(NR _ NP) = 1 - P(R ⊥ P) = 1 - .60 = .40 f) P(RNP) = P(R _ NP)/P(NP) but P(NP) = 1 - P(P) = 1 - .40 = .60 P(RNP) = .20/.60 = .3333 P

R

and

Y

N

Y

.36

.20

.56

N

.04

.40

.44

.40

.60

1.00

Note: In constructing the matrix, we are given P(R) = .56, P(P _ R) = .36, P(RP) = .90. That is, only one marginal probability is given. From P(R), we can get P(NR) by taking 1 - .56 = .44. However, only these two marginal values can be computed directly. To solve for P(P), using what is given, since we know that 90% of P lies in the intersection and that the intersection is .36, we can set up an equation to solve for P: .90P = .36 Solving for P = .40.

Chapter 4: Probability

4.48

37

Let M = mail Let S = sales P(M) = .38 P(M _ S) = .0000 P(NM _ NS) = .41 a) P(M _ NS) = P(M) - P(M _ S) = .38 - .00 = .38 b) Because P(M _ S) = .0000, P(M ⊥ S) = P(M) + P(S) Therefore, P(S) = P(M ⊥ S) - P(M) but P(M ⊥ S) = 1 - P(NM _ NS) = 1 - .41 = .59 Thus, P(S) = P(M ⊥ S) - P(M) = .59 - .38 = .21 c) P(SM) = P(S _ M)/P(M) = .0000/.38 = .0000 d) P(NMNS) = P(NM _ NS)/P(NS) = .41/.79 = .5190 where: P(NS) = 1 - P(S) = 1 - .21 = .79 Probability matrix for problem 4.48: S

M

Y

N

Y

.0000

.38

.38

N

.21

.41

.62

.21

.79

1.00

Chapter 4: Probability

4.49

38

Let F = Flexible Work Let V = Gives time off for Volunteerism P(F) = .41 P(VNF) = .10 P(VF) = .60 from this, P(NF) = 1 - .41 = .59 a) P(F ⊥ V) = P(F) + P(V) - P(F _ V) P(F) = .41 and P(F _ V) = P(F)⋅ P(VF) = (.41)(.60) = .246 Find P(V) by using P(V) = P(F _ V) + P(NF _ V) but P(NF _ V) = P(NF)⋅ P(VNF) = (.59)(.10) = .059 so, P(V) = P(F _ V) + P(NF _ V) = .246 + .059 = .305 and P(F ⊥ V) = P(F) + P(V) - P(F _ V) = .41 + .305 - .246 = .469 b) P(F _ NV) = P(F) - P(F _ V) = .41 - .246 = .164 c) P(FNV) = P(F _ NV)/P(NV) P(F _ NV) = .164 P(NV) = 1 – P(V) = 1 - .305 = .695. P(FNV) = P(F _ NV)/P(NV) = .164/.695 = .2360 d) P(NFV) = P(NF _ V)/P(V) = .059/.305 = .1934 e) P(NF ⊥ NV) = P(NF) + P(NV) - P(NF _ NV) P(NF) = .59 P(NV) = .695 Solve for P(NF _ NV) = P(NV) – P(F _ NV) = .695 - .164 = .531 P(NF ⊥ NV) = P(NF) + P(NV) - P(NF _ NV) = .59 + .695 - .531 = .754 Probability matrix for problem 4.49: V

F

Y

N

Y

.246

.164

.41

N

.059

.531

.59

.305

.695

1.00

Chapter 4: Probability

4.50

Event

Let S = Sarabia

Let T = Tran

Let J = Jackson

P(S) = .41 P(BS) = .05

P(T) = .32 P(BT) = .08

P(J) = .27 P(BJ) = .06

Prior

Conditional

P(Ei) .41 .32 .27

S T J

4.51

39

P(BEi) .05 .08 .06

Let R = regulations

Joint P(B _ Ei)

Let B = blood test

Revised P(BiNS) .3291 .4109 .2600

.0205 .0256 .0162 P(B) = .0623

T = tax burden

P(R) = .30 P(T) = .35 P(TR) = .71 a) P(R _ T) = P(R)⋅ P(TR) = (.30)(.71) = .2130 b) P(R ⊥ T) = P(R) + P(T) - P(R _ T) = .30 + .35 - .2130 = .4370 c) P(R ⊥ T) - P(R _ T) = .4370 - .2130 = .2240 d) P(RT) = P(R _ T)/P(T) = .2130/.35 = .6086 e) P(NRT) = 1 - P(RT) = 1 - .6086 = .3914 f) P(NRNT) = P(NR _ NT)/P(NT) = [1 - P(R ⊥ T)]/P(NT) = (1 - .4370)/.65 = .8662 Probability matrix for problem 4.51: T

R

Y

N

Y

.213

.087

.30

N

.137

.563

.70

.35

.65

1.00

Chapter 4: Probability

40

4.52 Event

Prior

Conditional

Joint P(RB _ Ei)

0-24

P(Ei) .353

P(RBEi) .11

25-34

.142

.24

.03408

35-44

.160

.27

.04320

> 45

.345

.39

.13455

.03883

Revised P(EiRB) .03883/.25066 = . 15491 .03408/.25066 = . 13596 .04320/.25066 = . 17235 .13455/.25066 = . 53678

P(RB) = .25066

4.53

Let GH = Good health P(GH) = .29

Let HM = Happy marriage P(HM) = .21

Let FG = Faith in God

P(FG) = .40

a) P(HM ⊥ FG) = P(HM) + P(FG) - P(HM _ FG) but P(HM _ FG) = .0000 P(HM ⊥ FG) = P(HM) + P(FG) = .21 + .40 = .61 b) P(HM ⊥ FG ⊥ GH) = P(HM) + P(FG) + P(GH) = .29 + .21 + .40 = .9000 c) P(FG _ GH) = .0000 The categories are mutually exclusive. The respondent could not select more than one answer. d) P(neither FG nor GH nor HM) = 1 - P(HM ⊥ FG ⊥ GH) = 1 - .9000 = .1000

Related Documents


More Documents from "gilli1tr"