Faaliyetler İş Araştırma ve İktisat Game Theory - Markov Analysis downloaded from fatihbook.com Added by: Murat Ylldmm MAN 203 QUANTIT.ATIVE METHODS (EMil KUZEY GAME THEORY Game theory is the study of how~Ptl~;;J;"~e fonnulated in conflict. A two ~game allows only ~o people or two groups to be_ involved in the game. Zero sum _ ~eans that t.he asum of the losse* for one player m. ust equal the sum o!,_ the ~for the ~the.r_pl~r. The overall suII! of the ~ and gains for both players, in other words, , -- - , -must be zero. r-::::J ~ 0 ~v I~ Depending on the actual l ~ ill the game and the . size of ~he gaple, a number of . solution technigues can be use .d.ln~e s(;ategy ~, strat~gies for ~e players can be I, obtained wi~aking~~alculationi When there is not a pure str'!tegy,also called a saddle point, for b0..!h playe! it is necessary to us~ oth~, such as the mixed and a anuputer solution for larger than 2)(2. Ize ofthe tMfni;ax Crile ' 0 cri eri t ~minimizes one's.maximum losses Also another way ~4sJ~ing a\p~tr.a~,gy g~e . J ~~ - . Mii:el1 'StrAegv Game: ).. game in which the optimal strategy for both players involves playin!more than Ode"sJ~~ver time. E~ startegy is played a given pe;centage of /~heti~~ 0 -- " .~_~ - Pure Strategv.) \!game in which both players wi ~~ us! one str.ategL ') ./- "",- '- .. I ....... _ _ _.sadtlle"Po-mi Game."A: game that h \ _ U .-r- - , o~~~K-"- Two-Person-Q.ame: A..game that 0 ~o playe 7' ~ lValit oUhe G~m~}lhe expect ::...:::===~ __ the game if the game is played a large mber_~s. - Zero-Sum Gam': A game in which ~--=--./ . ~ !,!-Iosses or one playeryqual the gains for the other ~ '0 GAME THEORY QUESTIONS 1. Detenninc the strategies for X an.cL Y given the following game. What is the value of ~ Yl Y2 Xl 2 - -4 X2 6 .10 0'1 ~ '0 2. What is the value, of the followin rr ganVe and the strategies ./"'- 0 Bl B2 Al 19 20 A2 5, -4 1 / 3. Determine each piayer'Sc.strate!ID~d the value of the game given the following tble: YI Y2'. 1\ Xl -' IL8~ ) jf~~ f- )f-?- l -n fi W ' l~ 4 : What is the value of the foUo -~ ) wing 'N",P?\ 'Sl S2 Rl 21 116 ' R2 89 3 5. Solve the follo~g game: 2 , MARKOV ANALYSIS Markov analysis is a technique dealing with probabilities of future occurrences with currently known probabilities. Some of the numerous applications are in ./' Business (e.g., market share analysis), ./' Bad debt prediction ./' University enrollment predictions ./' Machine breakdown prediction There are four assumptions of Markov Analysis 1. There are a limited or finite number of possible states. 2. The probability of changing states remains the same over time. 3. We can predict any future state from previous state and the transition matrix probabilities. 4. The size and makeup of the system do not change during the analysis. (Size and states of the system remain the same during analysis) Matrix o(Transition Probabilities • Shows the likelihood that the system will change from one time period to the next • This is the Markov Process. • It enables the prediction of future states or conditions. States and State Probabilities States are • used to identifY all possible conditions of a process or system • A system can exist in only one state at a time. Examples include: • Working and broken states of a machine • Three shops in town, with a customer able to patronize one at a time • Courses in a student schedule with the student able to occupy only one class at a time 1. IdentifY all states. 2. Determine the probability that the system is in this state. • This information is placed into a vector of state probabilities. 7r(i) = vector of state probabilities for period i = (1t 1 , 7r 2 ,···,7!n) where n = number of states 7rp7r" ... ,7rn = P (being in state 1,2, ... , state n) For example: • If dealing with only 1 machine, it may be known that it is currently functioning correctly. • The vector of states can then be shown. 7r(I) = (1,0) where 7r(1) = vector of states for the machine in period I 7r J = 1 = P (being in state 1) = P (machine working) 7r, = 0 = P (being in state 2) = P (machine broken) • Most of the time, problems deal with more than one item! Three Grocery Store example: • 100,000 customers monthly for the 3 grocery stores • State I = store 1 = 40,000/100,000 = 40% 3 • State 2 = store 2 = 30,000/ 100,000 = 30% • State 3 = store 3 = 30,000/ 100,000 = 30% vector of state probabilities: pel) = (0.4,0.3,0.3) where 7r (I) = vector of state probabilities in period I 7r, = 0.4 = P (of a person being irr store I) 7r 2 = 0.3 = P (of a person being in store 2) 7r3 = 0.3 = P (of a person being in store 3) Matrix of Transition Probabilities To calculate periodic changes, it is much more convenient to use • a matrix of transition probabilities. • a matrix of conditional probabilities of being in a future state given a current state. Let Pij = conditional probability of being in state j in the future given the current state of i, P (state j at time = I I state i at time = 0) For example, P 12 is the probability of being in state 2 in the future given the event was in state I in the prior period Let P = matrix of transition pro,baIJilil:ies Important: Each row must sum to 1. !! !!!! Predicting Future Market Shares A purpose of Markov analysis is to predict the future • Given the 1. vector of state probabilities and 2. matrix of transitional probabilities. It is easy to fmd the state probabilities in the future. • This type of analysis allows the computation of the probability that a person will be at one of the grocery stores in the future. • Since this probability is equal to market share, it is possible to determine the future market shares of the grocery store. When the current period is 0, finding the state probabilities for the next period (1) can be found 4 usmg: Therefore, the state probabilities n periods in the future can be obtained from the • state probabilities and the ( I)) l/ ) • of transiti5!!!.J1IQ.babil iti{}&") -r0) ~ \fJ-9 u / rth'l 5 lfgJ)P :T! (1) f Equilibrium Conditions _ ' . Equilibrium state probabilities are the long-~D average probabilities for being in each state. ( ~ __ ----- • Equilibrium conditions exist if state probabilities do not change after a large number of periods. • At equilibrium, state probabilities for the next period equal the state probabilities for current period. • One way to compute the equilibrium share of the market is to use Markov analysis for a large number of periods and see if the future amounts approach stable values. • On the next slide, the Markov analysis is repeated for 15 periods for the machine example. 6 • By the 15th period, the share of time the machine spends working and broken is around 66% and Equilibrium Equations 7 Absorbing States • Any state that does not have a probability of moving to another state is called an absorbing state. • If an entity is in an absorbing state now, the probability of being in an absorbing state in the future is 100%. . • An example of such a process is accounts receivable. • Bills are either paid, delinquent, or written off as bad debt. Once the debt stays paid or written off. 8