Why Choose Us?
0% AI Guarantee
Human-written only.
24/7 Support
Anytime, anywhere.
Plagiarism Free
100% Original.
Expert Tutors
Masters & PhDs.
100% Confidential
Your privacy matters.
On-Time Delivery
Never miss a deadline.
a market there are only two drug dealers, A and B, who sell a homogenous product
a market there are only two drug dealers, A and B, who sell a homogenous product. Each dealer has a constant marginal cost of 10. Consider the possibility that the two dealers will coordinate their activities in order to increase their profits. The cartel agreement would have each dealer restrict its output so that the market price will be high. However, each dealer might consider raising its output beyond the cartel quota (cheating on the agreement). Thus, for each dealer the two possible strategies are to comply and to cheat. Their possible payoffs are as follows:
- If both cheat, they each will earn profits of $10.
- If they both comply, they will each earn profits of $20.
- If one cheats and the other does not, the one who cheats will earn profits of $25; the one who doesn't cheat will earn profits of $5.
a. Do either of the dealers have a dominant strategy? Is there a Nash Equilibrium of this game? If so, what is it?
b. Suppose cartel agreements are contractible and contracts are enforced by a local mob boss. Would the dealers want to sign a contract? If so, how would the outcome change from part a?
c. Suppose that there is no mob boss. Instead, dealers A and B can regularly coordinate their activities on a daily basis. Dealer A has announced that he will "retire" in a year. Will the repeated interaction that they have until that time cause them to comply? Why or why not?
d. Suppose that instead of competing in the game described above, the two dealers were to compete in a Bertrand fashion in a single period. That is, the firms would simultaneously set their own price at any level. All of the demand for the good would go to the lower priced dealer. Do either of the dealers have a dominant strategy in this case? What is the Nash Equilibrium of the game?
2. Two teenagers, James and Dean, take their cars to opposite ends of Main Street, Middle-of-Nowhere, USA, at midnight and start to drive toward each other. The one who swerves to avoid a collision is the "chicken" and the one who keeps going straight is the winner. If both maintain a straight course, there is a collision in which both cars are damaged and both players injured. Suppose the payoffs are shown in the following table:
Dean
Swerve Straight
James Swerve 0, 0 -1, 1
Straight 1, -1 -2, -2
a. What is the Nash Equilibrium (or equilibria) of this game?
b. Suppose that James had access to some handcuffs that he could use to visibly lock his hands to the rearview mirror, completely inhibiting his ability to turn the steering wheel. Would James want to do this?
3. Consider the following game tree which can be considered a simplified model of the Cuban missile crisis. The Soviets have installed missiles in Cuba, and now the U.S. has the first move. It chooses between doing nothing and issuing a threat. Suppose the payoffs are as follows. If the United States does nothing, this is a major military and political achievement for the Soviets, so we score the payoffs as -2 for the United States and 2 for the Soviets. If the United States issues its threat, the Soviets get to move, and they can either withdraw or defy. Withdrawal is a humiliation (a substantial minus) for the Soviets and an affirmation of U.S. military superiority (a small plus), so we score it 1 for the United States and -4 for the Soviets. If the Soviets defy the U.S. threat, there will be a nuclear war. This is terrible for both, but particularly for the U.S., which as a democracy cares more for its citizens, so we score this -10 for the U.S. and -8 for the Soviets.
U.S., USSR
1, -4
withdraw
USSR
Defy -10, 8
US
-2, 2
a. What is the Subgame Perfect Nash Equilibrium of this game?
b. Is there an equilibrium to this game that is not subgame perfect (that is, one which involves a non-credible threat)?
4. Three antagonists, Larry, Mo, and Curly, are engaged in a three-way gunfight. There are two rounds. In the first round, each player is given one shot: first Larry, then Mo, and then Curly. After the first round, any survivors are given a second shot, again beginning with Larry, then Mo, and then Curly. For each person, the best outcome is to be the sole survivor. Next best is to be one of two survivors. In third place is the outcome in which no one gets killed. Dead last is that you killed.
Larry is a poor shot, with only a 30 percent chance of hitting a person at whom he aims. Mo is a much better shot, achieving 80 percent accuracy. Curly is a perfect shot - he never misses.
a. What is Larry's optimal strategy in the first round? (+1 point)
b. Who has the greatest chance of survival in this problem? Who has the lowest chance of survival? (+1 point)
Expert Solution
Please see the attached file.
----
The text below was pasted for searching purposes
1. Suppose that in a market there are only two drug dealers, A and B, who sell a homogeneous product. Each dealer has a constant marginal cost of 10. Consider the possibility that the two dealers will coordinate their activities in order to increase their profits. The cartel agreement would have each dealer restrict its output so that the market price will be high. However, each dealer might consider raising its output beyond the cartel quota (cheating on the agreement). Thus, for each dealer the two possible strategies are to comply and to cheat. Their possible payoffs are as follows:
- If both cheat, they each will earn profits of $10.
- If they both comply, they will each earn profits of $20.
- If one cheats and the other does not, the one who cheats will earn profits of $25; the one who doesn't cheat will earn profits of $5.
a. Do either of the dealers have a dominant strategy? Is there a Nash Equilibrium of this game? If so, what is it?
b. Suppose cartel agreements are contractible and contracts are enforced by a local mob boss. Would the dealers want to sign a contract? If so, how would the outcome change from part a?
c. Suppose that there is no mob boss. Instead, dealers A and B can regularly coordinate their activities on a daily basis. Dealer A has announced that he will "retire" in a year. Will the repeated interaction that they have until that time cause them to comply? Why or why not?
d. Suppose that instead of competing in the game described above, the two dealers were to compete in a Bertrand fashion in a single period. That is, the firms would simultaneously set their own price at any level. All of the demand for the good would go to the lower priced dealer. Do either of the dealers have a dominant strategy in this case? What is the Nash Equilibrium of the game?
2. Two teenagers, James and Dean, take their cars to opposite ends of Main Street, Middle-of-Nowhere, USA, at midnight and start to drive toward each other. The one who swerves to avoid a collision is the "chicken" and the one who keeps going straight is the winner. If both maintain a straight course, there is a collision in which both cars are damaged and both players injured. Suppose the payoffs are shown in the following table:
Dean
Swerve Straight
James Swerve 0, 0 -1, 1
Straight 1, -1 -2, -2
a. What is the Nash Equilibrium (or equilibrium) of this game?
b. Suppose that James had access to some handcuffs that he could use to visibly lock his hands to the rearview mirror, completely inhibiting his ability to turn the steering wheel. Would James want to do this?
3. Consider the following game tree which can be considered a simplified model of the Cuban missile crisis. The Soviets have installed missiles in Cuba, and now the U.S. has the first move. It chooses between doing nothing and issuing a threat. Suppose the payoffs are as follows. If the United States does nothing, this is a major military and political achievement for the Soviets, so we score the payoffs as -2 for the United States and 2 for the Soviets. If the United States issues its threat, the Soviets get to move, and they can either withdraw or defy. Withdrawal is a humiliation (a substantial minus) for the Soviets and an affirmation of U.S. military superiority (a small plus), so we score it 1 for the United States and -4 for the Soviets. If the Soviets defy the U.S. threat, there will be a nuclear war. This is terrible for both, but particularly for the U.S., which as a democracy cares more for its citizens, so we score this -10 for the U.S. and -8 for the Soviets.
U.S., USSR
1, -4
withdraw
USSR
Defy -10, 8
US
-2, 2
a. What is the Subgame Perfect Nash Equilibrium of this game?
b. Is there an equilibrium to this game that is not subgame perfect (that is, one which involves a non-credible threat)?
4. Three antagonists, Larry, Mo, and Curly, are engaged in a three-way gunfight. There are two rounds. In the first round, each player is given one shot: first Larry, then Mo, and then Curly. After the first round, any survivors are given a second shot, again beginning with Larry, then Mo, and then Curly. For each person, the best outcome is to be the sole survivor. Next best is to be one of two survivors. In third place is the outcome in which no one gets killed. Dead last is that you killed.
Larry is a poor shot, with only a 30 percent chance of hitting a person at whom he aims. Mo is a much better shot, achieving 80 percent accuracy. Curly is a perfect shot - he never misses.
a. What is Larry's optimal strategy in the first round? (+1 point)
b. Who has the greatest chance of survival in this problem? Who has the lowest chance of survival? (+1 point)
Question 1
Part a
This game is another version of the prisoner's dilemma. It's clear that the dominant strategy for both of them is to cheat. If the other dealer cheats, then the best response is to cheat as well (as this yields a payoff of $10 vs. a payoff of $5 if comply is played). Likewise, if the other dealer complies, then the best response is to cheat (as this yields a payoff of $25 vs. a payoff of $20 if comply is played)
Therefore, for both players, the strictly dominant strategy is to cheat. Therefore, there is only one Nash equilibrium: (Dealer A cheats, Dealer B cheats). Their payoffs will be (10, 10)
Part b
Yes, they will want to sign it. The contract will allow them to get the outcome (Dealer A complies, Dealer B complies), which yields a payoff of (20, 20) vs the (10, 10) they are currently getting. This is because, thanks to the contract, they can commit to playing "comply".
Part c
No, it will not. In the last working day of Dealer A (just before he retires), he has no incentive to comply, because the multi-period game ends there. The only possible incentive to comply in each period is the promise that the other dealer will keep complying in this and in the future periods. Since in the last day of A there are no more future periods, then Dealer A is better off by cheating, no matter what Dealer B does. Dealer B, knowing this, will also cheat on the last day.
Now, let's see what happens one day before the last working day. They both know that they will cheat in the next period. Therefore, once again, they have no incentive to comply in the current period, so they will both cheat. This logic is then repeated to the day before that, and the day before that one, etc. It's clear thus that both players will cheat in all periods until Dealer A retires.
Part d
In this case, neither dealer has a dominant strategy. The best response for each player is, given that the other dealer chooses a price P > 10, choose a price P - , where is a very small number. For example, if the other player chooses to sell at $15, the best response is to sell at $14.99 - this causes this dealer to charge the highest possible price that allows him to capture the whole market. On the other hand, if the other player chooses P = 10, then the best response is to choose P = 10 as well. If the price is set lower than 10, then the dealer that does that experiences losses, because the marginal cost is 10 per unit.
As a result of this best response structure, the only possible Nash equilibrium is that both players choose P = 10.
Question 2
Part a
There are two Nash equilibria in this game: (Swerve, Straight) and (Straight, Swerve). For example, if James expects Dean to go Straight then he should Swerve (getting a payoff of -1 instead of -2 if he goes Straight). Now, if Dean expects James to Swerve, then he should go Straight (getting a payoff of 1 instead of a payoff of 0 if he Swerves). Thus (Swerve, Straight) is a Nash equilibrium. The same argument can be made to show that (Straight, Swerve) is an equilibrium as well.
Part b
Yes, he would want to do it. In this case, James commits to playing Straight. Dean knows that James will play Straight. Therefore, his best response is to Swerve. Thus James gets a payoff of 1 if he credibly commits to playing Straight (he does so by using the handcuffs).
Question 3
Part a
The only SPE (Subgame Perfect Equilibrium) is:
US plays "Threat"
USSR plays "If US plays threat, then withdraw"
This is found by backward induction from the shown tree. Let's start at the node where the US has already issued the threat. In this case, if the USSR withdraws, they get a payoff of -4. If they defy, they get a payoff of -8. Therefore, their best response is to withdraw.
Now, the US knows this. They know that if they threat, the USSR will withdraw, and thus the US will get a payoff of 1. If the US does nothing, the US gets a payoff of -2. Therefore, the optimal strategy for the US is to Threat.
Part b
Yes, there is. The Nash equilibrium that is not a SPE is:
US plays "Do Nothing"
USSR plays "If US plays threat, then defy"
This equilibrium involves the non-credible threat from the USSR that they will defy if the US threats. It's not credible because, as we've seen, USSR 's best response to US threatening is to withdraw.
Question 4
Part a
The optimal strategy for each player is to always shoot at the best of the other two shooters. In this way, each player minimizes the probability of getting killed. Therefore, Larry must always shoot Curly if he's alive, or Moe if he's not. Moe should always shoot Curly if he's alive, and Larry if he's not. Finally, Curly should shoot Moe if he's alive and Larry if he's not.
Larry's optimal strategy in the first round is thus to shoot Curly.
Part b
Let's go through each possible case.
We know that Larry starts by shooting at Curly. There's a 0.3 probability he hits. If he does, it's Moe's turn, and he will shoot at Larry and hit with probability 0.8. If he does, Moe lives and Larry and Curly die. This outcome has a probability 0.3*0.8 = 0.24
Let's now say that Larry killed Curly (prob 0.3) and, in Moe's turn, he missed Larry (prob 0.2). It's now Larry's turn again. If he kills Moe (prob 0.3), Larry lives and Moe and Curly die. This event has a probability 0.3*0.2*0.3 = 0.018. On the other hand, if Larry misses Moe, (prob 0.7) then Moe shoots at Larry. If he hits (prob 0.8) then Moe lives and Larry and Curly die. This outcome has probability 0.3*0.2*0.7*0.8 = 0.0336. Finally, if Moe missed this last shot (prob 0.2), then Larry and Moe live and Curly dies. This event has probability 0.3*0.2*0.7*0.2 = 0.0084.
Let's now say that Larry missed Curly in the first round (prob 0.7). Moe will now shoot at Curly and hit him with prob 0.8. Let's assume he does. It's now Larry's turn again. If he hits Moe, (prob 0.3) then Larry lives, and Moe and Curly die. This event has a probability 0.7*0.8*0.3 = 0.0168. If he misses Moe (prob 0.7), then Moe shoots at Larry. If he hits, (prob 0.8), then Moe lives, and Larry and Curly die. The probability of this is 0.7*0.8*0.7*0.8 = 0.3136. Finally, if Moe misses Larry in this last shot (prob 0.2) then Larry and Moe live, and Curly die. The probability is 0.7*0.8*0.7*0.2 = 0.0784.
We've already investigated all the outcomes in which Curly dies in the first round. Let's now assume that he doesn't, so both Larry and Moe missed their shots (prob 0.7*0.2). Curly will now kill Moe with certainty. It's now Larry's turn. If he kills Curly, then Larry lives, Moe and Curly die. The probability is 0.7*0.2*0.3 = 0.042. If he misses, then Larry will die when Curly shoots. In this event, Curly lives, and Larry and Moe die. The probability of this event is 0.7*0.2*0.7 = 0.0294
We've now exhausted all the possible events.
In order to find Larry's probability of survival, we simply sum all the probabilities of the events in which Larry lives. This is 0.018+0.0084+0.0168+0.0784+0.042 = 0.1636
Let's now do the same for Moe. His probability of survival is
0.24+0.0036+0.0084+0.3136+0.0784 = 0.644
Finally, we do it for Curly. His probability of survival is just 0.0294.
We conclude that Moe has the greatest probability of survival, while Curly has the smallest one.
Archived Solution
You have full access to this solution. To save a copy with all formatting and attachments, use the button below.
For ready-to-submit work, please order a fresh solution below.





