Hello!
This is a video game related question, but please, take it seriously.
Assume we have a lake consisting of 528 tiles and among them are 4 fixed tiles on which the player can catch a fish. There is also a fixed probability of catching a fish per one usage of fishing rod which is 35%. Essentially, using a rod is a Bernoulli trial with p=0.35 when fishing on a "fished" tile and p=0 on an empty tile. The player starts fishing on tile 1 and uses his rod k times. Then he moves to tile 2, uses his rod k times and repeats the procedure until he has fished on every tile in the lake. Then he simply starts again from tile 1 and so on... My problem is as following:
After how many uses of the fishing rod on average will the player finally catch a fish with respect to k
I've been thinking about this for a few days, but still can't find a solution.
Also, I'm not interested in the result itself, but I'd like to know how exactly I should approach this kind of a problem.