Hi Sorry can anyone help me with this question, I did a few parts of this question but the remaining im having problem =(

In an economy, there are many identically distributed and independent projects, that is, each project requires $10m of investment and pays back in a year of either $11m with probability 95% or $0m (fails) with probability 5%, while the payout of each project does not depend on the payout of other projects.

a) What is the mean and standard deviation of the rate of return if a bank is to investment in N projects? What happens if N goes to infinity (N→∞)?

b) Suppose the asset of a bank is many such projects (N→∞), and the secondary security this bank offers to its customers is a one‐year CD with fixed return 1 + r, what is the maximum rate of return r a competitive bank can offer? Ignore the operational cost and other costs.

f) If an investor has $10m and he can either invest in one project or buy the one‐year CD with the maximum rate of return offered by the bank. He is risk averse, with expected utility function as Eu(c) (10c - 0.5c^2 ), where c is his consumption at the end of year 1, uncertain at the beginning of the year. Suppose he will only consume at the end of year one and will consume whatever he has at that time. At the beginning of the year, will he invest in CD or the project to maximize his expected utility?