Hello,
I've been told (again) this morning that MLE is the best estimator one has first to think of...
Suppose \$f(t)=a_0(t) + \sum_{i=1} ^{n} a_i(t) X_i\$ where \$X_i\$ (\$i=1,2,\dots,n\$) are independent standard normal variables. $\t\$ is time over interval \$[t_1,t_2]\$. What are the mean, std, and CDF of the maximal f over \$[t_1,t_2]\$ at Instant \$t\$? I can use Monte Carlo simulation. But it is not efficient. Any numerical methods for this?