Hello,

Convergence in probability implies convergence in distribution. From here, your problem is obvious

I can show you the proof... But it's pretty tedious I can link you to the French Wikipedia where there is a proof... Convergence de variables aléatoires - Wikipédia

where there is

(Xn converges to X in probability Xn converges to X in distribution)Théorème — Xn converge vers X en probabilité Xn converge vers X en loi.

Then prove that if Xn converges to X in distribution, and Yn converges to 0 in distribution, then Xn+Yn converges to X in distribution.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I'll try to get you started for this problem, by using a similar method to the theorem's proof. You'll finish it because it looks very difficult to write down this in latex !

in distribution means that , for any x where is continuous. F denotes the cumulative distribution function.

This means that

This can be translated as :

in probability means that .

This can be translated as :

Now you want to show that

Write that , for any given

Since (the comma represents an intersection of events), we can say that

Similarly (this will need a little bit of thinking),

Hence we get :

Gaaah... At least this can get you started... I'm really struggling in thinking on a computer and have no much time left I don't even know if this is useful, because you can use that theorem above...

The steps after that are roughly : get an inequality in the form

And then make similar steps to get an inequality in the form

And this would mean that , which is what you want.