When merging 2 binary heaps (implemented as arrays), the worst case running time is O(n).

But I just want to figure out why its that.

So to do this, you would merge the two arrays in one, then create an empty heap, then for each element in the new array, you insert that element in the heap.

Now, to insert 1 element thats O(log(n)), but you do that n times, because once for each element. But then I'm getting O(n*log(n)). Why is this not right?