
Linear algebra
Let OA and OB be linearly independent vectors and let r > 0. Calculate the vector OM where M is the point of intersection of the line through the points A and B with the line through the point O that is parallel to the vector OB + rOA.
I don't understand where it says the line through point O has to be parallel to the vector OB + rOA, isn't just means that OM has to be in between the two vectors OA and OB? Can some one draw me the diagram and briefly explain how to do this question?