Can i do this with a matrix multiplication operator

Let’s say I want to find a vector of possible revenues, where revenue equals Price times quantity: R=P*Q

P and Q are vectors of length N, say for this example, N=2, and i want a vector N=2 for R

P=(a,b)' and Q=(c,d)' (both column vectors, dimension 2*1).

A simple vector multiplication (where Q is transposed R=P*Q') yields a scalar:

R=(ac+bd)

A simple vector multiplication (where P is transposed R=P'Q) yields a 2*2 matrix:

R=(ac,ad)

(bc,bd)

What I actually want is the (2*1) vector as a "product" of the P and Q vector:

R=(ac)

(bd)

Is there a "product" operator that will do this?

If I set up P and Q as N*N matrices (instead of N vectors) with values on the diagonals:

P*=(a 0)

(0 b)

Q*=(c 0)

(0 d)

Then P*Q yields:

R*=(ac)

(bd)

This gives me the elements I want, but I want them in a vector, not a matrix.

I'm fresh out of ideas...

**Update:**

Yes i can turn

R*=(ac 0)

(0 bd)

into

R=(ac)

(bd)

by multiplying R* by the column vector:

(1)

(1)

but this doesn't help put the column vectors P and Q into diagonal matrices P* and Q*. I don't think any operation can do that, can it?

I am almost about to define my own matrix operation that does what i want it to nice and neatly. It will be called:

Farmer's Vector Multiplication.

Look for it in a math forum near you!