Hi, im revising for my foundations of computer science exam, looking through passed papers, and ive come across a question thats giveing me troubles.

Matrices A, B and C have sizes 2 x a, b x 3 and 3 x c respectivly. Find the conditions on a, b, c under which:

i. The matric A . B . C exists and has the size 2 x 1

to get the resulting matrix to be 2 x 1, c would have to be equal to 1 right? im just not sure what to set a and b to to get a matrix i can multiply in to 3 x 1.

i hope that makes sence, thanks for you help!