Help converting 2d matrix coordinates to 1d array

Hello kind folk, I am in the unfortunate position of being terrible at math but still trying to program computers. My apologies for my ignorance and slowness (liberal arts major, go figure ;)).

As the title says, I've been trying to figure out how to convert 2d matrix coordinates (where x = column and y = row) to a 1d array (a long list of all the points in the matrix). The array starts at 0, as most programming languages do. So the array is just a list of all the matrix cells as if you took each row or the matrix and set it at the end of the previous row. For example, from this matrix:

Code:

`._ _ _.`

|_|_|_|

|_|_|x|

|_|_|_|

to this array:

Code:

`._ _ _ _ _ _ _ _ _.`

|_|_|_|_|_|x|_|_|_|

Unfortunately, it's not as simple as x * y. I'm actually "embedding" a smaller matrix into a large parent matrix -- a matrix within a matrix (metamatrix?). So my trouble has been finding an equation that will convert the embedded matrix coordinates to the larger matrix coordinates, THEN converting those coords into its corresponding index/position in the list (of the parent matrix). So far, I've actually gotten this to work:

Code:

`parent matrix array index = parentMatrixNumColumns * embedMatrixCell.y - (parentMatrixNumColumns - embedMatrixCell.x)`

Thing is, that only works for arrays that are 1-based, as in they start at 1 (instead of starting at 0). Does that make any sense? Any ideas?