What am I missing???? Is the introductory info in this problem extraneous?

I just don't understand what is being asked of me? Here it is:

A computer screen is divided into little squares called pixels. The number of pixels on the screen is called the monitor resolution. A common monitor resolution is 640 x 480, which means that there are 640 pixels along the length of the screen and 480 pixels along the width of the screen. If a computer program divides the screen into equal size squares, what would be the length of the side in pixels of the largest possible square?

My first approach was to multiply the resolution and see if that answer had a perfect square and that got me nowhere. I am sure the computation must be straightforward. I just don't even get what "square" they are referring to and whether I am supposed to start with a 640 x 480 monitor or not. Help!