I have the following problem:
The nationwide average salary of a computer programmer can be modeled by the equation y=31.8x(1.06)^n, where y is the salary in thousands of dollars and n is the number of years since 1990.
Using the model, predict the average programmer's salary in 2010.
y = 31.8x(1.06)^20 Rounded, I got the answer $102,000. Is this correct?
Where are you getting the problem from? If it is in print, then you should be able to distinguish between the Latin character 'x' and the multiplication symbol '×' (the latter will generally have perfectly straight, uniform-width lines, as shown here).