Calculating error when interpolating to resample data
I have a data set with data that is y vs time ( and nonlinear). It has been sampled at uneven times and has missing dataetc. So to compare the data I have interpolated (by linear interpolation) to new time points for all the data using software (matlab). The data set is massive so I have automated this process so checking the fit visually is difficult. Does anyone know of a way to calculate the "fit" of the interpolated data? As they are original data and resembled data are at new time points, I cannot think of a way to do this. Or is there a better way resampling the data and calculating the fit?
Re: Calculating error when interpolating to resample data
When calculating the fit of a linear regression you compare the data points to the expected point which is got with the linear equation. You compare your points to theoretical points. When doing interpolation any points you create are already theoretical points so you have nothing to compare them to. For this reason I believe you cannot calculate the fit of an interpolation.