If you know that x(t) is differentiable, then I would differentiate the whole thing, take the Laplace Transform, and solve for Y/X. Is anything preventing you from doing that?
Yeah, I see what you mean. This is a doozy of a problem. Is this a textbook problem? If so, there's probably some trick you're supposed to see in order to solve it. I tried doing the Laplace Transform directly, and then fiddling around with interchanging the order of integration. You can get some interesting equations that way, but not towards getting the final ratio desired. And you can try tricks with integration by parts, but that ends up doing the same thing as interchanging the order of integration. One thought that did occur to me was this: convolution. If you were to focus only on the LT of the integral term with the x(t) multiplying it, I wonder if you couldn't use the convolution theorem to help you out there. It'd be worth trying, because then you might get something like this:
Perhaps you could work with pulling the x(t) inside the integral, and doing a change of variable.
The figure attached shows the system I am trying to model with the equation.
It is a feedback system to suppress x(t) at the node y(t). The system itself is quite simple but the transfer function is giving me headache. What I want to do is formulating H(s) and then determine the optimum A if the optimum A exists......
cfy30
I am sure the equation I have is correct. z(t) is the "wanted" signal and x(t) is the interference that needs to be canceled. When the system starts to run, x(t) will be suppressed, leaving z(t) as the output y(t). Imagine z(t) is cos(2*pi*200*t) and x(t) is cos(2*pi*50*t). y(t) contains cos(2*pi*200*t) only and is free of cos(2*pi*50*t).
cfy30
Consider two signals and and corresponding outputs , . Now what is the output when the input is ? (note I am here working with a necessary condition for linearity not the full condition, to show that linearity fails it is sufficient to show that a necessary condition fails)
CB