Heres the problem:
It is know that f and g are defined for every x.
It is known that
I am asked to prove or disprove the following statement:
Now i think you can just disprove it when both f(x) and g(x) are approaching 0 when x -> 1 then you get 0/0 which isnt = 1.
(Just make up some random functions that would do that like f(x) = x-1, and g(x) = 2x-2)
you'd get x-1/x-1.
The teacher says otherwise but he didn't explained himself too well, something about "it is possible to algebraically manipulate the function to make the 0 disappear from the denominator."
It is possible in some functions alright but the functions aren't given here and we just need to disprove the statement so would that "disprovement" be correct?
Thanks a lot!