I'm trying to figure out how to prove the following equation in nonlinear.
(d/dx)u + u(d/dy)u = 0.
I know to show something is linear you need to show for a linear operator, L, and constants a, b and functions u, v that:
L(au + bv) = aL(u) + bL(v).
The problem I am having is that I want to show that is is a linear operator by saying that L= (d/dx) + u(d/dy). This is because I know that I can prove
(d/dx)u + x(d/dy)u = 0 is a linear operator by saying that L = (d/dx) + x(d/dy). Obviously i can use independent variables in defining a linear operator, but not use
dependent variables in defining a linear operator. Therefor, the crux of my problem is not knowing how to define the Operator to then show that the equation is nonlinear. Help on this would be appreciated. Thanks!
A nonlinear program is one that will not satisfy the superposition process, or one whose output is not directly proportional in order to its input; some sort of linear system meets these conditions. To put it differently, a nonlinear system is any issue where the variable(s) being solved for is not written as a linear combined independent components. A new nonhomogeneous system, which is linear apart from the presence of a function of the independent variables, is nonlinear as outlined by a strict explanation, but such systems tend to be studied alongside linear techniques, because they may be transformed to some sort of linear system of multiple variables.for info about linear, linear function definition