Page:EB1911 - Volume 08.djvu/245

 substitution &eta; = &eta;0 &minus; 1/Y enables us at once to obtain the general solution; for instance, when $2\mathrm{B} = \frac{d}{dx}\log \left(\frac{\mathrm{A}}{\mathrm{C}}\right),$|undefined a particular solution is &eta;0 = &radic;(-A/C). This is a case of the remark, often useful in practice, that the linear equation $\phi(x)\frac{d^2y}{dx^2} + \tfrac{1}{2}\frac{d\phi}{dx}\frac{dy}{dx} + \mu y = 0,$ where &mu; is a constant, is reducible to a standard form by taking a new independent variable z = &int; dx[&phi;(x)]-½.

We pass to other types of equations of which the solution can be obtained by rule. We may have cases in which there are two dependent variables, x and y, and one independent variable t, the differential coefficients dx/dt, dy/dt being given as functions of x, y and t. Of such equations a simple case is expressed by the pair $\frac{dx}{dt} = ax + by + c,\frac{dy}{dt} = a'x + b'y + c',$ wherein the coefficients a, b, c, a′, b′, c′, are constants. To integrate these, form with the constant &lambda; the differential coefficient of z = x + &lambda;y, that is dz/dt = (a + &lambda;a′)x + (b + &lambda;b′)y + c + &lambda;c′, the quantity &lambda; being so chosen that b + &lambda;b′ = &lambda;(a + &lambda;a′), so that we have dz/dt = (a + &lambda;a′)z + c + &lambda;c′; this last equation is at once integrable in the form z(a + &lambda;a′) + c + &lambda;c′ = Ae(a + &lambda;a′)t, where A is an arbitrary constant. In general, the condition b + &lambda;b′ = &lambda;(a + &lambda;a′) is satisfied by two different values of &lambda;, say &lambda;1, &lambda;2; the solutions corresponding to these give the values of x +&lambda;1y and x + &lambda;2y, from which x and y can be found as functions of t, involving two arbitrary constants. If, however, the two roots of the quadratic equation for &lambda; are equal, that is, if (a &minus; b′)² + 4a′b = 0, the method described gives only one equation, expressing x + &lambda;y in terms of t; by means of this equation y can be eliminated from dx/dt = ax + by + c, leading to an equation of the form dx/dt = Px + Q + Re(a + &lambda;a′)t, where P, Q, R are constants. The integration of this gives x, and thence y can be found.

A similar process is applicable when we have three or more dependent variables whose differential coefficients in regard to the single independent variables are given as linear functions of the dependent variables with constant coefficients.

Another method of solution of the equations

consists in differentiating the first equation, thereby obtaining $\frac{d^2x}{dt^2} = a\frac{dx}{dt} + b\frac{dy}{dx};$ from the two given equations, by elimination of y, we can express dy/dt as a linear function of x and dx/dt; we can thus form an equation of the shape d²x/dt² = P + Qx + Rdx/dt, where P, Q, R are constants; this can be integrated by methods previously explained, and the integral, involving two arbitrary constants, gives, by the equation dx/dt = ax + by + c, the corresponding value of y. Conversely it should be noticed that any single linear differential equation $\frac{d^2x}{dt^2} = u + vx + w\frac{dx}{dt},$ where u, v, w are functions of t, by writing y for dx/dt, is equivalent with the two equations dx/dt = y, dy/dt = u + vx + wy. In fact a similar reduction is possible for any system of differential equations with one independent variable.

Equations occur to be integrated of the form Xdx + Ydy + Zdz = 0, where X, Y, Z are functions of x, y, z. We consider only the case in which there exists an equation &phi;(x, y, z) = C whose differential $\frac{\partial \phi}{\partial x}dx + \frac{\partial \phi}{\partial y}dy + \frac{\partial \phi}{\partial z}dz = 0$ is equivalent with the given differential equation; that is, &mu; being a proper function of x, y, z, we assume that there exist equations $\frac{\partial \phi}{\partial x} = \mu \mathrm{X},\frac{\partial \phi}{\partial y} = \mu \mathrm{Y}, \frac{\partial \phi}{\partial z} = \mu \mathrm{Z}$ these equations require $\frac{\partial}{\partial z}(\mu \mathrm{Y}) = \frac{\partial}{\partial y}(\mu \mathrm{Z}),$ &c., and hence $\mathrm{X}\left(\frac{\partial \mathrm{Z}}{\partial y} - \frac{\partial \mathrm{Y}}{\partial z}\right) + \mathrm{Y}\left(\frac{\partial \mathrm{X}}{\partial z} - \frac{\partial \mathrm{Z}}{\partial x}\right) + \mathrm{Z}\left(\frac{\partial \mathrm{Y}}{\partial x} - \frac{\partial \mathrm{X}}{\partial y}\right) = 0;$|undefined conversely it can be proved that this is sufficient in order that &mu; may exist to render &mu;(Xdx + Ydy + Zdz) a perfect differential; in particular it may be satisfied in virtue of the three equations such as $\frac{\partial \mathrm{Z}}{\partial y} - \frac{\partial \mathrm{Y}}{\partial z} = 0$|undefined in which case we may take &mu; = 1. Assuming the condition in its general form, take in the given differential equation a plane section of the surface &phi; = C parallel to the plane z, viz. put z constant, and consider the resulting differential equation in the two variables x, y, namely Xdx + Ydy = 0; let &psi;(x, y, z) = constant, be its integral, the constant z entering, as a rule, in &psi; because it enters in X and Y. Now differentiate the relation &psi;(x, y, z) = &fnof;(z), where &fnof; is a function to be determined, so obtaining $\frac{\partial \psi}{\partial x}dx + \frac{\partial \psi}{\partial y}dy + \left(\frac{\partial \psi}{\partial z} - \frac{df}{dz}\right) dz = 0;$ there exists a function &sigma; of x, y, z such that $\frac{\partial \psi}{\partial x} = \sigma \mathrm{X} \frac{\partial \psi}{\partial y} = \sigma \mathrm{Y}$ because &psi; = constant, is the integral of Xdx + Ydy = 0; we desire to prove that &fnof; can be chosen so that also, in virtue of &psi;(x, y, z) = &fnof;(z), we have $\frac{\partial \psi}{\partial z} - \frac{df}{dz} = \sigma \mathrm{Z},$ namely $\frac{df}{dz} =\frac{\partial \psi}{\partial z} - \sigma \mathrm{Z};$ if this can be proved the relation &psi;(x, y, z) &minus; &fnof;(z) = constant, will be the integral of the given differential equation. To prove this it is enough to show that, in virtue of &psi;(x, y, z) = &fnof;(z), the function &part;&psi;/&part;x &minus; &sigma;Z can be expressed in terms of z only. Now in consequence of the originally assumed relations, $\frac{\partial \psi}{\partial x} = \mu \mathrm{X}, \frac{\partial \phi}{\partial y} = \mu \mathrm{Y}, \frac{\partial \phi}{\partial z} = \mu \mathrm{Z}$ we have $\frac{\partial \psi}{\partial x}\left/\frac{\partial \phi}{\partial x}\right. = \frac{\sigma}{\mu} = \frac{\partial \psi}{\partial y}\left/\frac{\partial \phi}{\partial y}\right.,$ and hence $\frac{\partial \psi}{\partial x}\frac{\partial \phi}{\partial y} - \frac{\partial \psi}{\partial y}\frac{\partial \phi}{\partial x} = 0;$ this shows that, as functions of x and y, &psi; is a function of &phi; (see the note at the end of part i. of this article, on Jacobian determinants), so that we may write &psi; = F(z, &phi;), from which $\frac{\sigma}{\mu} = \frac{\partial \mathrm{F}}{\partial \phi};$ then $\frac{\partial \psi}{\partial z} = \frac{\partial \mathrm{F}}{\partial z} + \frac{\partial \mathrm{F}}{\partial \phi}\frac{\partial \phi}{\partial z} = \frac{\partial \mathrm{F}}{\partial z} + \frac{\sigma}{\mu} \cdot \mu \mathrm{Z} = \frac{\partial \mathrm{F}}{\partial z} + \sigma\mathrm{Z}$ or $\frac{\partial \psi}{\partial z} - \sigma\mathrm{Z} = \frac{\partial \mathrm{F}}{\partial z};$|undefined in virtue of &psi;(x, y, z) = &fnof;(z), and &psi; = F(z, &phi;), the function &phi; can be written in terms of z only, thus &part;F/&part;z can be written in terms of z only, and what we required to prove is proved.

Consider lastly a simple type of differential equation containing two independent variables, say x and y, and one dependent variable z, namely the equation $\mathrm{P}\frac{\partial z}{\partial x} + \mathrm{Q}\frac{\partial z}{\partial y} = \mathrm{R},$ where P, Q, R are functions of x, y, z. This is known as Lagrange’s linear partial differential equation of the first order. To integrate this, consider first the ordinary differential equations dx/dz = P/R, dy/dz = Q/R, and suppose that two functions u, v, of x, y, z can be determined, independent of one another, such that the equations u = a, v = b, where a, b are arbitrary constants, lead to these ordinary differential equations, namely such that $\mathrm{P}\frac{\partial u}{\partial x} + \mathrm{Q}\frac{\partial u}{\partial y} + \mathrm{R}\frac{\partial u}{\partial z} = 0$ and $\mathrm{P}\frac{\partial v}{\partial x} + \mathrm{Q}\frac{\partial v}{\partial y} + \mathrm{R}\frac{\partial v}{\partial z} = 0.$ Then if F(x, y, z) = 0 be a relation satisfying the original differential equations, this relation giving rise to $\frac{\partial \mathrm{F}}{\partial x} + \frac{\partial \mathrm{F}}{\partial z}\frac{\partial z}{\partial x} = 0$ and $\frac{\partial \mathrm{F}}{\partial y} + \frac{\partial \mathrm{F}}{\partial z}\frac{\partial z}{\partial y} = 0,$ we have $\mathrm{P}\frac{\partial \mathrm{F}}{\partial x} + \mathrm{Q}\frac{\partial \mathrm{F}}{\partial y} + \mathrm{R}\frac{\partial \mathrm{F}}{\partial z} = 0.$|undefined It follows that the determinant of three rows and columns vanishes whose first row consists of the three quantities &part;F/&part;x, &part;F/&part;y, &part;F/&part;z, whose second row consists of the three quantities &part;u/&part;x, &part;u/&part;y, &part;u/&part;z, whose third row consists similarly of the partial derivatives of v. The vanishing of this so-called Jacobian determinant is known to imply that F is expressible as a function of u and v, unless these are themselves functionally related, which is contrary to hypothesis (see the note below on Jacobian determinants). Conversely, any relation &phi;(u, v) = 0 can easily be proved, in virtue of the equations satisfied by u and v, to lead to $\mathrm{P}\frac{dz}{dx} + \mathrm{Q}\frac{dz}{dy} = \mathrm{R}.$ The solution of this partial equation is thus reduced to the solution of the two ordinary differential equations expressed by dx/P = dy/Q = dz/R. In regard to this problem one remark may be made which is often of use in practice: when one equation u = a has been found to satisfy the differential equations, we may utilize this to obtain the second equation v = b; for instance, we may, by means of u = a, eliminate z—when then from the resulting equations in x and y a relation v = b has been found containing x and y and a, the substitution a = u will give a relation involving x, y, z.

Note on Jacobian Determinants.—The fact assumed above that the vanishing of the Jacobian determinant whose elements are the partial derivatives of three functions F, u, v, of three variables x, y, z,