Independent variables and partial differentiation

Discussion in 'Physics & Math' started by Pete, Feb 8, 2012.

  1. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    I hope this makes sense... I'm not familiar with the terminology.

    There is a vector space (x,y), with a linear transformation between (x, y) and (x', y'):
    \(x' = g_1(x, y) \\ y' = g_2(x, y)\)

    There is a set of points (a,b) in (x,y) defined by the following function:
    \(a = f(b,c)\)
    Where c is an independent variable.
    - What is db/dc? Is it zero? Is it undefined?

    The points (a,b) are transformed to (a',b').
    - Can we say from the above whether b' and c are independent?
    - Is there a rigorous definition of what 'independent' means?
    - Does it depend on the particular equation?

    Say we use the above to derive an expression like this:
    \(a' = h(a', b', c)\)
    - Are b' and c independent variables in that expression?
    - Can I simply treat b' as a constant when finding \(\frac{\partial a'}{\partial c}\), or do I need to treat it as a function of c?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    I'm not sure whether I can answer this question in the abstract. So let me see if I can construct a simple example...

    I'll use

    \(x' = x + y\\ y'=x-y\)

    I'll take

    \(a=e^{bc}\)

    You said c is an independent variable, by which I assume you mean c is independent of b. That is, we can vary b as much as we want to without affecting c, and vice versa.

    This suggests to me that db/dc=0.

    Using my example

    \(a=e^{bc} \\ x' = x + y\\ y'=x-y\)

    so

    \(a'=a + b = e^{bc} + b\\ b'=a-b = e^{bc} - b\)

    It seems clear to me from the example that b' depends on c.

    Doesn't it mean that you can vary one variable without affecting the other?

    Do you mean \(a'=h(a,b',c)\)? Using my example we have

    \(a'= e^{bc} + b = b' + 2b = b' + 2\frac{\ln a}{c}\)

    No, because \(b'=e^{bc}-b\)

    Does't the notation \(\frac{\partial a'}{\partial c}\) mean that all variables other than c are to be held constant? That would include b', wouldn't it?
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    Thanks James,
    But the same logic suggests that dc/db=0, ie db/dc is infinite or something.
    It seems to me that db/dc=0 would imply that b can't vary:
    \(\begin{align} \frac{db}{dc} &= 0 \\ \int \frac{db}{dc} dc &= \int 0 dc \\ b + C_1 &= C_2 \\ b &= C \end{align}\)
    ...but I'm not enough of a mathematician to be certain that this proof is valid.

    That's my understanding, but I was wondering if there was any formal definition.

    In the specific case this question is based on, the expression was \(a'=h(a',b',c)\), but the key point is that the expression involves b' and c, and doesn't involve b, so your example is good.

    This is where I get confused.
    In that function, it seems that b' is an dependent variable, depending on b and c. (Or are they three interdependent variables?)
    But in the function:
    \(a'= b' + 2\frac{\ln a}{c}\)
    Where b is not involved, it seems to me that b' is independent of c, because they can vary without affecting the other.
    For any given value of c, b' can have any value (because b can have any value).
    For any given value of b', c can have any value (because b can have any value).

    Does that make sense, or am I right off the track?

    I think so... but I'm getting confused by b.
    We can hold b constant or b' constant, but not both (because that would imply holding c constant).
    b is not in the equation, but b' was earlier derived/defined as a function of b and c.
    So, do we hold b' constant, because b' is in the equation and b isn't?
    Or do we hold b constant, for some other reason?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Trippy ALEA IACTA EST Staff Member

    Messages:
    10,890
    Consider a horizontal line, which is really what you're describing.

    The value of Y (or in this case b) is both constant and independent of the value of X (or in this case c).

    Or to put it another way, if you set the variable c to some constant, you make it independent of b.

    I'm not sure it neccessarily follows that it's true of all independent variables, however, that the derivative is 0.
     
  8. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    So, dy/dx=0 implies that y is a constant, right?
    So if y is not a constant, then this implies that dy/dx is not zero?
     
  9. Trippy ALEA IACTA EST Staff Member

    Messages:
    10,890
    I would have said so, yes.

    If you consider

    Y=0
    Y=6
    Y=\(\frac{\sqrt{2}}{\pi}\)
    and Y=999999999999999999999999999999999999999999999999999999999999999

    In all cases \(\frac{dy}{dx}=0\)
     
  10. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    What if y = z, and z is independent of x?

    In general, if y=f(x,z), then

    \(\frac{dy}{dx} = \frac{\partial y}{\partial x} + \frac{\partial y}{\partial z}\frac{\partial z}{\partial x}\)

    So, if y=f(x,z)=z, then

    \(\frac{dy}{dx} = 0 + 1\times \frac{\partial z}{\partial x} = \frac{\partial z}{\partial x}\)

    so, we need to know how z varies with x to find the total derivative of y with respect to x.

    If z is independent of x, then it seems to me that we must have \(\frac{\partial z}{\partial x} = 0\), and it then follows also that \(\frac{dy}{dx}=0\).
     
  11. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    I think that should be
    \(\frac{dy}{dx} = \frac{\partial y}{\partial x} + \frac{\partial y}{\partial z}\frac{dz}{dx}\)

    Right... but "how z varies with x" implies some relationship between z and x. It doesn't make sense if z and and are independent, because when you vary x, z can do anything.


    Maybe I'm just not getting the concept of independence.
    It seems to me that independence is not an absolute thing, but is relative to the equation.
    eg:
    y = x+z
    implies y depends on independent variables x and z.
    But rewriting it as:
    x = y - z
    implies that x depends on independent variables y and z.

    Is it better in this case to say that x, y, and z are interdependent?
     
    Last edited: Feb 9, 2012
  12. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Penrose states in his book "The Road to Reality", that for a pair of independent variables, say u and v, expressing that some quantity, say Z, is a function of u and v but independent of v is given by: dZ/dv = 0. It also says that for any value of u, Z is constant in v, and so Z depends only on u.

    He also makes the point that this only holds locally, it might not be true for all Z.
     
  13. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    ...so Z is really only a function of u?
    Are you sure it didn't say \(\partial Z/\partial v = 0\)?

    That looks very much like a description of \(\partial Z/\partial v = 0\).
     
  14. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Yes, sorry, it actually concerns the Cauchy-Riemann equations. Partial derivatives are of one variable by definition.

    Ed: but the principle holds for ordinary differentiation: if Z is constant in v the "slope" dZ/dv is zero, and v is independent.
    For a function of two variables, Z(u,v), there is no problem with Z being constant in v but still a function of u and v. The v could be contour lines of constant height (latitude) above the equator of a sphere, say, so they have gradients = 0.
     
    Last edited: Feb 9, 2012
  15. temur man of no words Registered Senior Member

    Messages:
    1,330
    If you fix c, you'll get a curve a=f(b,c) in general. So c cannot be taken as the only independent variable. Rather, we should consider b as a function of a and c. Differentiate the equation a=f(b,c) with respect to c to get

    \( 0=\frac{\partial f}{\partial b} \frac{\partial b}{\partial c} + \frac{\partial f}{\partial c}, \)
    so
    \( \frac{\partial b}{\partial c} = - \frac{\partial f}{\partial c} \left(\frac{\partial f}{\partial b}\right)^{-1}, \) or in short \(b_c = - f_c/f_b\),
    provided the latter term can be inverted.
    In effect, it is the same as holding a fixed, and defining the implicit function b(c) by a=f(b,c).

    - Not from the above, bu you can choose to consider b' and c as independent, because you have one equation and 3 variables.
    - It means you can freely choose the values of the variables, and still have freedom to satisfy the given equation/relation.
    - Yes, but in a weak way. If you have 1 equation involving 3 variables, you can choose 2 of them to be independent, unless some singular situations arise (look up implicit function theorem).

    - You can choose so.
    - Yes.
     
  16. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    Thanks temur.
    What about the total derivative \(\frac{db}{dc}\)?
    Does it have a value?
    Is it even meaningful?

    Thanks, I'll exercise due care

    Please Register or Log in to view the hidden image!



    One more...
    We have:
    \(\begin{align} x &= f(y,z) \\ \frac{dx}{dy} &= 0 \end{align}\)

    Is this correct:
    \(\begin{align} \int \frac{dx}{dy} dy &= \int 0 dy \\ x + C_1 &= C_2 \\ x &= C \end{align}\)

    Or this:
    \(\begin{align} \int \frac{dx}{dy} dy &= \int 0 dy \\ x + g(z) &= C_2 \end{align}\)


    My understanding is that the use of the expression \frac{dx}{dy} implies that z is a function of y, so x is implicitly a function of only y, and that the first process is correct.

    But I'm not certain, and there's disagreement about in [post=2901509]another thread[/post] (it's a debate thread, so only Tach and I can post there).
     
  17. temur man of no words Registered Senior Member

    Messages:
    1,330
    It is not meaningful. In order to define this, you need to specify how a depends on c. The "total" derivative \(\frac{db}{dc}\) is taken along a given curve. Without a curve (i.e., a dependence such as a=a(c)), there is no "total" derivative.

    Again you need to know how z depends on y. Otherwise it has no meaning.
     
  18. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    Right, we can't even talk about \(\frac{dx}{dy}\) unless z depends on y...

    So, if:
    \(\begin{align} x &= f(y,z) \\ z &= g(y) \\ \frac{dx}{dy} &= 0 \end{align}\)

    Then is this correct?
    \(\begin{align} \int \frac{dx}{dy} dy &= \int 0 dy \\ x + C_1 &= C_2 \\ x &= C \end{align}\)

    Thanks again
     
    Last edited: Feb 10, 2012
  19. temur man of no words Registered Senior Member

    Messages:
    1,330
    These conditions imply only that the function f(y,z) is constant along the curve z=g(y). For example, take g(y)=0. Then any of f(y,z)=1, f(y,z)=zy, f(y,z)=sin(z) satisfy the equation \(\frac{dx}{dy} = f_y+f_zg_y=0\).
     
    Last edited: Feb 10, 2012
  20. Pete It's not rocket surgery Registered Senior Member

    Messages:
    10,167
    *click*

    Thanks!
     

Share This Page