Degree of accuracy...

Discussion in 'Physics & Math' started by BrianHarwarespecialist, Aug 29, 2015.

  1. BrianHarwarespecialist We shall Ionize!i Registered Senior Member

    Messages:
    869
    Measurement theory

    "Measurement theory is the study of how numbers are assigned to objects and phenomena, and its concerns include the kinds of things that can be measured, how different measures relate to each other, and the problem of error in the measurement process"

    -Britannica

    When measuring a construction invision an error margin of .001 percent or .00001 percent and so on when comparing equivalence of different shapes.

    For this example I will use quadrature of a circle, although technically this is impossible approximations can come very close.

    Constructions of current date or at least 99.9999 percent acurate. With an accuracy so well defined it translates to considerably less than the size of the diameter of the tip of the pencil used to make the constructions.

    So technically anyone measuring these constructions with a ruler will not be able to discern any noticeable error margin. For all pragmatic purposes of geometric constructions comparing equivalent area quantities of the circle and square can be considered pseudo-correct. Accurate to a few decimal places but still of perfect pragmatic usefulness.

    The methods is assumed to also be capable to make contructions acurate to any desired degree of accuracy wanted, but since the human eye will naturally never be able to see the difference (error) why waste the time.
     
    Last edited: Aug 29, 2015
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. BrianHarwarespecialist We shall Ionize!i Registered Senior Member

    Messages:
    869
    So I guess the question is how small can you shrink the error margin before the amount of steps get to0 tedious?

    Consider the error margin it's undefined because it can be changed depending on the accuracy produced by the method used.

    In just a few simple steps it can already be defined to an accuracy of error not detectable by the human eye.

    Now from here it's better defined in the complex plane.
    But such precision at least to me serves nothing more than specialized applications.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Boris2 Valued Senior Member

    Messages:
    1,106
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. danshawen Valued Senior Member

    Messages:
    3,951
    The uncertainty principle derives as much from wave particle duality as it does the dynamics of bound energy in a universe in which there is only time, energy, and entanglement. The latter is what makes it possible for energy to become bound.

    A correct formulation of physics contains only these terms and c. References to space at rest (not even possible!) or solid geometrical constructs including Pi (which appears in Einstein's field equation due to the influence of Minkowski) betray a throwback to the Euclidean / Pythagorean geometry of Ancient Greece that do not take relativity into account. So does Newton's calculus, because of a non-existent dependence on a dt term that is the same everywhere, and in which time dilation does not exist.

    The twin paradox works just as well for relativistic rotation along a circular path as it does for straight trajectories (whatever that means), and this is the proof that even the value of the constant Pi is relative to an observer's state of motion. Wherever Pi as a constant is referenced, it must be referenced with respect to something that either has zero time dilation (is not moving or in a gravitational field) or is the same as the curvature of 'space' in the region of interest.

    Simultaneity only exists in this universe in the case of entanglement and viewing the same event from different perspectives in the same inertial frame. It was Minkowski's vanity that simultaneity in terms of energy propagation was a limiting case to the granularity of time. It most definitely is not, nor is whatever is in a mathematician's mind ever a test of physical reality. The infinite granularity of time is something that is conceptually easy to understand (a timeline of virtual particle events), yet impossible to directly observe, even in a mind's eye. This is why it must be assumed, but entangled photon experiments have already confirmed it to a high degree of experimental certainty. One part in 10^13, to be exact. This is the speed of light times 10,000.
     
    Last edited: Sep 11, 2015
  8. BrianHarwarespecialist We shall Ionize!i Registered Senior Member

    Messages:
    869
    Maybe 'at rest' could be defined as the same as
    undetectable. This will of course create an error or inexactness in related quantities.
     
    danshawen likes this.
  9. danshawen Valued Senior Member

    Messages:
    3,951
    Even "undetectable" is governed by the uncertainty principle. Thanks! This is new.
     

Share This Page