Is Entropy a "property" or a "primrose-path"?

Welcome Forums Gravitation Is Entropy a "property" or a "primrose-path"?

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #489
    Bill Eshleman
    Participant

    Dear Gyula,
    You responded to the concept of “Entropy and the H-theorem”
    with this excellent argument.

    “At the connection of Entropy and H-Theorem some basic assumption are not correct:

    – First of all: the positions, x, and the velocities, v, are not intrinsic properties of particles.
    – x and v are never known exactly: definite initial conditions, definite assumptions about x(t), v(t) at some time exact t is impossible.
    – The infinity (infinite large and infinite small) distances belong not to physical descriptions. The particles cannot be too close to each other.
    – Intrinsic properties of particles are two kinds of conserved charges qi and gi,; they are elementary electric and elementary gravitational charges.
    – qi and gi generate two fundamental interactions; the interactions are non-conservative in presence of charges. The interactions propagate with c. The interactions between particles can be attractive or repulsive.
    – Never exist particles without charges; never exist particles without interactions.
    – In small distances the electromagnetic interaction dominates. In large distance the gravity dominates, however, electromagnetism is also always present.
    – Elastic scattering of particles can never occur. The particles are always correlated.
    – Particles and fields appear always together.
    – What is the equilibrium of matter composed of particles and fields? Is the equilibrium the state of neutrinos or the neutron stars? Or somewhat other?

    Almost each basic assumption at the foundation of entropy is incorrect.”

    But I’m thinking that you demand too much from the concept
    of Entropy by wanting it to have “first-principles” when
    the possibility exists that Entropy is itself, a “first-
    principle”, needing no further assumptions at all.

    It is hard for me to put into words, but suppose that what
    looks to you as “invalid assumptions” are really just
    approximate implications of the concept of Entropy;
    conclusions which are themselves uncertain and seemingly
    disordered. Boltzmann’s “thermodynamic entropy” seems of
    little use, but when it got “cleaned-up” when analyzed by
    Gibbs and statistical mechanics and finally Claude Shannon,
    reality has become a “mixture-problem” much like bags of
    Jelly-Beans in a Jelly-Bean candy store, where the licorice
    flavors (positrons and eltons) are almost anomalous and in
    which measurements can never be exactly determined.

    If anything is to be assumed to be “exact”, I hope it
    is that Entropy is the only conserved property of open-
    systems among a multitude of breakable symmetries.

    That force, momentum, and energy(among other things) are NOT conserved, and that Entropy is the only conserved
    property of interactions.

    I even question the appropriateness of thinking that 23,000
    year old data about orbits near spiral galaxy cores predicts
    what the cores are now; that is, if our Milky-Way’s core
    imploded into a parallel universe today, that we would not know it until 23,000 years from today; and that year 25,016
    astronomers would be pondering this “dark” gravitational
    force with no apparent source; a source which may have
    existed long ago, but is now somewhere else; a symmetry
    which seems to violate conservation of information, but
    does not.

    Sincerely,
    Bill Eshleman

    #490
    Gyula Szász
    Moderator

    “If anything is to be assumed to be “exact”, I hope it is that Entropy is the only conserved property of open-systems among a multitude of breakable symmetries.
    That force, momentum, and energy (among other things) are NOT conserved, and that Entropy is the only conserved property of interactions.”

    The only conserved properties of interactions are their sources and their constant propagation c.

    #505
    Bill Eshleman
    Participant

    Dear Gyula,

    You said:
    “The only conserved properties of interactions are their sources and their constant propagation c.”

    I must agree, but then I must conclude that it is really
    the entropy which is conserved and propagated at c…….

    So in this fashion, Atoms and Entropy would be equivalent
    notions and this is what I’m thinking Boltzmann
    was “getting-at” in the first place. That is, that conserved particles and conserved waves are what might be
    called “entropy-packets”(-P times Log(P)) and symmetric in
    Nature.

    Sincerely,
    Bill Eshleman

    #513
    Gyula Szász
    Moderator

    “.. it is really the entropy which is conserved and propagated at c…….”

    Unfortunately, not! The interactions propagate with c and the sources of the interactions are conserved

    #514
    Bill Eshleman
    Participant

    This Wikipedia entry is what I was exposed to at university;
    it was called Communication Theory at that time. The so-
    called “other” entropies still baffle me. I call it the
    “candy-store” entropy and have been utterly brainwashed and
    hung out to dry, on its “truth”.

    https://en.wikipedia.org/wiki/Entropy_(information_theory)#Entropy_as_information_content

    Communications require a transmitter, a channel, and a receiver. This theory allows electrical engineers to
    describe the information content of alphabetic characters
    and words so that optimal(but noisy) channels can be
    determined.

    e, P, p, E is the alphabet, the atoms and composite
    particles are the words, the radiations are the noise,
    and the channel is space(time).

    To me, the analogy is fascinating; so I think/speculate
    that most, if not all, models of particles and their interactions can be modeled not necessarily with Newton’s
    old-style Calculus, but with the more modern Shannon
    Information(communication) Theory concepts instead.

    But it is important to know that I view this as merely an
    hypothesis; the theory comes to me courtesy of Boltzmann, Gibbs, et. al., and finally Shannon.

    Entropy is an extensive property and its paths are real
    and reversible and conserved and symmetric(usually).

    Thank you, cough, cough.

    #515
    Bill Eshleman
    Participant

    And this is the man who might be responsible for
    creating a new way to replace Calculus as a modeling
    language. He got the 2009 Fields Medal for studying
    the Boltzmann Equation.

    #516
    Gyula Szász
    Moderator

    The distinction between my effort and what Boltzmann, Gibbs, et. al., and finally Shannon done is that I have a dynamical proposal for several concrete particles with concrete fundamental interactions.
    This is a completely other purpose as Boltzmann and the other have had modeling.

    #517
    Bill Eshleman
    Participant

    A suspicious dichotomy, I’m thinking. A hidden symmetry.
    Possibly even a symmetry between Calculus and Information
    Theory. Or maybe even as “trivial” as “d” versus “delta”,
    and “integral” versus “summation”. To my way of thinking,
    numbers are only numbers, so maybe the symmetry is between
    the “concreteness of reality” and the “nothingness of
    numbers”…….

    #518
    Bill Eshleman
    Participant

    I’m even thinking that Feynman diagrams could easily
    have been developed as “entropic” instead of “energetic”
    He sums vectors that occur simultaneously, and dot products
    vectors that occur one after the other. I like Feynman diagrams, but sadly, they only represent how shallow my
    understanding of QM is. 🙁

    #519
    Bill Eshleman
    Participant

    And that minus-sign that you modern physicists put into my
    “precious” Pythagorean-Theorem; I don’t interpret what
    happens as the creation of an imaginary axis; I see it as
    the creation and/or revelation of another real 3-dimensional
    world for so-called “weak-fields”; a world where our copies
    see time as the cause of interactions, whereas in this
    world we see the opposite…. that interactions are the cause of time. And as the fields get stronger and
    stronger, more and more other worlds get created and/or
    revealed.

    That the “weak-field” “other-world” could be an (E, p)
    world, would be very elegant indeed.

    The irony might be that for the “strong-field” we may
    be detecting particles that belong in the other-worlds and
    not in this world at all.

    #534
    Bill Eshleman
    Participant

    Dear Gyula,

    Now, with my feet again firmly attached to the Earth, and
    not at all attacking your theory, nor even proposing to fool
    you into a chain of reasoning that could negate your
    assumptions….

    I propose that your theory be left intact, and suggest that
    it is only the form of your formalism, your purely Calculus
    treatment of motion that could stand a more statistical
    flavor. When I listen to you say that the positions and
    velocities cannot be determined and that the Laws of Nature
    are non-deterministic but causal, I see the need for your
    concept of motion to be augmented by the concept of “optimal
    transport” of information(-entropy). I believe that
    statistical laws are what Nature has; probabilities and
    their flow is a start, but large collections of objects
    need the concept of a “measure” and that the term “entropy”
    is that measure. Here is another video that I think you
    should be interested in:

    I suggest that your formal treatment is most accurately
    described as “analytic” and that Information Theory is
    as Cedric says, “synthetic” and so general that elegant and
    beautiful concepts are born-out-of this generality.

    Sincerely,
    Bill Eshleman

Viewing 11 posts - 1 through 11 (of 11 total)
  • You must be logged in to reply to this topic.