Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

DetlefK

(16,423 posts)
Mon Jan 23, 2017, 06:12 AM Jan 2017

Question about thermodynamics

Let's say, I have a sequence "34251". It is disordered and therefore contains a certain amount of entropy. Now that sequence gets ordered by some process or algorithm. The new sequence is "12345". It is more ordered than the other sequence and therefore contains less entropy than the other sequence.

1. Where did the entropy go during the process? Where is it now that the process is finished?

2. If the system contains entropy, there must be some mathematical equivalent to "temperature". What is it?

8 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

Buckeye_Democrat

(14,853 posts)
1. I haven't studied thermodynamics and entropy for years, but...
Mon Jan 23, 2017, 07:55 AM
Jan 2017

I don't think '12345' would represent less entropy. Information-wise, I think entropy more accurately represents the number of ways a system can be arranged. Those five digits can be arranged in 120 ways, and that's a better indication of the entropy than the actual ordering.

Temperature is the average kinetic energy of the objects in the system.

Entropy has the dimension of energy divided by temperature. So if the total energy is constant in a closed system, the increase of entropy indicates that the average kinetic energy has decreased in some process and hence the decreased kinetic energy must be in other forms, which increases the ways the total energy has been arranged.

I suppose another way to look at it is that not all collisions are elastic, where both the momentum and kinetic energy of a system are conserved. In inelastic collisions, some of the energy is transformed to radiation.

Someone with a PhD in physics could probably explain it better, and more accurately, than me. Entropy has a reputation for being a tricky concept, so I'm not going to pretend that I totally grasp it.

There's an idea from Erik Verlinde called "Entropic Gravity" which, if true, would do away with the need for "dark matter" to explain the observed rotations of galaxies and galaxy clusters. It's far beyond my understanding!
https://en.wikipedia.org/wiki/Entropic_gravity

DetlefK

(16,423 posts)
2. My example was bad, but the question is still valid:
Mon Jan 23, 2017, 08:40 AM
Jan 2017

Let's say, I have a question with many possible answers. The system has a high entropy. (Maximum possible entropy when all answers have equal probability.)

Then I work on that problem with some kind of process or algorithm. The probability of some answers increases, the probability of some other answers decreases. Overall, the entropy goes down.

Once the system has reached the point where one answer has 100% probability and all other answers have 0% probability, the system has reached a state of minimum possible entropy.



Where did that entropy go? Where did the algorithm put it?
(Entropy can only be destroyed under very special thermodynamic circumstances that don't apply here.)

Buckeye_Democrat

(14,853 posts)
3. I think the algorithm would be considered work done on it...
Mon Jan 23, 2017, 05:18 PM
Jan 2017

and entropy can decrease due to outside work/energy. It only stays the same or increases in a closed system without such outside influence.

If the work is instead considered part of the same system, the part with lowered entropy is still over-matched by the increased entropy from the work done upon it, and so the overall entropy of the system still increases. If it's something like a computer doing the algorithm, then the heat it radiates would be a source of greater overall entropy.

Something like that.

Like I said, it's been awhile. Better educated people in physics could explain it better, and more convincingly, than me.

Here's a video about entropy that I just found, but it's still no match for the "language" of mathematics in these kinds of matters.

DetlefK

(16,423 posts)
5. You are mixing up two different kinds of entropy.
Tue Jan 24, 2017, 05:58 AM
Jan 2017

There is one entropy describing the state of the question-answer-system. The heat generated by the computer/brain describes the matter the computer/brain is made of.

What very, very, very few physicists know is that entropy can be destroyed (or to be more precise: self-destruct). You wait until the system goes to a state of lower entropy via stochastic, random processes and then you conserve the system in that state. In layman's terms: You wait until the problem solves itself. Which can take billions of years or more, depending on the problem, because the probability for the self-destruction of the entropy is reeeeeeeeeeeeeally small.



The algorithm doing work on the system, that sounds good. But that would mean, we would need something akin to "temperature" in the system.

Buckeye_Democrat

(14,853 posts)
6. Sean Carroll might have mentioned that after the 34-minute mark...
Tue Jan 24, 2017, 06:09 AM
Jan 2017

of the video in post #4.

It's the part when he talks about Boltzmann trying to explain how the early universe started in a state of such low entropy.

Sorry if I misunderstood your question. The answer? I don't really know. If it's ultimately based on probability, then some very unusual things can happen sometimes.

Buckeye_Democrat

(14,853 posts)
4. Here's a better video about entropy.
Mon Jan 23, 2017, 08:33 PM
Jan 2017

It's mostly about the direction of time, and its relationship to entropy, but there's some good stuff starting around the 14-minute mark about entropy.

NNadir

(33,514 posts)
7. Your question is fairly sophisticated and is another statement of the Maxwell's demon problem.
Sat Feb 11, 2017, 10:29 AM
Feb 2017

Maxwell asked what would be the consequence of a putative demon on an atomic scale who opened a door to let molecules exceeding a certain speed through it and blocked all others. Theoretically this would reverse the "zeroth" law of thermodynamics which states that two systems in contact will eventually reach the same temperature.

The solution to this problem concerns the entropy associated with information processing. We see this entropy in every day experience by the fact that computer chips get warm when they operate.

In this case the entropy of the universe increases even if the entropy of the local system decreases.

hunter

(38,311 posts)
8. Heh, what does the internet weigh? The electrons in motion? About the same as a strawberry.
Sat Feb 11, 2017, 01:10 PM
Feb 2017


http://www.telegraph.co.uk/technology/internet/8865093/Internet-weighs-the-same-as-a-strawberry.html

https://people.eecs.berkeley.edu/~kubitron/

The information itself? Much less.

Computers are interesting. They don't work by organizing information, they work by disposing of it; they work by throwing away the information you don't want.

This computing process turns highly ordered electric energy into low grade heat. Even a "perfect" computing machine, one that used the least possible electric energy to move bits around would do this.

This paper goes into some math:

http://ieeexplore.ieee.org/document/7738677/

These authors estimate modern computer technologies use 100-10,000 times more energy than the theoretical limit.

There are also possible computer designs that would operate in a different manner, not throwing any information away. That's where things get very weird and controversial, talking about quantum computers and fully reversible computational processes.

Personally, I'd begin in the land of signal-to-noise ratios and sampling theory, which is where all this "information theory" came from.

Monty Montgomery is doing some Interesting work explaining various aspects of human perception and signal theory here, in a way that's understandable without the heavy math:

https://people.xiph.org/~xiphmont/demo/neil-young.html

https://xiph.org/video/vid2.shtml

How do we understand the universe unless we are fully aware of the limitations and prejudices of our own human perceptions, especially as these perceptions relate to this thing we call "time."

I'm not talking anything metaphysical here. Our internal model of the "universe and everything" is clearly a product of our evolutionary history. It's not necessarily an accurate or realistic model, all it had to do was bring your genetic self to this moment.










Latest Discussions»Culture Forums»Science»Question about thermodyna...