Entropy: A little understood concept in physics [video]
231 points by guptarohit 2 years ago | 163 comments- javajosh 2 years agoEntropy only made sense when I learned it from the perspective of statistical thermodynamics. It's a very programmerly understanding, IMHO, and it's quite intuitive. EXCEPT that the language used is ridiculous: grand canonical ensemble indeed! Anyway, the idea that a system can be in some number of specific states, and that equilibrium is that unique situation where the number of possible specific states is at its maximum, really spoke to me.
- guga42k 2 years agoIf somebody needs to build an intuition about entropy he could think about simple problem.
You are given insulated cylinder with a barrier in the middle. Left side of the cylinder filled with ideal gas A, and the right side filled with gas B. If given a particle one can distinguish A from B. The pressure and temperature on both sides are the same. Then you remove the barrier and gases mix. Question: how much work you need to do to revert the system into the original state? Hint: the work is equal to entropy difference between two states.
More generally, if you have proper insulated system and leave it be for a while. All of sudden you will have to do some work to come back to the original state despite energy conservation law holds.
- ithkuil 2 years agoIf you need to do work in order to revert to the previous state, does it imply you can extract work when going to the first to the second state?
Given the scenario you just laid out it seems no work can be extracted just by letting mix two substances that are at the same temperature and pressure. But there is something about it that doesn't quite add up to my intuition of symmetry and conservation laws. Could you please elaborate more on that?
- Lichtso 2 years agoI think you can very well extract work from having a membrane and selectively let one substance mix into the other but not the other in the first [0]. It is called Osmosis [1].
[0]: https://en.wikipedia.org/wiki/Semipermeable_membrane [1]: https://en.wikipedia.org/wiki/Osmosis
- guga42k 2 years ago>If you need to do work in order to revert to the previous state, does it imply you can extract work when going to the first to the second state?
Nope. The work comes from the system coming from ordered state into unordered. Why the problem above is good for intuition because you can work out how to reverse the state. You invent semi-magical barrier which is fully transparent for particles A and reflects particles B, then you start to push such barrier from left to right up to the middle, compressing gas B (and making work!) and leave left part with gas A only, then repeat similar exercise on the right side.
>Given the scenario you just laid out it seems no work can be extracted just by letting mix two substances that are at the same temperature and pressure. But there is something about it that doesn't quite add up to my intuition of symmetry and conservation laws. Could you please elaborate more on that?
As far as I understand this asymmetry was the exact reason why entropy was introduced. Then later explained by Boltzmann via a measure of number of microscopic states.
Naturally second law of thermodynamics forbids perpetual engines.
- Lichtso 2 years ago
- kuchenbecker 2 years agoEnter Maxwell's demon as a completely valid solution to this problem showing you can decrease entropy within that system (but you need to exclude the demon from the system).
- ithkuil 2 years ago
- dekhn 2 years agoI took a stat thermo class and it was basically all about entropy, which was expresed as ln W- the log of the number of ways (permutations) that a system can be ordered, which gives a convenient denominator when calculating the probability of a specific permutation. Here's the professor's book, which was still only in latex form when we took the class: https://www.amazon.com/Molecular-Driving-Forces-Statistical-...
- javajosh 2 years agoyes, there are lots of quantitative details. I wanted to emphasize the key qualitative concept, from which the others can derive. In a similar way you can derive all of special relativity, and approach an intuition about the strangeness of spacetime, starting with only two ideas: the laws of physics are the same in all reference frames; the speed of light is constant. I prefer to start there and derive e.g. Lorentz factors than start with the mathy stuff.
- kergonath 2 years ago> starting with only two ideas: the laws of physics are the same in all reference frames; the speed of light is constant.
Isn’t this redundant, though? The constant velocity for light in a vacuum comes directly from the laws of (classical) electromagnetism in the form of Maxwell’s equations. So “the laws of Physics are the same in all reference frames” implies “Maxwell’s equations are valid in all reference frames”, which in turn implies “the velocity of light in vacuum is the same in all reference frames”. That’s what I understood reading Einstein himself.
I think it’s much stronger that way. Otherwise we get to why light should be a special case, which is difficult to defend. The constant velocity of light (in vacuum) being an unavoidable consequence of the laws of Physics makes it much stronger.
> I prefer to start there and derive e.g. Lorentz factors than start with the mathy stuff.
That’s how Einstein himself explained it (with trains and stuff, but still) and it makes a lot of sense to me. Much more than the professor who did start with the Lorentz transform and then lost everyone after 20 minutes of maths.
- kergonath 2 years ago
- javajosh 2 years ago
- contravariant 2 years agoThat intuition is still a bit shallow though. I don't mean that in a bad way, some intuition is better than none at all. However if you start to dig you'll find out the terminology goes out of wack.
Note that you're describing equilibrium as a unique situation where the number of possible states is at a maximum. Now how can a situation be unique if it has the maximum number of possible states? Clearly the situation is as far from unique as it can be.
To resolve the contradiction requires distinguishing between features of the probability distribution and features of a random sample (i.e. a possible state) and also needs an explanation how it even makes sense to view a deterministic physical system (leave quantum mechanics for now) as a random variable.
The theory that links everything together is ergodic theory, which has a couple of handy theorems. One is that for a certain kind of dynamical system the average over time and the average over the 'possible states' agree. Such a system can also be assigned an entropy. It even suggests that generally a system will be found around states with a probability close to 2^-entropy (this is not absolutely always true ..but close enough for physicists)
Now what does such a system look like? Well we need a state space (easy) and a measure on it which is constant as the system evolves (i.e. we can pick a region in the state space, evolve it and its volume will stay constant). The last part is tricky, but as it turns out classical mechanics gives us the phase space and the canonical volume on it (basically the standard notion of volume) which fit the bill. This gives a probability distribution on the state space and an entropy equal to log(volume in phase space), which matches the definitions in statistical physics but also gives a solid foundation for some of the seemingly arbitrary choices.
So there you have it, that's why a system can have a probability distribution attached to it, despite being deterministic, why 'high entropy states' are common, and why physical systems have a uniform distribution (and therefore an entropy which is the log of the number of states).
This also explains physicists got away with using a uniform distribution without worrying about which variables they used. By pure 'coincidence' the standard choice of variables that physicists use have this incredibly nice property that makes everything work out. I'm not sure if this is too well known so it might be worth abusing this to 'prove' a perpetuum mobile is possible to stop people using uniform distributions without due deliberation.
- passion__desire 2 years agoEntropy is mathematical force to be honest.
- Solvency 2 years agoHuh? "Force"?
- esafak 2 years agoconcept
- esafak 2 years ago
- canadianfella 2 years ago[dead]
- Solvency 2 years ago
- guga42k 2 years ago
- floatrock 2 years agoThe part that was new to me was the bit about how a space full of life tends toward more entropy faster than the same amount of space without life.
Like the best ideas, it’s simple and makes sense if you think about it, but it’s still a really interesting framing that the complex machinery of life is really just the most efficient “entropy converter”.
If there’s something about the arrow of time that speeds towards the heat death of the universe, we’re just helping it go a tiny bit faster here on our floating speck of dust.
- quickthrower2 2 years agoDoes Einstein’s model of time care about entropy. In other words if there are 2 regions, one where entropy is increasing at that time and one where it isn’t as much, does it affect time?
- semi-extrinsic 2 years agoNo. Entropy can be used to explain the direction of time, as a kind of symmetry breaking of all the microscopic laws that are symmetric in time. But it does not say anything about the "speed of time". Relativity does tell us the speed of time - it's the speed of light.
- Balgair 2 years agoTo be suuuuuuper pedantic here: Relativity tells us that time is a dimension, one that is a bit unique. In that it has a constant attached to it. So, the 3 dimensions you're used to are just normal, they have no constants.
(x,y,z)
Meters of x are meters of z and meters of y. Relativity (and I'm really simplifying a lot by just saying 'relativity'), well relativity comes along as says that time is also a dimension, just with the constant of 'c' attached (the speed of light). That way you can convert seconds into meters.
(x,y,z,ct) not just (x,y,z,t).
So now the time dimension is much larger than the spatial dimensions. About 300,000,000 times larger, a third-ish of a billion. So a meter of x is ~1/3 of a billion meters of time.
Now, there is a lot more about relativity, like, just tons. And I skipped most of it. And trying to just say that time is a simple little conversion away from meters is just wrong. And how that all relates to entropy is a mess that we really haven't figured out yet.
- klabb3 2 years ago> Relativity does tell us the speed of time - it's the speed of light.
I forget the name of the book, but it was trying to convey intuitions about relativity (first special, then general).
It’s very easy to make the mistake of trying to understand space and time first, concepts we think we intuit, but in relativity these are somewhat higher-level concepts. Instead, start with what the most fundamental part of the theory and go from there: the speed of light is constant. Accept that first. It’s the comfort zone. You can always return safely to this point.
So, when moving to space and time, the book explained it like this: everything moves at the speed of light, at all times. It’s just that instead of x,y,z – we add t, time, as well. So for an object that’s still, all it’s movement is through the time dimension. Conversely, an object that moves incredibly fast, like a photon, already “used” it’s speed in the spatial dimensions, so it doesn’t “age” in terms of time.
This is just special relativity, but I liked this approach. It’s basically embracing the theory first instead of trying to shoehorn it into the world we have so many misconceptions about.
- Balgair 2 years ago
- danq__ 2 years agoEntropy is an orthogonal to Einstein's laws of physics.
Both newtons and Einstein's description of the universe are time reversable. The physics are valid independent of the direction of time.
Entropy is literally a separate observable phenomenon with it's own set of axioms.
- semi-extrinsic 2 years ago
- Piezoid 2 years agoThere is a video by PBS Space Time on that subject: https://youtu.be/GcfLZSL7YGw
- Timon3 2 years ago> Like the best ideas, it’s simple and makes sense if you think about it, but it’s still a really interesting framing that the complex machinery of life is really just the most efficient “entropy converter”.
Right? I think this could be a really interesting basis for a sci-fi novel - the first space-faring civilization in the entire universe trying to force every form of life to keep "entropy usage" to a minimum, so they can prolong their own life span.
- hunter-gatherer 2 years agoIsaac Asmiov's short story "The Last Question" isn't super inline with your suggestion but is a short discourse on humanity's struggle against entropy.
- ineptech 2 years agoNot exactly what you're describing, but this is along those lines: https://slatestarcodex.com/2015/06/02/and-i-show-you-how-dee... It's pretty silly but the ending is both satisfying and relevant.
- hunter-gatherer 2 years ago
- franky47 2 years ago> a space full of life tends toward more entropy faster than the same amount of space without life.
And a space full of children is exponentially faster at increasing entropy.
- tgv 2 years agoIsn't there a theorem that says that the overall rate is constant in a closed system? If you look at the universe as a whole, it would be closed, and we wouldn't have any impact. However, I learnt about entropy decades ago, and never applied it, so don't take my word for it.
- quickthrower2 2 years ago
- ctafur 2 years agoAccording to entropy, and thermodynamics in general, I can't recommend enough the notes [0] of prof. Zhigang Suo of Harvard. It's a new way of presenting thermodynamics and I finally get it... contrary to when I took a thermo course at university.
[0]: https://docs.google.com/document/d/10Vi8s-azYq9auysBSK3SFSWZ...
Also prof. Suo puts entropy as the main character of the "play". The other concepts (temperature, etc.) are defined from entropy.
- hospitalJail 2 years agoLast night me and my wife were deciding if we should watch The Witcher or this video.
I decided I didn't have the brainpower/mental capacity to think about The Witcher and that this video on Entropy would be easier to digest.
- SanderNL 2 years agoNot sure if you are humble bragging, but I agree The Witcher is more demanding. Layers of meaning, emotions, allegory, subtext. It’s no Shakespeare, but physics and math are simple in comparison especially in the wonderfully produced and easily digestible format of Veritasium.
- Balgair 2 years agoI mean, the author has a doctoral degree in science communication. It's his job and the point of the channel to try to make things easy to understand.
The opposite is true with fiction. You're intentionally trying to have the audience make connections themselves, like a Sherlock story or the Great Gatsby. The point is in the discovery by the viewer.
- Balgair 2 years ago
- SanderNL 2 years ago
- TexanFeller 2 years agoI recently enjoyed this presentation by Sean Carroll that touches on definitional and philosophical issues with entropy. The talk made me feel less stupid for not feeling entirely comfortable with how entropy was explained to me before. Turns out there are a few different ways to define and quantify entropy that are used in different contexts and they each have some unresolved philosophical issues.
"Can you get time out of Quantum Mechanics?": https://youtu.be/nqQrGk7Vzd4
- m3affan 2 years agoI wonder how many concepts that are so complicated for our brain to formalize or even process.
- cubefox 2 years agoAnother surprising thing is that physicists have not yet succeeded in reducing the ordinary notion of cause and effect to fundamental physics. Carroll has also worked on this issue.
- kergonath 2 years agoI would recommend reading some Carlo Rovelli, it sounds like something you might like.
- m3affan 2 years ago
- antimora 2 years agoThe video did not explain why the sun is a low entropy source. I found this explaining what I am sharing with you:
So, the sun is a low-entropy source of energy, and Earth (and everything on it) increases that entropy as it uses and then reradiates that energy. This process is entirely consistent with the second law of thermodynamics.
The relationship between light frequency and entropy comes from the fact that entropy is a measure of disorder or randomness. High-frequency light, such as ultraviolet or visible light, is more ordered and less random than lower-frequency light, such as infrared or microwave light.
This is due to how light is structured. Light is made up of particles called photons, and each photon carries a certain amount of energy. The energy of a photon is directly proportional to its frequency: higher-frequency photons carry more energy than lower-frequency ones.
So, if you have a fixed amount of energy to distribute among photons, you can do so in many more ways (i.e., with higher entropy) if you use low-energy, low-frequency photons. That's because you would need many more of them to carry the same total amount of energy.
On the other hand, if you use high-energy, high-frequency photons, you would need fewer of them to carry the same total amount of energy. There are fewer ways to distribute the energy (i.e., lower entropy), so this arrangement is more ordered and less random.
Therefore, high-frequency light is considered a lower-entropy form of energy compared to low-frequency light, because the energy is concentrated in fewer, more energetic photons.
- kgwxd 2 years ago> The video did not explain why the sun is a low entropy source.
Laymen to the extreme but, didn't it? The thing about the low entropy of the universe near the big bang, gravity naturally bringing things together, and such?
- guga42k 2 years ago>The video did not explain why the sun is a low entropy source. I found this explaining what I am sharing with you:
to my best understanding, to go from high entropy state to low entropy state you need work to do. The sun is a source of energy to do the work
- kgwxd 2 years ago
- spuz 2 years agoI think the concept would be easier for me to understand if we talked about the inverse of entropy - i.e. some kind of measurement for the concentration of useful energy or "order". I think it would then be more intuitive to say that this measurement always decreases. Do we even have a word for the opposite of entropy?
- MaxRegret 2 years agoNegentropy? This is a concept in information theory, but maybe also in physics.
- winwang 2 years agoAnd it's closely related to the Gibbs free energy (available energy), which decreases with increasing entropy, all other things equal.
- winwang 2 years ago
- bob1029 2 years ago> Do we even have a word for the opposite of entropy?
I've always referred to the inverse as "information" or "order".
- cshimmin 2 years ago"information" is not a good term for the opposite. Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.
- kgwgk 2 years ago> Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.
More entropy means more missing information.
https://iopscience.iop.org/book/mono/978-0-7503-3931-5/chapt...
- bob1029 2 years agoI agree. Order is a better term to describe it. Predictability. More entropy = more total possible states the system can be in.
I tend to use "information" to refer to a statistically-significant signal or data that the application/business can practically utilize. This is definitely not the same as the strict information theoretical definition.
- kgwgk 2 years ago
- cshimmin 2 years ago
- dangitnotagain 2 years agoEmergence.
If entropy is the distribution of potential over negative potential, emergence is that where an outcome creates potential (of some discrete domain).
The mystifying relationship with thermodynamic entropy is that through accelerating entropy in other domains (burning), the entropy in the primary domain (eternal drag) is supplanted by the potentials provided by the external domain.
- 2 years ago
- Timon3 2 years agoI might be completely off kilter here (please tell me if that's the case!), but what makes sense to me is to think about "how much entropy is left", as in "how much can entropy still increase between the current state and the highest entropy state". That flips the meaning to describe what you're talking about, and feels very intuitive to me.
- spuz 2 years agoWhen you say "how much entropy is left?" you make it sound like a quantity that is decreasing, not increasing. That seems incorrect, no?
- Timon3 2 years agoThat's why I specified that it's meant as "how much can entropy still increase between the current state and the highest entropy state". If you can't read it in that sense, either ignore the shorter question or read it as "how much increase of entropy left". Or maybe "how much entropy left until max".
- Timon3 2 years ago
- spuz 2 years ago
- danq__ 2 years agoEnergy should never be used to introduce entropy as the concept of energy itself is highly, highly complicated and abstract.
Entropy and energy are orthogonal anyway. You can understand entropy without the need to even use the word "energy."
Entropy is an aspect of probability, it is in fact a numerical phenomenon.
- vehicles2b 2 years agoI find it intuitive to think of such “ordered” distributions as having a higher compression ratio. (Ie compare the file sizes of zipping a file with just ones or just zeros vs zipping a file with a uniform, random mixture of ones and zeros.)
- Solvency 2 years agoHuh? I've never really understood this metaphor.
Take any photograph in Photoshop. First, save one copy of it as a compressed JPG.
Now, on the original, add a small densely repeating tiled pattern multiplied on top as a layer. Like a halftone effect, dot texture, whatever. Technically you're adding more order and less chaos. The resulting image won't compress as efficiently.
- dist-epoch 2 years agoThe idea is to use the best compressor possible. So called Kolmogorov complexity.
- dist-epoch 2 years ago
- Solvency 2 years ago
- tnecniv 2 years agoNormally we care not about entropy but changes in entropy or entropy measured in comparison to a reference distribution, so you can just take the negative of that difference.
- kledru 2 years agowe do have a word -- "negentropy". Physicists also sometimes find it easier to talk about "negentropy" instead of entropy.
- knolan 2 years agoExergy
- syntaxing 2 years agoNot trying to be cheeky but wouldn’t the opposite of entropy be work?
- cjs_ac 2 years agoEctropy has been suggested, but the term is not in common use.
- dangitnotagain 2 years agoPotential
- MaxRegret 2 years ago
- antimora 2 years agoJust recently I also watch a video by Sabine Hossenfelder called "I don't believe the 2nd law of thermodynamics": https://www.youtube.com/watch?v=89Mq6gmPo0s
I recommend this video as well.
- dist-epoch 2 years agoSabine Hossenfelder also had a video recently on entropy:
> I don't believe the 2nd law of thermodynamics.
- evouga 2 years agoWhat I really like about this explanation is that it highlights the fact that entropy is not a natural property of the physics system: entropy is only defined with respect to some coarse-graining operation applied by an imperfect observer of the system. So as Sabine points out it seems we should really be talking about multiple different entropies, each of which corresponds to a different mechanism for coarse-graining microstates into macrostates, with each different entropy changing at different rates depending on the coarse-graining mechanism and physical system. (And in particular, God observing the universe would not see entropy change at all; even if there were uncertainty in the initial conditions of the universe, God would see that uncertainty perfectly propagated with no loss of information, in a way made precise by Liouville's Theorem.)
But even this is not the full story, because I can take a mass-spring network, and no matter how I choose to coarse-grain it, I will not see the entropy corresponding to that coarse-graining increase, because the trajectory of a mass-spring system is periodic. Entropy increase requires that the system is ergodic with respect to the chosen coarse-graining operation, i.e. that over long times the trajectory visits the coarse-grained states in a "random" and uniform way. It's not at all obvious to me why the dynamics of particles bouncing around in a box have this property, and particles attached in a mass-spring network do not; and neither the Sabine nor the Veritaserum videos address this or why we should expect all practical real-world physical systems to be ergotic with respect to practical coarse-graining mechanisms.
- dist-epoch 2 years ago> mass-spring system is periodic
I don't pretend to understand this stuff, but wouldn't a real mass-spring system slowly stop, due to friction, air resistance, heat dissipation, ...? So a real system wouldn't be periodic.
- sidlls 2 years agoPeriodic doesn't mean perpetual, perfect, constant periodic motion (in general).
- consilient 2 years agoYes, but they're talking about an idealized harmonic oscillator, not a physical mass-spring system.
- sidlls 2 years ago
- dist-epoch 2 years ago
- topologie 2 years agoBeat me to it...
I posted it without realizing somebody else had already posted it.
Gotta love Prof. Hossenfelder.
- evouga 2 years ago
- 4ad 2 years agoI hate Veritasium's clickbait, and I think most of his videos are very poor, but this one is the exception. It's very well put together. The first ten minutes of the video is exactly how I introduce entropy to people.
Of course I can't give him a pass on how crass it was telling that women he has a PhD in physics (he does not). The video would have been so much better without that two seconds of footage...
- andyjohnson0 2 years ago> course I can't give him a pass on how crass it was telling that women he has a PhD in physics (he does not)
To be clear, he does have a PhD but it is in physics education research, not physics.
- hilbert42 2 years ago"I think most of his videos are very poor,"
Why do you think so? (Most to me seem reasonable but one on speed of electricity stands out as badly done (he redid the video but it too could have been better).)
- pxeger1 2 years agoHe seems to exaggerate the importance of things when they make for a good story and sound interesting. This is a classic flaw in popular science but I think he's got a lot more egregious with it over the years.
The worst example I remember, which is actually what drove me to unsubscribe, was when he said that the golden ratio was "a pretty five-y number" because it can be written as 0.5 + 0.5 * (5^0.5). Anyone with a good mathematical background could tell you there's nothing five-y about 0.5 at all. I'll grant him, the golden ratio is still a little bit five-y because of the sqrt(5).
The whole context and presentation seemed like it was designed to make the viewer feel like they'd learnt something even though nothing of substance was really delivered in those 20 seconds. He does that a lot.
- hilbert42 2 years ago"The whole context and presentation seemed like it was designed to make the viewer feel like they'd learnt something even though nothing of substance was really delivered in those 20 seconds. He does that a lot."
I can't disagree with that. He's not only a YouTube presenter but also a documentary maker, His documentary Uranium: Twisting the Dragon's Tailhttps://www.imdb.com/title/tt4847012/ has been repeated on TV where I am, I think, at least three times. He's also made others.
Unfortunately, that type of presentation is all too common in modern docos (probably couldn't get it past the director otherwise). As you'd know, this and similar techniques (which I find highly irritating) are also used to pad out half-hour shows to an hour and or to multi episodes when one would do adequately (thankfully, my PVR comes to the rescue and I rarely watch them in real time).
- interestica 2 years ago> Anyone with a good mathematical background could tell you there's nothing five-y about 0.5 at all.
Um, I disagree? Visually, the 5 is memorable here. "Fivey" just seems to mean "lots of the number 5"?
- hilbert42 2 years ago
- hospitalJail 2 years agoHis dandruff ad was pretty cringe IMO. But idk, it seems like his videos are really long for what they accomplish in general. He seems like the generic youtuber that milks every dollar of ad revenue and is shameless about it.
Kind of sad that a expensive camera + clickbait thumbnail/title > Experts communicating clearly and accurately.
I imagine he/his team is scouring youtube for the experts, and remaking their videos with more production value.
- AlexandrB 2 years agoI was quite disappointed by his video on self-driving cars. It presented a very one-sided view and felt like a puff piece (and, indeed, it was sponsored by Waymo). Tom Nicholas did a good job breaking down the problems with it: https://m.youtube.com/watch?v=CM0aohBfUTc&pp=ygUPdG9tIHZlcml...
- willis936 2 years agoHis video on electromagnetism is still my gold standard for Lorentz's Law. It came out while I was taking emag. I liked it so much I showed it to my professor, who didn't name drop Lorentz all semester. The class was making sure no one got a BSEE without knowing Maxwell's Equations, which does warrant a semester. I guess it was more of a failing of the physics curriculum.
- pxeger1 2 years ago
- 2 years ago
- manojlds 2 years agoMost of his videos are GOOD, come on!
- kergonath 2 years agoThe problem is that on a given specific subject you can never be sure whether he’s exaggerating or misrepresenting things. Just this makes watching him a waste of time, because then you need to spend at least twice the time to fact check him. A bit like asking a question to ChatGPT. At least Wikipedia provides you links to proper sources.
- kergonath 2 years ago
- PrimeMcFly 2 years ago> I think most of his videos are very poor,
This sounds quite bitter, as does griping about him mentioning his PhD.
His videos have excellent production quality, and do a great job of communicating advanced STEM concepts to laypeople in an entertaining way.
Maybe you don't like them, but that doesn't mean they are bad. Given their popularity, it would seem they are anything but.
- 2 years ago
- andyjohnson0 2 years ago
- Lichtso 2 years agoI think the most unintuitive even unsettling aspect of entropy is that the entropy of black holes is proportional to their surface area, not their volume [0]. That is only briefly mentioned in the video and not discussed any further.
[0] https://en.wikipedia.org/wiki/Holographic_principle#Black_ho...
- contravariant 2 years agoTo be fair black holes are just weird so that's not too damning on its own.
What makes it weird is that black holes must necessarily* have the maximum amount of entropy for a specific volume. So not only the entropy of a black hole is proportional to its surface area, but the entropy of some volume of space can not grow beyond that. In particular entropy cannot be proportional to volume without limit, the density must be 0 on average for a big enough region.
*: According to some people anyway.
- spiralx 2 years agoThe surface of a black hole is also the most efficient information storage possible - each 2x2 Planck lengths area can store a single bit. Of course there's no way to read that data...
- contravariant 2 years ago
- subroutine 2 years agoIn the video, Derek says the sun provides "a more useful form of energy than earth gives back because it is more concentrated". I would argue this is not technically true. What makes the energy from the sun useful is that it emits blackbody radiation, and this has photons that are more energetic than heat (IR) radiated back into space. Light with wavelengths between 400-700 nm are used by plants for photosynthesis. Heat radiated back into the atmosphere is mostly in the IR spectrum. This is both quantitatively and qualitatively important due to the quantized properties of electrons orbiting nuclii (IR wavelengths cannot readily to elevate electrons into a greater energy state).
- pcwelder 2 years agoSo the law of increasing entropy is not a fundamental law of the reality because it can be derived from other fundamental equations.
Suppose I show you a snapshot of a random universe, would you be able to tell if the entropy of the universe is going to increase or decrease as the time progresses?
Let's assume that universe's entropy would increase. Consider another universe exactly the same as current universe, but all the particles' velocities reversed. Then this universe's entropy would decrease.
So you are equally like to select both the universe and hence the original assumption of increasing entropy is wrong.
Discarding quantum properties of the particles, is it then fair to say that time's direction is unrelated to whether entropy increases or decreases?
- rixed 2 years agoIf by "random universe" you mean a universe in which all states of every particule are random, then my understanding is that we would probably conclude that entropy is neither increasing nor decreasing. Our universe is not random. We could spot local phenomenons where entropy is clearly going in one direction. We assume everywhere the entropy would go in the same direction (increasing), and deduce from this hypothesis that the universe started with a very low entropy.
- canjobear 2 years agoThe expected increase in entropy can be derived from laws of mechanics plus the critical stipulation that, in the past, entropy was very low. Essentially, physical systems want to be in high-entropy states. So if you observe one to be in a very low-entropy state, then you can conclude that with high probability the future of that system will go to higher-entropy states.
> Suppose I show you a snapshot of a random universe, would you be able to tell if the entropy of the universe is going to increase or decrease as the time progresses?
Yes, if it has low entropy then entropy will probably increase; if it has high entropy then the entropy will probably fluctuate up and down statistically.
> Let's assume that universe's entropy would increase. Consider another universe exactly the same as current universe, but all the particles' velocities reversed. Then this universe's entropy would decrease.
The key is that you're exponentially unlikely to find yourself in a universe where all the particles' velocities are reversed. See this: https://en.wikipedia.org/wiki/Fluctuation_theorem
The probability that a system randomly evolves in a way that reduces entropy is very very small.
- cubefox 2 years ago> > Suppose I show you a snapshot of a random universe, would you be able to tell if the entropy of the universe is going to increase or decrease as the time progresses?
> Yes, if it has low entropy then entropy will probably increase
The problem is that it probably increases in both time directions, such that the state of minimum entropy is now. As you said, we have to stipulate that the entropy in the past is low, we can't (yet?) infer it from observation. Which raises the question what justifies us making this assumption in the first place.
- cubefox 2 years ago
- 2 years ago
- 2 years ago
- rixed 2 years ago
- topologie 2 years agoRelated:
"I don't believe the 2nd law of thermodynamics. (The most uplifting video I'll ever make.)" by Sabine Hossenfelder
- lisper 2 years agoA pithier way to introduce this topic: the first law of thermodynamics, a.k.a. the law of conservation of energy, is that energy cannot be created nor destroyed, only transformed from one form to another. In light of this, how can there ever be a shortage of energy?
[Note that this is intended to be a rhetorical question advanced for the purposes of pedagogy. If you find yourself wanting to post an answer, you have missed the point.]
- dahart 2 years agoWhy is this pithier than the video? I’m not entirely sure I see added pedagogical value. Asking the rhetorical question how can there be a shortage of energy sounds a little like someone sort-of intentionally misunderstanding what that phrase “energy shortage” means in any practical economic context. “Energy shortage” is an economics phrase, not a physics phrase. The first law of thermodynamics doesn’t suggest there can’t be energy shortages on earth, because the phrase “energy shortage” is not used to suggest a loss of energy to the universe, energy shortages are all about not having enough specific forms of energy in specific places at specific times [1], and it’s no surprise that we can’t capture dissipated heat, or that a local power system has a maximum limit at any given time, for example.
Something similar could perhaps be said for the video’s approach; “what do we get from the sun?” is an ambiguous question, not necessarily a fair setup to ask a lay person when you have entropy in mind as the answer. We do get energy from the sun, that is a correct answer, and we use some of it before it goes away. But, there is the nice a-ha that all the energy from the sun eventually leaves the earth, right?
[1] “An energy crisis or energy shortage is any significant bottleneck in the supply of energy resources to an economy.“ https://en.wikipedia.org/wiki/Energy_crisis
- JumpCrisscross 2 years agoIt’s pithy, but in the way of word play. Energy, colloquially, means useful energy. The question collides the conventional and technical definitions to create the illusion of profundity.
- lisper 2 years agoPithy != profound. The intent was to get people to think about the fact that the word "energy" means different things in different contexts, and that the thing that actually has value is not energy but the absense of entropy.
- lisper 2 years ago
- JumpCrisscross 2 years ago
- dekken_ 2 years ago> how can there ever be a shortage of energy
I think it's not so much a shortage of energy, but that there thermodynamic equilibrium and thus no available energy to do anything.
I don't think this will ever happen tho, it's pretty clear to me that making energy more dense is a universal process.
- sghiassy 2 years agoNot really - look up “heat death” of the universe.
- jjaken 2 years agoHeat death doesn’t mean “no heat” or that energy has depleted. It just means that energy is fully dispersed. All kg the hear exists, it’s just that no one place has any more than anywhere else, and so there is no longer any transfer of energy.
- credit_guy 2 years agoThe "heat death" of the universe is a concept that deserves to die. The second principle of thermodynamics is true only if you ignore gravity. In the presence of gravity, systems tend to go towards lower entropy, just see how a planetary system can form out of a gas cloud.
- jjaken 2 years ago
- sghiassy 2 years ago
- sghiassy 2 years agoUsable energy is different than total energy. If energy isn’t concentrated (say something like gasoline) it’s not usable
- uoaei 2 years agoEnergy is actually a red herring -- what's relevant here is work.
Useful work, aka information, is work that can be employed in dynamics vis a vis processing. Useless work, aka heat, is the devil's share of the energy expenditure which is lost as entropy when undergoing a process.
- dahart 2 years ago
- danq__ 2 years agoI don't think this delivers the intuition in a simple manner. It's also not fully correct. People explain entropy in over complicated ways and even in this thread many people explaining it don't get it. There is a simple way to think about this and I guarantee if you read my explanation you'll understand it more.
In essence what you need to realize is that entropy is just a label for an aspect of probability.
Things tend to become disordered over time because disordered states are more probable then ordered states. Entropy is thus simply phenomenon of probability... of things moving from a low probability state to a high probability state. That's it.
That's really all there is to it. That's all you need to digest, all the complicated math and explanations are all just surrounding the above concept.
Entropy is just a high level abstraction of probability. It just allows you to explain things without the intuition of probability bogging you down. For example, explaining life in terms of probability is harder to grasp as it's akin to rolling 10 dice and having all the dice roll a 6.
- northlondoner 2 years agoThere are multiple different definitions of Entropy and usage. Boltzmann's and Gibbs's physical entropy is not strictly equivalent to information entropy of Shannon whereby legend says von Neumann told Shannon that call 'information capacity of a channel' entropy, as nobody knows it. A great reads from books of Arieh Ben-Naim is highly recommended.
- rssoconnor 2 years agoWhile this is a reasonable historical explanation of entropy, and explains that we don't gain net energy from the sun, it still misses the mark on what entropy is now known to be.
Entropy isn't a property of an object, or a system or things in physics. Entropy is a property of our _description_ of systems. More precisely it is a measure of how poorly a given specification of a physical system is, i.e. given description of a systems, typically the pressure / volume / temperature of a gas or whatnot, how many different physical systems correspond to such a description.
In particular, _thermodynamic entropy is Shannon entropy_.
In the case where the description of state specifies a volume of phase space wherein a physical state lies within, then the entropy is the logarithm of the volume of this fragment of phase space. If we take this collection of states and see how they evolve in time, then Liouville’s theorem says the volume of phase space will remain constant.
If we want to build a reliable machine, i.e. an engine, that can operate in any initial state that is bounded by our description, and ends up win a final state bounded by some other description, well, in order for this machine to preform reliably, the volume of the final description needs to be greater than the volume of the description of the initial state. Otherwise, some possible initial states will fail to end up in the desired final state. This is the essence of the second law of thermodynamics.
I want to emphasis this: entropy exists in our heads, not in the world.
E.T. Jaynes illustrated this "5. The Gas Mixing Scenario Revisited" in https://www.damtp.cam.ac.uk/user/tong/statphys/jaynes.pdf where two imaginary variants of Argon gas are mixed together. If one engineer is ignorant of the different variants of Argon gas, it is impossible to extract work from the gas, but armed with knowledge of the difference (which must be exploitable otherwise they wouldn't actually be different) work can be extracted.
Knowledge _is_ power.
Taking an extreme example, suppose we have two volumes of gas at different volumes / pressures / temperature. We can compute how much work can be extracted from those gases.
But, suppose someone else knows more than just the volume / pressure / temperature of these gases. This someone happens to know the precise position and velocity of every single molecule of gas (more practically they know the quantum state of the system). This someone now gets to play a the role of Maxwell's demon and separate all the high velocity and low velocity molecules of each chamber, opening and closing a gate using their perfect knowledge of where each particle is at each moment in time. From this they can now extract far more work than the ignorant person.
In both cases the gas was identical. How much useful work one can extract depends on how precise one's knowledge of the state of that gas is.
- ko27 2 years agoEntropy is very much "real" and it exists outside of our mind. The resolution to Maxwell's demon is that knowledge of every particle's state is not free, you need to increase the system's entropy by obtaining knowledge more than you can ever eliminate by opening chamber doors.
If it only existed in our minds and not in physical reality that would mean it would be possible to construct a device that decreases global entropy on average.
- pcwelder 2 years agoThis is eye opening. Thanks a lot for this comment and linking the pdf. I loved E.T. Jayne's Probability Theory book, so looking forward to reading this pdf too.
- kgwgk 2 years agoThere are a few chapters of an unpublished book on thermodynamics here: https://bayes.wustl.edu/etj/thermo.html
This article is also interesting: "THE EVOLUTION OF CARNOT'S PRINCIPLE" https://bayes.wustl.edu/etj/articles/ccarnot.pdf
Building on these ideas, the first five chapters of this (draft of a) book from Ariel Caticha are quite readable: https://www.arielcaticha.com/my-book-entropic-physics
- rssoconnor 2 years agoI want to add this quote that I was originally looking for, and at last found in Jaynes's "Where do we Stand on Maximum Entropy?" (page 237 of http://ndl.ethernet.edu.et/bitstream/123456789/33178/1/R.%20...)
---
In the Summer of 1951, Professor G. Uhlenbeck gave his famous course on Statistical Mechanics at Stanford, and fol- lowing the lectures I had many conversations with him, over lunch, about the foundations of the theory and current progress on it. I had expected, naively, that he would be enthusiastic about Shannon's work, and as eager as I to exploit these ideas for Statistical Mechanics. Instead, he seemed to think that the basic problems were, in principle, solved by the then recent work of Bogoliubov and van Hove (which seemed to me filling in details, but not touching at all on the real basic problems)--and adamantly rejected all suggestions that there is any connection between entropy and information. His initial reaction to my remarks was exactly like my initial reaction to Shannon's: "Whose information?" His position, which I never succeeded in shaking one iota, was: "Entropy cannot be a measure of 'amount of ignorance,' because different people have different amounts of ignorance; entropy is a definite physical quantity that can be measured in the laboratory with thermometers and calorimeters." Although the answer to this was clear in my own mind, I was unable, at the time, to convey that answer to him. In trying to explain a new idea I was, like Maxwell, groping for words because the way of thinking and habits of language then current had to be broken before I could express a different way of thinking. Today, it seems trivially easy to answer Professor Uhlen- beck's objection as follows: "Certainly, different people have different amounts of ignorance. The entropy of a thermo- dynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities Xi which define its thermodynamic state. This is a completely 'objective' quantity in the sense that it is a function only of the Xi, and does not depend on anybody's personality. There is then no reason why it cannot be measured in the laboratory."
- rssoconnor 2 years ago
- kgwgk 2 years ago
- cubefox 2 years agoHossenfelder says the same in her entropy video. A really interesting hypothesis.
- ko27 2 years ago
- danq__ 2 years agoThe most intuitive explanation of entropy ever:
entropy is a fancy way of explaining probability.
Things with higher probability tend to occur over things of lower probability.
Thus when certain aspects of the world like configurations of gas particles in a box are allowed to change configurations, they will move towards high probability configurations.
High probability configurations tend to be disordered. Hence the reason why we associate entropy with things becoming increasingly disordered. For example... gas particles randomly and evenly filling up an entire box is more probable then all gas particles randomly gathering on one side of the box.
If you understand what I just explained than you understand entropy better than the majority of people.
- hilbert42 2 years agoThat's an excellent overall summary as he covers almost every aspect of the subject albeit in brief. It would be good if he produced a second video dealing with the low entropy of incoming energy from the sun and the higher entropy of radiated energy from earth and relate that to global warming.
In all the debate over global warming little is talked about why say CO2 and other greenhouse gasses increase the earth's temperature and how they shift the wavelength of the radiated energy from earth. In other words we need to explain in simple terms why the incoming and outgoing energy can remain the same yet the earth's temperature has increased.
- gorhill 2 years agoMy understanding:
Global warming occurs because the previous equilibrium between incoming and outgoing energy has been broken by changes in the composition of the atmosphere.
So until we reach a new equilibrium long after the atmosphere composition ceases to change, the outgoing energy will be less than the incoming energy.
- kergonath 2 years agoThat’s exactly it. Except that equilibrium will never be reached, it’s more like tending towards a steady state.
- hilbert42 2 years agoRight, better stated than me.
It would help if we could convey this across to the lay public. No doubt, the minutiae is complex and involved and the details will be argued over for years but at the high level it's pretty straightforward.
Seems to me many balk at climate science because the figures seem to magically appear from nowhere—they just pop out of climate scientists' mouths without explanation. A simple understandable explanation might reduce some of the noisy opposition.
- hilbert42 2 years ago
- kergonath 2 years ago
- jjaken 2 years agoThe important part to understand is timescales. In a day, the Earth does absorb some energy. Of course it does, plants collect it, solar panels collect, the ocean and land collect it. The amount Earth collects is a tiny fraction of what it releases. That collection isn’t permanent though and is slowly released. Within a day, the Earth absorbs some energy, but over a long enough timescale, all of that energy is released again.
The Earth is taking on energy every day from the sun. If we didn’t release it all back, the earth would be warming much much faster. It only remains relatively cool because it releases almost as much as it receives.
Another important note is that long term energy is not only stored as heat on earth. It’s stored as potential energy in the atoms of cells in plants and animals. Think of how cold a gallon of gasoline is, yet how much energy it stores.
For an example think of hot asphalt from a summer day. It gets real hot all day and slowly cools down at night. Sometimes it can be pretty warm to stand on the road even if it’s a cool night.
Within the human timescale, the Earth is retaining some (tiny fraction) of heat. That tiny fraction of heat is a very small window of heat that life can tolerate. It’s not too much and not too little. If the earth were to retain just a tiny bit more, suddenly life can’t tolerate it. On the scale of the universe, the difference between those realities is minuscule, even though it’s enormous to us.
- willis936 2 years agoWhich debate over global warming are you referring to? There are debates that involve atmospheric chemists that discuss Earth darkening cause and effects.
- hilbert42 2 years agoRight, the average person hasn't a clue about albedo, nor do they know why say CO2 and CH4 increase global warming whereas others such as O2 are more benign.
It may help lower the temperature of the debate if they did.
Edit, we're pitching this discussion at the level he has—the lay public. Scientific argument over the minutiae is another matter altogether.
- jjaken 2 years agoYeah debates in climate science are about phenomena laypeople don’t even know exist. It’s about how what we observe happens. No one is arguing about what we’re observing, eg global warming.
- jjaken 2 years ago
- hilbert42 2 years ago
- gorhill 2 years ago
- martythemaniak 2 years agoGreat video, but very wrong to cite Jeremy England. Ilya Prigogine came up with the concept of Dissipative Structures and won the Nobel Prize in Chemistry in 1977 for that work. He also has a couple of pop-sci books on he subject that I found super illuminating. They are a little bit challenging to read, but they're very thorough and Order out of Chaos in particular has a fantastic summary of 400 years of philosophy of science. Highly recommend reading the OG.
- bugs_bunny 2 years agoAn interesting read related to this is the following article, which begins with von Neumann's comment that ``Nobody really knows what entropy really is."
https://www.researchgate.net/publication/228935581_How_physi...
- szundi 2 years agoJust after Sabine’s
- lll-o-lll 2 years agoYes, and the Sabine video contained far more information (lower entropy?) and one genuinely interesting idea I’d never heard before. The idea that heat death may not be the end of intelligent life! The concept this relies on is the idea that macro states are actually just combinations of micro states all of which have the same probability. E.g. the sequence 1, 2, 3, 4, 5 and 3, 1, 4, 2, 5 are equally likely if you are selecting 5 random numbers (1-5), but the ordered sequence is an important state to us. The Big Bang -> heat death is just super unlikely state to super likely state, but these macro states are poorly defined. They matter to us. So perhaps the universe goes on with complex life harvesting neg-entropy from a heat death configuration of micro states as they slowly transition from “extremely unlikely” to “likely” in the context of something incomprehensible to humans. I feel like the idea would need more mathematical meat on the bones to go further, but still an intriguing thought!
- cubefox 2 years agoI suspect that "macro state that is more important to us" really is special in some objective way, not just subjectively.
- cubefox 2 years ago
- lll-o-lll 2 years ago
- wpwpwpw 2 years agoI loved this one too and I believe it is an excelent followup: I don't believe the 2nd law of thermodynamics. (The most uplifting video I'll ever make.) @ https://www.youtube.com/watch?v=89Mq6gmPo0s
- manojlds 2 years agoWhen I was watching the video I was thinking this deserves to be posted on HN and yup, someone already has.
- dangitnotagain 2 years agoEntropy should be redefined as “the distribution of potential over negative potential.”
Whether discussing what is over what may be, or thermal equilibrium, potential distribution describes it all!
- 2 years ago
- lvl102 2 years agoI’d like to think of entropy in terms of randomness or rather uniqueness of elements/compounds within a system.
- thumbuddy 2 years agoIn my opinion the most misunderstood concept from physics is probably any exponential relationship. I realize we could view entropy to be one of those if we flip the relationship around and equate for microstates making my statement the superset. But generally speaking, Ive seen both lay people and experts struggle to reason about them, especially with complex numbers.
- guerrilla 2 years agoIt's actually amazing how bad science popularizers are at explaining entropy. It's really not that difficult if they would just take some time and think before they speak (which I'm sure this video proves based on what people are saying about it.)
- max_ 2 years agoSad he didn't talk about Shannon Entropy
- sourcecodeplz 2 years agoIt was a great video
- friend_and_foe 2 years agoThe video talks about how the earth radiates away the same amount of energy as it gets from the sun, just red shifted. In light of this, let's talk about climate change, global warming and the greenhouse effect.