mastodontech.de ist einer von vielen unabhängigen Mastodon-Servern, mit dem du dich im Fediverse beteiligen kannst.
Offen für alle (über 16) und bereitgestellt von Markus'Blog

Serverstatistik:

1,4 Tsd.
aktive Profile

#entropy

0 Beiträge0 Beteiligte0 Beiträge heute

Three types of Entropy:

Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, and information systems, including the transmission of information in telecommunication.

The "three types of entropy" generally refer to entropy as a thermodynamic concept, information entropy, and the maximum entropy principle, which are distinct but related ways of conceptualizing entropy across physics, information theory, and statistical inference. Other groupings of entropy exist, such as classifications of thermodynamic, informational, and digital entropies or types of quantum statistics-based entropies.

#Zoomposium with Prof. Dr. #Arieh #Ben-#Naim: “#Enchantment of #Entropy

He assumes that we need a new basic #understanding of the #phenomenon of entropy. In Arieh's view, entropy, which originally stems from the 2nd #main #theorem of #thermodynamics, has been misused and incorrectly transferred as a #concept to other areas of #physics, #biology and everyday #life.

Read more at: philosophies.de/index.php/2024

or: youtu.be/Km88EreH4A8

#PhysicsJournalClub
"Temperature as joules per bit"
by C.A. Bédard, S. Berthelette, X. Coiteux-Roy, and S. Wolf

Am. J. Phys. 93, 390 (2025)
doi.org/10.1119/5.0198820

Entropy is an important but largely misunderstood quantity. A lot of this confusion arise from its original formulation within the framework of Thermodynamics. Looking at it from a microscopic point of view (i.e. approaching it as a Statistical Mechanics problem) makes it a lot more digestible, but its ties to Thermodynamics still creates a lot of unnecessary complications.
In this paper the authors suggest that by removing the forced connection between entropy and the Kelvin temperature scale, one can rethink entropy purely in terms of information capacity of a Physical system, which takes away a lot of the difficulties usually plaguing the understanding of what entropy is actually about.
I don't think the SI will ever consider their suggestion to remove Kelvins as a fundamental unit and include bits, but this paper will be a great boon to any student banging their head against the idea of entropy for the first (or second, or third) time.