Three types of Entropy:
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, and information systems, including the transmission of information in telecommunication.
The "three types of entropy" generally refer to entropy as a thermodynamic concept, information entropy, and the maximum entropy principle, which are distinct but related ways of conceptualizing entropy across physics, information theory, and statistical inference. Other groupings of entropy exist, such as classifications of thermodynamic, informational, and digital entropies or types of quantum statistics-based entropies.