Entropy is a measure of a number of distinct possible configurations that result in an equivalent outcome.
It’s pure statistics. Given time symmetric laws of nature and a state that can be achieved by a relatively small number of configurations, in the absence of potential barriers, the system inevitably approaches a state that’s achievable by a larger number of configurations. Simply because an elementary change is more likely to fall into the latter mode. Thus, arrow of time emerges.
Entropy is a measure of a number of distinct possible configurations that result in an equivalent outcome.
It’s pure statistics. Given time symmetric laws of nature and a state that can be achieved by a relatively small number of configurations, in the absence of potential barriers, the system inevitably approaches a state that’s achievable by a larger number of configurations. Simply because an elementary change is more likely to fall into the latter mode. Thus, arrow of time emerges.