The probabilities are epistemic in the sense that they are defined in terms of specified data and derived from those data by definite and objective rules of inference, the same for every rational investigator. Jaynes also used the word 'subjective' in this context because others have used it in this context.
He accepted that in a sense, a state of knowledge has a subjective aspect, simply because it refers to thought, which is a mental process. But he emphasized that the principle of maximum entropy refers only to thought which is rational and objective, independent of the personality of the thinker. In general, from a philosophical viewpoint, the words 'subjective' and 'objective' are not contradictory; often an entity has both subjective and objective aspects.
Jaynes explicitly rejected the criticism of some writers that, just because one can say that thought has a subjective aspect, thought is automatically non-objective.
Refine your editions:
He explicitly rejected subjectivity as a basis for scientific reasoning, the epistemology of science; he required that scientific reasoning have a fully and strictly objective basis. One writer even goes so far as to label Jaynes' approach as "ultrasubjectivist",  and to mention "the panic that the term subjectivism created amongst physicists". The probabilities represent both the degree of knowledge and lack of information in the data and the model used in the analyst's macroscopic description of the system, and also what those data say about the nature of the underlying reality.
This cannot be guaranteed, a priori. For this reason MaxEnt proponents also call the method predictive statistical mechanics. The predictions can fail. But if they do, this is informative, because it signals the presence of new constraints needed to capture reproducible behaviour in the system, which had not been taken into account. The thermodynamic entropy at equilibrium is a function of the state variables of the model description. It is therefore as "real" as the other variables in the model description. If the model constraints in the probability assignment are a "good" description, containing all the information needed to predict reproducible experimental results, then that includes all of the results one could predict using the formulae involving entropy from classical thermodynamics.
To that extent, the MaxEnt S Th is as "real" as the entropy in classical thermodynamics. Of course, in reality there is only one real state of the system.
Nonequilibrium thermodynamics and maximum entropy production in the Earth system
The entropy is not a direct function of that state. It is a function of the real state only through the subjectively chosen macroscopic model description. The Gibbsian ensemble idealises the notion of repeating an experiment again and again on different systems, not again and again on the same system. So long-term time averages and the ergodic hypothesis , despite the intense interest in them in the first part of the twentieth century, strictly speaking are not relevant to the probability assignment for the state one might find the system in. However, this changes if there is additional knowledge that the system is being prepared in a particular way some time before the measurement.
One must then consider whether this gives further information which is still relevant at the time of measurement. The question of how 'rapidly mixing' different properties of the system are then becomes very much of interest. Information about some degrees of freedom of the combined system may become unusable very quickly; information about other properties of the system may go on being relevant for a considerable time.
If nothing else, the medium and long-run time correlation properties of the system are interesting subjects for experimentation in themselves. Failure to accurately predict them is a good indicator that relevant macroscopically determinable physics may be missing from the model. According to Liouville's theorem for Hamiltonian dynamics , the hyper-volume of a cloud of points in phase space remains constant as the system evolves.
Therefore, the information entropy must also remain constant, if we condition on the original information, and then follow each of those microstates forward in time:. However, as time evolves, that initial information we had becomes less directly accessible. Instead of being easily summarisable in the macroscopic description of the system, it increasingly relates to very subtle correlations between the positions and momenta of individual molecules.
Compare to Boltzmann's H-theorem.
Equivalently, it means that the probability distribution for the whole system, in 6N-dimensional phase space, becomes increasingly irregular, spreading out into long thin fingers rather than the initial tightly defined volume of possibilities. Classical thermodynamics is built on the assumption that entropy is a state function of the macroscopic variables —i.
The extended, wispy, evolved probability distribution, which still has the initial Shannon entropy S Th 1 , should reproduce the expectation values of the observed macroscopic variables at time t 2. However it will no longer necessarily be a maximum entropy distribution for that new macroscopic description. On the other hand, the new thermodynamic entropy S Th 2 assuredly will measure the maximum entropy distribution, by construction.
Therefore, we expect:. At an abstract level, this result implies that some of the information we originally had about the system has become "no longer useful" at a macroscopic level. At the level of the 6 N -dimensional probability distribution, this result represents coarse graining —i. Like all statistical mechanical results according to the MaxEnt school, this increase in thermodynamic entropy is only a prediction. It assumes in particular that the initial macroscopic description contains all of the information relevant to predicting the later macroscopic state.
This may not be the case, for example if the initial description fails to reflect some aspect of the preparation of the system which later becomes relevant. In that case the "failure" of a MaxEnt prediction tells us that there is something more which is relevant that we may have overlooked in the physics of the system. It is also sometimes suggested that quantum measurement , especially in the decoherence interpretation, may give an apparently unexpected reduction in entropy per this argument, as it appears to involve macroscopic information becoming available which was previously inaccessible.
However, the entropy accounting of quantum measurement is tricky, because to get full decoherence one may be assuming an infinite environment, with an infinite entropy. The argument so far has glossed over the question of fluctuations. It has also implicitly assumed that the uncertainty predicted at time t 1 for the variables at time t 2 will be much smaller than the measurement error.
Salam, V. Vladimorov, and A. Vasconcellos, R. Luzzi, and L. E , 57 , — Madureira, A. Luzzi, J. Casas-Vazquez, and D. Ramos, A. B , 23 , — Vasconcellos, J. Ramos, and R. Grandy and P. Milonni, eds. Press, Cambridge , pp. Grandy and L.
- Statistical mechanics;
- Non-equilibrium thermodynamics and statistical mechanics : foundations and applications.
- Fax Man?
Schick, eds. Courant and D. Barnum, C. Caves, C. Fuchs, R. Schack, D. Driebe, W. Hoover, H. Posch, B. Holian, R. Peierls, and J. Today , 47 , No. De Groot and P. Glansdorff and I. Facts, Trends, Debates K. Ropolyi, and P. Szegedi, eds.
Non-equilibrium Thermodynamics and Statistical Mechanics: Foundations and Applications
Truesdell, Rational Thermodynamics , Springer, Berlin Jou, J. Lebon, D. Jou, and J. Lebon and D. H , 40 , — E , 48 , — Luzzi, and J. Nuovo Cimento , 24 , 1—70 Nuovo Cimento , 29 , 1—82 Nuovo Cimento , 30 , 95— Silva, J. B , 86 , Vasconcellos, A. Silva, and R. Silva, C. Rodrigues, J. E , 91 , However, various universal results are known. These include various fluctuation theorems. Note that the second law follows from this theorem. The fluctuation-dissipation relation may also be derived from it.
A lot of research has been done in this area so for more information I suggest reading some review articles, such as,. Esposito, M. Nonequilibrium fluctuations, fluctuation theorems, and counting statistics in quantum systems. Reviews of Modern Physics, 81 4 , Campisi, M.
Maximum entropy thermodynamics - Wikipedia
Colloquium: Quantum fluctuation relations: Foundations and applications. Reviews of Modern Physics, 83 3 , There are various master equations e. So, like I said there is no universal answer to your question. In fact there is an article from F.
Bonetto, J. Lebowitz and L. Rey-Bellet, Fourier's Law: a Challenge for Theorists arxiv which states in the abstract: "There is however at present no rigorous mathematical derivation of Fourier's law I think people tend to make this sound vastly more mysterious than it is. As explained in any text book on kinetic theory see, e.
- Flame From The Sea (The Vikings Book 1).
- Swimwear Style Secrets: The Ultimate Guide to choosing your Swimwear!
- Order of the Dragon (The Pariahs Book 1).
- Party School: Crime, Campus, and Community (Northeastern Series on Gender, Crime, and Law).
- Shop by category.
- Saving Grace (The Saving Series Book 1).
- Subscribe to RSS?
Vol X of Landau , taking moments of this equation and assuming slowly varying distributions gives Fourier's law of heat conduction, Fick's law of diffusion, the Newton-Navier-Stokes law for vicous friction, etc. Not only that, it provides a method for computing the corresponding transport coefficients, and for reasonably dilute systems the result agrees with experiment.
The Boltzmann equation does not rely on the classical approximation it works for quantum fluids, too, as explained by Landau , but it requires the existence of well defined quasi-particles. For systems in which quantum coherence plays a role, quantum analogs of the Boltzmann equation can be derived from non-equilibrium Green functions.
Sign up to join this community. The best answers are voted up and rise to the top. Home Questions Tags Users Unanswered.