The natural unit of information (symbol: nat),[1] sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.

One nat is equal to 1/ln 2 shannons ≈ 1.44 Sh or, equivalently, 1/ln 10 hartleys ≈ 0.434 Hart.[1]

History

edit

Boulton and Wallace used the term nit in conjunction with minimum message length,[2] which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit used as a unit of luminance.[3]

Alan Turing used the natural ban.[4]

Entropy

edit

Shannon entropy (information entropy), being the expected value of the information of an event, is inherently a quantity of the same type and with a unit of information. The International System of Units, by assigning the same unit (joule per kelvin) both to heat capacity and to thermodynamic entropy implicitly treats information entropy as a quantity of dimension one, with 1 nat = 1.[a] Systems of natural units that normalize the Boltzmann constant to 1 are effectively measuring thermodynamic entropy with the nat as unit.

When the shannon entropy is written using a natural logarithm,   it is implicitly giving a number measured in nats.

Notes

edit
  1. ^ This implicitly also makes the nat the coherent unit of information in the SI.

References

edit
  1. ^ a b "IEC 80000-13:2008". International Electrotechnical Commission. Retrieved 21 July 2013.
  2. ^ Boulton, D. M.; Wallace, C. S. (1970). "A program for numerical classification". Computer Journal. 13 (1): 63–69. doi:10.1093/comjnl/13.1.63.63-69&rft.date=1970&rft_id=info:doi/10.1093/comjnl/13.1.63&rft.aulast=Boulton&rft.aufirst=D. M.&rft.au=Wallace, C. S.&rft_id=https://doi.org/10.1093%2Fcomjnl%2F13.1.63&rfr_id=info:sid/en.wikipedia.org:Nat (unit)" class="Z3988">
  3. ^ Comley, J. W. & Dowe, D. L. (2005). "Minimum Message Length, MDL and Generalised Bayesian Networks with Asymmetric Languages". In Grünwald, P.; Myung, I. J. & Pitt, M. A. (eds.). Advances in Minimum Description Length: Theory and Applications. Cambridge: MIT Press. sec. 11.4.1, p271. ISBN 0-262-07262-9. Archived from the original on 2006-06-19. Retrieved 2006-04-18.
  4. ^ Hodges, Andrew (1983). Alan Turing: The Enigma. New York: Simon & Schuster. ISBN 0-671-49207-1. OCLC 10020685.

Further reading

edit
  • Reza, Fazlollah M. (1994). An Introduction to Information Theory. New York: Dover. ISBN 0-486-68210-2.