The Hartley function is a measure of information that is similar to the Shannon entropy but uses a different formulation. It was introduced by Ralph Hartley in 1928 and is particularly useful in the context of information theory, particularly when dealing with discrete random variables.
Articles by others on the same topic
There are currently no matching articles.