• Aatube@kbin.melroy.org
    link
    fedilink
    arrow-up
    0
    ·
    15 days ago

    From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.

    Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing. —Wikipedia article for shannons