• Flying Squid@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        And now it’s “it’s the paper’s fault it’s wrong because it defined a term the way I didn’t want it defined.”

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          15 days ago

          Yes.

          Science is built on a shared, standardized base of knowledge. Laying claim to a standard term to mean something entirely incompatible with the actual definition makes your paper objectively incorrect and without merit.

          • Flying Squid@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 days ago

            Cool. Let me know when you feel like reading the paper since Aatube already showed you they are using it properly. Or at least admitting you might not know as much about this as you think you do…

      • Aatube@kbin.melroy.org
        link
        fedilink
        arrow-up
        0
        ·
        15 days ago

        From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.

        Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing. —Wikipedia article for shannons