語言選擇:
免費網上英漢字典|3Dict

entropy

資料來源 : pyDict

資料來源 : Webster's Revised Unabridged Dictionary (1913)

Entropy \En"tro*py\, n. [Gr. ? a turning in; ? in + ? a turn,
   fr. ? to turn.] (Thermodynamics)
   A certain property of a body, expressed as a measurable
   quantity, such that when there is no communication of heat
   the quantity remains constant, but when heat enters or leaves
   the body the quantity increases or diminishes. If a small
   amount, h, of heat enters the body when its temperature is t
   in the thermodynamic scale the entropy of the body is
   increased by h ? t. The entropy is regarded as measured from
   some standard temperature and pressure. Sometimes called the
   thermodynamic function.

         The entropy of the universe tends towards a maximum.
                                                  --Clausius.

資料來源 : WordNet®

entropy
     n 1: (communication theory) a numerical measure of the
          uncertainty of an outcome; "the signal contained
          thousands of bits of information" [syn: {information}, {selective
          information}]
     2: (thermodynamics) a thermodynamic quantity representing the
        amount of energy in a system that is no longer available
        for doing mechanical work; "entropy increases as matter
        and energy in the universe degrade to an ultimate state of
        inert uniformity" [syn: {randomness}, {S}]

資料來源 : Free On-Line Dictionary of Computing

entropy
     
         A measure of the disorder of a system.  Systems tend
        to go from a state of order (low entropy) to a state of
        maximum disorder (high entropy).
     
        The entropy of a system is related to the amount of
        {information} it contains.  A highly ordered system can be
        described using fewer {bit}s of information than a disordered
        one.  For example, a string containing one million "0"s can be
        described using {run-length encoding} as [("0", 1000000)]
        whereas a string of random symbols (e.g. bits, or characters)
        will be much harder, if not impossible, to compress in this
        way.
     
        {Shannon}'s formula gives the entropy H(M) of a message M in
        bits:
     
        	H(M) = -log2 p(M)
     
        Where p(M) is the probability of message M.
     
        (1998-11-23)
依字母排序 : A B C D E F G H I J K L M N O P Q R S T U V W X Y Z