Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Shannon’s Entropy

Encyclopedia of Information Science and Technology, Third Edition
Entropy implies the amount of “disorder” associated with a random variable. In information theory, this important metric gives the expected value of the information conveyed by a message. Shannon entropy measure calculates an estimate of the average minimum number of bits that is required to encode a string of symbols based on an alphabet size and the frequency of the occurrence of the symbols.
Published in Chapter:
Dynamical Systems Approach for Predator-Prey Robot Behavior Control via Symbolic Dynamics Based Communication
Sumona Mukhopadhyay (University of Calgary, Canada) and Henry Leung (University of Calgary, Canada)
Copyright: © 2015 |Pages: 12
DOI: 10.4018/978-1-4666-5888-2.ch651
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR