The technical concept of information developed after Shannon (1948) has fueled advances in many fields, from fundamental physics to bioinfomatics, but its technical precision has come at a cost. This has undermined its usefulness in fields distinguished by the need to explain functional significance and reference, such as evolutionary biology, cognitive neuroscience, and the social sciences. To formulate a more adequate concept of information ironically requires attending to the physicality of information media. I argue that recognizing the interdependence of the two distinctively different uses of the concept of entropy (informational and thermodynamic) is the key to a concept of information that incorporates it semiotic function.