Résumé : Information is the distinguishing mark of our era, permeating every facet of our lives. An ability to understand and harness information has the potential for significant advances. Our current understanding of information dates back to Claude Shannon's revolutionary work in 1948, resulting in a general mathematical theory of reliable communication that formalized the modern digital communication and storage principles, paved the way for the Internet, DVDs and iPods of today. While Shannon's information theory has had profound impact, its application beyond storage and communication poses foundational challenges. The National Science Foundation has just established a Science & Technology Center on the Science of Information to meet the new challenges posed by the rapid advances in networking, biology and knowledge extraction. Its missions is to develop rigorous principles guiding the extraction, manipulation, and exchange of information, integrating elements of space, time, structure, semantics and context. In this talk, after reviewing main results of Shannon, we then attempt to identify some features of information encompassing structural, spatio-temporal, and semantic facets of information. If time allows, we present one new result on a fundamental lower bound for structural compression and describe a novel algorithm achieving this lower bound for graphical structures.
|Dernière modification : mercredi 06 juillet 2011||Contact : Cyril.Banderier at lipn.univ-paris13.fr|