Philosophy of information is a new field of research that explores the conceptual issues at the intersection of computational science, information technology, and philosophy. This is the philosophical domain concerned by:
- critical study of the conceptual nature and basic principles of information, including its dynamics and use in science;
- the development and application of computational methodologies and information theory to philosophical problems.
(in Luciano Floridi, “What is the philosophy of information?”, Metaphilosophy, 2002, (33), 1/2.)
The philosophy of information finally emerged as an independent field of research in the 1990s thanks to Luciano Floridi, the first to use the term philosophy of information in the technical sense expressed by the above definition and to develop a unified and coherent conceptual framework for the whole topic.
The philosophy of information is based on the technical work of Norbert Wiener, Alan Turing, Ross W. Ashby, Claude Shannon, Warren Weaver, and many other scientists who have been working on computing and information theory since early 1950s, and then, on the work of Fred Dretske, Jon Barwise, Brian Cantwell Smith, and others.
Definition of information
The meaning of the word information is very wide.
Claude Shannon, for example, was very cautious: “The word ‘information’ has received many definitions from the various authors working on information theory. It is likely that some of them will be sufficiently useful to merit further study and eventually to be widely accepted. However, it is unlikely that a single concept will satisfy all possible applications of this general theory. (Shannon, 1993). Based on Shannon’s work, Weaver supported a tripartite analysis of information in terms of:
- technical issues regarding quantification of information as treated by Shannon’s theory,
- semantic problems related to meaning and truth,
- what he called “influential” problems, regarding the impact and effectiveness of information on human behavior (which he thought was just as important as the other two).
And these are just two examples raised at the beginning of the theory of information.
According to Floridi, four kinds of compatible phenomena are commonly called “information”:
- information about something (train schedules),
- information as such (DNA, fingerprints),
- information for something (instructions, an algorithm),
- information contained in something (a constraint, a model).
The word “information” is often used in a very abstract or metaphorical way, his meaning is not strictly defined.
Computer science and philosophy
Recent advances in computing, such as the semantic web, knowledge engineering, or modern artificial intelligence, bring new ideas, new topics, new questions, and more methodologies and modes of philosophical research to the philosophy. On the one hand, IT brings new opportunities and challenges to traditional philosophy, and changes the philosophers’ view of the fundamental concepts of philosophy. On the other hand, only strong philosophical foundations will allow new advances in bioinformatics, software engineering, knowledge engineering or ontology.
The classical concepts of philosophy, such as mind, consciousness, experience, reasoning, knowledge, truth, morality or creativity are increasingly part of the interests and topics of research in computer science, particularly in the field of mobile and intelligent agents.
However, both computer science and philosophy have so far challenged each other on the question of the nature of information by referring each other jurisdiction, since the time of the “founding fathers”. Thus, this vast notion, raised by a physicist to the rank of universal constant, remained in a “no man’s land”. However, researchers from very different horizons (G. Simondon, O. Costa Beauregard, H. Laborit, H. Atlan) converge on a duality “structure/action” or neighbor. On the other hand, the “theory of information” is misleadingly referred to as the mathematical theory of communication [of information] to use Shannon and Weaver’s terms. There is no such thing as information theory, despite the advances that this notion has allowed in so many disciplines.
Things are changing fast, though. There has been since A.N. Kolmogorov, R.J. Solomonoff and currently G.J. Chaitin an algorithmic theory of information, potentially extended by L. Brisson and F.W. Meyerstein to general knowledge. This step is qualified by A. Sournia of “breakthrough Chaitin-Brisson-Meyerstein”; in addition, four other theories would be in the making. Finally, the information is the subject of conjectures quite globalist combining biology, cosmology and metaphysics.
Translated from Wikipedia
Leave a Reply