progicnet: Probabilistic logic and probabilistic networks

A research project funded by the Leverhulme Trust (2006 – September 2008)

Contributors: Rolf Haenni Solar, Jan-Willem Romeijn, Gregory Wheeler, Jon Williamson

progicnet

If we wish to reason more effectively, we can draw on both probability theory and deductive logic to offer guidance. However, these are very different formalisms: deductive logic tells us how the structure of sentences can be exploited to draw conclusions while probability theory tells us how uncertainties interact. For example, deductive logic tells us that from “Jack has bronchitis” and “if Jack has bronchitis then Jack has a cough” we can conclude “Jack has a cough”. Probability theory can be used to tell us the probability that “Jack has bronchitis given that he has a cough” if we know the relative incidences of this symptom and disease in the population.

A probabilistic logic offers a richer formalism, one that combines the capacity of probability theory to handle uncertainty with the capacity of deductive logic to exploit structure. Potential applications are numerous and are to be found in disciplines as diverse as the philosophy of science (where we need to model theory choice and theory change, and to understand statistical methodology), artificial intelligence (where computers need to combine statistics with structural knowledge in order to forecast the weather for instance), bioinformatics (where we need to combine deductive reasoning about chemical structure with statistical reasoning about observed biological characteristics) and legal argumentation (where we would like to model legal theory formation from case law). In each of these problem domains probabilistic information and structural knowledge needs to be combined and a probabilistic logic offers a framework for handling this combination.
The difficulty with probabilistic logics is that they tend to multiply the complexities of their probabilistic and logical components. Probabilistic logics can be hard to understand, and inference using probabilistic logics can be time-consuming and complex. In probability theory, probabilistic networks (including what are known as Bayesian nets and credal nets) have been developed to simplify the task of probabilistic reasoning. These nets can afford a simple representation of complex problems and can be used to dramatically speed-up the time it takes to perform calculations.
The goal of this project is to investigate the application of probabilistic networks to probabilistic logic. If successful, probabilistic networks could render probabilistic logic simpler and more perspicuous and could render applications of probabilitic logic feasible at last.
This project is an academic network, running from 2006-2008.

Writings

An executive summary of our programme:

Jon Williamson: A note on probabilistic logics and probabilistic networks, The Reasoner 2(5), 2008.

Our magnum opus:

Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler and Jon Williamson: Probabilistic logics and probabilistic networks, Synthese Library, Springer, to appear.

While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches to probabilistic logic fit into a simple unifying framework: logically complex evidence can be used to associate probability intervals or probabilities with sentences.

Specifically, we show in Part I that there is a natural way to present a question posed in probabilistic logic, and that various inferential procedures provide semantics for that question: the standard probabilistic semantics (which takes probability functions as models), probabilistic argumentation (which considers the probability of a hypothesis being a logical consequence of the available evidence), evidential probability (which handles reference classes and frequency data), classical statistical inference (in particular the fiducial argument), Bayesian statistical inference (which ascribes probabilities to statistical hypotheses), and objective Bayesian epistemology (which determines appropriate degrees of belief on the basis of available evidence).

Further, we argue, there is the potential to develop computationally feasible methods to mesh with this framework. In particular, we show in Part I how credal and Bayesian networks can naturally be applied as a calculus for probabilistic logic. The probabilistic network itself depends upon the chosen semantics, but once the network is constructed, common machinery can be applied to generate answers to the fundamental question introduced in Part I.

A collection on probabilistic logic and probabilistic networks:

Fabio Cozman, Rolf Haenni, Jan-Willem Romeijn, Federica Russo, Gregory Wheeler & Jon Williamson (eds): Combining probability and logic, Special Issue, Journal of Applied Logic 7(2), 2009; Editorial:

An application of our approach to psychology:

Jan-Willem Romeijn, Rolf Haenni, Gregory Wheeler and Jon Williamson: Logical Relations in a Statistical Problem, to appear in B. Lowe, E. Pacuit & J.W. Romeijn (eds), FotFS’07: Foundations of the Formal Sciences VI, Reasoning about Probabilities and Probabilistic Reasoning, London: College Publications 2008.

This paper presents the progicnet programme. It proposes a general framework for probabilistic logic that can guide inference based on both logical and probabilistic input. After an introduction to the framework as such, it is illustrated by means of a toy example from psychometrics. It is shown that the framework can accommodate a number of approaches to probabilistic reasoning: Bayesian statistical inference, evidential probability, probabilistic argumentation, and objective Bayesianism. The framework thus provides insight into the relations between these approaches, it illustrates how the results of different approaches can be combined, and it provides a basis for doing efficient inference in each of the approaches.

Other progicnet papers:

Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler and Jon Williamson: Possible Semantics for a Common Framework of Probabilistic Logics, In V. N. Huynh (ed.): Interval / Probabilistic Uncertainty and Non-Classical Logics, Advances in Soft Computing Series, Springer 2008, pp. 268-279.

This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.

Rolf Haenni: Probabilistic argumentation, Journal of Applied Logic, in press.

Argumentation in the sense of a process of logical reasoning is a very intuitive and general methodology of establishing conclusions from defeasible premises. The core of any argumentative process is the systematical elaboration, exhibition, and weighting of possible arguments and counter-arguments. This paper presents the formal theory of probabilistic argumentation, which is conceived to deal with uncertain premises, for which respective probabilities are known. With respect to possible arguments of a hypothesis, this leads to probabilistic weights in the first place, and finally to an overall probabilistic judgment of the uncertain proposition in question. The proposed probabilistic measure possesses the desired properties of non-monotonicity and non-additivity. Reasoning according to the proposed formalism is an intuitive and natural generalization of the two classical forms of logical and probabilistic reasoning.

Rolf Haenni: Climbing the Hills of Compiled Credal Networks, in G. de Cooman, J. Vejnarová, and M. Zaffalon (editors), ISIPTA’07, 5th International Symposium on Imprecise Probabilities and Their Applications, pp. 213-222, 2007.

This paper introduces a new approximate inference algorithm for credal networks. The algorithm consists of two major steps. It starts by representing the credal network as a compiled logical theory. The resulting structure is the basis on which the subsequent steepest-ascent hill-climbing algorithm operates. The output of the algorithm is an inner approximation of the exact lower and upper posterior probabilities.

William Harper and Gragory Wheeler (eds): Probability and Inference: Essays in Honour of Henry E. Kyburg, College Publications, 2007. Amazon.co.uk

Recent advances in philosophy, artificial intelligence, mathematical psychology, and the decision sciences have brought a renewed focus to the role and interpretation of probability in theories of uncertain reasoning. Henry E. Kyburg, Jr. has long resisted the now dominate Bayesian approach to the role of probability in scientific inference and practical decision. The sharp contrasts between the Bayesian approach and Kyburg’s program offer a uniquely powerful framework within which to study several issues at the heart of scientific inference, decision, and reasoning under uncertainty. The commissioned essays for this volume take measure of the scope and impact of Kyburg’s views on probability and scientific inference, and include several new and important contributions to the field. Contributors: Gert de Cooman, Clark Glymour, William Harper, Isaac Levi, Ron Loui, Enrique Miranda, John Pollock, Teddy Seidenfeld, Choh Man Teng, Mariam Thalos, Gregory Wheeler, Jon Williamson, and Henry E. Kyburg, Jr.

Jan-Willem Romeijn: The All-Too-Flexible Abductive Method: ATOM’s Normative Status, Journal of Clinical Psychology, 2008, to appear.

Jan-Willem Romeijn and Igor Douven: The Discursive Dilemma as a Lottery Paradox, Economics and Philosophy, 23(3), pp. 301-319, 2007.

List and Pettit have stated an impossibility theorem about the aggregation of individual opinion states. Building on recent work on the lottery paradox, this paper offers a variation on that result. The present result places different constraints on the voting agenda and the domain of profiles, but it covers a larger class of voting rules, which need not satisfy the proposition-wise independence of votes.

Jan-Willem Romeijn, D. Borsboom, and J.M. Wicherts: Measurement Invariance versus Selection Invariance: Is fair selection possible?, Psychological Methods, in press.

This paper shows that measurement invariance (defined in terms of an invariant measurement model in different groups) is generally inconsistent with selection invariance (defined in terms of equal sensitivity and specificity across groups). In particular, when a unidimensional measurement instrument is used, and group differences are present in the location but not in the variance of the latent distribution, sensitivity and positive predictive value will be higher in the group located at the higher end of the latent dimension, whereas specificity and negative predictive value will be higher in the group located at the lower end of the latent dimension. When latent variances are unequal, the differences in these quantities depend on the size of group differences in variances, relative to the size of group differences in means. The effect is shown to originate as a special case of Simpson’s paradox, which arises because the observed score distribution is collapsed into an accept/reject dichotomy. Simulations show that the effect can be substantial in realistic situations. It is suggested that the effect may be partly responsible for overprediction in minority groups as typically found in empirical studies on differential academic performance. A methodological solution to the problem is suggested, and social policy implications are discussed.

Jan-Willem Romeijn, I. Douven and L. Horsten: Anti-realist Truth, under submission.

Antirealists have hitherto offered at best sketches of a theory of truth. This paper presents an antirealist theory of truth in some formal detail. It is shown that the theory is able to deal satisfactorily with some problems that are standardly taken to beset antirealism.

Jan-Willem Romeijn and R. van de Schoot: A philosophical analysis of Bayesian model selection for inequality constrained models, in Null, Alternative and Informative Hypotheses, Hoijtink, Klugkist and Boelen (eds.), to appear.

Michael Wachter and Rolf Haenni: Logical compilation of Bayesian networks with Discrete Variables, in K. Mellouli (ed.), ECSQARU’07, 9th European Conference on Symbolic and Quantitative Approaches to Reasoning under Uncertainty, pp. 536-547, LNAI 4724, 2007.

This paper presents a new approach to inference in Bayesian networks. The principal idea is to encode the network by logical sentences and to compile the resulting encoding into an appropriate form. From there, all possible queries are answerable in linear time relative to the size of the logical form. Therefore, our approach is a potential solution for real-time applications of probabilistic inference with limited computational resources. The underlying idea is similar to both the differential and the weighted model counting approach to inference in Bayesian networks, but at the core of the proposed encoding we avoid the transformation from discrete to Boolean variables. This alternative encoding enables a more natural solution.

Michael Wachter and Rolf Haenni: Propositional DAGs: a New Graph-Based Language for Representing Boolean Functions, In P. Doherty, J. Mylopoulos, and C. Welty (eds), KR’06, 10th International Conference on Principles of Knowledge Representation and Reasoning, pp. 277-285, 2006. AAAI Press.

This paper continues the line of research on knowledge compilation in the context of Negation Normal Forms (NNF) and Binary Decision Diagrams (BDD). The idea is to analyze different target languages according to their succinctness and the classes of queries and transformations supported in polytime. We identify a new property called simple-negation, which is an implicit restriction of all NNFs and BDDs. The removal of this restriction leads to Propositional Directed Acyclic Graphs (PDAG), a more general family of graph-based languages for representing Boolean functions or propositional theories. With respect to certain NNF-based languages, we will show that corresponding PDAG-based languages are at least as succinct and support the same transformations. The most interesting language even supports the same queries and an additional transformation, making it more flexible.

M. Wachter and R. Haenni: Multi-State Directed Acyclic Graphs, In Z. Kobti and D. Wu (eds), CanAI’07, 20th Canadian Conference on Artificial Intelligence, pp. 464-475, LNAI 4509, 2007.

This paper continues the line of research on the representation and compilation of propositional knowledge bases with propositional directed acyclic graphs (PDAG), negation normal forms (NNF), and binary decision diagrams (BDD). The idea is to permit variables with more than two states and to explicitly represent them in their most natural way. The resulting representation languages are analyzed according to their succinctness, supported queries, and supported transformations. The paper shows that most results from PDAGs, NNFs, and BDDs can be generalized to their corresponding multi-state extension. This implies that the entire knowledge compilation map is extensible from propositional to multi-state variables.

M. Wachter, R. Haenni and M. Pouly: Optimizing Inference in Bayesian Networks and Semiring Valuation Algebras, in A. Gelbukh and A. F. Kuri Morales (eds), MICAI’07: 6th Mexican International Conference on Artificial Intelligence, pp. 236-247, LNAI 4827, 2007.

Previous work on context-specific independence in Bayesian networks is driven by a common goal, namely to represent the conditional probability tables in a most compact way. In this paper, we argue from the view point of the knowledge compilation map and conclude that the language of Ordered Binary Decision Diagrams (OBDD) is the most suitable one for representing probability tables, in addition to the language of Algebraic Decision Diagrams (ADD). This holds not only for inference in Bayesian networks, but is more generally applicable in the generic framework of semiring valuation algebras, which can be applied to solve a variety of inference and optimization problems in different domains.

Gregory Wheeler: Applied logic without psychologism, Studia Logica, 88(1): 137-156, 2008.

Logic is a celebrated representation language because of its formal generality. But there are two senses in which a logic may be considered general, one that concerns a technical ability to discriminate between different types of individuals, and another that concerns constitutive norms for reasoning as such. This essay embraces the former, permutation-invariance conception of logic and rejects the latter, Fregean conception of logic. The question of how to apply logic under this pure invariantist view is addressed, and a methodology is given. The pure invariantist view is contrasted with logical pluralism, and a methodology for applied logic is demonstrated in remarks on a variety of issues concerning non-monotonic logic and non-monotonic inference, including Charles Morgan’s impossibility results for non-monotonic logic, David Makinson’s normative constraints for non-monotonic inference, and Igor Douven and Timothy Williamson’s proposed formal constraints on rational acceptance.

Gregory Wheeler: Two puzzles concerning measures of uncertainty and the positive Boolean connectives, in Proceedings of the 13th Portuguese Conference on Artificial Intelligence (EPIA 2007), Guimaraes, LNAI Series, Berlin: Springer-Verlag, 2007

The two puzzles are the Lottery Paradox and the Amalgamation Paradox, which both point out difficulties for aggregating uncertain information. A generalization of the lottery paradox is presented and a new form of an amalgamation reversal is introduced. Together these puzzles highlight a difficulty for introducing measures of uncertainty to a variety of logical knowledge representation frameworks. The point is illustrated by contrasting the constraints on solutions to each puzzle with the structural properties of the preferential semantics for non-monotonic logics (System P), and also with systems of normal modal logics. The difficulties illustrate several points of tensions between the aggregation of uncertain information and aggregation according to the monotonically positive Boolean connectives, ^ and v.

Gregory Wheeler: Focused Correlation and Confirmation, British Journal for the Philosophy of Science, in press.

This essay presents results about a deviation from independence measure called focused correlation. This measure explicates the formal relationship between probabilistic dependence of an evidence set and the incremental confirmation of a hypothesis, resolves a basic question underlying Peter Klein and Ted Warfield’s ‘truth-conduciveness’ problem for Bayesian coherentism, and provides a qualified rebuttal to Erik Olsson’s claim that there is no informative link between correlation and confirmation. The generality of the result is compared to recent programs in Bayesian epistemology that attempt to link correlation and confirmation by utilizing a conditional evidential independence condition. Several properties of focused correlation are also highlighted.

Gregory Wheeler & Luís Moniz Pereira: Methodological naturalism and epistemic internalism, Synthese, 163(3), pp. 315-328, 2008.

Epistemic naturalism holds that the results or methodologies from the cognitive sciences are relevant to epistemology, and some have maintained that scientific methods are more compatible with externalist theories of justification than with internalist theories. But practically all discussions about naturalized epistemology are framed exclusively in terms of cognitive psychology, which is only one of the cognitive sciences. The question addressed in this essay is whether a commitment to naturalism really does favor externalism over internalism, and we offer reasons for thinking that naturalism in epistemology is compatible with both internalist and externalist conceptions of justification. We also argue that there are some distinctively internalist aim that are currently being studied scientifically and these notions, and others, should be studied by scientific methods.

Gregory Wheeer & Jon Williamson: Evidential probability and objective Bayesian epistemology, in Prasanta S. Bandyopadhyay & Malcolm Forster (eds): Handbook of the philosophy of statistics, Elsevier 2009.

In this chapter we draw connections between two seemingly opposing approaches to probability and statistics: evidential probability on the one hand and objective Bayesian epistemology on the other.

Jon Williamson: Objective Bayesian probabilistic logic, Journal of Algorithms in Cognition, Informatics and Logic 63: 167-183.

This paper develops connections between objective Bayesian epistemology – which holds that the strengths of an agent’s beliefs should be representable by probabilities, should be calibrated with evidence of empirical probability, and should otherwise be equivocal – and probabilistic logic. After introducing objective Bayesian epistemology over propositional languages, the formalism is extended to handle predicate languages. A rather general probabilistic logic is formulated and then given a natural semantics in terms of objective Bayesian epistemology. The machinery of objective Bayesian nets and objective credal nets is introduced and this machinery is applied to provide a calculus for probabilistic logic that meshes with the objective Bayesian semantics.

Jon Williamson: Aggregating judgements by merging evidence, Journal of Logic and Computation, in press.

The theory of belief revision and merging has recently been applied to judgement aggregation. In this paper I argue that judgements are best aggregated by merging the evidence on which they are based, rather than by directly merging the judgements themselves. This leads to a three-step strategy for judgement aggregation. First, merge the evidence bases of the various agents using some method of belief merging. Second, determine which degrees of belief one should adopt on the basis of this merged evidence base, by applying objective Bayesian theory. Third, determine which judgements are appropriate given these degrees of belief by applying a decision-theoretic account of rational judgement formation.

Jon Williamson: Inductive influence, British Journal for the Philosophy of Science, 58, pp. 689-708, 2007;

Objective Bayesianism has been criticised for not allowing learning from experience: it is claimed that an agent must give degree of belief 1/2 to the next raven being black, however many other black ravens have been observed. I argue that this objection can be overcome by appealing to *objective Bayesian nets*, a formalism for representing objective Bayesian degrees of belief. Under this account, previous observations exert an *inductive influence* on the next observation. I show how this approach can be used to capture the Johnson-Carnap continuum of inductive methods, as well as the Nix-Paris continuum, and show how inductive influence can be measured.

Jon Williamson: Objective Bayesianism with predicate languages, Synthese 163 (3), pp. 341-356, 2008;

Objective Bayesian probability is normally defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to first-order logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequency-induced probability function which generalises the lambda=0 function of Carnap’s continuum of inductive methods.

Jon Williamson: Epistemic complexity from an objective Bayesian perspective, in A. Carsetti (ed.) `Causality, meaningful complexity and knowledge construction’, Springer, in press;

Evidence can be complex in various ways: e.g., it may exhibit structural complexity, containing information about causal, hierarchical or logical structure as well as empirical data, or it may exhibit combinatorial complexity, containing a complex combination of kinds of information. This paper examines evidential complexity from the point of view of Bayesian epistemology, asking: how should complex evidence impact on an agent’s degrees of belief? The paper presents a high-level overview of an objective Bayesian answer: it presents the objective Bayesian norms concerning the relation between evidence and degrees of belief, and goes on to show how evidence of causal, hierarchical and logical structure lead to natural constraints on degrees of belief. The objective Bayesian network formalism is presented, and it is shown how this formalism can be used to handle both kinds of evidential complexity – structural complexity and combinatorial complexity.

Activities

Workshop: Foundations of Formal Sciences: Reasoning about probabilities and probabilistic reasoning. May 2-5 2007, Amsterdam.
Workshop: Methodological Problems of the Social Sciences, May 7 2007, Tilburg.
Conference: progic07: The Third Workshop on Combining Probability and Logic. September 5-7 2007, Canterbury.
Jan Willem Romeijn: Progic 2007: the Third Workshop on Combining Probability and Logic, The Reasoner 1(6), 2007.

Talks

European Summer School in Logic, Language and Information, 4-15 August 2008, Hamburg. Course page.
Non-classical Logics: from Foundations to Applications, 24-26 April 2008, Pisa.
International Workshop on Interval/Probabilistic Uncertainty and Non-Classical Logics, Japan Advanced Institute of Science and Technology (JAIST), March 25-28 2008, Ishikawa, Japan.
Winter School in Logic, India Institute of Technology, January 14-26 2008, Kanpur.
progic07: The Third Workshop on Combining Probability and Logic. September 5-7 2007, Canterbury.
Fourth Annual Formal Epistemology Workshop, Carnegie Mellon University, May 31- June 3 2007, Pittsburgh.
Methodological Problems of the Social Sciences, May 7 2007, Tilburg.
Foundations of Formal Sciences: Reasoning about probabilities and probabilistic reasoning. May 2-5 2007, Amsterdam.

Related Work

Rolf Haenni and Stephan Hartmann: Modeling Partially Reliable Information Sources: a General Approach based on Dempster-Shafer Theory, Information Fusion, 7(4): pp. 361-379, 2006.

Combining testimonial reports from independent and partially reliable information sources is an important epistemological problem of uncertain reasoning. Within the framework of Dempster-Shafer theory, we propose a general model of partially reliable sources, which includes several previously known results as special cases. The paper reproduces these results on the basis of a comprehensive model taxonomy. This gives a number of new insights and thereby contributes to a better understanding of this important application of reasoning with uncertain and incomplete information.

Henry E. Kyburg Jr., Choh Man Teng, and Gregory Wheeler: Conditionals and consequences. Journal of Applied Logic.

We examine the notion of conditionals and the role of conditionals in inductive logics and arguments. We identify three mistakes commonly made in the study of, or motivation for, non-classical logics. A nonmonotonic consequence relation based on evidential probability is formulated. With respect to this acceptance relation some rules of inference of System P are unsound, and we propose refinements that hold in our framework.
Gregory Wheeler: Rational acceptance and conjunctive / disjunctive absorption. Journal of Logic, Language and Information 15(1-2), pp. 49-63.

A bounded formula is a pair consisting of a propositional formula phi in the first coordinate and a real number within the unit interval in the second coordinate, interpreted to express the lower-bound probability of phi. Converting conjunctive disjunctive combinations of bounded formulas to a single bounded formula consisting of the conjunction/disjunction of the propositions occurring in the collection along with a newly calculated lower probability is called absorption. This paper introduces two inference rules for effecting conjunctive and disjunctive absorption and compares the resulting logical system, called System Y, to axiom System P. Finally, we demonstrate how absorption resolves the lottery paradox and the paradox of the preference.

Jon Williamson: Bayesian networks for logical reasoning, in Carla Gomes & Toby Walsh (eds) [2001]: Proceedings of the AAAI Fall Symposium on using Uncertainty within Computation, AAAI Press Technical Report FS-01-04, pp. 136-143.

By identifying and pursuing analogies between causal and logical influence I show how the Bayesian network formalism can be applied to reasoning about logical deductions.

Jon Williamson: Probability logic, in Dov Gabbay, Ralph Johnson, Hans Jurgen Ohlbach & John Woods (eds)[2002]: Handbook of the Logic of Inference and Argument: The Turn Toward the Practical, Studies in Logic and Practical Reasoning Volume 1, Elsevier, pp. 397-424.

I examine the idea of incorporating probability into logic for a logic of practical reasoning. I introduce probability and its interpretations, give an account of the development of the logical approach to probability, its immediate problems, and improved formulations. Then I discuss inference in probabilistic logic, and propose the use of Bayesian networks for inference in both causal logics and proof planning.

Jon Williamson: Bayesian nets and causality: philosophical and computational foundations, Oxford University Press (UK, US) 2005. Preface, Reviews & Errata

Bayesian nets are widely used in artificial intelligence as a calculus for casual reasoning, enabling machines to make predictions, perform diagnoses, take decisions and even to discover casual relationships. This book, aimed at researchers and graduate students in computer science, mathematics and philosophy, brings together two important research topics: how to automate reasoning in artificial intelligence, and the nature of causality and probability in philosophy.

Links

Probabilistic Logic on Wikipedia

The Reasoner – a gazette on reasoning, inference and method.

Acknowledgements

We are very grateful to the The Leverhulme Trust for providing financial support. We are also grateful to the Netherlands Organisation for Scientific Research (NWO) for supporting a related project “Probabilistic models of scientific reasoning”.