1 |
Front Matter |
|
|
Abstract
|
2 |
,A Formal Concept View of Abstract Argumentation, |
Leila Amgoud,Henri Prade |
|
Abstract
The paper presents a parallel between two important theories for the treatment of information which address questions that are apparently unrelated and that are studied by different research communities: an enriched view of formal concept analysis and abstract argumentation. Both theories exploit a binary relation (expressing object-property links, attacks between arguments). We show that when an argumentation framework rather considers the complementary relation does not attack, then its stable extensions can be seen as the exact counterparts of formal concepts. This leads to a cube of oppositions, a generalization of the well-known square of oppositions, between eight remarkable sets of arguments. This provides a richer view for argumentation in cases of bi-valued attack relations and fuzzy ones.
|
3 |
,Approximating Credal Network Inferences by Linear Programming, |
Alessandro Antonucci,Cassio P. de Campos,David Huber,Marco Zaffalon |
|
Abstract
An algorithm for approximate credal network updating is presented. The problem in its general formulation is a multilinear optimization task, which can be linearized by an appropriate rule for fixing all the local models apart from those of a single variable. This simple idea can be iterated and quickly leads to very accurate inferences. The approach can also be specialized to classification with credal networks based on the maximality criterion. A complexity analysis for both the problem and the algorithm is reported together with numerical experiments, which confirm the good performance of the method. While the inner approximation produced by the algorithm gives rise to a classifier which might return a subset of the optimal class set, preliminary empirical results suggest that the accuracy of the optimal class set is seldom affected by the approximate probabilities.
|
4 |
,A Comparative Study of Compilation-Based Inference Methods for Min-Based Possibilistic Networks, |
Raouia Ayachi,Nahla Ben Amor,Salem Benferhat |
|
Abstract
Min-based possibilistic networks, which are compact representations of possibility distributions, are powerful tools for representing and reasoning with uncertain and incomplete information in the possibility theory framework. Inference in these graphical models has been recently the focus of several researches, especially under compilation. It consists in encoding the network into a . (CNF) base and compiling this latter to efficiently compute the impact of an evidence on variables. The encoding strategy of such networks can be either locally using . or globally using .. This paper emphasizes on a comparative study between these strategies for compilation-based inference approaches in terms of CNF parameters, compiled bases parameters and inference time.
|
5 |
,Qualitative Combination of Independence Models, |
Marco Baioletti,Davide Petturiti,Barbara Vantaggi |
|
Abstract
We deal with the problem of combining sets of independence statements coming from different experts. It is known that the independence model induced by a strictly positive probability distribution has a graphoid structure, but the explicit computation and storage of the closure (w.r.t. graphoid properties) of a set of independence statements is a computational hard problem. For this, we rely on a compact symbolic representation of the closure called fast closure and study three different combination strategies of two sets of independence statements, working on fast closures. We investigate when the complete DAG representability of the given models is preserved in the combined one.
|
6 |
,A Case Study on the Application of Probabilistic Conditional Modelling and Reasoning to Clinical Pa |
Christoph Beierle,Marc Finthammer,Nico Potyka,Julian Varghese,Gabriele Kern-Isberner |
|
Abstract
We present a case-study of applying probabilistic logic to the analysis of clinical patient data in neurosurgery. Probabilistic conditionals are used to build a knowledge base for modelling and representing clinical brain tumor data and expert knowledge of physicians working in this area. The semantics of a knowledge base consisting of probabilistic conditionals is defined by employing the principle of maximum entropy that chooses among those probability distributions satisfying all conditionals the one that is as unbiased as possible. For computing the maximum entropy distribution we use the ME.o.e system that additionally provides a series of knowledge management operations like revising, updating and querying a knowledge base. The use of the obtained knowledge base is illustrated by using ME.o.e’s knowledge management operations.
|
7 |
,Causal Belief Networks: Handling Uncertain Interventions, |
Imen Boukhris,Salem Benferhat,Zied Elouedi |
|
Abstract
Eliciting the cause of an event will be easier if an agent can directly intervene on some variables by forcing them to take a specific value. The state of the target variable is therefore totally dependent of this external action and independent of its original causes. However in real world applications, performing such perfect interventions is not always feasible. In fact, an intervention can be uncertain in the sense that it may uncertainly occur. It can also have uncertain consequences which means that it may not succeed to put its target into one specific value. In this paper, we use the belief function theory to handle uncertain interventions that could have uncertain consequences. Augmented causal belief networks are used to model uncertain interventions.
|
8 |
,On Semantics of Inference in Bayesian Networks, |
Cory J. Butz,Wen Yan,Anders L. Madsen |
|
Abstract
An algorithm, called . (SI) has been proposed recently for determining semantics of the intermediate factors constructed during exact inference in discrete Bayesian networks. In this paper, we establish the soundness and completeness of SI. We also suggest an alternative version of SI, one that is perhaps more intuitive as it is a simpler graphical approach to deciding semantics.
|
9 |
,Evaluating Asymmetric Decision Problems with Binary Constraint Trees, |
Rafael Cabañas,Manuel Gómez-Olmedo,Andrés Cano |
|
Abstract
This paper proposes the use of . in order to represent and evaluate asymmetric decision problems with Influence Diagrams (IDs). Constraint rules are used to represent the asymmetries between the variables of the ID. These rules and the potentials involved in IDs will be represented using binary trees. The application of these rules can reduce the size of the potentials of the ID. As a consequence the efficiency of the inference algorithms will be improved.
|
10 |
,On the Equivalence between Logic Programming Semantics and Argumentation Semantics, |
Martin Caminada,Samy Sá,João Alcântara |
|
Abstract
In this paper, we re-examine the connection between formal argumentation and logic programming from the perspective of semantics. We note that one particular translation from logic programs to instantiated argumentation (the one described by Wu, Caminada and Gabbay) can serve as a basis for describing various equivalences between logic programming semantics and argumentation semantics. In particular, we are able to provide a formal connection between regular semantics for logic programming and preferred semantics for formal argumentation. We also show that there exist logic programming semantics (L-stable semantics) that cannot be captured by any abstract argumentation semantics.
|
11 |
,A Fuzzy-Rough Data Pre-processing Approach for the Dendritic Cell Classifier, |
Zeineb Chelly,Zied Elouedi |
|
Abstract
The Dendritic Cell Algorithm (DCA) is an immune inspired classification algorithm based on the behavior of natural dendritic cells. The DCA performance relies on its data pre-processing phase based on the Principal Component analysis (PCA) statistical method. However, using PCA presents a limitation as it destroys the underlying semantics of the features after reduction. One possible solution to overcome this limitation was the application of Rough Set Theory (RST) in the DCA data pre-processing phase; but still the developed rough DCA approach presents an information loss as data should be discretized beforehand. Thus, the aim of this paper is to develop a new DCA data pre-processing method based on Fuzzy Rough Set Theory (FRST) which allows dealing with real-valued data with no data quantization beforehand. In this new fuzzy-rough model, the DCA data pre-processing phase is based on the FRST concepts; mainly the fuzzy lower and fuzzy upper approximations. Results show that applying FRST, instead of PCA and RST, to DCA is more convenient for data pre-processing yielding much better performance in terms of accuracy.
|
12 |
,Compiling Probabilistic Graphical Models Using Sentential Decision Diagrams, |
Arthur Choi,Doga Kisa,Adnan Darwiche |
|
Abstract
Knowledge compilation is a powerful approach to exact inference in probabilistic graphical models, which is able to effectively exploit determinism and context-specific independence, allowing it to scale to highly connected models that are otherwise infeasible using more traditional methods (based on treewidth alone). Previous approaches were based on performing two steps: encode a model into CNF, then compile the CNF into an equivalent but more tractable representation (d-DNNF), where exact inference reduces to weighted model counting. In this paper, we investigate a bottom-up approach, that is enabled by a recently proposed representation, the Sentential Decision Diagram (SDD). We describe a novel and efficient way to encode the factors of a given model directly to SDDs, bypassing the CNF representation. To compile a given model, it now suffices to conjoin the SDD representations of its factors, using an . operator, which d-DNNFs lack. Empirically, we find that our simpler approach to knowledge compilation is as effective as those based on d-DNNFs, and at times, orders-of-magnitude faster.
|
13 |
,Independence in Possibility Theory under Different Triangular Norms, |
Giulianella Coletti,Davide Petturiti,Barbara Vantaggi |
|
Abstract
In this paper we consider coherent .-conditional possibility assessments, with . a continuous .-norm, and introduce for them a concept of independence already studied for the minimum and strict .-norms. As a significant particular case of .-conditional possibility we explicitly consider ..-conditional possibility (obtained through the minimum specificity principle) introduced by Dubois and Prade.
|
14 |
,Probabilistic Satisfiability and Coherence Checking through Integer Programming, |
Fabio Gagliardi Cozman,Lucas Fargoni di Ianni |
|
Abstract
This paper presents algorithms based on integer programming, both for probabilistic satisfiability and coherence checking. That is, we consider probabilistic assessments for both standard probability measures (Kolmogorovian setup) and full conditional measures (de Finettian coherence setup), and in both cases verify satisfiability/coherence using integer programming. We present empirical evaluation of our method, with evidence of phase-transitions.
|
15 |
,Extreme Lower Previsions and Minkowski Indecomposability, |
Jasper De Bock,Gert de Cooman |
|
Abstract
Coherent lower previsions constitute a convex set that is closed and compact under the topology of point-wise convergence, and Maaß [2] has shown that any coherent lower prevision can be written as a ‘countably additive convex combination’ of the extreme points of this set. We show that when the possibility space has a finite number . of elements, these extreme points are either degenerate precise probabilities, or in a one-to-one correspondence with the (Minkowski) indecomposable compact convex subsets of ℝ..
|
16 |
,Qualitative Capacities as Imprecise Possibilities, |
Didier Dubois,Henri Prade,Agnès Rico |
|
Abstract
This paper studies the structure of qualitative capacities, that is, monotonic set-functions, when they range on a finite totally ordered scale equipped with an order-reversing map. These set-functions correspond to general representations of uncertainty, as well as importance levels of groups of criteria in multicriteria decision-making. More specifically, we investigate the question whether these qualitative set-functions can be viewed as classes of simpler set-functions, typically possibility measures, paralleling the situation of quantitative capacities with respect to imprecise probability theory. We show that any capacity is characterized by a non-empty class of possibility measures having the structure of an upper semi-lattice. The lower bounds of this class are enough to reconstruct the capacity, and their number is characteristic of its complexity. We introduce a sequence of axioms generalizing the maxitivity property of possibility measures, and related to the number of possibility measures needed for this reconstruction. In the Boolean case, capacities are closely related to non-regular multi-source modal logics and their neighborhood semantics can be described in terms
|
17 |
,Conditional Preference Nets and Possibilistic Logic, |
Didier Dubois,Henri Prade,Fayçal Touazi |
|
Abstract
CP-nets (Conditional preference networks) are a well-known compact graphical representation of preferences in Artificial Intelligence, that can be viewed as a qualitative counterpart to Bayesian nets. In case of binary attributes it captures specific partial orderings over Boolean interpretations where strict preference statements are defined between interpretations which differ by a single flip of an attribute value. It respects preferential independence encoded by the ceteris paribus property. The popularity of this approach has motivated some comparison with other preference representation setting such as possibilistic logic. In this paper, we focus our discussion on the possibilistic representation of CP-nets, and the question whether it is possible to capture the CP-net partial order over interpretations by means of a possibilistic knowledge base and a suitable semantics. We show that several results in the literature on the alleged faithful representation of CP-nets by possibilistic bases are questionable. To this aim we discuss some canonical examples of CP-net topologies where the considered possibilistic approach fails to exactly capture the partial order induced by CP-net
|
18 |
,Many-Valued Modal Logic and Regular Equivalences in Weighted Social Networks, |
Tuan-Fang Fan,Churn-Jung Liau |
|
Abstract
Social network analysis is a methodology used extensively in social sciences. While classical social networks can only represent the qualitative relationships between actors, weighted social networks can describe the degrees of connection between actors. In classical social network, regular equivalence is used to capture the similarity between actors based on their linking patterns with other actors. Specifically, two actors are regularly equivalent if they are equally related to equivalent others. The definition of regular equivalence has been extended to regular similarity and generalized regular equivalence for weighted social networks. Recently, it was shown that social positions based on regular equivalence can be syntactically expressed as well-formed formulas in a kind of modal logic. Thus, actors occupying the same social position based on regular equivalence will satisfy the same set of modal formulas. In this paper, we will present analogous results for regular similarity and generalized regular equivalence based on many-valued modal logics.
|
19 |
,Zero-Probability and Coherent Betting: A Logical Point of View, |
Tommaso Flaminio,Lluis Godo,Hykel Hosni |
|
Abstract
The investigation reported in this paper aims at clarifying an important yet subtle distinction between (i) the logical objects on which measure theoretic probability can be defined, and (ii) the interpretation of the resulting values as rational degrees of belief. Our central result can be stated informally as follows. Whilst all subjective degrees of belief can be expressed in terms of a probability measure, the converse doesn’t hold: probability measures can be defined over linguistic objects which do not admit of a meaningful betting interpretation. The logical framework capable of expressing this will allow us to put forward a precise formalisation of de Finetti’s notion of . which lies at the heart of the Bayesian approach to uncertain reasoning.
|
20 |
,Conditional Random Quantities and Iterated Conditioning in the Setting of Coherence, |
Angelo Gilio,Giuseppe Sanfilippo |
|
Abstract
We consider conditional random quantities (c.r.q.’s) in the setting of coherence. Given a numerical r.q. . and a non impossible event ., based on betting scheme we represent the c.r.q. .|. as the unconditional r.q. . + .., where . is the prevision assessed for .|.. We develop some elements for an algebra of c.r.q.’s, by giving a condition under which two c.r.q.’s .|. and .|. coincide. We show that .|. coincides with a suitable c.r.q. .|. and we apply this representation to Bayesian updating of probabilities, by also deepening some aspects of Bayes’ formula. Then, we introduce a notion of iterated c.r.q. (.|.)|., by analyzing its relationship with .|.. Our notion of iterated conditional cannot formalize Bayesian updating but has an economic rationale. Finally, we define the coherence for prevision assessments on iterated c.r.q.’s and we give an illustrative example.
|