Subjective logic is a type of probabilistic logic that explicitly takes uncertainty and source trust into account. In general, subjective logic is suitable for modeling and analysing situations involving uncertainty and relatively unreliable sources. For example, it can be used for modeling and analysing trust networks and Bayesian networks.
Arguments in subjective logic are subjective opinions about state variables which can take values from a domain (aka state space), where a state value can be thought of as a proposition which can be true or false. A binomial opinion applies to a binary state variable, and can be represented as a Beta PDF (Probability Density Function). A multinomial opinion applies to a state variable of multiple possible values, and can be represented as a Dirichlet PDF (Probability Density Function). Through the correspondence between opinions and Beta/Dirichlet distributions, subjective logic provides an algebra for these functions. Opinions are also related to the belief representation in Dempster–Shafer belief theory.
A fundamental aspect of the human condition is that nobody can ever determine with absolute certainty whether a proposition about the world is true or false. In addition, whenever the truth of a proposition is expressed, it is always done by an individual, and it can never be considered to represent a general and objective belief. These philosophical ideas are directly reflected in the mathematical formalism of subjective logic.
Subjective opinions express subjective beliefs about the truth of state values/propositions with degrees of uncertainty, and can explicitly indicate the source of belief whenever required. An opinion is usually denoted as where is the source of the opinion, and is the state variable to which the opinion applies. The variable can take values from a domain (also called state space) e.g. denoted as . The values of a domain are assumed to be exhaustive and mutually disjoint, and sources are assumed to have a common semantic interpretation of a domain. The source and variable are attributes of an opinion. Indication of the source can be omitted whenever irrelevant.
Let be a value in a binary domain. A binomial opinion about the truth of value is the ordered quadruple where:
|: belief mass||is the belief that is true.|
|: disbelief mass||is the belief that is false.|
|: uncertainty mass||is the amount of uncommitted belief.|
|: base rate||is the prior probability in the absence of belief or disbelief.|
These components satisfy and . The characteristics of various opinion classes are listed below.
|An opinion||where||is an absolute opinion that is equivalent to Boolean TRUE,|
|where||is an absolute opinion that is equivalent to Boolean FALSE,|
|where||is a dogmatic opinion which is equivalent to a traditional probability,|
|where||is an uncertain opinion which expresses degrees of uncertainty, and|
|where||is a vacuous opinion which expresses total uncertainty.|
The projected probability of a binomial opinion is defined as .
Binomial opinions can be represented on an equilateral triangle as shown below. A point inside the triangle represents a triple. The b,d,u-axes run from one edge to the opposite vertex indicated by the Belief, Disbelief or Uncertainty label. For example, a strong positive opinion is represented by a point towards the bottom right Belief vertex. The base rate, also called the prior probability, is shown as a red pointer along the base line, and the projected probability, , is formed by projecting the opinion onto the base, parallel to the base rate projector line. Opinions about three values/propositions X, Y and Z are visualized on the triangle to the left, and their equivalent Beta PDFs (Probability Density Functions) are visualized on the plots to the right. The numerical values and verbal qualitative descriptions of each opinion are also shown.
The Beta PDF is normally denoted as where and are its two strength parameters. The Beta PDF of a binomial opinion is the function
Let be a variable which can take values . A multinomial opinion over is the composite tuple , where is a belief mass distribution over the possible values of , is the uncertainty mass, and is a base rate distribution over the possible values of . These parameters satisfy and as well as .
Visualising multinomial opinions can be challenging. Trinomial opinions can be simply visualised as points inside a tetrahedron. Opinions with dimensions larger than trinomial do not lend themselves to simple visualisation.
Dirichlet PDFs are normally denoted as where is a probability distribution over the values of , and are the strength parameters. The Dirichlet PDF of a multinomial opinion is the function where the strength parameters are given by .
Most operators in the table below are generalisations of binary logic and probability operators. For example addition is simply a generalisation of addition of probabilities. Some operators are only meaningful for combining binomial opinions, and some also apply to multinomial opinion.  Most operators are binary, but complement is unary, and abduction is ternary. See the referenced puplications for mathematical details of each operator.
|Subjective logic operator||Operator notation||Propositional/binary logic operator|
|Multiplication||Conjunction / AND|
|Division||Unconjunction / UN-AND|
|Comultiplication||Disjunction / OR|
|Codivision||Undisjunction / UN-OR|
|Subjective Bayes' theorem ||Contraposition|
|Transitivity / discounting||n.a.|
|Cumulative fusion ||n.a.|
Transitive source combination can be denoted in a compact or expanded form. For example, the transitive trust path from analyst/source via source to the variable can be denoted as in compact form, or as in expanded form. Here, expresses that has some trust/distrust in source , whereas expresses that has an opinion about the state of variable which is given as an advice to . The expanded form is the most general, and corresponds directly to the way subjective logic expressions are formed with operators.
In case the argument opinions are equivalent to Boolean TRUE or FALSE, the result of any subjective logic operator is always equal to that of the corresponding propositional/binary logic operator. Similarly, when the argument opinions are equivalent to traditional probabilities, the result of any subjective logic operator is always equal to that of the corresponding probability operator (when it exists).
In case the argument opinions contain degrees of uncertainty, the operators involving multiplication and division (including deduction, abduction and Bayes' theorem) will produce derived opinions that always have correct projected probability but possibly with approximate variance when seen as Beta/Dirichlet PDFs. All other operators produce opinions where the projected probabilities and the variance are always analytically correct.
Different logic formulas that traditionally are equivalent in propositional logic do not necessarily have equal opinions. For example in general although the distributivity of conjunction over disjunction, expressed as , holds in binary propositional logic. This is no surprise as the corresponding probability operators are also non-distributive. However, multiplication is distributive over addition, as expressed by . De Morgan's laws are also satisfied as e.g. expressed by .
Subjective logic gives very efficient computation of mathematically complex models. This is possible by approximating the analytically correct functions whenever needed. While it is relatively simple to analytically multiply two Beta PDFs in the form of a joint Beta PDF, anything more complex than that quickly becomes intractable. When combining two Beta PDFs with some operator/connective, the analytical result is not always a Beta PDF and can involve hypergeometric series. In such cases, subjective logic always approximates the result as an opinion that is equivalent to a Beta PDF.
Subjective logic is applicable when the situation to be analysed is characterised by considerable uncertainty and incomplete knowledge. In this way, subjective logic becomes a probabilistic logic for uncertain probabilities. The advantage is that uncertainty is preserved throughout the analysis and is made explicit in the results so that it is possible to distinguish between certain and uncertain conclusions.
Subjective trust networks
Subjective trust networks can be modelled with a combination of the transitivity and fusion operators. Let express the referral trust edge from to , and let express the belief edge from to . A subjective trust network can for example be expressed as as illustrated in the figure below.
The indices 1, 2 and 3 indicate the chronological order in which the trust edges and advices are formed. Thus, given the set of trust edges with index 1, the origin trustor receives advice from and , and is thereby able to derive belief in variable . By expressing each trust edge and belief edge as an opinion, it is possible for to derive belief in expressed as .
Trust networks can express the reliability of information sources, and can be used to determine subjective opinions about variables that the sources provide information about.
Subjective Bayesian networks
In the Bayesian network below, and are parent variables and is the child variable. The analyst must learn the set of joint conditional opinions in order to apply the deduction operator and derive the marginal opinion on the variable . The conditional opinions express a conditional relationship between the parent variables and the child variable.
The deduced opinion is computed as . The joint evidence opinion can be computed as the product of independent evidence opinions on and , or as the joint product of partially dependent evidence opinions.
The combination of a subjective trust network and a subjective Bayesian network is a subjective network. The subjective trust network can be used to obtain from various sources the opinions to be used as input opinions to the subjective Bayesian network, as illustrated in the figure below.
Traditional Bayesian network typically do not take into account the reliability of the sources. In subjective networks, the trust in sources is explicitly taken into account.
- A. Jøsang. Subjective Logic: A formalism for reasoning under uncertainty. Springer Verlag, 2016
- A. Jøsang. Artificial Reasoning with Subjective Logic. Proceedings of the Second Australian Workshop on Commonsense Reasoning, Perth 1997. PDF
- A. Jøsang. A Logic for Uncertain Probabilities. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems. 9(3), pp. 279–311, June 2001. PDF
- A. Jøsang. Probabilistic Logic Under Uncertainty. Proceedings of Computing: The Australian Theory Symposium (CATS'07), Ballarat, January 2007. PDF
- D. McAnally and A. Jøsang. Addition and Subtraction of Beliefs. Proceedings of the conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU2004), Perugia, July, 2004.
- A. Jøsang, and D. McAnally. Multiplication and Comultiplication of Beliefs. International Journal of Approximate Reasoning, 38/1, pp. 19–51, 2004.
- A. Jøsang. Generalising Bayes' Theorem in Subjective Logic. 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI 2016), Baden-Baden, Germany, 2016.
- Subjective Logic by Audun Jøsang
- Subjective Logic Experimentation Framework based on Subjective Logic Operators in Trust Assessment: An Empirical Study by F. Cerutti, L. M. Kaplan, T. J. Norman, N. Oren, and A. Toniolo
- Decision Point AI built on Subjective Logic and Bayesian Networks