Fuzzy Logic Proofs II: Systems of Logic

In this post I will discuss how fuzzy logic works as opposed to conventional logic.  This will lay groundwork for our discussion of fuzzy logic proofs later. Remember, the motivation for looking at such kinds of proofs is that it provides us with a means of dealing with uncertainty. Perhaps there are other motivations, but I will not mention them here.

I assume basic knowledge of first order logic (and, or, not, only if, ...) and pretty much nothing else. Don't worry, I won't get all academic on you. We'll save that for papers and articles.

Now there's a lot of theory and academic mumbo jumbo surrounding fuzzy logic, but practically speaking anyone can do it. A "fuzzy logic system" basically describes how to deal with "a & b (a AND b)", "a | b (a OR b)", and "~a (NOT a)" when we are, say, only 80% certain a is true and 50% certain b is true. There are two common ways of doing so that I will discuss here.

Defining the last operation mentioned above is the easiest. no matter what kind of logic we use, "~a", the logical complement of a, is always going to be "1-a". This makes sense if look at conventional logic, taking 1 to be "True" and 0 to be "False":




Fuzzy Logic: Conservative Flavor

The first is the most common. For "a & b", where a=.5 (50% certain) and b=.8 (80% certain) it says this: "Well, we know that a is 50% true, and b is more than 50% true, so we are at least 50% certain that the whole statement 'a & b' is true." This conservative attitude contributes to its popularity, and the fact that we take the smallest certainty value to be the value of the whole statement gives it a name: minimization fuzzy logic. So the certainty of the expression "a & b" is the smallest of the two between a and b. You may note that this works for conventional logic:

as for "a | b", we use DeMorgan's Law to define the meaning of disjunction. As "a | b" is equivalent to "~(~a & ~b)", we can just use subtraction and "min" to find out our OR operation. It turns out that the certainty of the statement "a | b" turns out to be the greatest value between a and b. This matches our conservative intuition: "We know a 50% true and b is 80% true, so 'a | b' must be at least 80% true." This is again confirmed by using our conventional "absolute truth" model:

Furthermore, with this (and many other) systems of fuzzy logic, the conventional laws of association, commutation, and transititivity still apply, as well as our faithful friend, DeMorgan's Laws.

Although not generally true of fuzzy logic (though totally true in conventional logic), minimization logic follows the law of idempotence; in other words,


Fuzzy Logic: Liberal Flavor

No pun intended, by the way, in the naming of these flavors. They're both important for our purposes, and the usefulness and limits of each will be demonstrated.

All the cool stuff except idempotence (association, commutation, transitivity, etc.) apply to this logic as well, including the fact that this kind of logic works with conventional logic (it works using only the values 0 and 1).

The next flavor of logic is actually defined, not by its conjunction operation (AND), but by its disjunction (OR). For example, let's say that we know x is 40% true and y is 30%. To figure out how to assign a certainty value to the phrase 'x | y', we simply add their certainties together; therefore, 'x | y' has a certainty value of 70%. 

Wait! what if y was 70% certain? then 'x | y' would be 110% certain!?!? 

Nah. We'll just say that if the certainty of x and y sum to more than 1 (or just barely made it to one), the couple has "made the cut" and we say that their combined certainties are more than enough to make us certain that 'x | y' is absolutely true. therefore, If the certainty of x and y are 40% and 80%, 'x | y' is still 1. That's why this logic system is referred to as bounded addition. We see that this flavor of logic is quite liberal - It takes the certainty of BOTH values into the value of the whole. 

Using DeMorgan's  Laws, we use NOT and OR to find AND.  Conjuction (AND) here has the same kind of "make the cut" attitude: "If x and y's certainties sum to more than one, it's significant. Otherwise, we really can't say much about the certainty of 'x & y' as a statement." In other words: take the certainties of x and y, subtract 1, and you have the value of 'x & y'. Again the question arises, "What if x and y are at 0%?" The answer is in the same spirit as last time. if x+y-1 is less than 0, we'll just interpret that as we are so uncertain of x and y that we have more than enough reason simply to assign the certainty of truth for 'x & y' to be false.

This is also called Łukasiewicz (Loo-KAH-she-vich) logic, after the polish philosopher who first invented fuzzy logic. Yes, ladies and gentlemen - additive logic was the first fuzzy logic out there, invented near the turn of the 20th century.

This logic is cool over other logics because unlike most other logics, Łukasiewicz logic obeys contradiction and excluded middle laws:
"a & ~a"=0%, "a | ~a"=100%.

What? MOST fuzzy logic doesn't obey contradiction? How can people use minimization logic without it? How can MODUS PONENS even work without it?

Oh, but it can. I'll show you how next time. (I said in my first post that I'd save that stuff for my third post, not my second, but I think I'll just jump to the good stuff.)

Comments

Popular Posts