On 08 Feb 2014, at 22:05, LizR wrote:

On 8 February 2014 08:43, Bruno Marchal <marc...@ulb.ac.be> wrote:

On 07 Feb 2014, at 02:29, LizR wrote:

On 7 February 2014 09:14, Bruno Marchal <marc...@ulb.ac.be> wrote:

On 06 Feb 2014, at 07:39, LizR wrote:

<snip>
OK, having had a look at what you say below, let's have another go. Start from p -> q being equivalent to (~p V q)

That gives us ~p -> q equiv (p V q) and from the above ~p is (p -> f) so p V q is (p -> f) -> q which I seem to remember is what you got. OK so far.

p & q --- well, p -> q is ~(p & ~q), so ~(p -> q) = (p & ~q) and ~(p -> ~q) = (p & q)

so ~(p -> (q -> f)) which I guess is ((p -> ( q -> f)) -> f) = (p & q)

Does it?!?! Looking below, I see that it does. Wow.
I knew you can do that.

With hints.

So with "->" and "f" we can define all connectors.

Is there a connector (like "&", "V", "->", ...) such that all connectors can be defined from it?

This is a facultative exercise. Only for possible raining sundays. We will not use this in the sequel.





But that doesn't make sense, because & requires two arguments, so it would have to be something like ... well, p -> q is (~p V q) and it's also ~(p & ~q), which contain V and & ... I'm not sure I know what you mean.

Like for "~", to define "&" and "V" to a machine which knows only "- >" and "f". You can use the "~", as you have alredy see that you can define it with "->" and "f".

I reason aloud. Please tell me if you understand.

First we know that "p -> q" is just "~p V q", OK?

So the "V" looks already close to "->". Except that instead of "~p V q" (which is p -> q) we want "p V q".

May be we can substitute just p by ~p: and p V q might be then ~p - > q,

Well, you can do the truth table of ~p -> q, and see that it is the same as p V q.

To finish it of course, we can eliminate the "~", and we have that p V q is entirely defined by (p -> f) -> q.

OK?

And the "&":

Well, we already know a relationship between the "&" and the "V", OK? The De Morgan relations.

So, applying the de Morgan relation, p & q is the same as ~(~p V ~q), (the same "logically", not pragmatically, of course).

That solves the problem.

But we can verify, perhaps simplify. We can eliminate the "V" by the definition above (A V B = ~A -> B), ~(~p V ~q) becomes ~(~~p -> ~q), that is ~(p -> ~q). Or, to really settle the things, and define & from -> and f:
p & q = ((p -> (q -> f)) -> f).

OK?

Apparently, yes.
OK. (Not sure what you mean by "apparently", though).

Well, even though I did it, the result still looks rather strange to me!

Cantor said "I see but don't believe it". it is normal. von Neuman said "nobody understand math", mathematicians get only used to it.







Each world, once "illuminated" (that is once each proposition letter has a value f or t) inherits of the semantics of classical proposition logic.

This means that if p and q are true in some world alpha, then (p & q) is true in that world alpha, etc. in particular all tautologies, or propositional laws, is true in all illuminated multiverse, and this for all illuminations (that for all possible assignment of truth value to the world).

OK?

Question: If the multiverse is the set {a, b}, how many illuminated multiverses can we get?

I suppose 4, since we have a world with 2 propositions, and each can be t or f?

Answer: there is three letters p, q, r, leading to eight valuations possible in a, and the same in b, making a total of 64 valuations, if I am not too much distracted. I go quick. This is just to test if you get the precise meanings.

Oh, OK. So a and b are worlds, not ... sorry. I see.

Good.


So that is 2^3 x 2^3 because a has p,q,r = 3 values, all t or f, as does b. OK now I see what you meant.

OK.



Of course with the infinite alphabet {p, q, r, p1, q1, r1, p2, ... } we already have a continuum of multiverses.

I can't quite see why it's a continuum. Each world has a countable infinity of letters, and the number of worlds is therefore 2 ^ countable infinity! Is that a continuum?

Yes. We proved it, Liz.

Yes I had a sneaky suspicion we did. It seems familiar ... a bit.
Understanding is good.
Understanding and memorizing, even with the help of a well presented diary, is better, as it saves the future possible works.

I agree. I'm sure I started one, too, but I can't find it now. (So sometimes I have to treat you as my diary...)


Well, I hope you will not lost me too!

Memorizing is good, but only if you manage to keep the memory accessible. 'course.







Take a the infinite propositional symbol letters {p, q, r, p1, q1, r1, p2, ... } . They are well ordered. So a sequence of 1 and 0 (other common name for t and f) can be interpreted as being a valuation. The valuation are the infinite sequences of 1 and 0. Or the function from N to {0, 1}.

If such a set of function was in bijection with N, i -> f_i, the function g defined by g(n) = f_n(n) + 1 would be a function f_i, let us sat f_k, and f_k, applied on k, would gives both f_k(k) + 1 and f_k(k), and be well defined, making 0 = 1.

OK. I think.
Hmm... OK. I think. For now. (That was quick).

I meant it's clear once you assign values to them and make binary strings that they can be diagonalised. And I remember the above proof, at least in outline.

OK.




 The world of maths kicks back!


Yeah! That's its charm.

Some would disagree...


I guess that they met the bad math teacher who kicks the student before math kick them, making it impossible for them to understand the real kicking back of math, and develop the appreciation.

That's bad for the slow student, which sometimes are slow because they are more demanding in understanding, and it is good for the quick student, who can learn to solve problem by no more than pattern matching, without any understanding. Consumerist societies favour quick students, which aggravate the situation for slow students, and long term project.

As a math teacher, I try to help the two kinds of student, but it is not always easy, and to be honest, I favor the slow one.

For me, a valid reasoning with a false answer is better than a false reasoning with a correct answer. I know that in real life, the contrary is true.






[]A is true in alpha, by definition, means that A is true in all world beta *accessible* from alpha.

And

<>A is true in alpha iff there is a world beta; where A is true, accessible from alpha.

OK. That makes sense, but I'm not sure I can use that fact to work things out...


Understanding is good.

But understanding + familiarity is better, and that comes with *some* practice.

Yes, I know. But you are trying to teach an old dog new tricks (as we say). I have recently learned how cryptic crosswords work and (maybe not quite so recently) the craft of novel writing. Plus my work requires me to learn new things, at least if I want to live in New Zealand (because NZ is too small to have the exact jobs I'm good at). And I have 2 kids and a husband to look after. So I am trying to fit in some logic as well ... luckily I have some spare time at work quite often...

Well, be careful. I would feel sorry if you lost your job because you do logic. You might try to make your husband and kids working on the problem, which gives you the good exercise of being able to pose the problems, which sometimes gives the main hints to the solution.

But you can also asks for hints, or for solution, or for supplementary explanations (if interested of course). Old dogs can learn new tricks, it just take more time (unless Alzheimer or other health problem).





Oh dear. I don't seem to be able to get my head around this.

That happens. Tell me if the explanations above help. Ask any question.

Well it does seem to make sense now. The binary relations help, I didn't really get that.

I am a bit aware of that. But it might be because I go quick.

But I might go quick because yo don't ask enough question, also.

That will be when I don't have time to really engage. I can do all this, but only when I concentrate on it - it doesn't come naturally.

That is why I feel a bit guilty when I go too much quickly. If I give too much simple exercise, which helps for the familiarity, I am afraid to be boring. The difficulty in explaining AUDA is that it is necessarily technical. It is not the place to prove everything to you, but a minimal amount of logic is needed to understand the enunciation of the results.




Now I have to go and make breakfast :)

I am happy you don't forget what is really important in life. Bon appetit!

Soon, some summing up definition and few "easy" exercises to solidify the old dog memory, and makes all of this more familiar and more natural ... as far as it is possible, because logic is not something natural, and classical logic is a bit counter-intuitive.

Some others seems interested in the thread too, but might be less courageous for participating, as you need some courage to do a sort of persistent "exam" online. I can understand. But I know that if I explain everything ex-cathedra, everyone will be lost somewhere, and nobody will know where. I do hope some others will participate to make things lighter on your shoulders.

Bruno





http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to