Sungchul,

I lack the background in math and physics and other sciences to make any serious assessment of your conjectures about the recurrence of Planck-like distributions in various fields. From what I've read or skimmed over the years, I'd say that most likenesses among distributions turn out not to reflect common causal roots and often reflect a limited mathematical analogy involved in the compared processes.

I didn't see a wave-particle analogy in your previously offered idea that one classificatory axis of signs is about their wave nature and the other classificatory axis is about their particle nature. It's really not as if the belonging to many trichotomies at once is wavelike, while being just one of three kinds of sign in any given trichotomy is particle-like. Or vice versa (I forget which way you did it).

The only quantumlike thing that I've noticed in Peirce is the idea that something qualitative-of-feeling is singular if one reacts with it, and is general if one reflects on it, but remains an indefinite quality of feeling if one merely contemplates it passively, 'feels' it. In a classical perspective, _/singular/_ and _/general/_ constitute an exclusive alternative, and an undecidedness between them would merely reflect happenstance ignorance, and would not, again in a classical perspective, be a positive phenomenon reflecting necessary, in-principle ignorance, where quality is like a bit of feasibilism or probabilism that has managed to squeeze through the needle's eye of the present by avoiding environmental interaction like a wave passing through a double slit so that an interference pattern gets imprinted onto a photographic plate. Yet, on the non-classical supposition that quality of feeling is like such an interference pattern, then quality of feeling should be like a positive undecidedness among singulars, not like a positive undecidedness between singular and general. I'm quite skeptical of the idea that logical quantities such as singular and general are like slits side by side in a barrier, or like dot-sharp hits on a screen.

Anyway in some sense maybe you could say that the first member of each sign trichotomy resembles the interference-pattern-like, and the second resembles the particle-like. I don't know what the third one would resemble, maybe the wavelike (or 'wave-packet'-like) and probability-like, in some undivided way. But this is all very airy.

Best, Ben

On 10/3/2014 7:41 AM, Sungchul Ji wrote:

Ben wrote:
                                                              (100314-1)
"Curiously, there seems more realism, more of an idea of finding the
objective truth about generals that relate waves/particles than about the
singulars or particulars, the waves/particles themselves (which are not
particularly individualistic anyway), especially when the objective truth
about a given wave/particle is supposed to be classical and
observer-independent, not quantum. Feynman's attitude seems to have been,
give up trying to understand it classically. One can imagine Peirce
surveying the scene with an amused glint in his eye. Not only was he a
modal realist, he associated individuality with falsity."

I am coming to the conclusion that the reason the Planck distribution
(PD), y = (a/(Ax + B)^5/Exp(b/(Ax + B)) -- 1),  fits so many fat tailed
distributions found in all fields of natural sciences and linguistics
(atomic physics, protein physics, cell biology, immunology, brain
physiology, glottometrics, and cosmology; see the figure attached) may be
because of the universality of the wave-particle duality.  This conclusion
is primarily motivated by the structure of PD, which traces back to the
Planck radiation equation, u(lambda, T) = (2pihc^2/lambda^5)/exp(hc/kT
lambda) -- 1),  which is the product of two terms, the first reflecting the
number of standing waves per unit volume and the second  the average
energy of the standing waves.  Also it makes a physical sense to me --
standing waves in atoms are called atomic orbitals, and standing waves can
be implicated in protein folds, enzymic catalysis, metabolite
concentration gradients in cells, brain functions including
decision-making, word formation inside the mouth cavity, and within the
volume of the universe.

It is interesting to point out that PD and the Menzerath-Altmann law, y =
Ax^b exp(-c/x), discovered in glottometrics  in the 1950's (?),  are
functionally equivalent, since they both fit the same data sets (see a, b,
f,  g,  j, k, l, and m  in Figure 9 attached), and this may be because
they are both the products of a power law and an exponential function.
This conclusion seems consistent with the postulate I proposed in 2012
that the wave-particle complementarity operates not only in physics, but
also in biology and semiotics (see Table 2.13 in the chapter entitled
Complementarity,  under Publications > Book Chapters at conformon.net).

Any comments or critiques would be appreciated.

With all the best.

Sung
__________________________________________________
Sungchul Ji, Ph.D.
Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net




Clark, list,

Maybe I've underestimated the amount of instrumentalism - it's hard for
me to discern how seriously people take their own ideas of 'useful
fictions' in practice. Often enough the phrase 'useful fiction' seems a
cynical or self-deprecating way to say "enlightening approximation." But
not always. Also, I forgot about cases of formalisms that can be
dispensed with in principle and are used for calculations - as when it
is said that, in physics, gauge invariance reflects a redundancy in the
description, so it's more mathematical than especially physical, while
Lorentz invariance is indispensable and physical. I'm at sea with gauge
invariance, the math is quite beyond me. However, the distinction
between dispensable and indispensable formalisms, and the idea that some
physical-theoretical invariance is more especially physical than another
physical-theoretical invariance, seems harder for a pure instrumentalism
about laws to deal with. But I've gotten in over my head.

Curiously, there seems more realism, more of an idea of finding the
objective truth about generals that relate waves/particles than about
the singulars or particulars, the waves/particles themselves (which are
not particularly individualistic anyway), especially when the objective
truth about a given wave/particle is supposed to be classical and
observer-independent, not quantum. Feynman's attitude seems to have
been, give up trying to understand it classically. One can imagine
Peirce surveying the scene with an amused glint in his eye. Not only was
he a modal realist, he associated individuality with falsity.

Best, Ben

On 9/30/2014 5:49 PM, Clark Goble wrote:

(Changed the thread title since weâEUR^(TM)ve drifted far from natural
propositions)

On Sep 30, 2014, at 11:58 AM, Benjamin Udell <bud...@nyc.rr.com
<mailto:bud...@nyc.rr.com>> wrote:

     > [CG] Whether the âEURoenearly realâEUR? is good enough is a reasonable
     question. Like you, I see it as good enough, but I think there
     are important caveats one has to make which is why I mentioned
     that on practical grounds for many entities they act like
     instrumentalists.
     [End quote]

I'd say that they're acting as fallibilists. They may also hold that
a theory should be evaluated not for the plausibility of its
assumptions but the only for the success of its predictions, and it's
more tempting to call that approach instrumentalism. Some have even
held that it's okay and even necessary for the assumptions to be
'descriptively false'.

While related to fallibilism IâEUR^(TM)m not sure thatâEUR^(TM)s a good term.
Fallibilists in practice just reject epistemic foundationalism. Since
there are very few foundationalists left IâEUR^(TM)m not sure that gets us
much. (I only see them among theological oriented philosophers doing
epistemology - but perhaps there are a few atheist foundationalists
left)

Now certainly most scientists - especially since positivism largely
died - are fallibiliist. I think what IâEUR^(TM)m talking about goes beyond
that.

I think many (wish there was a poll for this) physicists view laws
like the ideal gas law or even NewtonâEUR^(TM)s Laws as useful fictions. But
they may well be a realist towards other phenomena laws or structures.
That whole âEURoeuseful fictionâEUR? bit really goes well beyond fallibilism.

I vaguely remember Peirce discussing something like this. IâEUR^(TM)ll try and
look it up tonight. It was relative to measurement and simplifications
one makes in physics and chemistry. Really thatâEUR^(TM)s the issue at hand.
When is a first or second order approximation good enough? (e.g.
analogy to series expansion with Fourier, Bessel, or Spherical Bessel
functions)

Now, that could mean merely seemingly false by omission of factors
that one would have thought to be pertinent, and I do think that is
part of it.
Yes, the first and second order approximation gets at that. But it can
also apply to simplified boundary conditions or, as with NewtonâEUR^(TM)s
Laws, discovering laws one thought were universal were actually just
an approximation in certain conditions. i.e. not fundamental.

Still, I'd call that fallibilism, not instrumentalism, although it
reflects the spirit of some who call themselves instrumentalists.
I think the difference, even beyond the useful fiction, is over what
generals one can legitimately precind and what are more âEURoeaccidentalâEUR?
simplifications. To go back to the series expansion analogy often if
you find a large term in the first or second term and the following
terms are very small, you feel legitimate to say this is a real
structure. However for some simplifications you donâEUR^(TM)t think the
resultant structures are really there but that you are just making a
model that gives you useful answers.

For even a scholastic realist of the Perigean sort I think we can make
a distinction there between useful fictions and mind independent
structures that may be obscured due to complexity. So to return to my
other example, one might see the ideal gas law as a real law that gets
obscured by other complexities or one might see it purely as a simple
model that does not get at an underlying structure.

Discerning whatâEUR^(TM)s a simplification and whatâEUR^(TM)s a real structure 
is
often not at all clear. ItâEUR^(TM)s also what makes discerning structures in
complex phenomena such as economics or psychology so hard compared to
physics. With physics we can tease out underlying phenomena from
complicating factors like friction.

But even Peirce's idea of plausibility is more about developing a
theory than about evaluating its success. Most scientific hypotheses,
including quite a few highly plausible ones, get disconfirmed, and I
don't think that Peirce held that hypotheses that stand up to testing
generally turn out to have been the most plausible in advance.
Verification and falsification take place over time and are a
continuing process rather than something âEURoecompleted.âEUR? As you say,
lots
of things that seem solid (like NewtonâEUR^(TM)s laws prior to 1910) are
turned over. Plausibility seems always indexed to a particular time,
set of theories, and experimental results.

*The case of the incomplex hypothesis which one really doesn't expect
to be true is the closest, I think, to instrumentalism, but it's a
case of treating a hypothesis instrumentally without embracing the
view called 'instrumentalism', which holds (or originally held,
according to what we find in Peirce's account of it) that theories
don't affirm objective laws or norms but merely predict particular
results. *
When I think of instrumentalism I tend to think of Feynman rather than
the more formal philosophers of science. His focus was on calculating
rather than reality. He was a big proponent of that and famously
warned people off from trying to understand quantum mechanics at a
deep level. I donâEUR^(TM)t know how influential that perspective still is.
That poll that Howard linked to unfortunately didnâEUR^(TM)t directly touch on
the instrumentalist question beyond perhaps the question about whether
QM was epistemic (27%). IâEUR^(TM)m not sure that gets at the issue
sufficiently though.

It also doesnâEUR^(TM)t get at what we might term âEURoesituational
instrumentalismâEUR? for lack of a better term. I suspect thatâEUR^(TM)s much
more
common. But then you have to ask what theories one is situational about.

*Still, insofar as fallibilism applies to our beliefs, and incomplex
hypotheses aside for the moment, how does one characterize other than
as 'instrumental' one's attitude _/toward/_ the tentative or
experimental hypothesis or theory that conflicts with a belief that
one holds? I would call it 'successiblism', the attitude that said
hypothesis or theory is 'successible', i.e., it could be true, and
that one could find the real through it. Even the incomplex
hypothesis has to be granted some provisional credibility, as a kind
of possible approximation to the truth. Of course one needs both
fallibilism and successibilism about one's beliefs and one's doubts,
hypotheses, etc.; but sometimes one or the other stands out more as
what one needs. With the terms 'fallibilism' and 'successibilism'
obvously I'm trying for the kind of informative etymological
counterbalancing involved in 'verifiable' and 'falsifiable' but with
a much smaller morphological mess.*
*
*
As I said I donâEUR^(TM)t think fallibilism gets at this issue. I wonder if
degree of belief might be a fruitful Peircean notion to apply. It gets
at the issue that how we act is dependent upon how much we believe the
structure in question. I say that because I think there are plenty of
people who might see some social or ethical norms as âEURoeuseful
fictionsâEUR?
without believing them. (Say VoltaireâEUR^(TM)s take on Christianity) And of
course the notion of the double truth has a long if sometimes
misrepresented history. Think the Averroists for instance. Some might
say Strauss advocates that too.

     But this is not all which distinguishes doubt from belief. There
     is a practical difference. Our beliefs guide our desires and shape
     our actions. The Assassins, or followers of the Old Man of the
     Mountain, used to rush into death at his least command, because
     they believed that obedience to him would insure everlasting
     felicity. /*Had they doubted this, they would not have acted as
     they did. So it is with every belief, according to its degree*/.
     The feeling of believing is a more or less sure indication of
     there being established in our nature some habit which will
     determine our actions. Doubt never has such an effect. (âEURoeThe
     Fixation of BeliefâEUR? EP 1:114)

IâEUR^(TM)m not sure we need much more than this. The people with situational
instrumentalism will simply act differently than those with true
instrumentalism and those who have a âEURoenear realismâEUR? or âEURoegood
enough
realismâEUR? towards certain structures will act differently still. Now
the danger is that we move more towards William JamesâEUR^(TM) view of acting
on belief rather than PeirceâEUR^(TM)s. But I think even sticking to Peirce we
can see differences in terms of how people calculate or measure or
verify.
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to