Mr. Miller,

Thanks for the tips, they are very well founded. I confused the halting
problem with the termination property of total functions. You are
absolutely right, if you stick with those rules then all functions will
terminate, because all those verbs (modulo a type mismatch) are total over
their arguments. I was mentioning a rather simple trick that Haskell is
able to do sometimes. Here's an example:

lp x = lp x + 1

This definition compiles just fine and at runtime it'll whiz off into an
infinite loop. Haskell is able to "solve the halting problem" and kill the
program with a <<loop>> exception. It's a nice trick for development
purposes, but no one relies on it for it's semantics.

Your advice is great because it keep J to a subset of only total and pure
functions, which is most of J. Pair that with the fact that verbs are
polymorphic over the dimension of the array (No more (map . map . map) or
any of that nonsense.) and with the rank system and attempting to add types
to anything just breaks down. :) I still like to think about it though,
since I am a bit of a type systems and programming language enthusiast.


Mr Hui,

You are absolutely right, the types are rather trivial: luDecompse ::
Matrix a -> Maybe (Matrix a, Matrix a, Matrix a, a)
http://hackage.haskell.org/package/matrix-0.3.4.3/docs/Data-Matrix.html#g:15
However at first glance I do know a couple of things. First that it may
fail, because of the Maybe. Then that it returns 4 values to me in a tuple,
3 matrices and a single element. So just for the sake of documentation,
that helps. Here's the tests for decomposition:
https://github.com/Daniel-Diaz/matrix/blob/b00f44fe9e748ca014832696130585185709547f/test/Main.hs#L109
I could maybe see something done with liquidHaskell, but that would just be
too much. So you are right, most of the time it is just trivial things that
are encoded, but even that is nice. It then goes on to even more complex
territory. Notice I didn't even try to put down the type of ^ , I just
stayed with ^_1. Since that would (at least to my puny brain) nearly
impossible, but it's still fun to think about.

Still, what I would really like to see is maybe some ability to add define
*closed* sums in J. https://en.wikipedia.org/wiki/Sum_type Since it's easy
to fake it with tags and such, you just don't get the benefits of
exhaustiveness checking. But to me, I don't really see how it would fit
into J and at this point is just a dream. Maybe one day.


Mr. Miller and Mr. Hui,

I was wondering what your opinion on this blog post by Mr. Harper is:
https://existentialtype.wordpress.com/2011/03/19/dynamic-languages-are-static-languages/
I'm almost certain that he did not have J on his mind when he was writing
it. Maybe you can give me and him a piece of your mind?

On Fri, Apr 3, 2015 at 10:25 AM, Roger Hui <[email protected]>
wrote:

> That may be fine in the abstract, but I would like to know, in the specific
> case of the LUP decomposition, how knowing that the types are correct gives
> you any confidence that the result is correct.  With the assertions in
> LUcheck in http://www.jsoftware.com/jwiki/Essays/LU_Decomposition
>
>  assert. (($p) -: ,n) *. (i.n) e. p
>  assert. (($L) -: m,m) *. ((0~:L)<:(i.m)>:/i.m) *. 1=(<0 1)|:L
>  assert. (($U) -: m,n) *.  (0~:U)<:(i.m)<:/i.n
>  assert. A -: p {"1 L mp U
>
> If these all pass, I am pretty sure that the result is in fact an LUP
> decomposition.
>
>
>
>
> On Thu, Apr 2, 2015 at 11:27 PM, Aistis Raulinaitis <[email protected]
> >
> wrote:
>
> > Mr. Hui,
> >
> > It's an honor. If I stay to only total functions then my code is
> guaranteed
> > to be either productive or guaranteed to terminate. Knowing that my
> program
> > will never "crash" if I remain disciplined is a good starting point.
> Modulo
> > infinite recursion and dirty data, but apparently Haskell can "solve the
> > halting problem" with a <<loop>> exception, by doing some simple code
> path
> > analysis, which I rarely, if ever bump into. For the sake of utility you
> > can define partial functions as well, since working with a language with
> > only total functions would not be very fun.
> >
> > The other thing that can increase my confidence is the fact that you wrap
> > primitive data types with newtype wrappers among other type-level
> > invariances. So if you have a UserID and a PostID, you can wrap Ints with
> > these wrappers and ghc will distinguish at compile time between the two
> and
> > won't accept any code that mixes up the two. Even though both have the
> > exact same representation at run time with the wrappers being completely
> > erased.
> >
> > To contrast this with some issues, mainline Haskell lumps all IO into a
> > single monad. You can do things like Eff:
> > http://hackage.haskell.org/package/extensible-effects that let you to
> > break
> > up the IO monolith into even smaller effects. So I know from the type
> that
> > this function will open the water valves and not fire the missiles,
> instead
> > of all being lumped into a single IO monad. Although mainline Haskell has
> > not fully accepted Eff, or any of the other libraries or proposals yet.
> >
> > The last thing is information flow, Basically you tag a piece of data as
> > private and if attempt to write code that leaks that information out to
> the
> > real world, the program won't compile. The only way to do it is to coerce
> > the types, implying malicious intent.
> > http://hackage.haskell.org/package/lio
> > http://hackage.haskell.org/package/seclib
> >
> > So although there is no one answer to the confidence question, wearing
> the
> > hair shirt of strong static types (to me) has some benefits in some
> cases.
> > The big benefit for me when using J is that if my problem fits into
> tables,
> > which it does quite a lot, then there is no shorter or quicker way from
> > data to answer than J. I've been thinking about NLP in J too, the high
> > amount of implicit mapping seems like it can be super useful in a variety
> > of cases.
> >
> > J is great for my confidence in correctness because of the fact that it
> is
> > so dense, it can be obviously correct just by looking at it. J wins on
> that
> > front by leaps and bounds. Haskell has some of this property too,
> although
> > it is much more verbose, but a lot of the time you can see that a program
> > is correct, just by looking at it. Unlike many other languages. If it
> > typechecks, all the better! This is usually due to great refactoring
> > capabilities of Haskell. Lazyness and purity allow you to rip out pieces
> > without any change to the code whatsoever, being able to break everything
> > down to its component pieces, usually with lots of whitespacing to
> > emphasize the patterns in the code. If you tried to write Haskell code
> they
> > way you write oneliners in J, it just wouldn't work, it'd actually be
> more
> > verbose because you would have to use the fully explicit version with the
> > semicolons and curly brackets to write it all on one line, since
> > indentation is significant in Haskell.
> >
> > About assertions, the way I think about it, is that I would rather think
> > about the problem for a little longer to be able to be able to derive at
> > least some parts of the specification in the types. Now it's also very
> easy
> > to over-specify and actually make it more difficult. For the things you
> > can't specify (Like hashing functions, etc.) there is always testing, I'm
> > sure you've heard of the famous QuickCheck libary, with it's property
> based
> > testing: http://hackage.haskell.org/package/QuickCheck There is also
> > tasty,
> > which integrates a whole bunch of testing libraries:
> > http://hackage.haskell.org/package/tasty
> >
> > Thank you Raul, I've heard that happens with explicit definitions, right?
> > If I have my model correct, as soon as you interpret the verb over the
> data
> > and specialize it to the columns, you are essentially running at the bare
> > metal until the data ends and interpretation continues, specializes, and
> > cycles for more data.
> >
> > On Thu, Apr 2, 2015 at 9:10 PM, Roger Hui <[email protected]>
> > wrote:
> >
> > > In my daily work I program a lot in C, and think a lot in APL/J.  IMO,
> a
> > > type system catches only the easy errors.  I mean, if a Haskell program
> > > compiles (passing all the type checks), how confident are you that the
> > > program will work?
> > >
> > > On the other hand, I am a believer in assertions such as demonstrated
> in
> > > verb LUcheck in http://www.jsoftware.com/jwiki/Essays/LU_Decomposition
> .
> > > In this case, I am confident that a solution that passes all the
> > assertions
> > > is correct.  Assertions tend to be more robust than programs because
> > > testing is typically easier than computing.  (e.g. it's much easier to
> > > check that S is a solution to an NP complete program than to derive
> it.)
> > >
> > >
> > >
> > > On Thu, Apr 2, 2015 at 7:34 PM, Aistis Raulinaitis <
> > [email protected]>
> > > wrote:
> > >
> > > > I'm rather agnostic either way! I like J so much because of it's
> > > dynamism.
> > > > The J does dynamism right, bar none (imo). I only need to say one
> word,
> > > > rank. I haven't found anything that compares to it in any other
> > language.
> > > > Not even Factor comes close, and I really really like it's stack
> > > > combinators and fried quotations. What I really like about J is that
> a
> > > lot
> > > > of the work I do fits right into it, quick, iterative, data
> > exploration,
> > > > make a graph and forget about the code forever. No other language
> lets
> > me
> > > > do that entire cycle as quickly as J, the solution can usually fit
> in a
> > > > tweet. What I've been thinking about is that there must be some
> really
> > > nice
> > > > underlying way to give J an optional type system, or at make some
> sort
> > of
> > > > toy language that fits the bill. That and there is no implementation
> > for
> > > > obverse in Haskell, aka: ^_1 :: (a -> b) -> (b -> a). Although maybe
> if
> > > you
> > > > add some constraints on a and b and reified J's AST, maybe it would
> go
> > > > somewhere. My first thought is to do something like: ^_1 :: (JVal a,
> > JVal
> > > > b) => JFunc a b -> JFunc b a.
> > > >
> > > > I know Mr. Iverson really didn't like types, and I completely
> > understand
> > > > that. It can absolutely slow down the development of the language,
> but
> > > > giving up sum types in J is a bit sad to me. I think that just that
> > would
> > > > be interesting, since nil is horrible and Maybe is bliss, and I
> think J
> > > > would benefit from having the capability. Then my personal bias shows
> > in
> > > > the fact that I'm sold on dependent types, which I actually think are
> > > > *necessary* for typing some of J's trickier semantics. Such as
> > functions
> > > > that dispatch on the type of the argument. My first thought was to
> use
> > > type
> > > > level sums, but then I realized that not much work has been done in
> > that
> > > > area. Compound that with the fact that I realized that some functions
> > > would
> > > > require *open* sums, *at the type level*! So that would give up all
> the
> > > > nice things that closed sums give you. So now my best guess is to
> track
> > > the
> > > > types with a type level set, and then to dispatch to the proper
> > function
> > > > with that information. So the type would contain the possible types
> of
> > > the
> > > > arguments that could be passed in, essentially still maintaining 100%
> > of
> > > > the polymorphism (if everything goes right), while also giving you
> > > compile
> > > > time errors if you pass say an int into something that expects an
> > array.
> > > > You could go one step further (and I would), and track the dimensions
> > of
> > > > the array at the type level. So the type of (2D) transpose would be:
> > > > 2dtrans :: [a][b] -> [b][a], where a and b are the length of each
> > > > dimension. Generalized NDimensional transpose would be a little
> > > trickier, I
> > > > think it can be done with enough tinkering. You could then go on and
> > say
> > > > that you expect specific sizes: someFunc :: [3][4] -> [5][4] ->
> [8][4],
> > > or
> > > > leave some parts polymorphic: otherFunc :: [a][5] -> [6][b] ->
> > > > [(a+b)*3][7]. Yes, that is type level computation, welcome to
> dependent
> > > > types! ;) It's like ASSERT but even better!
> > > >
> > > > This would essentially give you compile time errors for a whole suite
> > of
> > > > things that usually go wrong at runtime. It would also probably give
> > > better
> > > > error messages, at least more verbose ones.
> > > >
> > > > Very interesting article!
> > > >
> > > > Devon, I hope some of the above answers a part of your question. If
> you
> > > > already have some experience with Haskell, I would implore you to
> try a
> > > > dive into Agda. http://oxij.org/note/BrutalDepTypes/ <--- My
> favorite
> > > > tutorial on Agda. Install Emacs if you don't already use it and use
> > > Agda's
> > > > auto proof search feature, it's a completely different way of
> > programming
> > > > and it's definitely the future in it's own way. If you want a good
> > > example
> > > > of Agda's power, I'll shamelessly promote some of my own code,
> wherein
> > I
> > > > use Agda to prove some trivial things in physics:
> > > >
> > > >
> > >
> >
> https://github.com/sheganinans/Static-Dimensions/blob/master/StaticDimensions.agda#L638
> > > >
> > > > On Thu, Apr 2, 2015 at 6:12 PM, Joey K Tuttle <[email protected]> wrote:
> > > >
> > > > > Lots of topics to chat about  ;-)
> > > > >
> > > > > Sorry to say, I don't admire your idealized child - but these
> things
> > > are
> > > > > indeed personal.
> > > > >
> > > > > In general, choice of language is intensely personal  -- e.g. in
> this
> > > > > story about a fellow "suing himself" over language license
> issues...
> > > > >
> > > > > http://www.forbes.com/sites/ryanmac/2014/06/05/on-the-
> > > > >
> > >
> verge-of-ipo-arista-networks-faces-lawusit-from-a-billionaire-cofounder/
> > > > >
> > > > > I don't know how that ended - but I thought that it was interesting
> > > view
> > > > > of passion for a language.
> > > > >
> > > > >
> > > > >
> > > > > On 2015/04/02 17:01 , Aistis Raulinaitis wrote:
> > > > >
> > > > >> Definitely. Thanks. I felt a little sad for the author's distaste
> > for
> > > > >> static typing. As a Haskeller I actually see static typing as tool
> > to
> > > > lean
> > > > >> into when things get tough, instead of a thing you have to work
> > > against.
> > > > >> To
> > > > >> me programs are just data and mappings of that data from input to
> > > > output.
> > > > >> I
> > > > >> may just be one of those weirdos, but for me (at least right now)
> > the
> > > > >> perfect language for me would be if J and Idris/Agda had a baby
> and
> > > > made a
> > > > >> dependently typed array language. ...and regained back APL's
> > symbolic
> > > > >> syntax and left ASCII in the dust.
> > > > >> On Apr 2, 2015 4:33 PM, "Joey K Tuttle" <[email protected]> wrote:
> > > > >>
> > > > >>  With this interesting article -
> > > > >>>
> > > > >>>
> > http://www.technologyreview.com/review/536356/toolkits-for-the-mind/
> > > > >>>
> > > > >>>
> > > > >>>
> > > > >
> > ----------------------------------------------------------------------
> > > > > For information about J forums see
> > http://www.jsoftware.com/forums.htm
> > > > >
> > > >
> ----------------------------------------------------------------------
> > > > For information about J forums see
> http://www.jsoftware.com/forums.htm
> > > >
> > > ----------------------------------------------------------------------
> > > For information about J forums see http://www.jsoftware.com/forums.htm
> > >
> > ----------------------------------------------------------------------
> > For information about J forums see http://www.jsoftware.com/forums.htm
> >
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to