[Jprogramming] Quadratic programming solvers in J

2016-03-12 Thread jonghough via Programming
Has anybody written a quadratic optimization solver in J? Or is there one in any of the packages? Examples: https://en.m.wikipedia.org/wiki/Quadratic_programming -- For information about J forums see http://www.jsoftware.com/f

Re: [Jprogramming] A.

2016-04-30 Thread jonghough via Programming
A. 0 1 5 is the same as A. 2 3 4 0 1 5 so the "missing" items seem to be implicitly placed before the specified items, in order. Sent from Outlook Mobile On Sat, Apr 30, 2016 at 8:32 AM -0700, "'Pascal Jasmin' via Programming" wrote: A. 1 3 5 36 A. 5 1 3 40 A. 1 5 3 37 A

Re: [Jprogramming] Niven's constant

2016-07-05 Thread jonghough via Programming
Hi, I don't have my PC at hand now, but what is the problem with your calculation? I had never heard of Nivens constant, but looking at Wikipedia there is a formula for it containing the zeta function. Why not use this formula? Zeta 2 3 4 are well known constants, and you can calculate or h

Re: [Jprogramming] Greatest Increasing Subsequence

2016-09-02 Thread jonghough via Programming
Yes, Raul is absolutely correct. And the flaw (in my solution, at least) is obvious now. I'll try for a correct solution again tomorrow. From: 'Mike Day' via Programming Sent: Saturday, September 3, 00:26 Subject: Re: [Jprogramming] Greatest Increasing Subsequence To: 'Mike Day' via Progr

Re: [Jprogramming] simplifying im2col

2019-04-14 Thread jonghough via Programming
I had a go writing conv nets in J. See https://github.com/jonghough/jlearn/blob/master/adv/conv2d.ijs This uses ;.3 to do the convolutions. Using a version of this , with a couple of fixes, I managed to get 88% accuracy on the cifar-10 imageset. Took several days to run, as my algorithms are no

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-17 Thread jonghough via Programming
sors. And notice the shape of the result shows it is a tensor, also.   filter2 =: filter,:_1+filter   cf2=: 4 :  '|:"2 |: +/ x filter2&(convFunc"3 3);._3 y'   $ (1 2 2,:3 3 3) cf2"3 i,:5+i NB. 2 2 3 3 Much of my effort regarding CNN has been studying the literature tha

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
stage. Part of my confusion is that I would have thought the transpose would have been 0 1 3 2 |:, instead. Can you say more about that? I have yet to try to understand your verbs `forward` and `backward`, but I look forward to doing so. I could not find definitions for the following functions and w

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
ailure: create__w |  4    =#shape I am pretty sure you are using different `create`s and are using them in unstated `cocurrent` environments. Would you mind providing the j environment at the start of this example? This most recent example with 5 3 8 8 shaped tensors is likely to be exactly what I

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
|:"2 |: +/ x filter&(convFunc"3 3);._3 y' > >    (1 2 2,:3 3 3) cf"3 i NB. 3 3$1 1 _2 _2 3 _7 _3 1  0 > > > > My next example makes both the `filter` and the RHA into tensors. And > > notice the shape of the result shows it is a tensor, also. > &g

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
Sorry, as I said in a previous email, the example I gave with runConv will not work, as it was made for a much older version of the project. Please try this as is, in jqt. NB. == A1=: 3 8 8 $ 1 1 1 1 1 1 1 1, 0 0 0 0 0 0 0 0, 0 0 0 0 0 0 0 0, 1 1 1 1 1 1 1 1, 0 0 0

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
il 19, 2019, 1:25:03 PM GMT+9, jonghough via Programming wrote: Sorry, as I said in a previous email, the example I gave with runConv will not work, as it was made for a much older version of the project. Please try this as is, in jqt. NB. == A1=: 3 8 8 $ 1 1 1

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
|      0!:0 y[4!:55<'y'   load'/Users/brian/j64-807-user/projects/jlearn/init.ijs' 1 Test success Simple GMM test, diagonal covariance ...   load jpath '~temp/simple_conv_test.ijs' not found: /users/brian/j64-807-user/temp/simple_conv_test.ijs On Fri, Apr 19, 2019 a

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-18 Thread jonghough via Programming
> so I should run `OUTPUT fit__pipe INPUT` 2 or 3 more times. Yes, I think so. After 2 or three more times, you should get all correct. 100% accuracy. >What does the other output mean? For example what is alternating 1 and 2, > what is 1...20, what is 10? There are 15 images. When we constructe

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-25 Thread jonghough via Programming
different. > > Thanks, > > -- > Raul > > On Thu, Apr 18, 2019 at 8:13 PM jonghough via Programming > wrote: > > > >  The convolution kernel function is just a straight up elementwise > multiply and then sum all, it is not a dot product or matrix product. >

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-26 Thread jonghough via Programming
07-user\projects\jlearn\init.ijs > |      0!:0 y[4!:55<'y' > |script[0] > |fn[0] > |      fn fl > |load[:7] > |  0    load y > |load[0] > | >  load'c:\Users\devon_mccormick\j64-807-user\projects\jlearn\init.ijs' > > The arguments to "dot&quo

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-28 Thread jonghough via Programming
I think you may be right. Thanks for pointing this out. However, since my networks mostly work, I am going to assume that having too many biases doesn't negatively impact the results, except for adding "useless" calculations. If you are correct, I should fix this. I have edited the source on a

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-29 Thread jonghough via Programming
under the assumption this is "wd" defined in JQt and this is some sort of progress message.  Is this correct? Thanks, Devon On Sun, Apr 28, 2019 at 9:20 AM jonghough via Programming < [email protected]> wrote: >  I think you may be right. Thanks for pointing this out. Howe

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-04-29 Thread jonghough via Programming
The locales may be a bit confusing, and if they are slowing down the training, then I will definitely rethink them. The main idea is that every layer is its own object and conducts its own forward and backward passes during training and prediction. Every layer, including Conv2D, LSTM, SimpleLaye

Re: [Jprogramming] convolutional neural network [was simplifying im2col]

2019-05-20 Thread jonghough via Programming
This looks very interesting. Sorry, I am traveling until next week so cannot give it much more than a quick look through at the moment. Next week I will try to run it. By the way, following you advice and issues you discovered with my convnet (bias shape in particular), I am refactoring my sour