Something on OO is at
https://code.jsoftware.com/wiki/Vocabulary/ObjectOrientedProgramming
<https://code.jsoftware.com/wiki/Vocabulary/ObjectOrientedProgramming#Creating_an_instance_of_a_class>
If you are using lots of instances, using numeric atoms rather than
boxed strings for the locatives is faster in 9.01.
Henry Rich
On 4/29/2019 7:13 PM, jonghough via Programming wrote:
The locales may be a bit confusing, and if they are slowing down the
training, then I will definitely rethink them. The main idea is that
every layer is its own object and conducts its own forward and backward passes
during training and prediction.
Every layer, including Conv2D, LSTM, SimpleLayer, Activation, PoolLayer etc, is
a child of 'NNLayer', which itself is a child of 'NN' , which contains some
verbs useful to all layers (NN and NNLayer are defined in advnn.ijs).
This OO style is based on how I would do it in python, which I am far more
comfortable with than J. The style is probably not very J-esque (if there is
such a thing), but the fact that neural nets have so many moving parts (with
lots of mutable state), means some sort of object oriented programming is
suitable.
My understanding of J's OO system mainly comes from here:
https://www.jsoftware.com/help/learning/25.htm (there is not much literature on
this subject on the Jsoftware website, AFAIK). So just using coinsert and
coclass I created object hierarchies. Any advice for improvements of the design
are welcome.
As for global variables, I am trying to think, but can't recall any particular
global variables being passed around.
Basically the 'NNPipeline' object contains an array of 'NNLayer's (i.e.
layers__pipe), which it just iterates over using forward and backward verbs,
passing the output of one layer to the input of the next. backward, iterates in
reverse, passing the gradient values from the n+1th layer to the nth layer.
Each layer updates its internal parameters (i.e. weights) with each backward
pass.
So,
L1=:0{layers__pipe
w__L1
Will give you the weights of the first layer, assuming it is a 'SimpleLayer'
(i.e. Fully Connected Layer).
Thanks,
Jon On Tuesday, April 30, 2019, 3:02:44 AM GMT+9, Devon McCormick
<[email protected]> wrote:
Hi -
yes, I also have difficulty tracing through the very OOP-style of coding
though I have only attempted it manually so far. By my count, the example
model for the CIFAR10 problem creates 23 namespaces, each with three to
more than twenty values, most of them scalars. Also, this style of coding
appears to use globals to pass values, making it more difficult to figure
out the data flow.
My de-looping change decreased the execution time by such a large amount,
running hundreds of times faster, that I suspect it has to be wrong. Upon
reflection, I suspect that the use of namespaces and global scalars for
passing values essentially serializes the computation, so my attempt to
call the "fit" function with arrays fails to run all but one of the trials.
I intend to continue to work on this both because of my interest in
developing my own CNN for working with photos and for working out an
example of the difference between OOP and array programming.
Also, J's flaky handling of namespaces in debug does not help efforts to
understand this code at a high level.
Regards,
Devon
On Mon, Apr 29, 2019 at 1:16 PM Brian Schott <[email protected]> wrote:
Devon,
I'm not sure what I would have called my difficulty with understanding
Jon's fine jlearn system, but "de-looping" may be a good choice. To me the
difficulty involves the heavy use of locales, with which I have little
experience. This has meant that trying to trace verb calls and using debug
have been difficult for me. For example when I try to use debug for a verb
like foo__bar, the stop manager in jQt produces a paragraph of lines in the
selected verb but not a list of lines. The paragraph does not permit stop
line selection. I have not tried to use dbss to select stop lines, but
maybe that is possible.
I wonder if the complication locales inserts has been a reason that I have
not been able to find a usable verb/adverb/conjunction cross-reference
script for J.
The jlearn system is really impressive, though. It's documentation is
excellent inside most verbs.
--
(B=)
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm
---
This email has been checked for viruses by AVG.
https://www.avg.com
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm