Hi Ambica,
If the aim is to avoid overfitting and choose a reasonable number of
parameters, then DropOut might help reduce the size of grid search you
need to do - in particular, will likely need to write code to change
number of layers, but dropout changes layer size for you during training
phase.
Regards,
Benson
On 11/10/20 5:17 PM, Ambica Prasad wrote:
Hi Benson,
I am not sure how I would use DropOut to perform a grid-search over my
parameters. Could you elaborate?
Thanks,
Ambica
-----Original Message-----
From: mlpack <[email protected]> On Behalf Of Benson Muite
Sent: 08 November 2020 00:04
To: [email protected]
Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs (Ambica Prasad)
You may also want to examine the documentation on dropout:
https://www.mlpack.org/doc/mlpack-3.0.4/doxygen/classmlpack_1_1ann_1_1Dropout.html
On 11/7/20 9:15 PM, Aakash kaushik wrote:
Hey Ambica
So There is not a specific tutorial available for that but you can
always put the layer size in an array and loop over that for variable
layers sizes or you can sample random integers from a range and for
layer numbers I believe you have to change them manually every time
but not totally sure about it.
Best,
Aakash
On Sat, Nov 7, 2020 at 10:30 PM <[email protected]
<mailto:[email protected]>> wrote:
Send mlpack mailing list submissions to
[email protected] <mailto:[email protected]>
To subscribe or unsubscribe via the World Wide Web, visit
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
<http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
or, via email, send a message with subject or body 'help' to
[email protected]
<mailto:[email protected]>
You can reach the person managing the list at
[email protected]
<mailto:[email protected]>
When replying, please edit your Subject line so it is more specific
than "Re: Contents of mlpack digest..."
Today's Topics:
1. Tutorial for HyperParameterTuning for FFNs (Ambica Prasad)
---------- Forwarded message ----------
From: Ambica Prasad <[email protected]
<mailto:[email protected]>>
To: "[email protected] <mailto:[email protected]>"
<[email protected] <mailto:[email protected]>>
Cc:
Bcc:
Date: Sat, 7 Nov 2020 02:36:39 +0000
Subject: [mlpack] Tutorial for HyperParameterTuning for FFNs
Hi Guys,____
__ __
Is there an example or a tutorial that explains how to perform the
hyperparameter tuning for FFNs, where I can evaluate the network on
different number of layers and layer-sizes?____
__ __
Thanks,____
Ambica____
__ __
__ __
IMPORTANT NOTICE: The contents of this email and any attachments are
confidential and may also be privileged. If you are not the intended
recipient, please notify the sender immediately and do not disclose
the contents to any other person, use it for any purpose, or store
or copy the information in any medium. Thank you.
_______________________________________________
mlpack mailing list
[email protected] <mailto:[email protected]>
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
<http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
IMPORTANT NOTICE: The contents of this email and any attachments are
confidential and may also be privileged. If you are not the intended recipient,
please notify the sender immediately and do not disclose the contents to any
other person, use it for any purpose, or store or copy the information in any
medium. Thank you.
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack