I don't know if the hyperparameter tuner will work with vector<int>, but
give it a shot and see what happens. :)  In the worst case, you can
unpack all the elements into individual arguments (where 0 means "no
layer" I suppose).

On Fri, Nov 13, 2020 at 01:27:39PM +0000, Ambica Prasad wrote:
> Hi Ryan,
> 
>  Thanks a lot. This is helpful. I will try something like this.
> 
> class FFNWrapper
> {
>   ...
> 
>   template<typename MatType, typename LabelsType>
>   void Train(const MatType& data,
>              const LabelsType& labels,
>              const vector<int>& hidden_layers)
>   {
>   // Based on the vector, I will create the hidden_layers and start the 
> training
> 
>   }
>   ...
> };
> 
> Hope this works.
> 
> Thanks,
> Ambica
> -----Original Message-----
> From: Ryan Curtin <[email protected]>
> Sent: 13 November 2020 08:20
> To: Ambica Prasad <[email protected]>
> Cc: Benson Muite <[email protected]>; [email protected]
> Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs (Ambica 
> Prasad)
> 
> Hi Ambica,
> 
> There's one more thing worth mentioning.  The hyperparameter tuner works with 
> mlpack classifiers (or regressors) whose hyperparameters are specified in the 
> Train() call.  So, for instance, you could implement a class that works a 
> little like this:
> 
> class FFNWrapper
> {
>   ...
> 
>   template<typename MatType, typename LabelsType>
>   void Train(const MatType& data,
>              const LabelsType& labels,
>              const bool addSecondLayer)
>   {
>     // In this method you would build the network, and if
>     // `addSecondLayer` is true, you would add a second layer, then do
>     // the training.
>   }
> 
>   ...
> };
> 
> Now that is just one idea for a single boolean parameter, but you could 
> extend that to do search over architectures, so long as you can keep the 
> parameters of the architecture as parameters to Train().  Then I think the 
> hyperparameter tuner could work for that situation.
> 
> I hope this is helpful!  I know it would be a bit of implementation work, but 
> it should work (maybe with minor modifications). :)
> 
> On Wed, Nov 11, 2020 at 07:47:48PM +0000, Ambica Prasad wrote:
> > Thanks Benson, I get it now.
> >
> > Thanks,
> > Ambica
> >
> > -----Original Message-----
> > From: Benson Muite <[email protected]>
> > Sent: 12 November 2020 00:50
> > To: Ambica Prasad <[email protected]>; [email protected]
> > Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs
> > (Ambica Prasad)
> >
> > Hi Ambica,
> > If the aim is to avoid overfitting and choose a reasonable number of 
> > parameters, then DropOut might help reduce the size of grid search you need 
> > to do - in particular, will likely need to write code to change number of 
> > layers, but dropout changes layer size for you during training phase.
> > Regards,
> > Benson
> > On 11/10/20 5:17 PM, Ambica Prasad wrote:
> > > Hi Benson,
> > >
> > > I am not sure how I would use DropOut to perform a grid-search over my 
> > > parameters. Could you elaborate?
> > >
> > > Thanks,
> > > Ambica
> > >
> > > -----Original Message-----
> > > From: mlpack <[email protected]> On Behalf Of Benson
> > > Muite
> > > Sent: 08 November 2020 00:04
> > > To: [email protected]
> > > Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs
> > > (Ambica Prasad)
> > >
> > > You may also want to examine the documentation on dropout:
> > > https://www.mlpack.org/doc/mlpack-3.0.4/doxygen/classmlpack_1_1ann_1
> > > _1
> > > Dropout.html
> > >
> > > On 11/7/20 9:15 PM, Aakash kaushik wrote:
> > >> Hey Ambica
> > >>
> > >> So There is not a specific tutorial available for that but you can
> > >> always put the layer size in an array and loop over that for
> > >> variable layers sizes or you can sample random integers from a
> > >> range and for layer numbers I believe you have to change them
> > >> manually every time but not totally sure about it.
> > >>
> > >> Best,
> > >> Aakash
> > >>
> > >> On Sat, Nov 7, 2020 at 10:30 PM <[email protected]
> > >> <mailto:[email protected]>> wrote:
> > >>
> > >>      Send mlpack mailing list submissions to
> > >>      [email protected] <mailto:[email protected]>
> > >>
> > >>      To subscribe or unsubscribe via the World Wide Web, visit
> > >>      http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> > >>      <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
> > >>      or, via email, send a message with subject or body 'help' to
> > >>      [email protected]
> > >> <mailto:[email protected]>
> > >>
> > >>      You can reach the person managing the list at
> > >>      [email protected]
> > >> <mailto:[email protected]>
> > >>
> > >>      When replying, please edit your Subject line so it is more specific
> > >>      than "Re: Contents of mlpack digest..."
> > >>      Today's Topics:
> > >>
> > >>          1. Tutorial for HyperParameterTuning for FFNs (Ambica
> > >> Prasad)
> > >>
> > >>
> > >>
> > >>      ---------- Forwarded message ----------
> > >>      From: Ambica Prasad <[email protected]
> > >>      <mailto:[email protected]>>
> > >>      To: "[email protected] <mailto:[email protected]>"
> > >>      <[email protected] <mailto:[email protected]>>
> > >>      Cc:
> > >>      Bcc:
> > >>      Date: Sat, 7 Nov 2020 02:36:39 +0000
> > >>      Subject: [mlpack] Tutorial for HyperParameterTuning for FFNs
> > >>
> > >>      Hi Guys,____
> > >>
> > >>      __ __
> > >>
> > >>      Is there an example or a tutorial that explains how to perform the
> > >>      hyperparameter tuning for FFNs, where I can evaluate the network on
> > >>      different number of layers and layer-sizes?____
> > >>
> > >>      __ __
> > >>
> > >>      Thanks,____
> > >>
> > >>      Ambica____
> > >>
> > >>      __ __
> > >>
> > >>      __ __
> > >>
> > >>      IMPORTANT NOTICE: The contents of this email and any attachments are
> > >>      confidential and may also be privileged. If you are not the intended
> > >>      recipient, please notify the sender immediately and do not disclose
> > >>      the contents to any other person, use it for any purpose, or store
> > >>      or copy the information in any medium. Thank you.
> > >>      _______________________________________________
> > >>      mlpack mailing list
> > >>      [email protected] <mailto:[email protected]>
> > >>      http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> > >>      <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
> > >>
> > >>
> > >> _______________________________________________
> > >> mlpack mailing list
> > >> [email protected]
> > >> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> > >>
> > >
> > > _______________________________________________
> > > mlpack mailing list
> > > [email protected]
> > > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> > > IMPORTANT NOTICE: The contents of this email and any attachments are 
> > > confidential and may also be privileged. If you are not the intended 
> > > recipient, please notify the sender immediately and do not disclose the 
> > > contents to any other person, use it for any purpose, or store or copy 
> > > the information in any medium. Thank you.
> > >
> >
> > IMPORTANT NOTICE: The contents of this email and any attachments are 
> > confidential and may also be privileged. If you are not the intended 
> > recipient, please notify the sender immediately and do not disclose the 
> > contents to any other person, use it for any purpose, or store or copy the 
> > information in any medium. Thank you.
> > _______________________________________________
> > mlpack mailing list
> > [email protected]
> > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> >
> 
> --
> Ryan Curtin    | "Happy premise #2: There is no giant foot trying
> [email protected] | to squash me." - Kit Ramsey
> IMPORTANT NOTICE: The contents of this email and any attachments are 
> confidential and may also be privileged. If you are not the intended 
> recipient, please notify the sender immediately and do not disclose the 
> contents to any other person, use it for any purpose, or store or copy the 
> information in any medium. Thank you.
> 

-- 
Ryan Curtin    | "I don't really come from outer space."
[email protected] |   - L.J. Washington
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to