Github user edespino commented on a diff in the pull request:

    https://github.com/apache/incubator-madlib/pull/149#discussion_r126803370
  
    --- Diff: src/modules/convex/type/state.hpp ---
    @@ -541,6 +541,161 @@ class GLMNewtonState {
         } algo;
     };
     
    +
    +/**
    + * @brief Inter- (Task State) and intra-iteration (Algo State) state of
    + *        incremental gradient descent for multi-layer perceptron
    + *
    + * TransitionState encapsualtes the transition state during the
    + * aggregate function during an iteration. To the database, the state is
    + * exposed as a single DOUBLE PRECISION array, to the C++ code it is a 
proper
    + * object containing scalars and vectors.
    + *
    + * Note: We assume that the DOUBLE PRECISION array is initialized by the
    + * database with length at least 6, and at least first elemenet
    + * is 0 (exact values of other elements are ignored).
    + *
    + */
    +template <class Handle>
    +class MLPIGDState {
    +    template <class OtherHandle>
    +    friend class MLPIGDState;
    +
    +public:
    +    MLPIGDState(const AnyType &inArray) : 
mStorage(inArray.getAs<Handle>()) {
    +        rebind();
    +    }
    +
    +    /**
    +     * @brief Convert to backend representation
    +     *
    +     * We define this function so that we can use State in the
    +     * argument list and as a return type.
    +     */
    +    inline operator AnyType() const {
    +        return mStorage;
    +    }
    +
    +    /**
    +     * @brief Allocating the incremental gradient state.
    +     */
    +    inline void allocate(const Allocator &inAllocator,
    +                         const uint16_t &inNumberOfStages,
    +                         const double *inNumbersOfUnits) {
    +        mStorage = inAllocator.allocateArray<double, 
dbal::AggregateContext,
    +                dbal::DoZero, dbal::ThrowBadAlloc>(
    +                        arraySize(inNumberOfStages, inNumbersOfUnits));
    +
    +        // This rebind is totally for the following lines of code to take
    --- End diff --
    
    I noticed you adjusted (removal of "totally") a similar comment in 
"src/modules/convex/type/state.hpp". Is the same update applicable here?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to