You asked: "So Markov Models are too slow, so we add backprop, at the cost
that it is now a blackbox?"

Markov Models, HMM, MDP, POMDP, search algorithms are one family of
algorithms.
They have states, actions, rewards, and learned transition probabilities.
They are easy to understand, not blackboxes.

Neural networks, perceptrons, gradient descent, GANs, convnets, RNNs, PCA,
ICA, SVD, Boltzmann Machines, RBMs, are another different family of
algorithms.
They are mostly homeomorphisms, aka projections from highly
multidimensional pattern spaces into lower dimensional pattern spaces.
They are difficult to understand. And that's why we believe they are
blackboxes.

It's difficult to find similarities between these 2 different families of
algorithms.
However, both families have a perfect synergism called Deep Reinforcement
Learning, which is the perfect marriage between Deep Learning and MDPs
(stochastic search algorithms).
Deep Learning represents complex states in MDPs.
Complex states could be as complex as abstract representations of pixels in
a videogame.
That's the perfect connection between both families.
And yes, Deep Learning reduces the dimensionality of complex states. So,
POMDPs become computationally tractable.

If you want to learn more about Deep RL, watch my webinar:

Webinar on Deep Reinforcement Learning (Secure & Private AI Scholarship
Challenge)
https://youtu.be/oauLZG9nAX0
(I'm sorry for my English, which is my second language.)


On Fri, Feb 12, 2021 at 1:41 AM <[email protected]> wrote:

> Very refreshing, good article you wrote I liked it. Interesting way of
> looking at it, a FFNN made of dimensional manifold manipulators to find the
> patterns, clustering/ building a FFNN of related patterns. Analyzing, hmm,
> my mind is twisting, analyzing, a FFNN of related patterns = t, h, th, a,
> n, an, thank....wind = breeze = gust....count letters in this post = other
> task........or count > inject q everywhere > count again..........or what
> if "cat hat dog, boat tall ship, zoo wind cage, loop truck ?" is made of
> recognize item 1 as item 3 +/> recognize tuple 1 as tuple 2 as etc,  I mean
> why are the patterns related/clustered, if they are related then it must be
> either made of or translates (entails...cat=dog cuz cat eat, dog
> eat).......I know there is a LOT of patterns, most are rare, many are the
> same ones. Ok now this FFNN made of dimensional manifold manipulators to
> find the patterns, analyzing, hmm, so what does "I was walking down the
> road and saw a ?" kind of fold look like, why does it need nonlinear
> untangling? Or is it those weird patterns like cat gum dog, bike wind car,
> ..., hmm these curves/desired activations found a way to conect dimensions
> so all the samples and unseen similar samples are in the same space...
> assuming your data is 2D or 45D images with red and more if want plotted
> dots in tanglish swirls that have a seemingly recognizable pattern even you
> could see it on screen... ok maybe given a string we map it to a space as
> recognized, then map to what comes next if even need another mapping, the
> words/BPE inputs are a single red dot plotted in some space, ..... nih this
> doesn't seem right, I must go to bed but this only seems to b a way to make
> it faster, not actually empirically why the next word / how / where it
> comes from, like how markov models tell you why it comes next - cuz it was
> saw 55 times more often based on conditional prompt given.
>
> The eigenfaces was interesting, how simply does this work, is it not just
> word2vec really? Wiki etc is giving me overload of text the answer must be
> way simpler ...
>
> Though you didn't answer the question I posted asking if backpropagation
> is just a way to make markov models fast ex. adding word2vec to MM may slow
> MM-based nets down, so maybe backprop is a way to make MM efficient. I
> mean, markov models are so easy, Transformers what the hell are those, see
> what I mean? Where does this manifoldulare dimensinoal folding sigmoids
> even come from where is Next Word ???
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Ta86fa089ebd8ca28-Mc8954cd7e10f3b4a7f335512>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta86fa089ebd8ca28-Mbd59068b5cbfdef1d2942e65
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to