holy shite REC!   Looks like pretty good KoolAid!

I cut my teeth 40 years ago on APL. Feels like what I *wished for* back then (studying Physics/Math with CS "just a tool").

As we talked a few years ago, I have a (still open, hanging fire) project to do real-time stitching on a 360 stereographic camera (84 cameras in a spherical array with more than 50% overlap with each neighbor, E/W and N/S)...

- Steve

On 2/16/17 8:57 AM, Roger Critchlow wrote:
I watched the livestream from the TensorFlow Dev Summit in Mountainview yesterday. The individual talks are already packaged up as individual videos at https://events.withgoogle.com/tensorflow-dev-summit/videos-and-agenda/#content, but watching the livestream with the enforced moments of deadtime filled with vaguely familiar music (was that Phillip Glass, or a network trained on him?) was very instructive.

TensorFlow is a data graph language where the data is all Tensors, ie vectors, matrices, and higher dimensional globs of numbers. Google open sourced it as python scripts and a C++ kernel about a year ago, updated with minor releases monthly, and released 1.0 yesterday. It's been used all over the place at Google, it's the top machine learning repo at github, and its products have made the cover of Nature twice or three times in the past year.

New stuff yesterday:

  * an LLVM compiler to native x86, arm, nvidia, etc.,
  * new language front ends,
  * pre-built networks and network components
  * classical ML techniques in case deep learning networks aren't your
    thing
  * distributed model training on pcs, servers, and GPUs
  * a server architecture for delivering inferences at defined latency
  * embedded inference stacks for Android, iOS, and Raspberry Pi
  * a very sweet visualizer, TensorBoard, for network architectures,
    parameters, and classified sets
  * higher level APIs
  * and networks trained to find network architectures for new classes
    of problems

You can get a lot of this by just watching the keynote, even just the first 10 minutes of the keynote.

Whether you buy the KoolAid or not, it's an impressive demonstration of the quantity and quality of KoolAid that the Google mind can produce when it decides that it needs KoolAid.

An LSTM is a Long Short-Term Memory node, a basic building block of the networks that translate languages or process other variable length symbol strings.

-- rec --



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

Reply via email to