On 2019/2/24 上午1:04, Khem Raj wrote:
On Sat, Feb 23, 2019 at 7:29 AM Richard Purdie
<richard.pur...@linuxfoundation.org> wrote:
On Fri, 2019-02-22 at 20:49 +0000, Manjukumar Harthikote Matha wrote:
You might be interested in the yocto layers for tensorflow,
tensorflow-lite and
caffe2 on github here [1]. I'm not part of the team that developed
that work but I
forwarded your announcement to them. Perhaps there is the
opportunity for some
collaboration on the platform independent parts. The maintainer
details are in the
readme.

Thanks for the layer Hongxu. I agree with Steve, it would be good if
you could collaborate with meta-renesas-ai and introduce the layer as
meta-ai under meta-openembedded.
Please don't do the meta-openembedded part!

I would agree to not make it a sub layer under meta-openembedded, but it can
be hosted on openembedded git infrastructure, I dont see much problem with that
if thats the case

I believe that meta-oe is too large to be maintainable and that we need
a larger number of smaller layers.

There is a fine balance to be had, that I have come to realize over years now
but AI is large enough and segmented enough to have a layer of its own.

Having tensorflow in its own layer which as a specific purpose and its
specific maintainers who understand it is in my view much more
desirable and sustainable.
I think its a good idea to have various AI infras in one layer
including tensorflow
unless we have large enough dev community to maintain each of them so I like
meta-ai conceptually.

I know to create a standalone meta-ai than meta-tensorflow is more reasonable, that's my initial

layer naming, but

- It will dramatically increase the maintainer burden, so I limit the scope to the specific framework

  name. There are lots of TODO in tensorflow and I am afraid I do not have extra attention to

  other AI framework recently.

- Tensorflow is standalone enough, its build system is google's `bazel', like bitbake, it has special

  rules to build everything from scratch. (I've already sent other unbazel built recipes to

  meta-openembedded)

- Bazel is built by java, if we do not create sub layer in meta-ai (such as meta-ai/meta-tensorflow),

  the number of meta-ai layer deps will be more and more along with other AI frameworks

  are added. For other AI framework customer, depends unused layer is not a good idea.

- For future AI framework integration, if the framework is huge like TensorFlow (another well known is

  Facebook's PyTorch), we could create a standalone layer and appoint special maintainer to maintain it;

  if the framework is small and light, or fundamental algorithm packages used by multiple frameworks,

  we could create a meta-ai for collection, or directly add them to meta-openembedded. (For TensorFlow

  integration, I added 11 fundamental recipes to meta-openembedded )

//Hongxu


Cheers,

Richard


--
_______________________________________________
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto

Reply via email to