Re: [BangPypers] My proposal for Pycon India 2016

2016-06-22 Thread Akshay Aradhya
Isnt that a bit to simple for PyCon ?
On 23 Jun 2016 00:07, "Annapoornima Koppad"  wrote:

> Dear All,
>
> I submitted my proposal for Pycon India 2016. Please do vote on this and
> hep me get to the conference.
>
> Data Collection Using Raspberry Pi 3.0 and sensors using Python | PyCon
> India 2016
>
> *https://in.pycon.org/cfp/2016/*
>
> proposals/data-collection-using-raspberry-pi-30-and-sensors-using-python~aO8Gb/
> <
> https://in.pycon.org/cfp/2016/proposals/data-collection-using-raspberry-pi-30-and-sensors-using-python~aO8Gb/
> >
>
> Thanks and regards,
> Annapoornima Koppad
> ___
> BangPypers mailing list
> BangPypers@python.org
> https://mail.python.org/mailman/listinfo/bangpypers
>
___
BangPypers mailing list
BangPypers@python.org
https://mail.python.org/mailman/listinfo/bangpypers


[BangPypers] My proposal for Pycon India 2016

2016-06-22 Thread Annapoornima Koppad
Dear All,

I submitted my proposal for Pycon India 2016. Please do vote on this and
hep me get to the conference.

Data Collection Using Raspberry Pi 3.0 and sensors using Python | PyCon
India 2016

*https://in.pycon.org/cfp/2016/*
proposals/data-collection-using-raspberry-pi-30-and-sensors-using-python~aO8Gb/


Thanks and regards,
Annapoornima Koppad
___
BangPypers mailing list
BangPypers@python.org
https://mail.python.org/mailman/listinfo/bangpypers


Re: [BangPypers] AWS S3 Routine with Python

2016-06-22 Thread Anand Chitipothu
On Wed, 22 Jun 2016 at 22:18 Sundar N  wrote:

> Hi ,
> Looking for some pointers on using Python to decompress files on AWS S3.
> I have been using Boto 2.x library .
> As of now I have a requirement to extract a compressed file in one of the
> s3 buckets,
> that needs to be extracted. There is no direct API to handle this as of
> now.
>
> It would be of great assistance , if there are any pointers to tackle this
> problem.


s3 is just a storage service. For uncompressing an archive, you need to do
some computation, which is you have to handle separately.

You need to download that file, extract it locally and then upload all the
files back to s3. If that file is too big or the bandwidth is not that
great on your local machine, you can try it on a server with a fat pipe.
Remember that S3 also charges you for the transfer. If you care about those
charges, then try using an EC2 server (IIRC transfers among AWS services
are not billed). If you need to do that operation a lot of times, try
exploring AWS lambda.

Anand
___
BangPypers mailing list
BangPypers@python.org
https://mail.python.org/mailman/listinfo/bangpypers


[BangPypers] AWS S3 Routine with Python

2016-06-22 Thread Sundar N
Hi ,
Looking for some pointers on using Python to decompress files on AWS S3.
I have been using Boto 2.x library .
As of now I have a requirement to extract a compressed file in one of the
s3 buckets,
that needs to be extracted. There is no direct API to handle this as of now.

It would be of great assistance , if there are any pointers to tackle this
problem.

Thanks in advance.
Sundar.
___
BangPypers mailing list
BangPypers@python.org
https://mail.python.org/mailman/listinfo/bangpypers