Re: [Mojolicious] Re: Problem with streaming...

2020-05-15 Thread Dan Book
Take a look at the proxy helpers added recently, at the least they should
provide inspiration:
https://metacpan.org/pod/Mojolicious::Plugin::DefaultHelpers#proxy-%3Eget_p

-Dan

On Fri, May 15, 2020 at 1:13 AM Joseph Fridy  wrote:

> I have Mojo::UserAgent working to get the file from AWS S3 (including all
> the crypto foo), but I cannot stream its output from a controller attached
> to a Mojolicious::Lite route.  I am running into the same problem as is
> discussed in this thread from 2018:
>
> https://groups.google.com/forum/#!topic/mojolicious/fIGwnDOxl2E
>
> $ua->start($tx) is a blocking call, and as a result my chunks merely
> append to a buffer with nowhere to go until memory fills up.
> Unfortunately, if the way to solve this problem is answered in the
> responses to the thread above, said answer is too subtle for me to suss out.
>
> So, to restate my problem in words:
>
> given a Mojo::UserAgent transaction that can read a very large file, how
> can I stream its output, buffer by buffer, into a browser via a
> Mojolicious::Lite route?
>
> Regards,
>
> Joe Fridy
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Mojolicious" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to mojolicious+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/mojolicious/ce448a79-c8d4-4098-b0b0-d9564259b632%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mojolicious" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mojolicious+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/mojolicious/CABMkAVWo9iLVHZMF0YaGkCEekqXQ_XN4bLZDRB-4%3DJ6O9ezPgg%40mail.gmail.com.


[Mojolicious] Re: Problem with streaming...

2020-05-14 Thread Joseph Fridy
I have Mojo::UserAgent working to get the file from AWS S3 (including all 
the crypto foo), but I cannot stream its output from a controller attached 
to a Mojolicious::Lite route.  I am running into the same problem as is 
discussed in this thread from 2018:

https://groups.google.com/forum/#!topic/mojolicious/fIGwnDOxl2E

$ua->start($tx) is a blocking call, and as a result my chunks merely append 
to a buffer with nowhere to go until memory fills up.  Unfortunately, if 
the way to solve this problem is answered in the responses to the thread 
above, said answer is too subtle for me to suss out.

So, to restate my problem in words:

given a Mojo::UserAgent transaction that can read a very large file, how 
can I stream its output, buffer by buffer, into a browser via a 
Mojolicious::Lite route?

Regards,

Joe Fridy


-- 
You received this message because you are subscribed to the Google Groups 
"Mojolicious" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mojolicious+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/mojolicious/ce448a79-c8d4-4098-b0b0-d9564259b632%40googlegroups.com.


[Mojolicious] Re: Problem with streaming...

2020-05-10 Thread Joseph Fridy
My sincere apologies.  The two scraps are:

$tx->res->content->unsubscribe('read')->on(read => sub {

my($content, $bytes) = @_;

our $globalC;

our $digester;

our $transferLength;

our $contentLength;

if (!$content->headers->header('headersWritten')) {

  foreach my $name (@{$content->headers->names}) {

my $value = $content->headers->header($name);


 # $globalC should receive 
the streamed file

 # This is setting the 
headers from the S3

 # presigned URL.


$globalC->res->headers->header($name => $value);

if ($name =~ /[cC]ontent-[lL]ength/) {

  $contentLength = $value;

}

  }

  $content->headers->header('headersWritten' => 1);

  $globalC->write;

}

$digester->add($bytes);

$transferLength += length($bytes);

$globalC->write($bytes);

print "$transferLength bytes written...\n";

  });


Which fails after reading 668MB,


and 


  $c->proxy->get_p($url => $headers)->catch(sub {

  my $err = shift;

  print "Proxy error is $err\n";

  $c->render(text => "Error: $err\n");

});

Which fails after trying three or four times with a Connection refused.

Thanks,

Joe Fridy
On Sunday, May 10, 2020 at 6:53:42 AM UTC-4, Sebastian Riedel wrote:
>
> Please don't use HTML for formatting, the code is completely unreadable.
>
> --
> sebastian
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mojolicious" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mojolicious+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/mojolicious/6d17adda-fd0e-4e82-acc0-81e2f04617f2%40googlegroups.com.


[Mojolicious] Re: Problem with streaming...

2020-05-10 Thread Sebastian Riedel
Please don't use HTML for formatting, the code is completely unreadable.

--
sebastian

-- 
You received this message because you are subscribed to the Google Groups 
"Mojolicious" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mojolicious+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/mojolicious/29a905a1-0836-479a-935b-60a8e38c456a%40googlegroups.com.


[Mojolicious] Re: Problem with streaming...

2020-05-10 Thread Joseph Fridy
I reproduced the curl command with UserAgent successfully, and fixed my 
confusion with the headers.  Now I just have to be able to stream the 
results to the browser.  Here is the heart of my initial attempt, where 
$globalC points to the controller argument that should send data to the 
browser.  This reads data for 688MB, without apparently writing any of it 
to the browser, and then fails with an "Out of memory!"

$tx->res->content->unsubscribe('read')->on(read => sub {

my($content, $bytes) = @_;

our $globalC;

our $digester;

our $transferLength;

our $contentLength;

if (!$content->headers->header('headersWritten')) {

  foreach my $name (@{$content->headers->names}) {

my $value = $content->headers->header($name);


 # $globalC should receive 
the streamed file

 # This is setting the 
headers from the S3

 # presigned URL.


$globalC->res->headers->header($name => $value);

if ($name =~ /[cC]ontent-[lL]ength/) {

  $contentLength = $value;

}

  }

  $content->headers->header('headersWritten' => 1);

  $globalC->write;

}

$digester->add($bytes);

$transferLength += length($bytes);

$globalC->write($bytes);

print "$transferLength bytes written...\n";

  });

Attempted to use the $c->proxy->get_p helper in the following manner:

  $c->proxy->get_p($url => $headers)->catch(sub {

  my $err = shift;

  print "Proxy error is $err\n";

  $c->render(text => "Error: $err\n");

});



This appears to restart three or four times before failing with a 
"Connection refused".  The headers are the same headers that work correctly 
in the download with Mojo::UserAgent.

Any guidance?  I assure you I have read the references you have mentioned 
heretofore assiduously.  

Regards,

Joe Fridy



On Monday, May 4, 2020 at 3:05:11 AM UTC-4, Joseph Fridy wrote:
>
> I am attempting to stream the output of a curl command with Mojolicious.  
> The curl command looks (roughly) like this:
>
> curl -H "x-amz-server-side-encryption-customer-algorithm:AES256" -H 
> "x-amz-server-side-encryption-cust\
>
> omer-key:secretKey=" -H "x-amz-server-side-encryption-customer-key-M\
>
> D5:HashOfASecret==" 
> "https://mybucket.s3.amazonaws.com/obscureFileLocation?AWSAccessKeyId=secretStuff=1588568911=moreSecrets;
>  
> --\
>
> dump-header header.461 --silent
>
>
> The curl command references a pre-signed URL of a file in AWS stored with 
> Server Side Encryption with Client Side Keys (SSE-C), and supplies the 
> necessary key information via HTTP headers (the -H command options).  The 
> curl command works - but I don't want my users to have to have a system 
> with curl on it to access their files.  The plan is to open the curl 
> command as input to a pipe, and stream its output to the user's browser 
> with Mojolicious.  The curl command also dumps out the HTTP headers from 
> Amazon, so they can be used by Mojolicious.  They look like this:
>
>
> x-amz-id-2: 
> sgMzHD2FJEGJrcbvzQwdhZK6mxUW+ePd6xdghTfgSlV45lMhliIw4prfk4cZMTHbS4fJN8N7xio=
>
> x-amz-request-id: 99B9CA56083DD9ED
>
> Date: Mon, 04 May 2020 04:57:22 GMT
>
> Last-Modified: Sat, 02 May 2020 03:47:35 GMT
>
> ETag: "b3a11409be2705e4581119fa59af79d3-1025"
>
> x-amz-server-side-encryption-customer-algorithm: AES256
>
> x-amz-server-side-encryption-customer-key-MD5: HashOfSecretKey==
>
> Content-Disposition: attachment; filename = "fiveGigFile"
>
> Accept-Ranges: bytes
>
> Content-Type: application/octet-stream; charset=UTF-8
>
> Content-Length: 5368709125
>
> Server: AmazonS3 
>
>
> Note that the file is 5Gig.
>
>
>
> This is my stab at streaming with Mojolicious:
>
>
> use strict;
> use Mojolicious::Lite;
> use FileHandle;
> use Digest::MD5;
>
> any '/' => sub {
>   my $c = shift;
>   $c->render(template => "test");
> };
>
> any 'pickup' => sub {
>   my $c = shift;
>   my $nGigs = 0;
>   my $nMegs = 0;
>   $| = 1;
>   open(CURLCMD,"curlCmd");
>   my $curlCmd = ;
>   if ($curlCmd =~ /dump-header\s*(\S+)\s+/) {
>
> my $headerFile = $1;
> open(my $curl,"$curlCmd |");
> binmode $curl;
> my $initialized = 0;
> my $digester = Digest::MD5->new;
>
>   my $transferLength = 0;
>
>   my $drain;
>
>   $drain = sub {
>
>   my $c = shift;
>
>   my $chunk;
>
>   sysread($curl,$chunk,1024*1024);
>
>   if (!$initialized) {
>   # read the headers, and set up the transfer...
>
>open(HEADERS,$headerFile);
>
>while(my $line = ) {
>
>  $c->res->headers->parse($line);
>
>}
>close(HEADERS);
>
>$initialized = 1;
>
>print "header initialization completed for the following 
> headers\n";
>
>print join("\n",@{$c->res->headers->names}),"\n";
>
>   }
>
>   if ($initialized) {
>
>  while (length($chunk)) {
>
>