[JOB SEARCH] Looking for a new job

2005-07-26 Thread Slava Bizyayev
Hi Guys,

Since my contract with Edison Schools, Inc. is over, I'm looking for the
next job. I'm very thankful to Stathy Touloumis, who managed my work at
Edison. It was my pleasure to work with Stathy and other talented
programmers on the team.

My next dream-job would be to develop an effective XML compression and
make it Open Source. Then I would be happy to proceed with so-called
effective content delivery through the proper combination of content
compression, client-side content caching, and implementation of
semi-static pages for dynamically generated content. Please let me know
if someone would be willing to pay me for the development of this
approach.

Alternatively, I can help you with your current needs in your commercial
projects. Please do not hesitate to ask!

I'm eligible to work in the US and other countries worldwide.

Sincerely,
-- 
Slava Bizyayev
--
== ApacheCon 2005 == Stuttgart, Germany 18-22 July 2005
http://apachecon.com/2005/EU/html/sessions.html
Improving Web Performance with Dynamic Compression
The same hardware, just 20 times faster. Friday, 22 July 2005, 10:30
--
[EMAIL PROTECTED]  +44(0)208-923-5913 http://www.dynagzip.com
[EMAIL PROTECTED]+44(0)794-766-4131 - mobile AIM: SlavaBizyayev



RE: Eagle book RandPicture.pm, $r-internal_redirect, and IE 6.0 showing same image every time

2005-07-01 Thread Slava Bizyayev
On Thu, 2005-06-30 at 22:52, David Christensen wrote:
 1.  Make a mod_perl call that tells the browser not to cache the
 upcoming document.  (Does such a call exist?)

Take a look at Expires HTTP header. See rfc2616 for additional details.

 
 2.  Make a mod_perl call that tells the browser that the upcoming
 document is newer than the last one requested (I might be able to
 implement this idea using time(), update_mtime(), and
 set_last_modified(), but it seems like a crude hack).

Sounds like an unnecessary hack. I would better pay more attention to
possible proxies somewhere on network... See Vary * HTTP header for
details.

Hope this helps,
-- 
Slava Bizyayev
--
Setting the pace for tomorrow's Internet today. Join us and find out
more about the future direction of this widely acknowledged server
software. == ApacheCon 2005 == Stuttgart, Germany 18-22 July 2005
http://apachecon.com/2005/EU/html/sessions.html
Improving Web Performance with Dynamic Compression
The same hardware, just 20 times faster. Friday, 22 July 2005, 10:00
--
[EMAIL PROTECTED]  +44(0)208-923-5913   http://www.dynagzip.com
[EMAIL PROTECTED]+44(0)794-766-4131 - mobile AIM: SlavaBizyayev



Re: Eagle book RandPicture.pm, $r-internal_redirect, and IE 6.0 showing same image every time

2005-07-01 Thread Slava Bizyayev
On Fri, 2005-07-01 at 09:10, Geoffrey Young wrote:
 Slava Bizyayev wrote:
  On Thu, 2005-06-30 at 22:52, David Christensen wrote:
  
 1.  Make a mod_perl call that tells the browser not to cache the
 upcoming document.  (Does such a call exist?)
  
  
  Take a look at Expires HTTP header. See rfc2616 for additional details.
 
   $r-no_cache(1) is probably simpler :)

agree (assuming that the implementation is sufficient and universal)..

  
  
 2.  Make a mod_perl call that tells the browser that the upcoming
 document is newer than the last one requested (I might be able to
 implement this idea using time(), update_mtime(), and
 set_last_modified(), but it seems like a crude hack).
  
  
  Sounds like an unnecessary hack. 
 
 say the current image in msie has a newer mtime on disk than the image
 you're about to redirect to.

But initially there is no locally cached content. All you need to do at
this point -- is to make sure that there were no caching proxies on your
way. So, Vary * makes it simple. ;-)

Thanks,
Slava




Re: Vanishing Requests

2005-05-12 Thread Slava Bizyayev
On Wed, 2005-05-11 at 20:22, David Marshall wrote:
 The basic problem: some requests just totally vanish.  No record
 appears of them in the apache log, and no response is returned to the
 browser.

How did you know that the request reaches your server?

--
Slava




Re: XML parser/unparser in Perl

2005-05-08 Thread Slava Bizyayev
Dear Igor,

Thank you for the interest to the mod_perl users list. Your contribution
might be very welcome over here after some formal modifications:

1. You need to make your packaging and documentation CPAN compatible,
because this is the only way to distribute your code to community and to
make your code popular. Clear POD is _extremely_ important, it is
extracted from your code automatically when the code is packaged
properly. The following resource might be of your interest:

http://www.perl.com/pub/a/2005/04/14/cpan_guidelines.html

2. You are expected to submit to the list the initial rfc (reference for
comments) proposing your module with clear explanations why/when your
code is better then other existing alternatives available from CPAN. You
can expect then a fair discussion on the list and will be able to decide
finally what to do with the module.

Yes, it takes an extra-time, but it worth to do, believe me..

On Sun, 2005-05-08 at 06:21, Igor Rojdestvenski wrote:
 I have made a dumb, simple and quite convenient XML parser/unparser in
 Perl, which may be quite handy for mod_perl developers. Feel free to
 download it (also free) from
 http://www.patronov.net/programs/hashparser.pm.
 Self documented. Any modifications of the code are permitted without
 any limitations. Comments will be appreciated as well as if you make a
 reference to Igor Rojdestvenski when using it.

Kind regards,
-- 
Slava Bizyayev
--
Setting the pace for tomorrow's Internet today. Join us and find out
more about the future direction of this widely acknowledged server
software. == ApacheCon 2005 == Stuttgart, Germany 18-22 July 2005
http://apachecon.com/2005/EU/html/sessions.html
The same hardware, just 20 times faster. Friday, 22 July 2005, 10:00
--
[EMAIL PROTECTED]  +44(0)208-923-5913   http://www.dynagzip.com
[EMAIL PROTECTED]+44(0)794-766-4131 - mobile AIM: SlavaBizyayev



Re: Web Content Compression FAQ - update

2005-04-19 Thread Slava Bizyayev
On Mon, 2005-04-18 at 23:08, Scott Gifford wrote:
 ... I'm just explaining why the  probably showed up.

Right, I hope I've fixed this in POD original for now by joining 'From'
line with the previous one. Luckily for me, it was not the beginning of
the paragraph...

In common case for the future, it would probably be safer to email POD
as *.pod.gz file, or inside the *.tar.gz, won't it?

Thanks,
Slava




Re: Implementing abstract URI with mod_perl?

2005-04-19 Thread Slava Bizyayev
On Tue, 2005-04-19 at 11:30, Devin Murphy wrote:
 I'm trying to implement abstract URL / URI calls.
 
 For my htdocs directory (and .html files), the
 MultiViews option works fine.  For example,
 http://localhost/address will resolve to its
 respective html file.
 
 No such luck when I'm trying to get Apache to resolve
 .pl files.  For example, http://localhost/perl/counter
 gives me a 404 error.  (The full path,
 http://localhost/perl/counter.pl, works right)
 
 The idea is to hide the .pl extension from average
 users / links / c.
 
 I've set up a simple Location tag:
 
 Location /perl
SetHandler perl-script
PerlHandler ModPerl::Registry
Options +ExecCGI
PerlSendHeader On
 /Location
 
 And it seems like MultiViews option causes problems
 with the scripts (it denies access / 403 error).
 
 Also no luck setting up a typemap / .var file.
 
 Any hints?

How about fine tuning of mod_rewrite or to write a simple
PerlTransHandler?

Thanks,
Slava




Web Content Compression FAQ - update

2005-04-18 Thread Slava Bizyayev
 wrote to the [EMAIL PROTECTED] mailing list:

I...With 98% of the world on a dial up modem, all they care about is how long 
it takes to download a page.  It doesn't matter if it consumes a few more CPU 
cycles if the customer is happy.  It's cheaper to buy a newer faster box, than 
it is to acquire new customers.

=for html
/blockquote

=head1 How hard is it to implement content compression on an existing site?

=head2 Implementing content compression on an existing site typically involves 
no more than installing and configuring an appropriate Apache handler on the 
Web server.

This approach works in most of the cases I have seen.
In some special cases you will need to take extra care with respect to the 
global architecture of your Web application,
but such cases may generally be readily addressed through various techniques.
To date I have found no fundamental barriers to practical implementation of Web 
content compression.

=head1 Does compression work with standard Web browsers?

=head2 Yes. No client side changes or settings are required.

All modern browser makers claim to be able to handle compressed content and are 
able to decompress it on the fly,
transparent to the user.
There are some known bugs in some old browsers, but these can be taken into 
account
through appropriate configuration of the Web server.

I strongly recommend use of the CApache::CompressClientFixup handler in your 
server configuration
in order to prevent compression for known buggy clients.

=head1 Is it possible to combine the content compression with data encryption?

=head2 Yes.  Compressed content can be encrypted and securely transmitted over 
SSL.

On the client side, the browser transparently unencrypts and uncompresses the 
content for the user.
It is important to maintain the correct order of operations on server side to 
keep the transaction secure.
You must compress the content first and then apply an encryption mechanism.
This is the only order of operations current browsers support.

=head1 What software is required on the server side for content compression?

=head2 There are four known mod_perl modules/packages for Web content 
compression available to date for Apache 1.3 (in alphabetical order):

=over 4

=item * Apache::Compress

a mod_perl handler developed by Ken Williams (U.S.), CApache::Compress,
can generate gzip output through the CApache::Filter.
This module accumulates all incoming data and compresses the entire content 
body as a unit.

=item * Apache::Dynagzip

a mod_perl handler developed by Slava Bizyayev, CApache::Dynagzip uses the 
gzip format
to compress output dynamically through the CApache::Filter or through the 
internal Unix pipe.

CApache::Dynagzip is most useful when one needs to compress dynamic outbound 
Web content
(generated on the fly from databases, XML, etc.) when content length is not 
known at the time of the request.

CApache::Dynagzip's features include: 

=over 4

=item * Support for both HTTP/1.0 and HTTP/1.1.

=item * Control over the chunk size on HTTP/1.1 for on-the-fly content 
compression.

=item * Support for Perl, Java, or C/C++ CGI applications.

=item * Advanced control over the proxy cache with the configurable CVary 
HTTP header.

=item * Optional control over content lifetime in the client's local cache with 
the configurable CExpires HTTP header.

=item * Optional support for server-side caching of the dynamically generated 
(and compressed) content.

=item * Optional extra-light compression

removal of leading blank spaces and/or blank lines, which works for all 
browsers,
including older ones that cannot uncompress gzip format.

=back

=item * Apache::Gzip

an example of the mod_perl filter developed by Lincoln Stein and Doug MacEachern
for their book IWriting Apache Modules with Perl and C (U.S.),
which like CApache::Compress works through CApache::Filter.
CApache::Gzip is not available from CPAN.
The source code may be found on the book's companion Web site at 
Lhttp://www.modperl.com/

=item * Apache::GzipChain

a mod_perl handler developed by Andreas Koenig (Germany),
which compresses output through CApache::OutputChain using the gzip format.

CApache::GzipChain currently provides in-memory compression only.
Use of this module under Cperl-5.8 or higher is appropriate for Unicode data.
UTF-8 data passed to CCompress::Zlib::memGzip() are converted to raw UTF-8 
before compression takes place.
Other data are simply passed through.

=back

=head1 What is the typical overhead in terms of CPU use for the content 
compression?

=head2 Typical CPU overhead that originates from content compression is 
insignificant.

In my observations of data compression of files of up to 200K it takes less 
then 60 ms CPU on a P4 3 GHz processor.
I could not measure the lower boundary reliably for dynamic compression, 
because there are no really measurable latency.
From the perspective of global architecture and scalability planning,
I would suggest allowing some 10 ms per request

Re: Web Content Compression FAQ - update

2005-04-18 Thread Slava Bizyayev
Thanks, Stas!
On Mon, 2005-04-18 at 12:10, Stas Bekman wrote:
 Thanks Slava, committed.
 
 I'd further suggest to drop all =head2 strings, merging the content with 
 the question, making the TOC even more useful and the text more readable.


Frankly speaking, I don't like this idea. From my point of view, it
makes all headers mess and confusing. In current version TOC is like an
abstract of the main text. In some cases it might be sufficient to read
a TOC only. Again, from my point of view, it makes sense, and I would
like to keep it as is...

 Also, please fix your original:
 
 - From the perspective of global architecture and scalability planning,
 + From the perspective of global architecture and scalability planning,

Strangely... There is no '' in my original. Neither I find in
attachment that I sent...

Anyway, thanks once again,
Slava




Re: Web Content Compression FAQ - new update

2005-04-11 Thread Slava Bizyayev
H Stas,

On Mon, 2005-04-11 at 11:53, Stas Bekman wrote:
 It's done automatically by using the first paragraph of the
 
=head1 Description
 
 section, which exists in you pod files but yours :) I've added a simple:
 
 =head1 Description
 
 Everything you wanted to know about web content compression
 
 Let me know if you want something more elaborate.

Thanks, I will add this to my original for now in order to edit later.

 
 Also if you with to maintain this and other document on the more frequent 
 basis we can give you a commit access, so you won't depend on others.

Yes, it would be helpful and convenient.

 
 Finally may I suggest to drop that Q/A format and just have the questions 
 w/o the leadinig Q.? It will make the Table of contents much easier to 
 read and it won't have any impact on the document itself.

Agree. I can clean 'em up tonight and commit tomorrow.

Thanks,
Slava




Web Content Compression FAQ - new update

2005-04-10 Thread Slava Bizyayev
 their own compression techniques that reduce the 
advantage of further compression.

=for html
blockquote

On May 21, 2002 Peter J. Cranstone wrote to the [EMAIL PROTECTED] mailing list:

I...With 98% of the world on a dial up modem, all they care about is how long 
it takes to download a page.  It doesn't matter if it consumes a few more CPU 
cycles if the customer is happy.  It's cheaper to buy a newer faster box, than 
it is to acquire new customers.

=for html
/blockquote

=head1 Q: How hard is it to implement content compression on an existing site?

=head2 A: Implementing content compression on an existing site typically 
involves no more than installing and configuring an appropriate Apache handler 
on the Web server.

This approach works in most of the cases I have seen.
In some special cases you will need to take extra care with respect to the 
global architecture of your Web application,
but such cases may generally be readily addressed through various techniques.
To date I have found no fundamental barriers to practical implementation of Web 
content compression.

=head1 Q: Does compression work with standard Web browsers?

=head2 A: Yes. No client side changes or settings are required.

All modern browser makers claim to be able to handle compressed content and are 
able to decompress it on the fly,
transparent to the user.
There are some known bugs in some old browsers, but these can be taken into 
account
through appropriate configuration of the Web server.

I strongly recommend use of the CApache::CompressClientFixup handler in your 
server configuration
in order to prevent compression for known buggy clients.

=head1 Q: Is it possible to combine the content compression with data 
encryption?

=head2 A: Yes.  Compressed content can be encrypted and securely transmitted 
over SSL.

On the client side, the browser transparently unencrypts and uncompresses the 
content for the user.
It is important to maintain the correct order of operations on server side to 
keep the transaction secure.
You must compress the content first and then apply an encryption mechanism.
This is the only order of operations current browsers support.

=head1 Q: What software is required on the server side for content compression?

=head2 A: There are four known mod_perl modules/packages for Web content 
compression available to date for Apache 1.3 (in alphabetical order):

=over 4

=item * Apache::Compress

a mod_perl handler developed by Ken Williams (U.S.), CApache::Compress,
can generate gzip output through the CApache::Filter.
This module accumulates all incoming data and compresses the entire content 
body as a unit.

=item * Apache::Dynagzip

a mod_perl handler developed by Slava Bizyayev, CApache::Dynagzip uses the 
gzip format
to compress output dynamically through the CApache::Filter or through the 
internal Unix pipe.

CApache::Dynagzip is most useful when one needs to compress dynamic outbound 
Web content
(generated on the fly from databases, XML, etc.) when content length is not 
known at the time of the request.

CApache::Dynagzip's features include: 

=over 4

=item * Support for both HTTP/1.0 and HTTP/1.1.

=item * Control over the chunk size on HTTP/1.1 for on-the-fly content 
compression.

=item * Support for Perl, Java, or C/C++ CGI applications.

=item * Advanced control over the proxy cache with the configurable CVary 
HTTP header.

=item * Optional control over content lifetime in the client's local cache with 
the configurable CExpires HTTP header.

=item * Optional support for server-side caching of the dynamically generated 
(and compressed) content.

=item * Optional extra-light compression

removal of leading blank spaces and/or blank lines, which works for all 
browsers,
including older ones that cannot uncompress gzip format.

=back

=item * Apache::Gzip

an example of the mod_perl filter developed by Lincoln Stein and Doug MacEachern
for their book IWriting Apache Modules with Perl and C (U.S.),
which like CApache::Compress works through CApache::Filter.
CApache::Gzip is not available from CPAN.
The source code may be found on the book's companion Web site at 
Lhttp://www.modperl.com/

=item * Apache::GzipChain

a mod_perl handler developed by Andreas Koenig (Germany),
which compresses output through CApache::OutputChain using the gzip format.

CApache::GzipChain currently provides in-memory compression only.
Use of this module under Cperl-5.8 or higher is appropriate for Unicode data.
UTF-8 data passed to CCompress::Zlib::memGzip() are converted to raw UTF-8 
before compression takes place.
Other data are simply passed through.

=back

=head1 Q: What is the typical overhead in terms of CPU use for the content 
compression?

=head2 A: Typical CPU overhead that originates from content compression is 
insignificant.

In my observations of data compression of files of up to 200K it takes less 
then 60 ms CPU on a P4 3 GHz processor.
I could not measure the lower boundary reliably for dynamic

Re: OOP or functional?

2005-04-02 Thread Slava Bizyayev
On Sat, 2005-04-02 at 07:29, Octavian Rasnita wrote:
 As a general idea, what way do you suggest to create the modules that will
 be used with mod_perl?
 Using the functional style (with the Exporter module), or using the object
 oriented style?
 
 I am asking this because I want to maximize the speed of the execution, and
 I know that the OOP way might be slower.

It appears to be a kind of OT, but if you really concerned about the
performance of a particular chunk of your code -- you should take a look
at NASM, MASM, etc. whatever is appropiate on your hardware. Since C
level all languages and techniques are mainly targeting the code
reusability.

Regards,
Slava




Re: Apache::Clean worthwhile in addition to mod_gzip ?

2005-02-24 Thread Slava Bizyayev
Hi Jonathan,

On Thu, 2005-02-24 at 11:13, Jonathan Vanasco wrote:

 a _ what is the typical overhead in terms of cpu use -- ie, 
 cpu/connection time saved by smaller/faster downloads vs those used by 
 zipping

The short answer is: The typical CPU overhead originated from the
content compression is insignificant.

Actually, in my observations over the files up to 200K it was estimated
as less then 60 ms on P4 3 GHz processor. I could not measure the low
boundary reliably. I can suggest to count some 10 ms per request on
regular web pages for estimation of performance, but I would refrain
from betting on this recommendation really.

The estimation of a connection time is even more complicated, because of
a variety of possible features on the network. The worse scenario is
pretty impressive: Slow dial-up user connected via the ISP with no
proxy/buffering holds your socket for a while that is proportional to
the size of the requested file. So far, gzip can make this some 3-20
times shorter. However, if the ISP is buffered, you might be feeling not
that bad apart of the fact that you are paying to your X-Telecom for the
transmission of a blank data.

 b_ just to clarify, mod_deflate is the only chain usable for apache 2 
 -- and the various Apache:: perlmods are unneeded are incompatible?

Basically, this is true for the Apache::Dynagzip at least, and I was
thinking that this is stated pretty clear in FAQ. Additional tiny
filters could be prepended to mod_deflate on Apache-2 in order to make
necessary features when required.

Thanks,
Slava
--
http://www.lastmileisp.com/




Re: Apache::Clean worthwhile in addition to mod_gzip ?

2005-02-24 Thread Slava Bizyayev
On Thu, 2005-02-24 at 11:17, Geoffrey Young wrote:
  b_ just to clarify, mod_deflate is the only chain usable for apache 2 --
  and the various Apache:: perlmods are unneeded are incompatible?
 
 there is an Apache::Clean designed for use with apache2 on cpan
 
   http://search.cpan.org/~geoff/Apache-Clean-2.00_5/
 
 you can also read about it here
 
   http://www.perl.com/pub/a/2003/04/17/filters.html
   http://www.perl.com/pub/a/2003/05/22/testing.html
 

Thanks Geoff, I got it.

Slava




Re: Apache::Clean worthwhile in addition to mod_gzip ?

2005-02-24 Thread Slava Bizyayev
On Thu, 2005-02-24 at 11:53, Mark Stosberg wrote:

 I hadn't read closely about Dynagzip before. Now I see that I see it
 does white space compression, I think I may stop there, and not try to
 add Apache::Clean to the mix as well.

However, please let me know if you decide to use it for some reason. It
should be compatible with Apache::Dynagzip within the Apache::Filter
chain. You can turn off the Light-Compression in that case, and use all
features of Apache::Clean instead.

 
  What question would you like to add to Web Content Compression FAQ?
 
 Well, I can tell you my question, but I can't tell if you it has been
 frequent. :) 
 
 Basically: Is it worth cleaning (safely modifying) HTML before it's
 compressed?

Thanks, I hope to edit the FAQ shortly using all mentioned questions.

Regards,
Slava




Re: [OSCon 2005 guidelines] what talks to submit

2005-02-10 Thread Slava Bizyayev
Hi everyone,

On Thu, 2005-02-03 at 02:56, Stas Bekman wrote:
  To remind: OSCON will be August 1-5 in Portland, OR.
  http://conferences.oreillynet.com/os2005/
  Proposals are due on Feb 13, so don't procrastine and submit your 
  proposals now: http://conferences.oreillynet.com/cs/os2005/create/e_sess
  

I've submitted my proposal, but it looks like there is no way to verify
whether it arrives to destination or not. Website does not indicate
anything, and there are no email notification even... Looks odd for me.
Am I missing something?

Thanks,
Slava




Re: [OSCon 2005] rfc Open Source Dynamic Data Compression

2005-02-03 Thread Slava Bizyayev
Thanks, Stas!

On Thu, 2005-02-03 at 02:51, Stas Bekman wrote:
 
 Looks good to me. IMHO I'd spend the little possible time on the history. 
 When you have only a 45min, as an attendee I'd rather be interested in the 
 present and may be a little bit about the future, but past is past... :)

I fully agree with your view on history. It was an initial idea to give
a hand to the mod_perl's main stream by the clear numerical example of
how good the technology (mod_per) is for successful further development.
On the other hand this part (sufficiently detailed in the text of the
proposal) somehow supports the idea of a New Open Source Paradigm,
that from my point of view decreases the chance for the proposal to be
rejected (again;-). However, I do not expect to spend a lot of time on
this during the session. My approximate time-line is supposed to
include:

some 20% - compression basics with examples and dynamical features;

some 10% - how good is Apache::Dynagzip on the top of the mod_perl;

some 50% - what is on the top of the Apache::Dynagzip that makes life of
ISPs (and their clients) easier; + brief overview of Internet Banking
features and the problem of the rich content on every page for content
providers.

the rest of the time -- some 20%, I hope to spend on prospective
implementations of dynamic data compression and hope to initiate a
discussion around this subject.


Any additional suggestions?

Thanks,
Slava
--
http://www.lastmileisp.com/




Re: [OSCon 2005] rfc Open Source Dynamic Data Compression

2005-02-03 Thread Slava Bizyayev
On Thu, 2005-02-03 at 16:12, Dan Brian wrote:
 
 I'd cite the fact that places like Google use compression for almost 
 all serving. A lot of people don't know that compression is wide-spread 
 among the big sites.
 

Yes Dan, you are right regarding the Google, however to date Google and
Yahoo are rather exceptions than the rule for the content delivery over
the web. It is estimated recently that only some 6% of the top 1000
businesses are compressing their web data when possible. The most common
misunderstanding among the IT managers is that it is impossible (or very
difficult at least) to compress the dynamically generated content. They
used to choose the dynamic content as an alternative to the compressed
one. I want them to know that they can happily benefit from both of
these goodies, using OS and/or commercial software (and support) when
necessary/appropriate.

Thanks,
Slava
--
http://users.outlook.net/~sbizyaye/dynagzip/
http://users.outlook.net/~sbizyaye/dynagzip/support/




Re: [OSCon 2005] rfc Open Source Dynamic Data Compression

2005-02-03 Thread Slava Bizyayev
On Thu, 2005-02-03 at 17:11, Dan Brian wrote:
  The fact that Google uses compression 
 automatically dispels the reasons people might not find your session 
 interesting: compression is not generally compatible with most web 
 browsers, nobody is using compression, etc. My suggestion is that 
 you use the fact that some of the top sites do use it successfully to 
 create interest. I'd put that right in your session description. To 
 emphasize what Stas said, you need to advertise what you are doing. 
 That includes hooking their interest with the information that says, 
 the best know something that the rest of you don't, rather than only 
 6% of you are using compression, to which anybody can draw their own 
 incorrect conclusions. :-)
 

Thank you for the brilliant idea, Dan! I didn't catch on it initially...
Let me see how to incorporate it better... Won't you offer a patch?

Thanks,
Slava




[OSCon 2005] rfc Open Source Dynamic Data Compression

2005-01-30 Thread Slava Bizyayev
I'm going to submit the following my talk proposal for a 45 minutes
presentation session on OSCon 2005:

Open Source Dynamic Data Compression in Business Implementations

1. Basics of the content compression.
2. Examples of compressed files;
Benefits of data compression in numbers.
3. Dynamic compression vs. Static compression;
Latency benefits in graphics and numbers.
4. History of the open source mod_perl Apache handler Apache::Dynagzip:
a) fast and flexible development on the base of mod_perl Apache-1.3;
b) open source debug benefits, time-line of bug reports, stability;
c) run-time benefits of mod_perl;
d) community feedback on the final product.
5. Recent commercial solutions on the top of the Dynagzip project:
a) The Last Mile ISP;
b) Internet Banking;
c) Semi-Static Pages for Dynamic Content.
6. Further implementations on the top of the open source prototype:
a) WAP;
b) XML;
c) web services.

I tried my best to make this informative for both of the developers and
managers, however, concentrating on business objectives mainly. I would
highly appreciate any constructive critique and/or suggestions prior to
submitting this to OSCon (approximately -- next weekend).

You can respond to the List, or privately, whatever you prefer.

Thanks,
Slava
--
http://www.lastmileisp.com





Re: [OSCon 2005 guidelines] what talks to submit

2005-01-29 Thread Slava Bizyayev
Thanks, Stas!

On Sat, 2005-01-29 at 17:33, Stas Bekman wrote:
... May be nobody knows 
 about your work, and you simply need to work on advertising it more. For 
 example by answering more questions here ...

I really don't mind to answer the list questions when capable, but
during the last year people often (for some reason) prefer to ask me
about Apache::Dynagzip and associates off the list when necessary. I
don't mind either. My response queue is empty to date (except a couple
of strange guys those have misconfigured spam filters). So far, I feel
myself still on team. ;-)

 
  Could it be interesting for OSCON?
 
 I think it is.

Thanks once again,
Slava
--
http://users.outlook.net/~sbizyaye/dynagzip/




Re: Apache::Dynagzip not seeing my headers from Apache::Registry script

2004-12-24 Thread Slava Bizyayev
Hi Alex,

Your problem is originated from the fact that your code is not
compatible with the Apache::Filter chain that you are trying to use.
Apache::Dynagzip does not see your header because Apache::RegistryFilter
does not allow it to get through. That's why Apache::Dynagzip sets up
the default Content-Type.

Please take a look at the Apache::Filter documentation in order to
register you script with the Apache::RegistryFilter properly.

I hope it helps,
Slava

On Fri, 2004-12-24 at 17:34, Alex Greg wrote:
 Hi,
 
 
 I've been trying for most of the evening to get Apache::Dynagzip
 working with Apache::Registry. I've gotten to the point whereby it
 works fine (compresses the content or not depending on the browser),
 but it sets the default Content-Type to text/html and disregards the
 headers I send from my script.
 
 
 The code below produces the following when called from a non-gzip browser:
 
 
 #!/usr/local/bin/perl
 
 use strict;
 
 my $r = Apache-request;
 
 my $var;
 
 open(FILE, text.txt);
 
 while (FILE)
 {
 $var .= $_;
 }
 
 close(FILE);
 
 $r-content_type(text/plain);
 $r-send_http_header;
 
 print $var;
 
 
 Response:
 
 Date: Fri, 24 Dec 2004 23:22:57 GMT
 Server: Apache
 X-Module-Sender: Apache::Dynagzip
 Expires: Friday, 24-December-2004 23:27:57 GMT
 Transfer-Encoding: chunked
 Connection: close
 Content-Type: text/html
 
 213c
 Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut tempor
 bibendum ante. Donec rutrum. Cras semper neque in tellus. Pellentesque
 blandit magna in nisl. Quisque dignissim cursus ligula. Curabitur
 augue nunc, varius in, faucibus ac, ultrices quis, nisl.
 
 
 Note that Apache::Dynagzip is setting the header to the default
 text/html even though I have explicitly set it to text/plain. The
 following debug is from the error log:
 
 
 [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
 Apache::Dynagzip default_content_handler is serving the main request
 for GET /cgi-bin/blah.cgi HTTP/1.1 targeting /www/cgi-bin/blah.cgi via
 /cgi-bin/blah.cgi Light Compression is Off. Source comes from Filter
 Chain. The client  does not accept GZIP.
 [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
 Apache::Dynagzip default_content_handler no gzip for GET
 /cgi-bin/blah.cgi HTTP/1.1 min_chunk_size=8192
 [Fri Dec 24 23:29:10 2004] [debug]
 /usr/lib/perl5/site_perl/5.8.3/Apache/Dynagzip.pm(917): [client
 192.168.1.6] Apache::Dynagzip default_content_handler creates default
 Content-Type for GET /cgi-bin/blah.cgi HTTP/1.1
 [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
 Apache::Dynagzip default_content_handler is done OK for
 /www/cgi-bin/blah.cgi 40172 bytes sent
 
 
 What's even stranger is when I use the same check that Dynagzip uses
 to see if the Content-Type is set, it fails. Appending this to my
 script:
 
 if ($r-header_out(Content-type))
 {
 warn Headers sent OK;
 }
 else
 {
 warn Headers NOT sent OK;
 }
 
 
 Results in the following in the error log:
 
 
 Headers NOT sent OK at /www/cgi-bin/blah.cgi line 29.
 
 
 My Apache configuration is as follows:
 
 
 Directory /www/cgi-bin
 SetHandler perl-script
 PerlHandler Apache::RegistryFilter Apache::Dynagzip
 PerlSetVar Filter On
 PerlSendHeader Off
 PerlSetupEnv On
 AllowOverride AuthConfig
 Options +ExecCGI
 /Directory
 
 
 Any advice on this would be much appreciated, as I have been banging
 my head against it all evening!
 
 
 -- Alex


-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: Apache::Dynagzip not seeing my headers from Apache::Registry script

2004-12-24 Thread Slava Bizyayev
Please Alex, reply to all in order to keep the thread on list.

I believe that Ken Williams could give you a more precise advise since
I'm always writing pure Apache handlers in similar cases, but just in
order to give a quick try I would suggest to register the input too. I'm
not quite sure, but to the best of my knowledge your script becomes a
part of the Apache::RegistryFilter since you are inside the
Apache::Filter chain, and you should get your input in accordance with
the Apache::Filter rules. However, I might be wrong on that, and we'll
have to wait for Ken then.

Good luck!
Slava

On Fri, 2004-12-24 at 19:59, Alex Greg wrote:
 On 24 Dec 2004 18:47:15 -0600, Slava Bizyayev [EMAIL PROTECTED] wrote:
  Hi Alex,
  
  Your problem is originated from the fact that your code is not
  compatible with the Apache::Filter chain that you are trying to use.
  Apache::Dynagzip does not see your header because Apache::RegistryFilter
  does not allow it to get through. That's why Apache::Dynagzip sets up
  the default Content-Type.
  
  Please take a look at the Apache::Filter documentation in order to
  register you script with the Apache::RegistryFilter properly.
 
 Hi Slava,
 
 
 Thanks for your reply. I've now registered my script with
 Apache::Filter as follows:
 
 
 #!/usr/local/bin/perl
 
 use strict;
 
 my $r = Apache-request;
 
 $r = $r-filter_register;
 
 
 my $var;
 
 open(FILE, text.txt);
 
 while (FILE)
 {
 $var .= $_;
 }
 
 close(FILE);
 
 $r-content_type(text/plain);
 $r-send_http_header;
 
 print $var;
 
 
 Unfortunately, I now get this in my error log:
 
 
 Not a HASH reference at
 /usr/lib/perl5/site_perl/5.8.3/Apache/Filter.pm line 197.
 
 
 Any ideas?
 
 
 Thanks,
 
 
 -- Alex
 
 
  On Fri, 2004-12-24 at 17:34, Alex Greg wrote:
   Hi,
  
  
   I've been trying for most of the evening to get Apache::Dynagzip
   working with Apache::Registry. I've gotten to the point whereby it
   works fine (compresses the content or not depending on the browser),
   but it sets the default Content-Type to text/html and disregards the
   headers I send from my script.
  
  
   The code below produces the following when called from a non-gzip browser:
  
  
   #!/usr/local/bin/perl
  
   use strict;
  
   my $r = Apache-request;
  
   my $var;
  
   open(FILE, text.txt);
  
   while (FILE)
   {
   $var .= $_;
   }
  
   close(FILE);
  
   $r-content_type(text/plain);
   $r-send_http_header;
  
   print $var;
  
  
   Response:
  
   Date: Fri, 24 Dec 2004 23:22:57 GMT
   Server: Apache
   X-Module-Sender: Apache::Dynagzip
   Expires: Friday, 24-December-2004 23:27:57 GMT
   Transfer-Encoding: chunked
   Connection: close
   Content-Type: text/html
  
   213c
   Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut tempor
   bibendum ante. Donec rutrum. Cras semper neque in tellus. Pellentesque
   blandit magna in nisl. Quisque dignissim cursus ligula. Curabitur
   augue nunc, varius in, faucibus ac, ultrices quis, nisl.
  
  
   Note that Apache::Dynagzip is setting the header to the default
   text/html even though I have explicitly set it to text/plain. The
   following debug is from the error log:
  
  
   [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
   Apache::Dynagzip default_content_handler is serving the main request
   for GET /cgi-bin/blah.cgi HTTP/1.1 targeting /www/cgi-bin/blah.cgi via
   /cgi-bin/blah.cgi Light Compression is Off. Source comes from Filter
   Chain. The client  does not accept GZIP.
   [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
   Apache::Dynagzip default_content_handler no gzip for GET
   /cgi-bin/blah.cgi HTTP/1.1 min_chunk_size=8192
   [Fri Dec 24 23:29:10 2004] [debug]
   /usr/lib/perl5/site_perl/5.8.3/Apache/Dynagzip.pm(917): [client
   192.168.1.6] Apache::Dynagzip default_content_handler creates default
   Content-Type for GET /cgi-bin/blah.cgi HTTP/1.1
   [Fri Dec 24 23:29:10 2004] [info] [client 192.168.1.6]
   Apache::Dynagzip default_content_handler is done OK for
   /www/cgi-bin/blah.cgi 40172 bytes sent
  
  
   What's even stranger is when I use the same check that Dynagzip uses
   to see if the Content-Type is set, it fails. Appending this to my
   script:
  
   if ($r-header_out(Content-type))
   {
   warn Headers sent OK;
   }
   else
   {
   warn Headers NOT sent OK;
   }
  
  
   Results in the following in the error log:
  
  
   Headers NOT sent OK at /www/cgi-bin/blah.cgi line 29.
  
  
   My Apache configuration is as follows:
  
  
   Directory /www/cgi-bin
   SetHandler perl-script
   PerlHandler Apache::RegistryFilter Apache::Dynagzip
   PerlSetVar Filter On
   PerlSendHeader Off
   PerlSetupEnv On
   AllowOverride AuthConfig
   Options +ExecCGI
   /Directory
  
  
   Any advice on this would be much appreciated, as I have been banging
   my head against it all evening!
  
  
   -- Alex
  
 


-- 
Report problems: http://perl.apache.org/bugs

Re: response data

2004-12-20 Thread Slava Bizyayev
Hi Vadim,

Sorry for misunderstanding your problem. I've been thinking that you are
on Apache-1.3...

I cannot help with Apache-2.

Slava

On Mon, 2004-12-20 at 08:01, Vadim wrote:

 http://perl.apache.org/docs/2.0/user/handlers/http.html#PerlCleanupHandler
 [quote]
 Using cleanup_register() acting on the request object's pool
 ...
 The important difference from using the PerlCleanupHandler handler, is that 
 here you can pass an optional arbitrary argument to the callback function, 
 and no $r argument is passed by default. 
 [/quote]
 



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: response data

2004-12-17 Thread Slava Bizyayev
Hi Vadim,

On Fri, 2004-12-17 at 06:09, Vadim wrote:
 i try to get a response data in the PerlCleanupHandler stage to do some 
 internal actions. This actions do not perform any modifications on a response 
 data. So i dont want to keep user waiting. And i supposed untill $r object is 
 alive it's possible to get the data. But it looks like i mistake. and it's 
 impossible.
 
 The solution is to register the cleanup handler in the PerlResponseHandler 
 script and to pass the data and $r object as the arguments:
 
 r-pool-cleanup_register(\MyApache::MyProxy::handler, { r = $r, data = 
 $data} );

You cannot transfer parameters this way, however $r is available and
_is_ correct inside the Cleanup phase of the request processing flow.
You can transfer a copy of your response to the cleanup handler using
pnotes if necessary. Alternatively, you can chain your cleanup handler
with your content generation handler using Apache::Filter chain, so that
all the work appears inside the content generation phase.

Hope this helps,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: mod_gzip + LogHandler

2004-12-16 Thread Slava Bizyayev
Hi Victor,

On Wed, 2004-12-15 at 02:11, victor wrote:
 The functoin served me well until I start enabling mod_gzip for apache.  
 It appears that with mod_gzip turned on, the Environment I have setup 
 for some reason is no longer available.  I have tried to switch to notes 
 and ened with the same result.
 
 Do anyone has any idea why this is happening and/or know any work around?

You can configure mod_gzip on separated proxy if you really need it.
Alternatively, you can use mod_perl gzip handlers. See
http://perl.apache.org/docs/tutorials/client/compression/compression.html for 
details.

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: Gzip compression for dynamically generated content

2004-11-26 Thread Slava Bizyayev
Sorry Alex, I hit a wrong button in my previous response.

It looks like your script is a CGI script that sends own CGI
Content-Type. In order to use Apache::Filter chain you need to make your
script Apache::Filter compatible. It might be as simple as to comment
the print of Content-Type in your script, or might be much more
complicated if you generate some other headers as well... Alternatively,
you can configure Apache::Dynagzip to serve your script as a CGI
application (with no Apache::Filter involved). It should be in Dynagzip
documentation provided with the module in distribution.

Hope this helps,
Slava

On Fri, 2004-11-26 at 11:48, Alex Greg wrote:
 Hi,
 
 
 I've been looking into using gzip to compress the output from our 
 mod_perl servers. After a bit of research, I decided to use 
 Apache::Dynagzip. My configuration is as follows:
 
 Directory /www/cgi-bin
  SetHandler perl-script
  PerlHandler Apache::RegistryFilter Apache::Dynagzip
  PerlSetVar Filter On
  PerlSetVar LightCompression On
  PerlSendHeader Off
  PerlSetupEnv On
  AllowOverride AuthConfig
  Options +ExecCGI
 /Directory
 
 
 This works great, but the only problem is that I'm seeing the 
 Content-Type header duplicated at the beginning of the HTTP response, 
 like so:
 
 
 [EMAIL PROTECTED] root]# (echo GET / HTTP/1.1; echo Host: 
 www.digitallook.com; echo) | nc 192.168.100.3 80
 HTTP/1.1 200 OK
 Date: Fri, 26 Nov 2004 17:44:04 GMT
 Server: Apache
 X-Module-Sender: Apache::Dynagzip
 Expires: Friday, 26-November-2004 17:49:06 GMT
 Transfer-Encoding: chunked
 Connection: close
 Content-Type: text/html
 
 200a
 Content-type: text/html
 !DOCTYPE html PUBLIC -//W3C//DTD XHTML 1.0 Transitional//EN 
 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd;
 html xmlns=http://www.w3.org/1999/xhtml; xml:lang=en lang=en
 head
 .
 
 
 We're already printing the Content-Type from our application, so I'm not 
 sure why it's being sent by Dynagzip as well.
 
 
 Can anyone shed any light on this, or suggest an alternative way to 
 dynamically compress the output from mod_perl?
 
 
 Regards,
 
 
 -- Alex


-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



[ANNOUNCE] Apache::Dynagzip 0.16

2004-11-07 Thread Slava Bizyayev
This is a bug-fix version. Upgrade is recommended for all users.

It was noticed that Content-Type was overwritten to default text/html
during the streaming of some static files through the compression
because $r-header_out('Content-Type') was not set yet at the time when
$r-content_type indeed is set properly in previous URI Translation
Handler. Fixed now.

Additionally, the documentation was lightly edited, especially regarding
the details of module's installation and appropriate mod_perl configuration.
Thanks to Kevin Austin for the hints.

The uploaded file

Apache-Dynagzip-0.16.tar.gz

has entered CPAN as

file: $CPAN/authors/id/S/SL/SLAVA/Apache-Dynagzip-0.16.tar.gz
size: 24530 bytes
md5: 9ef9de3813a9d471f334fb4c064e8fa2

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: mod_perl sometimes prints to error log instead of client

2004-09-08 Thread Slava Bizyayev
Sorry guys, I missed the beginning of this thread, won't you mind to
remind me what is Apache::Dynagzip suspected in?

Slava

On Wed, 2004-09-08 at 13:19, Perrin Harkins wrote:
 On Mon, 2004-09-06 at 23:15, Ryan Underwood wrote:
  I've had this strange problem off and on ever since we started using
  mod_perl.  Occasionally, when the page is generated and printed, the
  output goes to the Apache error log instead of to the client
 [...]
 PerlHeaderParserHandler sub { tie *STDOUT, 'Apache' unless tied 
  *STDOUT; }
 
 I suspect either the above hack, or Apache::DynaGzip and
 Apache::RegistryFilter.  These are messing with where your STDOUT gets
 sent and could lead to strange results if something died part-way
 through.
 
 - Perrin
 


-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: mod_perl sometimes prints to error log instead of client

2004-09-08 Thread Slava Bizyayev

On Wed, 2004-09-08 at 14:48, Perrin Harkins wrote:
 Archived here:
 http://mathforum.org/epigone/modperl/swoxsnurcro/[EMAIL PROTECTED]

Apache::Dynagzip is commented out in questioning configuration. This is
right first step to do in the first place when you suspect any
incompatibility. However, it is not quite clear from the message,
whether the problem disappears after commenting Apache::Dynagzip or not?

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: mod_perl sometimes prints to error log instead of client

2004-09-08 Thread Slava Bizyayev
Hi Ryan,
On Wed, 2004-09-08 at 15:56, Ryan Underwood wrote:
 No, the problem doesn't go away without Dynagzip.

So far, there is nothing about Apache::Dynagzip in this problem really.

 I couldn't get
 Dynagzip to work correctly with a nph CGI script either.  There is an
 option UseCGIHeadersFromScript but it appears to have no effect in the
 code.

Right, UseCGIHeadersFromScript should not have any effect inside the
Apache::Filter chain. There are simply no any CGI anymore since you
switch to Apache::Filter that carries own requirements on how you should
create HTTP headers in your script. You should consider appropriate
changes in your script before switching. Otherwise, you can try to
configure Apache::Dynagzip to serve your script as a CGI binary, and
continue to enjoy the UseCGIHeadersFromScript option in your
configuration if necessary.

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



ANNOUNCE: Apache::Dynagzip 0.15

2004-05-11 Thread Slava Bizyayev
This is a clean up version. Upgrade is not necessary.

The uploaded file

Apache-Dynagzip-0.15.tar.gz

has entered CPAN as

file: $CPAN/authors/id/S/SL/SLAVA/Apache-Dynagzip-0.15.tar.gz
size: 23843 bytes
md5: 00cfe9677721bb745bea0157a3bcde34

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



ANNOUNCE: Compress::LeadingBlankSpaces 0.04

2004-05-06 Thread Slava Bizyayev
This is a new feature version. Tag code is not compressed anymore.

Upgrade is necessary for users of Apache::Dynagzip experiencing
corrupted content associated with code tag.

The uploaded file

Compress-LeadingBlankSpaces-0.04.tar.gz

has entered CPAN as

file:
$CPAN/authors/id/S/SL/SLAVA/Compress-LeadingBlankSpaces-0.04.tar.gz
size: 3720 bytes
md5: ecb282cbd223510bc990fff02b2ff079

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



ANNOUNCE: Apache::Dynagzip 0.14

2004-05-01 Thread Slava Bizyayev
This is a POD edition mainly. Upgrade is necessary unless you're happy
with what you already have.

The uploaded file

Apache-Dynagzip-0.14.tar.gz

has entered CPAN as

file: $CPAN/authors/id/S/SL/SLAVA/Apache-Dynagzip-0.14.tar.gz
size: 23642 bytes
md5: cd61ffd8eb6486bbeabb1b5fe6e16dad

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



[ANNOUNCE] Compress::LeadingBlankSpaces 0.03

2004-04-18 Thread Slava Bizyayev
This is a bug-fix version. The bug can affect Apache::Dynagzip when
'light compression' is configured 'on'.

There was a bug in capitalization of tag models. It is fixed since this
version. Upgrade is recommended for all earlier versions.

The uploaded file

Compress-LeadingBlankSpaces-0.03.tar.gz

has entered CPAN as

file:
$CPAN/authors/id/S/SL/SLAVA/Compress-LeadingBlankSpaces-0.03.tar.gz
size: 3566 bytes
md5: 76f2048c4054a9062653ec98b4c374da

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



[ANNOUNCE] Apache::Dynagzip 0.13

2004-04-06 Thread Slava Bizyayev
This is a bug-fix version. Richard Chen has found this bug that affected
some web clients incapable to speak gzip. Thanks Richard! The bug is
fixed since this version.

Upgrade is recommended for all earlier versions.

The uploaded file

Apache-Dynagzip-0.13.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/S/SL/SLAVA/Apache-Dynagzip-0.13.tar.gz
  size: 31466 bytes
   md5: fe24d567344ab8594973a118c2a90eeb

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



Re: A bug in Apache::Dynazip?

2004-04-05 Thread Slava Bizyayev
Hi Richard,

On Mon, 2004-04-05 at 16:43, Richard Chen wrote:
 [Mon Apr  5 17:23:52 2004] [info] [client 123.123.123.123] Apache::Dynagzip 
 default_content_handler is serving the main request for GET 
 /path/to/file.tmpl?scheme=http HTTP/1.1 targeting /dev.myhost.com/path/to/file.tmpl 
 via /path/to/file.tmpl Light Compression is Off. Source comes from Filter Chain. The 
 client curl/7.10.7 (i586-pc-linux-gnu) libcurl/7.10.7 OpenSSL/0.9.7c zlib/1.1.4 does 
 not accept GZIP.

Yes, this seems like a bug. Thank you for the hint. I'll release a fix
shortly.

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



ANNOUNCE: Compress::LeadingBlankSpaces 0.02

2004-03-26 Thread Slava Bizyayev
This is a bug-fix version. The bug can affect Apache::Dynagzip when
'light compression' is configured 'on'.

The problem was brought to my attention that blank lines inside the
textarea.../textarea tag might be used intentionally for some
reasons.

This version does not compress data inside the textarea.../textarea
anymore. Thanks to Igor Jovanovic for the hint.

The uploaded file

Compress-LeadingBlankSpaces-0.02.tar.gz

has entered CPAN as

file:
$CPAN/authors/id/S/SL/SLAVA/Compress-LeadingBlankSpaces-0.02.tar.gz
size: 3113 bytes
md5: e8ba805c14c075e589fe19673fb131ad

One might refrain from upgrading unless she has outstanding complains.

Thanks,
Slava



-- 
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html



RE: [OT] [mod_perl] [Fun] telecommuting job

2003-12-12 Thread Slava Bizyayev
...When you have no sense of humor, you have to have at least a sense
that you have no sense of humor...
(Russian source: Kos'ma Prutkov - Aphorisms;   
 translated by Slava
Bizyayev)
Dear Haroon and Jonathan,

I like Christmas. It's the time when miracles used to happen in the
World. And they really happen when you are capable of seeing them.

We have nothing to discuss here. The story was not about the job; It was
about a Christmas miracle that you might see/believe or not.

Thanks,
Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



RE: [OT] [mod_perl] [Fun] telecommuting job

2003-12-12 Thread Slava Bizyayev
...When you have no sense of humor, you have to have at least a sense
that you have no sense of humor...
 (Russian source: Kos'ma Prutkov -Aphorisms;
  translated by Slava Bizyayev)



Dear Haroon and Jonathan,

I like Christmas. It's the time when miracles used to happen in the
World. And they really happen when you are capable of seeing them.

We have nothing to discuss here. The story was not about the job; It was
about a Christmas miracle that you might see/believe or not.

Thanks,
Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



RE: [OT] [mod_perl] [Fun] telecommuting job

2003-12-12 Thread Slava Bizyayev
On Fri, 2003-12-12 at 13:32, Shaw, Matthew wrote:
 Slava:
 
   Are you an idiot?

You bet, aren't you?

Thanks,
Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



RE: [OT] [mod_perl] [Fun] telecommuting job

2003-12-12 Thread Slava Bizyayev
On Fri, 2003-12-12 at 13:11, Chris Shiflett wrote:
 I'm curious now. You posted an off-topic message on a mod_perl mailing
 list that:
 
 1. Demonstrated how rude you were to another person.
 2. Revealed private correspondence on a public mailing list.
 
 You then describe this as, It was about a Christmas miracle that you
 might see/believe or not.
 
 Where is the Christmas miracle? You can reply privately, if you like.

Well, finally I see that the number of serious people on the list is
much more than I expected initially. I'm really sorry for being
bothering you Guys. It was just a funny Christmas story initially, and I
included the full listing of negotiations just in order to ensure reader
that the story is not a fake. It was never implied to provide full
flight analysis for this situation on the list. Indeed, I can explain
every my word providing the proof of my innocence on every stage of
those negotiations.

Why are you blaming me with the offensive behavior that ruined the
negotiations? I have no clue. All business negotiations were already
over after the first response of the Lady. That was her first response,
which was offensive and stupid. She did not hesitate to provide a lie
about my experience as the basis for discarding my application in the
message addressed directly to me. Could you imagine this happening with
you?

I never deal with mafia, criminals and idiots; That's my rule, and I do
no excuses based on race, gender, and/or nationality/ethnicity of the
party. I'm extremely serious about that. (That's why I decided to leave
Russia one day in the past.)

Back again, since the arrival of the first response from the Lady I was
not interested anymore in approaching this position. I've been wondering
about the Lady's business only. How good would it be for her client to
receive the falsified information about the prospective employees? Why
does it happen in her office? Chances were that the Lady was just
unaware about some internal problems with her helpers. At the time when
all Americans united stand against the World of Terra I could not just
resist from helping the poor Lady.

So far, I gave her the last one more chance in order to open her eyes,
to double-check her office procedures, and to provide me with clear and
unambiguous answer on my key question that I'm always so concerned
about. And so she did, didn't she?

That's what I call miracle...

Thanks for your attention,

Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



[Fun] telecommuting job

2003-12-11 Thread Slava Bizyayev
I've been recently looking for a telecommute job in order to provide
some Christmas support to my family. Obviously, I tried one ad from the
Perl Telecommuting Job List. Negotiations took not a long time. Full
listing is provided. Especially, I would underline the funny way that
Laurie J. Roth - Director of Search  Consulting - used to answer my
innocent question. Enjoy:

-

 -Original Message-
 From: Slava Bizyayev [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, December 10, 2003 11:32 PM
 To: [EMAIL PROTECTED]
 Subject: Re: Sr. Perl Developer


 Dear Sir/Lady,

 I have no XSLT experience. Other your requirements are OK. My resume
is
 attached in MS Word format. Please let me know if you would like to
 talk.

 Sincerely,
 Slava (Vyacheslav) Bizyayev


On Thu, 2003-12-11 at 08:39, Laurie Roth wrote:
 Slava,

 Thank you for sending your resume along, however, my client wants 4
solid
 years of perl development.  They are really looking for a Perl guru. 
Good
 luck!

 Laurie J. Roth
 Director of Search  Consulting
 561-347-6421 ext:215
 mailto:[EMAIL PROTECTED]


-Original Message-
From: Slava Bizyayev [mailto:[EMAIL PROTECTED]
Sent: Thursday, December 11, 2003 3:55 PM
To: Laurie Roth
Subject: RE: Sr. Perl Developer


Are you idiot?

Regards,
Slava




From: Laurie Roth
[EMAIL PROTECTED]
To: Slava Bizyayev
[EMAIL PROTECTED]
Subject: RE: Sr. Perl Developer
Date: Thu, 11 Dec 2003 16:08:07 -0500

I must have read another resume when I responded.  I apologize for the
confusion.  Your people skills are horrific and I could never consider
you for an opportunity with any of my clients.

Laurie J. Roth
Director of Search  Consulting
561-347-6421 ext:215
mailto:[EMAIL PROTECTED]

-

Happy Christmas to everyone,

Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



Re: [mp1] why PERL5LIB = PERL5LIB=/.../mod_perl-1.29/lib

2003-11-12 Thread Slava Bizyayev
It is a very good question about the way that mod_perl-1 creates @INC. I
would additionally point out the . that in fact transforms into /
for each mod_perl handler. I would appreciate if somebody can direct me
to appropriate docs concerning the main idea under the hood to create
@INC this way (and how it could be changed around the patch). It is
closely related to one problem I had about a month ago with the shaded
modules on one of my servers...

Thanks,
Slava


On Wed, 2003-11-12 at 16:28, Stas Bekman wrote:
 Anybody has an idea why mp1 build does:
 
 PERL5LIB = PERL5LIB=/path/to/mod_perl-1.29/lib,
 
 (see the top-level Makefile).
 
 I can't see how is it going to work:
 
 PERL5LIB=PERL5LIB=/tmp perl-5.8.1 -le 'print join \n, @INC'
 PERL5LIB=/tmp
 /home/stas/perl/5.8.1/lib/5.8.1/i686-linux
 /home/stas/perl/5.8.1/lib/5.8.1
 /home/stas/perl/5.8.1/lib/site_perl/5.8.1/i686-linux
 /home/stas/perl/5.8.1/lib/site_perl/5.8.1
 /home/stas/perl/5.8.1/lib/site_perl
 .
 
 it doesn't really add /tmp to @INC. Do I miss something? Looks like some 
 ancient workaround for some problems...
 
 __
 Stas BekmanJAm_pH -- Just Another mod_perl Hacker
 http://stason.org/ mod_perl Guide --- http://perl.apache.org
 mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
 http://modperlbook.org http://apache.org   http://ticketmaster.com
 


-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



Re: [mp1] why PERL5LIB = PERL5LIB=/.../mod_perl-1.29/lib

2003-11-12 Thread Slava Bizyayev
On Wed, 2003-11-12 at 19:02, Stas Bekman wrote:
 I'm afraid you are talking about a totally different thing.

Yes, of-course. I'm sorry Stas, my head has been spinning a bit today...

Slava



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html