Re: [squid-users] YouTube and other streaming media (caching)
Hello! Aside from redirect bug, any known issues in YouTube caching? Would I risk too much if I put this on the production squid? On Saturday 08 November 2008 18:31, Kinkie wrote: > On Mon, Nov 3, 2008 at 6:32 PM, Horacio H. <[EMAIL PROTECTED]> wrote: > > Hi everybody, > > > > regarding this issue: > > > > http://wiki.squid-cache.org/WikiSandBox/Discussion/Yout > >ubeCaching > > For those interested, that page has been moved to > http://wiki.squid-cache.org/ConfigExamples/DynamicContent >/YouTube/Discussion
Re: [squid-users] YouTube and other streaming media (caching)
On Mon, Nov 3, 2008 at 6:32 PM, Horacio H. <[EMAIL PROTECTED]> wrote: > Hi everybody, > > regarding this issue: > > http://wiki.squid-cache.org/WikiSandBox/Discussion/YoutubeCaching For those interested, that page has been moved to http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube/Discussion -- /kinkie
Re: [squid-users] YouTube and other streaming media (caching)
Hi everybody, regarding this issue: http://wiki.squid-cache.org/WikiSandBox/Discussion/YoutubeCaching I came up with a workaroud, it's a rewriter script in PHP (sorry I'm not good at Perl, but maybe someone would be kind enough to later share a transcoded version... jeje) NOTE 1: Use this script for testing purposes only, It may not work as expected... I've tested it only with very few URLs... If you can improve it, please share. NOTE 2: To use this script you need the PHP command line interface. In Ubuntu yo can install it with this command: sudo apt-get install php5-cli NOTE 3: Make sure the log file is writable by the script. And now the script: #!/usr/bin/php -q http://[^/]+/(get_video|videodownload|videoplayback)\?@',$url) ) { ## Get reply headers ## $rep = get_headers($url); ## If reply is a redirect, make its store-URL unique to avoid matching the store-URL of a video ## $rnd = ""; if ( preg_match('/ 30[123] /',$rep[0]) ) { $rnd = "&REDIR=" . rand(1,9); } $url = preg_replace('@.*id=([^&]*)&?.*$@',"http://videos.SQUIDINTERNAL/ID=$1$rnd",$url); } ## Return rewrited URL ## print $url . "\n"; ## Record what we did on log ## fwrite($log,"$url $rep[0]\n"); ## May do some good, but I'm not sure ## flush(); } fclose($log); ?> ## END OF SCRIPT ## The trick here is knowing if the URL is a redirect (301, 302 or 303) with the get_headers function. It would be nice if the Squid process passed the HTTP status to the script, maybe as a key=value pair, but I'm not even a programmer so that is way beyond my knowledge... Regards, Horacio H.
Re: [squid-users] YouTube and other streaming media (caching)
Thanks! I've opened the bugzilla ticket. BTW, the script is in perl, I forgot the first line of the script (sorry!): - #!/usr/bin/perl $|=1; while (<>) { @X = split; $url = $X[0]; $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)&[EMAIL PROTECTED]://videos.youtube.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)[EMAIL PROTECTED]://videos.youtube.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)[EMAIL PROTECTED]://videos.google.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)&[EMAIL PROTECTED]://videos.google.INTERNAL/ID=$3@; # Uncomment this if the use of "squid://" is unaceptable: # $url =~ [EMAIL PROTECTED]://@^http://@; print "$url\n"; }
Re: [squid-users] YouTube and other streaming media (caching)
Cool! Could you drop this stuff into a bugzilla ticket so it doesn't get lost? Thanks! Adrian On Mon, Jun 09, 2008, Horacio Herrera Gonzalez wrote: > Hi Adrian, Ray and everyone... > > Here is a little contribution for the store_url_rewrite script, this > part deals with youtube and google video (At least, it works for > now)... > > NOTES: > > 1) This code is different because is based on the examples for the > url_rewrite_program > (http://wiki.squid-cache.org/SquidFaq/SquidRedirectors) and that way > was easier for me. > > 2) Warning! This code may match other sites not related to YT or GV. > > 3) I used "squid://" at the begining of the rewrited string to avoid > matching another rule, I know it's not standard and I'm still hoping > It doesn't cause any problems... Dear developers, please forgive my > insolence... :-) > > > $|=1; > while (<>) { > @X = split; > $url = $X[0]; > $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)&[EMAIL > PROTECTED]://videos.youtube.INTERNAL/ID=$3@; > $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)[EMAIL > PROTECTED]://videos.youtube.INTERNAL/ID=$3@; > $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)[EMAIL > PROTECTED]://videos.google.INTERNAL/ID=$3@; > $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)&[EMAIL > PROTECTED]://videos.google.INTERNAL/ID=$3@; > print "$url\n"; > } > > > 4) This is the relevant part of my squid.conf: > > acl store_rewrite_list url_regex ^http://(.*?)/get_video\? > acl store_rewrite_list url_regex ^http://(.*?)/videodownload\? > cache allow store_rewrite_list > > # Had to uncomment this again, because I couln'd login to google mail > using IE6 (firefox had no trouble): > acl QUERY urlpath_regex cgi-bin \? > cache deny QUERY > > refresh_pattern ^http://(.*?)/get_video\? 10080 90% 99 > override-expire ignore-no-cache ignore-private > refresh_pattern ^http://(.*?)/videodownload\? 10080 90% 99 > override-expire ignore-no-cache ignore-private > > storeurl_access allow store_rewrite_list > storeurl_access deny all > > storeurl_rewrite_program /usr/local/bin/store_url_rewrite > > > > Regards, -- - Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support - - $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -
Re: [squid-users] YouTube and other streaming media (caching)
Hi Adrian, Ray and everyone... Here is a little contribution for the store_url_rewrite script, this part deals with youtube and google video (At least, it works for now)... NOTES: 1) This code is different because is based on the examples for the url_rewrite_program (http://wiki.squid-cache.org/SquidFaq/SquidRedirectors) and that way was easier for me. 2) Warning! This code may match other sites not related to YT or GV. 3) I used "squid://" at the begining of the rewrited string to avoid matching another rule, I know it's not standard and I'm still hoping It doesn't cause any problems... Dear developers, please forgive my insolence... :-) $|=1; while (<>) { @X = split; $url = $X[0]; $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)&[EMAIL PROTECTED]://videos.youtube.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/get_video\?(.*)video_id=(.*?)[EMAIL PROTECTED]://videos.youtube.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)[EMAIL PROTECTED]://videos.google.INTERNAL/ID=$3@; $url =~ [EMAIL PROTECTED]://(.*?)/videodownload\?(.*)docid=(.*?)&[EMAIL PROTECTED]://videos.google.INTERNAL/ID=$3@; print "$url\n"; } 4) This is the relevant part of my squid.conf: acl store_rewrite_list url_regex ^http://(.*?)/get_video\? acl store_rewrite_list url_regex ^http://(.*?)/videodownload\? cache allow store_rewrite_list # Had to uncomment this again, because I couln'd login to google mail using IE6 (firefox had no trouble): acl QUERY urlpath_regex cgi-bin \? cache deny QUERY refresh_pattern ^http://(.*?)/get_video\? 10080 90% 99 override-expire ignore-no-cache ignore-private refresh_pattern ^http://(.*?)/videodownload\? 10080 90% 99 override-expire ignore-no-cache ignore-private storeurl_access allow store_rewrite_list storeurl_access deny all storeurl_rewrite_program /usr/local/bin/store_url_rewrite Regards,
Re: [squid-users] Youtube and other streaming media (caching)
> Amos and group, > > Thanks your answers! :-) > > Does anyboyd know how can I reserve 4 MB bandwith of 16 MB link > internet only to my users access youtube sites? > > I use FreeBSD 7.0 and squid 3.0-STABLE5. Sounds to me like an issue for QoS and the ZPH patch to Squid. The ZPH patch for 3.0 was withdrawn only due to policy reasons, and can be found at: http://www.squid-cache.org/Versions/v3/3.0/changesets/b8770.patch It has not been extensively tested. There should not be any kernel patches needed for what you want to do, just configuring the outbound QoS marker on outbound youtube requests (dstdomain .youtube.com). Amos > > Thanks. > > Rodrigo Gomes:. > > Amos Jeffries escreveu: >> Rodrigo de Oliveira Gomes wrote: >>> Henrik Nordstrom escreveu: On mån, 2008-06-02 at 18:27 -0300, Rodrigo de Oliveira Gomes wrote: > Hello Guys! > > Please, is the information about the message > (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) > work in squid version 3.0 stable 5? > No, only 2.7 so far. But is likely to be seen in 3.1 or 3.2 as well. Regards Henrik >>> Henrik and group, >>> >>>Thanks for answer my question... But I have another :-) >>> >>>I have a server with FreeBSD 7.0 and squid3.0-STABLE5... Just fine >>> :-) >>> >>>Today I have approximately 400 users in my proxy and 16 MB of link >>> (2 links of 8 MB with load balancing). Can I use delay pools to >>> guarantee 6 MB only to youtube videos? If yes, how can I do this? >> >> With the delay_access and rep_mime_type ACL. Though keep in mind delay >> pools are not a 'guarantee' they are limits. Networks service may be >> much slower than the pool allows, but cannot be faster. >> >> http://www.squid-cache.org/Versions/v3/3.0/cfgman/delay_access.html >> http://www.squid-cache.org/Versions/v3/3.0/cfgman/acl.html >> >> The videos all have a predictable mime type "video/flv". Which can be >> used in the ACL and access controls to assign a delay pool. >> >> >> Amos > > >
Re: [squid-users] Youtube and other streaming media (caching)
Amos and group, Thanks your answers! :-) Does anyboyd know how can I reserve 4 MB bandwith of 16 MB link internet only to my users access youtube sites? I use FreeBSD 7.0 and squid 3.0-STABLE5. Thanks. Rodrigo Gomes:. Amos Jeffries escreveu: Rodrigo de Oliveira Gomes wrote: Henrik Nordstrom escreveu: On mån, 2008-06-02 at 18:27 -0300, Rodrigo de Oliveira Gomes wrote: Hello Guys! Please, is the information about the message (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) work in squid version 3.0 stable 5? No, only 2.7 so far. But is likely to be seen in 3.1 or 3.2 as well. Regards Henrik Henrik and group, Thanks for answer my question... But I have another :-) I have a server with FreeBSD 7.0 and squid3.0-STABLE5... Just fine :-) Today I have approximately 400 users in my proxy and 16 MB of link (2 links of 8 MB with load balancing). Can I use delay pools to guarantee 6 MB only to youtube videos? If yes, how can I do this? With the delay_access and rep_mime_type ACL. Though keep in mind delay pools are not a 'guarantee' they are limits. Networks service may be much slower than the pool allows, but cannot be faster. http://www.squid-cache.org/Versions/v3/3.0/cfgman/delay_access.html http://www.squid-cache.org/Versions/v3/3.0/cfgman/acl.html The videos all have a predictable mime type "video/flv". Which can be used in the ACL and access controls to assign a delay pool. Amos
Re: [squid-users] Youtube and other streaming media (caching)
Rodrigo de Oliveira Gomes wrote: Henrik Nordstrom escreveu: On mån, 2008-06-02 at 18:27 -0300, Rodrigo de Oliveira Gomes wrote: Hello Guys! Please, is the information about the message (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) work in squid version 3.0 stable 5? No, only 2.7 so far. But is likely to be seen in 3.1 or 3.2 as well. Regards Henrik Henrik and group, Thanks for answer my question... But I have another :-) I have a server with FreeBSD 7.0 and squid3.0-STABLE5... Just fine :-) Today I have approximately 400 users in my proxy and 16 MB of link (2 links of 8 MB with load balancing). Can I use delay pools to guarantee 6 MB only to youtube videos? If yes, how can I do this? With the delay_access and rep_mime_type ACL. Though keep in mind delay pools are not a 'guarantee' they are limits. Networks service may be much slower than the pool allows, but cannot be faster. http://www.squid-cache.org/Versions/v3/3.0/cfgman/delay_access.html http://www.squid-cache.org/Versions/v3/3.0/cfgman/acl.html The videos all have a predictable mime type "video/flv". Which can be used in the ACL and access controls to assign a delay pool. Amos -- Please use Squid 2.7.STABLE1 or 3.0.STABLE6
Re: [squid-users] Youtube and other streaming media (caching)
Henrik Nordstrom escreveu: On mån, 2008-06-02 at 18:27 -0300, Rodrigo de Oliveira Gomes wrote: Hello Guys! Please, is the information about the message (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) work in squid version 3.0 stable 5? No, only 2.7 so far. But is likely to be seen in 3.1 or 3.2 as well. Regards Henrik Henrik and group, Thanks for answer my question... But I have another :-) I have a server with FreeBSD 7.0 and squid3.0-STABLE5... Just fine :-) Today I have approximately 400 users in my proxy and 16 MB of link (2 links of 8 MB with load balancing). Can I use delay pools to guarantee 6 MB only to youtube videos? If yes, how can I do this? Thanks again. Best Regards, Rodrigo de Oliveira Gomes:.
Re: [squid-users] Youtube and other streaming media (caching)
On mån, 2008-06-02 at 18:27 -0300, Rodrigo de Oliveira Gomes wrote: > Hello Guys! > > Please, is the information about the message > (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) > > work in squid version 3.0 stable 5? No, only 2.7 so far. But is likely to be seen in 3.1 or 3.2 as well. Regards Henrik signature.asc Description: This is a digitally signed message part
[squid-users] Youtube and other streaming media (caching)
Hello Guys! Please, is the information about the message (http://www.squid-cache.org/mail-archive/squid-users/200804/0420.html#replies) work in squid version 3.0 stable 5? thanks
Re: [squid-users] YouTube and other streaming media (caching)
On Wed, Apr 16, 2008, Ray Van Dolson wrote: > And here is the store_url_rewrite script. I added some logging: Cool! > Could likely remove the last elsif block at this point as it's catching > on the previous one now. But this is working great! Probably some > tuning yet to be done. Maybe someone could update the wiki with the > new regexp syntax. I'm keeping a slightly updated version of this stuff in my customer site. That way I can (try!) to keep on top of changes in the rules and notify customers when they need to update their scripts. The last thing I want to see is 100 different versions of my youtube caching hackery installed in places and causing trouble in the future. Adrian -- - Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support - - $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -
Re: [squid-users] YouTube and other streaming media (caching)
On Thu, Apr 17, 2008 at 08:11:51AM +0800, Adrian Chadd wrote: > The problem with caching Youtube (and other CDN content) is that > the same content is found at lots of different URLs/hosts. This > unfortunately means you'll end up caching multiple copies of the > same content and (almost!) never see hits. > > Squid-2.7 -should- be quite stable. I'd suggest just running it from > source. Hopefully Henrik will find some spare time to roll 2.6.STABLE19 > and 2.7.STABLE1 soon so 2.7 will appear in distributions. Thanks Adrian. FYI I got this to work with 2.7 (latest) based off the instructions you provided earlier. Here is my final config and the perl script used to generate the storage URL: http_port 3128 append_domain .esri.com acl apache rep_header Server ^Apache broken_vary_encoding allow apache maximum_object_size 4194240 KB maximum_object_size_in_memory 1024 KB access_log /usr/local/squid/var/logs/access.log squid # Some refresh patterns including YouTube -- although YouTube probably needs to # be adjusted. refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern -i \.flv$ 10080 90% 99 ignore-no-cache override-expire ignore-private refresh_pattern ^http://sjl-v[0-9]+\.sjl\.youtube\.com 10080 90% 99 ignore-no-cache override-expire ignore-private refresh_pattern get_video\?video_id 10080 90% 99 ignore-no-cache override-expire ignore-private refresh_pattern youtube\.com/get_video\? 10080 90% 99 ignore-no-cache override-expire ignore-private refresh_pattern . 0 20% 4320 acl all src 0.0.0.0/0.0.0.0 acl esri src 10.0.0.0/255.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT # Some Youtube ACL's acl youtube dstdomain .youtube.com .googlevideo.com .video.google.com .video.google.com.au acl youtubeip dst 74.125.15.0/24 acl youtubeip dst 64.15.0.0/16 cache allow youtube cache allow youtubeip cache allow esri # These are from http://wiki.squid-cache.org/Features/StoreUrlRewrite acl store_rewrite_list dstdomain mt.google.com mt0.google.com mt1.google.com mt2.google.com acl store_rewrite_list dstdomain mt3.google.com acl store_rewrite_list dstdomain kh.google.com kh0.google.com kh1.google.com kh2.google.com acl store_rewrite_list dstdomain kh3.google.com acl store_rewrite_list dstdomain kh.google.com.au kh0.google.com.au kh1.google.com.au acl store_rewrite_list dstdomain kh2.google.com.au kh3.google.com.au # This needs to be narrowed down quite a bit! acl store_rewrite_list dstdomain .youtube.com storeurl_access allow store_rewrite_list storeurl_access deny all storeurl_rewrite_program /usr/local/bin/store_url_rewrite http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow esri http_access deny all http_reply_access allow all icp_access allow all coredump_dir /usr/local/squid/var/cache # YouTube options. quick_abort_min -1 KB # This will block other streaming media. Maybe we don't want this, but using # it for now. hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? cache deny QUERY And here is the store_url_rewrite script. I added some logging: #!/usr/bin/perl use IO::File; use IO::Socket::INET; use IO::Pipe; $| = 1; $fh = new IO::File("/tmp/debug.log", "a"); $fh->print("Hello!\n"); $fh->flush(); while (<>) { chomp; #print LOG "Orig URL: " . $_ . "\n"; $fh->print("Orig URL: " . $_ . "\n"); if (m/kh(.*?)\.google\.com(.*?)\/(.*?) /) { print "http://keyhole-srv.google.com"; . $2 . ".SQUIDINTERNAL/" . $3 . "\n"; # print STDERR "KEYHOLE\n"; } elsif (m/mt(.*?)\.google\.com(.*?)\/(.*?) /) { print "http://map-srv.google.com"; . $2 . ".SQUIDINTERNAL/" . $3 . "\n"; # print STDERR "MAPSRV\n"; } elsif (m/^http:\/\/([A-Za-z]*?)-(.*?)\.(.*)\.youtube\.com\/get_video\?video_id=([^&]+).* /) { print "http://video-srv.youtube.com.SQUIDINTERNAL/get_video?video_id="; . $4 . "\n"; $fh->print("http://video-srv.youtube.com.SQUIDINTERNAL/get_video?video_id="; . $4 . "\n"); $fh->flush();
Re: [squid-users] YouTube and other streaming media (caching)
The problem with caching Youtube (and other CDN content) is that the same content is found at lots of different URLs/hosts. This unfortunately means you'll end up caching multiple copies of the same content and (almost!) never see hits. Squid-2.7 -should- be quite stable. I'd suggest just running it from source. Hopefully Henrik will find some spare time to roll 2.6.STABLE19 and 2.7.STABLE1 soon so 2.7 will appear in distributions. Adrian On Wed, Apr 16, 2008, Ray Van Dolson wrote: > Hello all, I'm beginning to implement a Squid setup and am in > particular looking to cache Youtube as it is a significant chunk of our > traffic and we don't want to outright block it (yet). > > I'm using squid-2.6.STABLE6 from RHEL 5.1 (latest errata). I've been > reading around a lot and am seeking a bit of clarification on the > current status of caching youtue and potentially other streaming media. > Specifically: > > * Adrian mentions support for Youtube caching in 2.7 -- which seems > to correspond with this changeset: > > http://www.squid-cache.org/Versions/v2/2.7/changesets/11905.patch > > Which would seem to be only a configuration file change. Is there > any reason Youtube caching won't work correctly in my 2.6 version > with a similar setup (and the rewriting script as well I guess)? > > * If there are additional changes to 2.7 codebase that make youtube > caching possible, are they insignificant enough that they could > easily be backported to 2.6? I'm trying to decide how I will > convince Red Hat to incorporate this as I doubt they'll want to > move to 2.7. Alternate of course is to build from source which I > am open to. > > My config file is as follows: > > http_port 3128 > append_domain .esri.com > acl apache rep_header Server ^Apache > broken_vary_encoding allow apache > maximum_object_size 4194240 KB > maximum_object_size_in_memory 1024 KB > access_log /var/log/squid/access.log squid > refresh_pattern ^ftp: 144020% 10080 > refresh_pattern ^gopher:14400% 1440 > refresh_pattern . 0 20% 4320 > > acl all src 0.0.0.0/0.0.0.0 > acl esri src 10.0.0.0/255.0.0.0 > acl manager proto cache_object > acl localhost src 127.0.0.1/255.255.255.255 > acl to_localhost dst 127.0.0.0/8 > acl SSL_ports port 443 > acl Safe_ports port 80 # http > acl Safe_ports port 21 # ftp > acl Safe_ports port 443 # https > acl Safe_ports port 70 # gopher > acl Safe_ports port 210 # wais > acl Safe_ports port 1025-65535 # unregistered ports > acl Safe_ports port 280 # http-mgmt > acl Safe_ports port 488 # gss-http > acl Safe_ports port 591 # filemaker > acl Safe_ports port 777 # multiling http > acl CONNECT method CONNECT > # Some Youtube ACL's > acl youtube dstdomain .youtube.com .googlevideo.com .video.google.com > .video.google.com.au > acl youtubeip dst 74.125.15.0/24 64.15.0.0/16 > cache allow youtube > cache allow youtubeip > cache allow esri > > http_access allow manager localhost > http_access deny manager > http_access deny !Safe_ports > http_access deny CONNECT !SSL_ports > http_access allow localhost > http_access allow esri > http_access deny all > http_reply_access allow all > icp_access allow all > coredump_dir /var/spool/squid > > # YouTube options. > refresh_pattern -i \.flv$ 10080 90% 99 ignore-no-cache override-expire > ignore-private > quick_abort_min -1 KB > > # This will block other streaming media. Maybe we don't want this, but > using > # it for now. > hierarchy_stoplist cgi-bin ? > acl QUERY urlpath_regex cgi-bin \? > cache deny QUERY > > I see logfile entries (and cached objects) that indicate my youtube > videos are being saved to disk. However they are never "HIT" even when > the same server is used. I wonder if the refresh_pattern needs to be > updated? The GET requests for the video do not have a .flv in their > filename What does refresh_pattern search for a match? The request > URL? The resulting MIME type? > > That's it for now. :) Thanks in advance. > > Ray -- - Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support - - $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -
[squid-users] YouTube and other streaming media (caching)
Hello all, I'm beginning to implement a Squid setup and am in particular looking to cache Youtube as it is a significant chunk of our traffic and we don't want to outright block it (yet). I'm using squid-2.6.STABLE6 from RHEL 5.1 (latest errata). I've been reading around a lot and am seeking a bit of clarification on the current status of caching youtue and potentially other streaming media. Specifically: * Adrian mentions support for Youtube caching in 2.7 -- which seems to correspond with this changeset: http://www.squid-cache.org/Versions/v2/2.7/changesets/11905.patch Which would seem to be only a configuration file change. Is there any reason Youtube caching won't work correctly in my 2.6 version with a similar setup (and the rewriting script as well I guess)? * If there are additional changes to 2.7 codebase that make youtube caching possible, are they insignificant enough that they could easily be backported to 2.6? I'm trying to decide how I will convince Red Hat to incorporate this as I doubt they'll want to move to 2.7. Alternate of course is to build from source which I am open to. My config file is as follows: http_port 3128 append_domain .esri.com acl apache rep_header Server ^Apache broken_vary_encoding allow apache maximum_object_size 4194240 KB maximum_object_size_in_memory 1024 KB access_log /var/log/squid/access.log squid refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl all src 0.0.0.0/0.0.0.0 acl esri src 10.0.0.0/255.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT # Some Youtube ACL's acl youtube dstdomain .youtube.com .googlevideo.com .video.google.com .video.google.com.au acl youtubeip dst 74.125.15.0/24 64.15.0.0/16 cache allow youtube cache allow youtubeip cache allow esri http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow esri http_access deny all http_reply_access allow all icp_access allow all coredump_dir /var/spool/squid # YouTube options. refresh_pattern -i \.flv$ 10080 90% 99 ignore-no-cache override-expire ignore-private quick_abort_min -1 KB # This will block other streaming media. Maybe we don't want this, but using # it for now. hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? cache deny QUERY I see logfile entries (and cached objects) that indicate my youtube videos are being saved to disk. However they are never "HIT" even when the same server is used. I wonder if the refresh_pattern needs to be updated? The GET requests for the video do not have a .flv in their filename What does refresh_pattern search for a match? The request URL? The resulting MIME type? That's it for now. :) Thanks in advance. Ray