php-general Digest 25 Sep 2006 17:01:38 -0000 Issue 4367

Topics (messages 242181 through 242198):

Re: Alternative to FCKeditor
        242181 by: Lester Caine
        242183 by: Kae Verens

Re: Download files outside DocumentRoot Dir
        242182 by: Kae Verens
        242188 by: Miles Thompson
        242195 by: Christopher Weldon

Re: Error Reporting for file commands
        242184 by: James Nunnerley
        242197 by: Robert Cummings

Print or Echo takes lots of time
        242185 by: Sancar Saran
        242187 by: Colin Guthrie

Re: array_sum($result)=100
        242186 by: Ahmad Al-Twaijiry
        242194 by: Robin Vickery
        242196 by: Martin Alterisio
        242198 by: Robert Cummings

Re: Frustrated trying to get help from your site
        242189 by: Michelle Konzack
        242191 by: Ray Hauge
        242192 by: David Robley

How would you do this ?
        242190 by: Jad madi

Re: File Upload Security and chmod
        242193 by: tedd

Administrivia:

To subscribe to the digest, e-mail:
        [EMAIL PROTECTED]

To unsubscribe from the digest, e-mail:
        [EMAIL PROTECTED]

To post to the list, e-mail:
        php-general@lists.php.net


----------------------------------------------------------------------
--- Begin Message ---
Jon Anderson wrote:

John Taylor-Johnston wrote:

Anyone know of a good alternative to FCKeditor? Or a decent file uploader?
Even after paying for a little help, I get zip for FCK.
I need another solution, another editor with an active forum or support,
John

TinyMCE...I don't know how good/bad TinyMCE is, but if you can't use FCK, you could try it.

http://tinymce.moxiecode.com/

Looks like we are all looking for something that actually works :(
I switched TinyMCE off because it keeps messing up good html, and I've been told to try FCKeditor ;) Currently I seem to be stuck with using composer locally and dumping the results back to the site :(

--
Lester Caine - G8HFL
-----------------------------
L.S.Caine Electronic Services - http://home.lsces.co.uk
Model Engineers Digital Workshop - http://home.lsces.co.uk/ModelEngineersDigitalWorkshop/
Treasurer - Firebird Foundation Inc. - http://www.firebirdsql.org/index.php

--- End Message ---
--- Begin Message ---
John Taylor-Johnston wrote:
Anyone know of a good alternative to FCKeditor? Or a decent file uploader?
Even after paying for a little help, I get zip for FCK.
I need another solution, another editor with an active forum or support,
John

try my KFM addon for FCKeditor (http://kfm.verens.com/)

use the stable release for now, as the nightly release is being converted to use SQLite at the moment.

Kae

--- End Message ---
--- Begin Message ---
Ramiro wrote:
Hi,
i'm trying to find a good solution to this problem. I want download files from a directory outside DocumentRoot.

This files cannot be downloaded through direct url like http://site/test.zip. It must be downloaded after user login.

I know i can do that using some functions like fread() + fopen() or readfile(), than i would echo file buffer to browser with correct headers. But, reading then dumping file to browser is a big problem to server.


http://ie2.php.net/fpassthru

Kae

--- End Message ---
--- Begin Message ---
At 01:44 AM 9/25/2006, Ramiro wrote:

Hi,
i'm trying to find a good solution to this problem. I want download files
from a directory outside DocumentRoot.

This files cannot be downloaded through direct url like
http://site/test.zip. It must be downloaded after user login.

I know i can do that using some functions like fread() + fopen() or
readfile(), than i would echo file buffer to browser with correct headers.
But, reading then dumping file to browser is a big problem to server.

I've made one test that shows me i will "eat" 1.8% of RAM (i've used "ps
aux" at Linux, in a server with 2Gb of RAM) to download a 30Mb file at
60kb/s speed. So, imagine what a dump-php-script can do with 50 to 100
concurrently downloads. Probably i will need 1 TeraByte of RAM to provide
downloads ;)

Theres my question now. Is there other way to protect files against direct
downloading? (Obligating users to login and denying direct-url's).

I also know i can check referer by using Mod_Rewrite at Apache. But it isn't
secure, since referer cannot be sent or be fake.

Please, help me ;)

Thank you !
--------
Script i used to test:
<?
$url = "test.tar.gz";

header('Content-Description: File Transfer');
header('Content-Type: application/force-download');
header("Content-Disposition: attachment; filename=\"".basename($url)."\";");
header('Content-Length: ' . filesize($url));
@readfile($url) OR die();
?>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

This is the contents of a script used to fetch .swf's. The script is called from a Flash movie.

        $filenam =  $_REQUEST["filenam"];
        if ($filenam){
                $contents = file_get_contents( "../above_root/" . $filenam );
                echo $contents;
        }else{
                echo "Not found";
        }

HTH - Miles


--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.405 / Virus Database: 268.12.8/455 - Release Date: 9/22/2006

--- End Message ---
--- Begin Message ---
Ramiro wrote:
Hi,
i'm trying to find a good solution to this problem. I want download files from a directory outside DocumentRoot.

This files cannot be downloaded through direct url like http://site/test.zip. It must be downloaded after user login.

I know i can do that using some functions like fread() + fopen() or readfile(), than i would echo file buffer to browser with correct headers. But, reading then dumping file to browser is a big problem to server.

I've made one test that shows me i will "eat" 1.8% of RAM (i've used "ps aux" at Linux, in a server with 2Gb of RAM) to download a 30Mb file at 60kb/s speed. So, imagine what a dump-php-script can do with 50 to 100 concurrently downloads. Probably i will need 1 TeraByte of RAM to provide downloads ;)

Theres my question now. Is there other way to protect files against direct downloading? (Obligating users to login and denying direct-url's).

I also know i can check referer by using Mod_Rewrite at Apache. But it isn't secure, since referer cannot be sent or be fake.

Please, help me ;)

Thank you !
--------
Script i used to test:
<?
$url = "test.tar.gz";

header('Content-Description: File Transfer');
header('Content-Type: application/force-download');
header("Content-Disposition: attachment; filename=\"".basename($url)."\";");
header('Content-Length: ' . filesize($url));
@readfile($url) OR die();
?>

What you can do is put the downloads in a separate directory actually in
your webroot. Then, use a .htaccess file to include a PHP file which
checks for authentication.

ie:

File in : /var/www/htdocs/downloads/file.zip
Accessible by: http://site/downloads/file.zip

.htaccess:

php_value auto_prepend_file "/var/www/htdocs/authenticate.php"

authenticate.php would theoretically have some code to check that the
user is authenticated, and if not, redirect to a login screen before any
headers are sent to the user.

--
Christopher Weldon, ZCE
President & CEO
Cerberus Interactive, Inc.
[EMAIL PROTECTED]
979.739.5874

--- End Message ---
--- Begin Message ---
Sometime ago, I posted the email below, regarding some problems with a file
manager we have developed for our users.

The problem still exists, and is now starting to cause "complaints".  Mainly
from my manager, who's fed-up with receiving the error emails from the
system, but it must be annoying users!

The scenario is; we have a fairly large Linux cluster, with a RAID disc.
Obviously because of the number of disc calls, there's a fair amount of
caching.

We think the problem is related to this, and would ideally like purely to
suppress the error, as it's normally when the script is trying to delete
something already deleted.

I've tried turning off errors (error_reporting(0)) and indeed using @ on all
php file system calls, but it still triggers an error.

Does anyone know of any issues with suppressing errors on the following
functions?
- unlink
- mkdir
- copy
- scandir

All of these fail sporadically, even with the above error stuff turned off,
and trigger an error.

Anyone's thoughts gratefully received....

Thanks
Nunners

-----Original Message-----
From: James Nunnerley [mailto:[EMAIL PROTECTED] 
Sent: 25 July 2006 16:33
To: 'php-general@lists.php.net'
Subject: Error Reporting for file commands

We've created a file manager which allows users to access their web space on
a server.  It's working brilliantly, except that it would seem there are
some caching issues, either by the system cache or the web server cache that
are causing us a headache.

When the script tries to delete a file, we always check (using file_exists)
to see whether the file exists before it's deleted.

The check comes back true, but the unlink then fails, saying no file or
directory there!

We've tried turning off all errors (using error_reoprting(0) ) but this
would seem to have little difference in the error - it still comes back with
a failure.

We are using our own error handling, but before the command is carried out,
there is this 0 call...

Does anyone know how we can stop these errors?

Cheers
Nunners

--- End Message ---
--- Begin Message ---
On Mon, 2006-09-25 at 11:53 +0100, James Nunnerley wrote:
> Sometime ago, I posted the email below, regarding some problems with a file
> manager we have developed for our users.
> 
> The problem still exists, and is now starting to cause "complaints".  Mainly
> from my manager, who's fed-up with receiving the error emails from the
> system, but it must be annoying users!
> 
> The scenario is; we have a fairly large Linux cluster, with a RAID disc.
> Obviously because of the number of disc calls, there's a fair amount of
> caching.
> 
> We think the problem is related to this, and would ideally like purely to
> suppress the error, as it's normally when the script is trying to delete
> something already deleted.
> 
> I've tried turning off errors (error_reporting(0)) and indeed using @ on all
> php file system calls, but it still triggers an error.
> 
> Does anyone know of any issues with suppressing errors on the following
> functions?
> - unlink
> - mkdir
> - copy
> - scandir
> 
> All of these fail sporadically, even with the above error stuff turned off,
> and trigger an error.
> 
> Anyone's thoughts gratefully received....

Are you using a custom error handler? If so your error handler has to
properly handle errors that are incoming with the supression operator
enabled.

You can detect suppression for any given error by checking if
error_reporting() is set to 0.

Cheers,
Rob.
-- 
.------------------------------------------------------------.
| InterJinn Application Framework - http://www.interjinn.com |
:------------------------------------------------------------:
| An application and templating framework for PHP. Boasting  |
| a powerful, scalable system for accessing system services  |
| such as forms, properties, sessions, and caches. InterJinn |
| also provides an extremely flexible architecture for       |
| creating re-usable components quickly and easily.          |
`------------------------------------------------------------'

--- End Message ---
--- Begin Message ---
Hi,

When I was check the performance of my system I found interesting resuts.

My code stores html output into a variable. When page creation complete I 
printed out the variable.

Problem was generation html code takes 0.5 second and just 
echo $strPage takes 2.0 or more second.

my code structure was.

$strPage = "<html> yada dayda";
...
$strPage.= " another html tags";
...
$strPage.= getSqlDataAndCreateSomeHtmlCOde();
...
end of page creation. 
Current Total execution time 0.5 seconds.
print $strPage;
Current Total execution time 2.5 seconds.

$strPage carries entire html structure (for example equal of 100K html code);

excluding the cookie and other kind of header transfers and error messages, 
there was no print or echo command was submitted.

Is there any idea about this latency and any idea to find problem...

Regards

Sancar Saran

--- End Message ---
--- Begin Message ---
Sancar Saran wrote:
> Hi,
> 
> When I was check the performance of my system I found interesting resuts.
> 
> My code stores html output into a variable. When page creation complete I 
> printed out the variable.
> 
> Problem was generation html code takes 0.5 second and just 
> echo $strPage takes 2.0 or more second.

I read a while ago that it is more efficient to echo as you go rather
than store up a large variable with the content in it.

You may want to investigate output buffering instead as 9 times in 10
the sames results of using a big variable can be achieved.

It does seem odd none the less tho' that it takes so long.

Col

--- End Message ---
--- Begin Message ---
Hi,

This seems to be nice method, I tried to do translate it to PHP put
since I'm bad in English I couldn't :)

could you give me some prototype for the method.



On 9/25/06, Google Kreme <[EMAIL PROTECTED]> wrote:
On 24 Sep 2006, at 10:41 , Penthexquadium wrote:
> On Sun, 24 Sep 2006 19:06:11 +0300, "Ahmad Al-Twaijiry"
> <[EMAIL PROTECTED]> wrote:
>> I have array of numbers and I want to get out of it a list of numbers
>> that if I sum them it will be 100, here is my list (for example ) :
>
> I think you can try to sort the array in reverse order, and then
> calculate the sum of numbers in loops (end loop when the sum is larger
> than target sum).

That seems like a very slow way to do it, I think.

First thing, sort the array, then scan the array backwards for the
first number under the target sum (in this case, 50).  Then look for
another number that will add to make the target sum.  So, if you find
a 90, look for a 10.  If you find it, you're done.  If you don't find
it, then search for the next smallest number (33) and add it.  Then
repeat.  If you don't find a match at 83, then add the next smallest
number, 20.  But that puts you over your target, so you discard 33
and start over with 50 and the next lowest number, 20.  Now you are
only looking for numbers <=20.

This might be the perfect place to use a recursive function, as long
as you are careful to limit it's iteration cycles.

No way is this going to be done quickly.

--
And, while it was regarded as pretty good evidence of criminality to
be living in a slum, for some reason owning a whole street of them
merely got you invited to the very best social occasions.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




--

Ahmad Fahad AlTwaijiry

--- End Message ---
--- Begin Message ---
On 24/09/06, Ahmad Al-Twaijiry <[EMAIL PROTECTED]> wrote:
Hi everyone

I have array of numbers and I want to get out of it a list of numbers
that if I sum them it will be 100, here is my list (for example ) :

$list = array(10,20,10,10,30,50,33,110,381,338,20,11,200,100);


I want the result to be :

$result = array( 10,20,10,10,50);

as you can see in the array $result , if we array_sum($result) the
result will be 100.

is they any algorithm to do this ?

Ah, the Subset Sum Problem - this isn't school homework by any chance?

http://en.wikipedia.org/wiki/Subset_sum_problem

-robin

--- End Message ---
--- Begin Message ---
2006/9/25, Robin Vickery <[EMAIL PROTECTED]>:

On 24/09/06, Ahmad Al-Twaijiry <[EMAIL PROTECTED]> wrote:
> Hi everyone
>
> I have array of numbers and I want to get out of it a list of numbers
> that if I sum them it will be 100, here is my list (for example ) :
>
> $list = array(10,20,10,10,30,50,33,110,381,338,20,11,200,100);
>
>
> I want the result to be :
>
> $result = array( 10,20,10,10,50);
>
> as you can see in the array $result , if we array_sum($result) the
> result will be 100.
>
> is they any algorithm to do this ?

Ah, the Subset Sum Problem - this isn't school homework by any chance?


You're surely mistaken, there are many practical uses of the subset sum
problem in website development, like.... errr... shipping optimization!

--- End Message ---
--- Begin Message ---
On Mon, 2006-09-25 at 16:42 +0100, Robin Vickery wrote:
> On 24/09/06, Ahmad Al-Twaijiry <[EMAIL PROTECTED]> wrote:
> > Hi everyone
> >
> > I have array of numbers and I want to get out of it a list of numbers
> > that if I sum them it will be 100, here is my list (for example ) :
> >
> > $list = array(10,20,10,10,30,50,33,110,381,338,20,11,200,100);
> >
> >
> > I want the result to be :
> >
> > $result = array( 10,20,10,10,50);
> >
> > as you can see in the array $result , if we array_sum($result) the
> > result will be 100.
> >
> > is they any algorithm to do this ?
> 
> Ah, the Subset Sum Problem - this isn't school homework by any chance?
> 
> http://en.wikipedia.org/wiki/Subset_sum_problem

Cool, I didn't know it had a specific name, all I could think of was
that it sounded a lot like the knapsack problem. The Wikipedia article
indicates it's a special case of the knapsack problem.

Cheers,
Rob.
-- 
.------------------------------------------------------------.
| InterJinn Application Framework - http://www.interjinn.com |
:------------------------------------------------------------:
| An application and templating framework for PHP. Boasting  |
| a powerful, scalable system for accessing system services  |
| such as forms, properties, sessions, and caches. InterJinn |
| also provides an extremely flexible architecture for       |
| creating re-usable components quickly and easily.          |
`------------------------------------------------------------'

--- End Message ---
--- Begin Message ---
Am 2006-09-22 13:36:40, schrieb Arno Kuhl:

> I'm not sure which examples you're referring to but if you mean the user
> contributed notes then the download documentation does include this - at

Yes

> least one of the .chm versions does. It's great, but you need to download it

.chm ?  -  Windows ??? Meeeeeeeeeeeee??? <plof>@@@@

> regularly if you want the latest notes (obviously). Use one of the skins and
> it's even better (I use the phpZ skin which displays a tab for the user
> notes).

You mean in winhlp32.exe ? Right ?

It is a realy nice tool but unfortunatly for the false OS.

I was already thinking on coding a "linhelp" program, but
it seems there is one but I have not found it.

I like to have html files which I can put on my internal
documentation server.

Greetings
    Michelle Konzack
    Systemadministrator
    Tamay Dogan Network
    Debian GNU/Linux Consultant


-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
##################### Debian GNU/Linux Consultant #####################
Michelle Konzack   Apt. 917                  ICQ #328449886
                   50, rue de Soultz         MSM LinuxMichi
0033/6/61925193    67100 Strasbourg/France   IRC #Debian (irc.icq.com)

--- End Message ---
--- Begin Message ---
On Friday 22 September 2006 16:48, Michelle Konzack wrote:
> Am 2006-09-22 13:36:40, schrieb Arno Kuhl:
> > I'm not sure which examples you're referring to but if you mean the user
> > contributed notes then the download documentation does include this - at
>
> Yes
>
> > least one of the .chm versions does. It's great, but you need to download
> > it
>
> .chm ?  -  Windows ??? Meeeeeeeeeeeee??? <plof>@@@@
>
> > regularly if you want the latest notes (obviously). Use one of the skins
> > and it's even better (I use the phpZ skin which displays a tab for the
> > user notes).
>
> You mean in winhlp32.exe ? Right ?
>
> It is a realy nice tool but unfortunatly for the false OS.
>
> I was already thinking on coding a "linhelp" program, but
> it seems there is one but I have not found it.
>
> I like to have html files which I can put on my internal
> documentation server.
>
> Greetings
>     Michelle Konzack
>     Systemadministrator
>     Tamay Dogan Network
>     Debian GNU/Linux Consultant

It's not the best in the world, but it works.

http://xchm.sourceforge.net/index.html

But since the documentation is online and always updated that way, I prefer to 
just use the website.

-- 
Ray Hauge
Programmer/Systems Administrator
American Student Loan Services
www.americanstudentloan.com
1.800.575.1099

--- End Message ---
--- Begin Message ---
Michelle Konzack wrote:

> Am 2006-09-22 13:36:40, schrieb Arno Kuhl:
> 
>> I'm not sure which examples you're referring to but if you mean the user
>> contributed notes then the download documentation does include this - at
> 
> Yes
> 
>> least one of the .chm versions does. It's great, but you need to download
>> it
> 
> .chm ?  -  Windows ??? Meeeeeeeeeeeee??? <plof>@@@@
> 
>> regularly if you want the latest notes (obviously). Use one of the skins
>> and it's even better (I use the phpZ skin which displays a tab for the
>> user notes).
> 
> You mean in winhlp32.exe ? Right ?
> 
> It is a realy nice tool but unfortunatly for the false OS.
> 
> I was already thinking on coding a "linhelp" program, but
> it seems there is one but I have not found it.
> 
> I like to have html files which I can put on my internal
> documentation server.

xchm is probably what you are thinking of - xchm.sourceforge.net



Cheers
-- 
David Robley

"Use your own toothbrush!" Tom bristled.
Today is Pungenday, the 49th day of Bureaucracy in the YOLD 3172. 

--- End Message ---
--- Begin Message ---
I'm building an RSS aggregator so I'm trying to find out the best way to
parse users account feeds equally so Lets say we have 20.000 user with
average of 10 feeds in account so we have about
200.000 feed

How would you schedule the parsing process to keep all accounts always
updated without killing the server? NOTE: that some of the 200.000 feeds
might be shared between more than one user

Now, what I was thinking of is to split users into
1-) Idle users (check their account once a week, no traffic on their RSS
feeds)
2-) Idle++ (check their account once a week, but got traffic on their
RSS feeds)
2-) Active users (Check their accounts regularly and they got traffic on
their RSS feeds)

NOTE: The week is just an example but at the end it’s going to be
dynamic ratio

so with this classification I can split the parsing power and time to
1-) 10% idle users
2-) 20% idle++ users
3-) 70% active users.

NOTE: There is another factors that should be included but I don’t want
to get the idea messy now (CPU usage, Memory usage, connectivity issues
(if feed site is down) in general the MAX execution time for the
continues parsing loop shouldn’t be more than 30 minutes 60 minutes)
Actually I’m thinking of writing a daemon to do it “just keep checking
CPU/memory” and excute whenever a reasonable amount of resource
available without killing the server.


Please elaborate.

--- End Message ---
--- Begin Message ---
At 9:32 PM -0600 9/24/06, Andy Hultgren wrote:
Hi Tedd,

Yes, when I browse to www.myDomain.com I get the index.html file, and so I
have been leaving the .public_html/ directory alone since it is not my
root.  I'm curious, what you described is exactly what I'm trying to do -
what permissions do you set the parent folder at when you are finished
uploading/saving/downloading/etc.?  I have my "uploaded_images/"
directory set at chmod 0100 and I can still browse to an uploaded image from
my file upload page...  Thanks for your response,


Andy:

I ran into the same problem trying to work with, and understand, permissions on a virtual host. When I asked this gang about permissions some time back, I received answers that ranged from RTFM to calling me stupid for using 0777, but none answered my question. No fault of the gang, I probably didn't ask the question correctly. In any event, I felt too stupid to ask the question again, so I went elsewhere looking for answers and eventually found something that works for me.

Some consider me a novice, so I'll ask the gang to overview my comments to make sure that I'm not guiding you down the wrong path.

As you know, the key to setting the permissions of a file depends upon the permissions the parent folder. If the parent folder permission is set to 0777, then we can change any files inside the folder as we want. However, that also presents a major security hole because then anyone can use that folder to upload and run evil code.

So, the key problem is how to alter parent folder permissions.

With virtual hosting, we can upload, manage, and set permissions as we want via our FTP connection software. So, I thought perhaps php had something like that and as such I discovered how to ftp connect via php.

Now, not all php ftp_<commands> are available to php 4, but you can connect to your site and change permissions of folders, which is what we actually need. So, if you want to do something with a file: then change the folder permissions of the folder that holds it; do whatever you want with the file; and then change the folder permissions back to something safe.

You can also create new folders if you want using the command ftp_mkdir().

Note, the beginning of the ftp_paths are different than url paths we would normally use to locate a file. For example:

An example web path:

http://www.yourdomain.com/rw/tmp/text.txt

An example symbolic link:

public_html/rw/tmp/text.txt

The following code will show you an example of how this works. Just put in your own domain, user id, password, and correct paths and try it out. Change the permissions in the code and watch how the file permissions change.

Please let me know if this works for you -- watch for line breaks.

hth's

tedd

PS: I don't know what to say about your ".public_html/" directory, but I would just leave it alone.

---

// how to call the function

<?php

$ftp_path = "public_html/rw/";  // note the ftp path
$theDir = "tmp";
$theFile ="text.txt";
FtpPerms($ftp_path, $theDir, $theFile);
?>


// the function

<?php
// create directory and change permissions via FTP connection

function FtpPerms($path, $theDir, $theFile)
{

$server='ftp.yourdomain.com'; // ftp server
$connection = ftp_connect($server); // connection

$user = "you";
$pass = "yourpassword";
$result = ftp_login($connection, $user, $pass); // login to ftp server

if ((!$connection) || (!$result))
{
echo("No connection<br/>");
return false;
exit();
}
else
{
echo("Made connection<br/>");
ftp_chdir($connection, $path); // go to destination dir

echo("Change permission<br/>");
$str="CHMOD 0755 " . $theDir; // change permissions for dir (note the space after 0775 )
ftp_site($connection, $str);
echo("$str<br/>");

$filename = "$theDir/$theFile";
$contents = "This is the contents of the file.";

echo("<hr><br/>Writing file <br/><br/>");

$file = fopen( $filename, "w" );
fwrite( $file, $contents);
fclose( $file );
chmod($filename,0755);

echo("Change permission<br/>");
$str="CHMOD 0600 " . $theDir; // change permissions back for dir
ftp_site($connection, $str);
echo("$str<br/>");


echo("Close connection<br/>");
ftp_close($connection); // close connection
}

}
?>
--
-------
http://sperling.com  http://ancientstones.com  http://earthstones.com

--- End Message ---

Reply via email to