php-general Digest 7 Aug 2008 09:01:10 -0000 Issue 5611
Topics (messages 277778 through 277800):
[scalability, performance, DoS] To or not to process images at runtime
277778 by: Marcelo de Moraes Serpa
277792 by: Per Jessen
277793 by: Nathan Nobbe
277796 by: Per Jessen
277798 by: Marcelo de Moraes Serpa
277799 by: Per Jessen
Re: Version Control Software
277779 by: Waynn Lue
277781 by: Shawn McKenzie
277785 by: Colin Guthrie
277786 by: Ross McKay
277789 by: Robert Cummings
Re: Variable number of parameters
277780 by: Shawn McKenzie
Re: PHP Brain Teasers
277782 by: tedd
Re: An appeal to your better nature
277783 by: Børge Holen
277787 by: Shawn McKenzie
277788 by: Robert Cummings
277794 by: Yeti
Uploading Large Files - Strange Issue
277784 by: Anna Vester
277790 by: Jay Blanchard
277791 by: Anna Vester
php File upload
277795 by: Tom
277800 by: Per Jessen
Anyone use FileMaker
277797 by: Micah Gersten
Administrivia:
To subscribe to the digest, e-mail:
[EMAIL PROTECTED]
To unsubscribe from the digest, e-mail:
[EMAIL PROTECTED]
To post to the list, e-mail:
[EMAIL PROTECTED]
----------------------------------------------------------------------
--- Begin Message ---
Hello,
My next project will be a kind of online photo viewer. All of these photos
will need to have watermark applied to them. The problem is that, depending
on the picture, different watermarks need to be applied. The easiest
solution would be to process these picture at runtime using GD, apply the
watermark(s) and serve them. The other approach, would be to pre-process
them (maybe using GD) and create different copies on the disk, the obvious
advantage being that it could be served directly via the webserver (apache),
but, it would be much harder to manage (need to fix a watermark error?
Re-process and re-create the images on the disk...) and would take much more
disk space. I would rather process them at runtime, per request, however,
this site will probably have lots of traffic. So, I've reached a deadend.
Could someone share his/her experiences and thoughts and help me decide? :)
FYI, The application would be custom built from the ground up using PHP 5
(Not sure if we will use a framework, if we happen to use, it will be
probably CakePHP). At first, there would be no clusters, proxies or
balancers, just a plain dedicated server with a good CPU, about 4GB RAM and
lots of disk space.
PS: I've put DoS in the subject tagline meaning Denial of Service as I think
that maybe dynamic processing of images X lots of request could result in
DoS.
Thanks in advance,
Marcelo.
--- End Message ---
--- Begin Message ---
Marcelo de Moraes Serpa wrote:
> My next project will be a kind of online photo viewer. All of these
> photos will need to have watermark applied to them. The problem is
> that, depending on the picture, different watermarks need to be
> applied. The easiest solution would be to process these picture at
> runtime using GD, apply the watermark(s) and serve them. The other
> approach, would be to pre-process them (maybe using GD) and create
> different copies on the disk, the obvious advantage being that it
> could be served directly via the webserver (apache), but, it would be
> much harder to manage (need to fix a watermark error? Re-process and
> re-create the images on the disk...) and would take much more disk
> space. I would rather process them at runtime, per request, however,
> this site will probably have lots of traffic. So, I've reached a
> deadend. Could someone share his/her experiences and thoughts and help
> me decide? :)
I think it depends on the amount of traffic you expect -
high - off-line
low-to-medium - on-line, on-demand, but cached.
Disk-space is cheap, especially if you don't need to be worried about
backup etc. I'm not sure why you think applying watermarks in an
off-line process would any less manageable than doing it on-line.
> FYI, The application would be custom built from the ground up using
> PHP 5 (Not sure if we will use a framework, if we happen to use, it
> will be probably CakePHP). At first, there would be no clusters,
> proxies or balancers, just a plain dedicated server with a good CPU,
> about 4GB RAM and lots of disk space.
Sounds like you are planning to do the processing off-line then. You
could even do a mix - if you've got a lot of photos (millions and
milloins), applying the watermarks could take a while in itself, so you
could leave that running slowly in the background, but combine it with
an on-line process that does on-demand watermarking (when the photo is
displayed).
/Per Jessen, Zürich
--- End Message ---
--- Begin Message ---
On Wed, Aug 6, 2008 at 3:04 PM, Marcelo de Moraes Serpa <[EMAIL PROTECTED]
> wrote:
> Hello,
>
> My next project will be a kind of online photo viewer. All of these photos
> will need to have watermark applied to them. The problem is that, depending
> on the picture, different watermarks need to be applied. The easiest
> solution would be to process these picture at runtime using GD, apply the
> watermark(s) and serve them. The other approach, would be to pre-process
> them (maybe using GD) and create different copies on the disk, the obvious
> advantage being that it could be served directly via the webserver
> (apache),
> but, it would be much harder to manage (need to fix a watermark error?
> Re-process and re-create the images on the disk...) and would take much
> more
> disk space. I would rather process them at runtime, per request, however,
> this site will probably have lots of traffic. So, I've reached a deadend.
> Could someone share his/her experiences and thoughts and help me decide? :)
>
> FYI, The application would be custom built from the ground up using PHP 5
> (Not sure if we will use a framework, if we happen to use, it will be
> probably CakePHP). At first, there would be no clusters, proxies or
> balancers, just a plain dedicated server with a good CPU, about 4GB RAM and
> lots of disk space.
>
> PS: I've put DoS in the subject tagline meaning Denial of Service as I
> think
> that maybe dynamic processing of images X lots of request could result in
> DoS.
for the code that will invoke the watermarking, put it behind another layer,
so that you can easily alter it in the future as the site grows. for
example, you might use strategy pattern, and your initial strategy will use
the current webserver directly. however, as the site begins to grow, you
can add additional webservers, dedicated to running gd on top of php. you
can then write a strategy which will pass the requests off to those boxe(s),
and it will be transparent to your existing code that knows only of the
strategy interface.
also, as you grow, distributed filesystems are key. for example, your
front-end webserver can handle requests from users on the site, dispatch a
request (restful for instance) to another box, dedicated to gd. since both
boxes share a common filesystem via nfs (or other) the gd box can create the
watermark, which will then be immediately available to the front-end box,
which it could signal w/ another request to say 'hey, the watermark is
ready'.
-nathan
--- End Message ---
--- Begin Message ---
Bernhard Kohl wrote:
> I think it also depends on the size of your images. If they are huge
> megapixel files processing them on the fly might cause severe lag.
> Still adding a watermark to an image with 100-200 thousand pixels is
> done within milliseconds on a modern machine.
>
(You probably meant to send this to the list)
The OP spoke about "a kind of online photo viewer", so I assumed e.g.
JPEGs at 1024x768 as a typical size, so about 700K pixels.
/Per Jessen, Zürich
--- End Message ---
--- Begin Message ---
@Per Jessen
Disk-space is cheap, especially if you don't need to be worried about
backup etc. I'm not sure why you think applying watermarks in an
off-line process would any less manageable than doing it on-line.
Well, the processing will be "online" in the sense that it will be triggered
via an admin interface. The pictures will then be batch-processed by a php
script using GD and saved to the disk and later served statically, without
the overhead of applying the watermark per-request, at runtime.
Less manegeable becouse I would have to keep copies of the pictures on the
disk. If I ever want to change these watermarks, I would have to somehow
recreate them. It is more work to do than if I used the per-request runtime
applying of watermark approach, since in this case, I would just apply the
watermarks I wanted and then serve the stream directly from memory.
Sounds like you are planning to do the processing off-line then. You
> could even do a mix - if you've got a lot of photos (millions and
> milloins), applying the watermarks could take a while in itself, so you
> could leave that running slowly in the background, but combine it with
> an on-line process that does on-demand watermarking (when the photo is
> displayed).
>
Yes, applying the watermarks "offline" in a batch to lots of images could
take a while, but the album wouldn't be published before this process is
done. So, I don't really understand what you mean by mixing the two
approaches.
@Nathan
for the code that will invoke the watermarking, put it behind another layer,
> so that you can easily alter it in the future as the site grows. for
> example, you might use strategy pattern, and your initial strategy will use
> the current webserver directly. however, as the site begins to grow, you
> can add additional webservers, dedicated to running gd on top of php. you
> can then write a strategy which will pass the requests off to those boxe(s),
> and it will be transparent to your existing code that knows only of the
> strategy interface.
>
> also, as you grow, distributed filesystems are key. for example, your
> front-end webserver can handle requests from users on the site, dispatch a
> request (restful for instance) to another box, dedicated to gd. since both
> boxes share a common filesystem via nfs (or other) the gd box can create the
> watermark, which will then be immediately available to the front-end box,
> which it could signal w/ another request to say 'hey, the watermark is
> ready'.
>
You have come with some great insights, the strategy idea seems nice and
could work. Adding dedicated "image processing" boxes is a good idea, even
better if the software to apply it is written in C, but I don't think my use
case justifies such an investment of time and money.
Another thing that you mentioned that is of great interest to me is the use
of a distributed filesystem, since I think I will just pre-process the
images in batch to add the watermark, the use of HDD space will grow
considerably as time goes by and the app grow. Is this approach transparent
enough so that architectural changes to the app wouldn't be necessary?
Thank you all for the replies!
Marcelo.
On Thu, Aug 7, 2008 at 3:52 AM, Per Jessen <[EMAIL PROTECTED]> wrote:
> Bernhard Kohl wrote:
>
> > I think it also depends on the size of your images. If they are huge
> > megapixel files processing them on the fly might cause severe lag.
> > Still adding a watermark to an image with 100-200 thousand pixels is
> > done within milliseconds on a modern machine.
> >
>
> (You probably meant to send this to the list)
>
> The OP spoke about "a kind of online photo viewer", so I assumed e.g.
> JPEGs at 1024x768 as a typical size, so about 700K pixels.
>
>
> /Per Jessen, Zürich
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>
--- End Message ---
--- Begin Message ---
Marcelo de Moraes Serpa wrote:
> Less manegeable becouse I would have to keep copies of the pictures on
> the disk. If I ever want to change these watermarks, I would have to
> somehow recreate them. It is more work to do than if I used the
> per-request runtime applying of watermark approach, since in this
> case, I would just apply the watermarks I wanted and then serve the
> stream directly from memory.
Hmm, I don't usually think "more work" = "less managable", but that's a
matter for you.
My personal take on this type of thing -
I would go for the on-demand watermarking, but with a cached copy of
everything that is watermarked. "on-demand" = "when a photo is
published the first time". Like Bernhard said earlier, it probably
takes a few milliseconds to apply a watermark, so the very first time a
photo is viewed, the viewer might just experience the slightest delay.
With apache this is really easy to do:
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-s
RewriteRule ^(.+)$ apply_watermark.php?name=$1
This means: if <photo-with-watermark> doesn't exist,
run "apply-watermark.php" to apply a watermark, write the
<photo-with-watermark> to cache/disk, and then output the watermarked
photo.
If you need to change the watermark, just erase the cached copies and
they're regenerated next time someone wants to view a photo. To save
on disk-space if that is a concern, you can run regular purges of
cached copies that haven't been viewed for a while:
find <cachedir> -atime +30 -type f | xargs rm
/Per Jessen, Zürich
--- End Message ---
--- Begin Message ---
Does subversion meet your needs? You can check out a working copy
that's your dev copy, then check in changes and push to production
whenever you want.
Waynn
On 8/6/08, Benjamin Darwin <[EMAIL PROTECTED]> wrote:
> After reading a topic on the list here about someone losing their website,
> and having a minor mistake on my own that cost me a week's work on a file
> (basically, tested the file, then uploaded to the live site and took the
> daily backup off the live site.. only to find the file was messed up.. and
> had to go to the weekly backup off cd to recover it, losing a week of
> work)..
>
> I'm wondering if anybody knows of a version control software program that
> may fit my needs.
>
> Basically, I'm looking for something that runs locally, not on the live
> site, that I can edit the files on the dev computer, and store old versions
> on the dev computer, and then just publish off of the local onto the live
> site whenever I need to.
>
> Anybody have any suggestons/ideas on how this should be done, and what
> program is a good fit?
>
> Thanks for any help,
> Ben
>
--- End Message ---
--- Begin Message ---
Benjamin Darwin wrote:
After reading a topic on the list here about someone losing their website,
and having a minor mistake on my own that cost me a week's work on a file
(basically, tested the file, then uploaded to the live site and took the
daily backup off the live site.. only to find the file was messed up.. and
had to go to the weekly backup off cd to recover it, losing a week of
work)..
I'm wondering if anybody knows of a version control software program that
may fit my needs.
Basically, I'm looking for something that runs locally, not on the live
site, that I can edit the files on the dev computer, and store old versions
on the dev computer, and then just publish off of the local onto the live
site whenever I need to.
Anybody have any suggestons/ideas on how this should be done, and what
program is a good fit?
Thanks for any help,
Ben
I use SVN for my local development. It is very easy I think. I use
Aptana IDE which has SVN support. I use linux, but I'm sure there is
SVN for winbloze.
-Shawn
--- End Message ---
--- Begin Message ---
Benjamin Darwin wrote:
Anybody have any suggestons/ideas on how this should be done, and what
program is a good fit?
Personally I think subversion is best suited for web projects (which can
have high graphics churn) and git is best suited for code projects
(which have text differences).
I'm really loving git the more I use it for various projects, but I
really don't think I'd be bothered with it's complexity if you are new
to VCS.
Subversion is great. You can can either run a repository locally or
remotely and check out your "working copy" to do the actual changes, and
commit back to the repository when you are done. If you keep your
repository store locally, then make sure you have a good backup policy
there!
Personally, I keep my subversion repository on a central server in the
office. It's connected via a standard ADSL but I can SSH in from the
outside world. Live servers connect directly into it to do their
checkouts for actually running the site. I do this on demand so it
doesn't matter if the office server is offline etc. as I just make sure
it goes online before updating. That way I'll usually have a full local
checkout on my dev machine, the master repository and of course the live
servers. This is quite reassuring from a backup perspective :D
If you do go for subversion, I can recommend Trac as an excellent web
based frontend to the repository to allow you to view it nicely. It also
has a wiki for keeping notes and a ticketing system; with a few plugins
I wrote (WorkLogPlugin, ClientsPlugin) it is ideal for tracking time
spent on various tasks for various clients in order ot issue invoices etc.
As for frontends, on Winblows, Tortoise SVN is the defacto one, but if
you use Eclipse there are a few options there (Subclipse and Subversive)
too.
HTHs
Col
--- End Message ---
--- Begin Message ---
On Wed, 6 Aug 2008 16:42:23 -0400, "Benjamin Darwin" wrote:
>[...]
>I'm wondering if anybody knows of a version control software program that
>may fit my needs.
>
>Basically, I'm looking for something that runs locally, not on the live
>site, that I can edit the files on the dev computer, and store old versions
>on the dev computer, and then just publish off of the local onto the live
>site whenever I need to. [...]
A couple of very easy-to-use ones are Subversion and CVS. Both are very
easy to use from a shell / command line, and both have nice GUIs
available for both Windows and *nix. Many editors and IDEs will work
with CVS directly, and some with Subversion.
I chose Subversion because I was trying to move SWMBO off Windows onto
Linux, and the GUIs for Subversion were similar enough and simple enough
on both (TortoiseSVN on Windows, RapidSVN on Linux). Subversion has some
nice options for setting up network servers if you need to go down that
path too (although you probably would get by nicely using local file
storage).
Under Windows, TortoiseSVN comes with a pretty good diff / merge tool
built-in. Under Linux, you'll want to grab Meld.
If you're doing website development by yourself with no self-built
common code libraries (or frameworks!) then you probably won't even need
to worry about stuff like branching. If you have set up some common code
libraries, then it's a good idea to look at branching so that you can
support older sites on older versions of the libraries whilst further
developing them for newer sites.
http://subversion.tigris.org/
http://tortoisesvn.tigris.org/
http://rapidsvn.tigris.org/
http://meld.sourceforge.net/
Of course, a good IT professional would probably tell you to use git,
with its 132-odd shell commands... ;)
--
Ross McKay, Toronto, NSW Australia
"And don't forget ladies and gentlemen
you have to buy this new thing that you don't have
and if you have it
well actually
the new better version of the thing that you have
well it just came out" - Jackson Jackson
--- End Message ---
--- Begin Message ---
On Thu, 2008-08-07 at 09:43 +1000, Ross McKay wrote:
> On Wed, 6 Aug 2008 16:42:23 -0400, "Benjamin Darwin" wrote:
>
> >[...]
> >I'm wondering if anybody knows of a version control software program that
> >may fit my needs.
> >
> >Basically, I'm looking for something that runs locally, not on the live
> >site, that I can edit the files on the dev computer, and store old versions
> >on the dev computer, and then just publish off of the local onto the live
> >site whenever I need to. [...]
>
> A couple of very easy-to-use ones are Subversion and CVS. Both are very
> easy to use from a shell / command line, and both have nice GUIs
> available for both Windows and *nix. Many editors and IDEs will work
> with CVS directly, and some with Subversion.
While I currently use CVS, I probably wouldn't choose it going forward
since Subversion solves many of the problems it has... as does GIT if I
recall. I'm still using CVS because it works for me and I haven't
allocated the time yet to switch over.
Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP
--- End Message ---
--- Begin Message ---
Philip Thompson wrote:
Is it possible to grab a variable number of parameters and send the
appropriate amount to another function?
<?php
// Some class
$this->db->prepare("SELECT * FROM `table` WHERE (`id`=?)");
$this->db->bind('ii', $id1);
$this->db->prepare("SELECT * FROM `table` WHERE (`id`=? AND
`other_id`=?)");
$this->db->bind('ii', $id1, $id2);
// DB class
function bind () {
$args = func_get_args();
$this->statement->bind_param($args[0], $args[1], ...);
}
?>
Ok, is it possible to send any number of variables to db->bind() in
order to send those to statement->bind_param()?
Or, if someone else has a better db abstraction method, feel free to
educate...
Thanks,
~Phil
I'm confused as your code looks like it's already doing what you're
asking. It's hard to tell without seeing what bind_param() looks like.
But just a thought to use arrays in one of two ways:
// 1.
$this->db->bind('ii', $id1, $id2);
function bind () {
$args = func_get_args();
$this->statement->bind_param($args);
// then bind_param() can count the number of args and use them
}
//2.
$this->db->bind('ii', array($id1, $id2));
function bind ($var, $ids) {
// pass thru to bind_param()
$this->statement->bind_param($var, $ids);
// then bind_param() can use the $ids array
// or count the ids and send individual args to bind_param()
}
As I said, it's kind of hard to tell without knowing exactly what you
want to achieve.
One of the more elegant ways that I have seen is to pass required args
and then an array of args that the receiving function can use, like:
$this->db->bind('ii', array('someoption'=>$id1, 'feature'=>$id2,
'action'=>'doit'));
Need more info on the desired outcome.
-Shawn
--- End Message ---
--- Begin Message ---
At 12:51 PM -0400 8/5/08, Daniel Brown wrote:
On Tue, Jun 26, 2007 at 1:44 PM, Daniel Brown <[EMAIL PROTECTED]> wrote:
Try to use some really common phrases that all of us around the
world should recognize, but feel free to get really elaborate with the
> code.
function getToDoList($wifeList)
{
list($doWhat, $doWhen) = explode ("/r", $wifeList);
$anarray1 = split(" ", $doWhat);
$anarray2 = split(" ", $doWhen);
$tomorrow = mktime(1, 0, 0, $m, $d + 1, $y);
for ($i = 0; $i < count($anarray1); $i++)
{
if (in_array($anarray2[$i], $tomorrow))
{
$putoff = $tomorrow;
}
else
{
$putoff = rand($excuse);
}
return $putoff
}
Cheers,
tedd
PS: Not guaranteed to work.
--
-------
http://sperling.com http://ancientstones.com http://earthstones.com
--- End Message ---
--- Begin Message ---
On Wednesday 06 August 2008 20:00:50 tedd wrote:
> At 9:11 AM -0400 8/5/08, Daniel Brown wrote:
> >On Tue, Aug 5, 2008 at 8:53 AM, Aschwin Wesselius
> >
> ><[EMAIL PROTECTED]> wrote:
> >> I wouldn't like to loose my stuff, but I can't afford much for the best
> >> solutions either. It's not that my job depends on it, but personal data
> >> is a big loss too.
> >
> > Tell me about it. One of the sickest feelings in the world comes
> >when you hear your hard drive start going "click.... click....
> >choke.... click...."
>
> No question about it.
>
> That's the reason why I have three backup systems. One in my garage
> in a fireproof safe that's in another fireproof safe; One hidden in
> my house in another fireproof safe; And one attached to my main
> computer that I backup everyday, or more often, which is stored (when
> not in use) in a yet another waterproof and fireproof safe. (I'm big
> on fireproof safes) All of which are protected by me and my guns. The
> only way I'm going to lose any data is if my place is stuck by a
> meteor.
>
> Sure it takes a lot of time to backup, but less than the alternative.
I'm just gonna comment some here. Hell, it's a bitch to loose data, but I give
you this... These "pro's" here, ain't getting things done, no time for it
between backups.
The time they use each year on backup you can write new code tenfold. ;D
muhaha
>
> Cheers,
>
> tedd
>
> --
> -------
> http://sperling.com http://ancientstones.com http://earthstones.com
--
---
Børge Holen
http://www.arivene.net
--- End Message ---
--- Begin Message ---
Richard Heyes wrote:
Hi,
Seems my 1and1 server has finally gone kaput taking my website with
it, and in the tradition of all good IT professionals, I have no
backups. :( So this is an appeal to you to ask if you have downloaded
anything from phpguru.org at all, could you please send it to me so I
can try to rebuild my site.
A big thanks.
Wooo, thanks for the reminder. Off to do my annual backups.
-Shawn
--- End Message ---
--- Begin Message ---
On Thu, 2008-08-07 at 00:46 +0200, Børge Holen wrote:
> On Wednesday 06 August 2008 20:00:50 tedd wrote:
> > At 9:11 AM -0400 8/5/08, Daniel Brown wrote:
> > >On Tue, Aug 5, 2008 at 8:53 AM, Aschwin Wesselius
> > >
> > ><[EMAIL PROTECTED]> wrote:
> > >> I wouldn't like to loose my stuff, but I can't afford much for the best
> > >> solutions either. It's not that my job depends on it, but personal data
> > >> is a big loss too.
> > >
> > > Tell me about it. One of the sickest feelings in the world comes
> > >when you hear your hard drive start going "click.... click....
> > >choke.... click...."
> >
> > No question about it.
> >
> > That's the reason why I have three backup systems. One in my garage
> > in a fireproof safe that's in another fireproof safe; One hidden in
> > my house in another fireproof safe; And one attached to my main
> > computer that I backup everyday, or more often, which is stored (when
> > not in use) in a yet another waterproof and fireproof safe. (I'm big
> > on fireproof safes) All of which are protected by me and my guns. The
> > only way I'm going to lose any data is if my place is stuck by a
> > meteor.
> >
> > Sure it takes a lot of time to backup, but less than the alternative.
>
> I'm just gonna comment some here. Hell, it's a bitch to loose data, but I
> give
> you this... These "pro's" here, ain't getting things done, no time for it
> between backups.
> The time they use each year on backup you can write new code tenfold. ;D
> muhaha
What are you yammering on about? A simple cron job can do the backup
while I sleep-- "Look ma... I work while I sleep!"
Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP
--- End Message ---
--- Begin Message ---
Backups? What's that?
--- End Message ---
--- Begin Message ---
Hello group,
I have a very strange issue coming up when uploading large files ( about
30MB). The problem is it works fine on my computer (and two others that I've
tested), but it doesn't work on my client's laptop. It comes up with error
code - 0 (which is upload successful), but the actual file is not on the
server. Here is my error checking code:
if ($sizeOK && $typeOK) {
switch($_FILES['items']['error'][$number]) {
case 0:
if(!file_exists(UPLOAD_DIR.$file)) {
$success =
move_uploaded_file($_FILES['items']['tmp_name'][$number], UPLOAD_DIR.$file);
}
else {
$success =
move_uploaded_file($_FILES['items']['tmp_name'][$number],
UPLOAD_DIR.$postDate.$file);
$cp = true;
}
if ($success) {
$result[] = "$file uploaded successfully";
}
else {
$result[] = "Error uploading $file. Please
try again.";
}
break;
case 3:
$result[] = "Error uploading $file. Please try
again.";
default:
$result[] = "System error uploading $file. Contact
Webmaster.";
}
}
elseif ($_FILES['items']['error'][$number] == 4) {
$result[] = 'You chose not to add this file.';
}
else {
$result[] = "$file cannot be uploaded. Maximum size: $max.<br />
Acceptable file types: pdf and mp3.<br />Error number: " .
$_FILES['items']['error'][$number]."<br />";
}
=====================================
So for some reason on his computer it doesn't go to the switch statement
(case 0), but goes to the very last else statement.
So he always get this message:
>bigfiles.mp3 cannot be uploaded. Maximum size: 51,000.00KB.
>Acceptable file types: pdf and mp3
>Error number: 0
Yet, it always seems to work when I do it on my computer.
Any insight into this issue would be very helpful.
Thank you.
Anna Vester
--- End Message ---
--- Begin Message ---
[snip]
I have a very strange issue coming up when uploading large files ( about
30MB). The problem is it works fine on my computer (and two others that
I've
tested), but it doesn't work on my client's laptop. It comes up with
error
code - 0 (which is upload successful), but the actual file is not on the
server. Here is my error checking code:
if ($sizeOK && $typeOK) {
switch($_FILES['items']['error'][$number]) {
case 0:
if(!file_exists(UPLOAD_DIR.$file)) {
$success =
move_uploaded_file($_FILES['items']['tmp_name'][$number],
UPLOAD_DIR.$file);
}
else {
$success =
move_uploaded_file($_FILES['items']['tmp_name'][$number],
UPLOAD_DIR.$postDate.$file);
$cp = true;
}
if ($success) {
$result[] = "$file uploaded
successfully";
}
else {
$result[] = "Error uploading $file.
Please
try again.";
}
break;
case 3:
$result[] = "Error uploading $file. Please try
again.";
default:
$result[] = "System error uploading $file.
Contact
Webmaster.";
}
}
elseif ($_FILES['items']['error'][$number] == 4) {
$result[] = 'You chose not to add this file.';
}
else {
$result[] = "$file cannot be uploaded. Maximum size: $max.<br />
Acceptable file types: pdf and mp3.<br />Error number: " .
$_FILES['items']['error'][$number]."<br />";
}
=====================================
So for some reason on his computer it doesn't go to the switch statement
(case 0), but goes to the very last else statement.
So he always get this message:
>bigfiles.mp3 cannot be uploaded. Maximum size: 51,000.00KB.
>Acceptable file types: pdf and mp3
>Error number: 0
Yet, it always seems to work when I do it on my computer.
Any insight into this issue would be very helpful.
[/snip]
It is likely that it is not PHP causing the issue. What browser is he
using? What are his security settings for the browser? Have you viewed
the source of the upload form on his browser?
--- End Message ---
--- Begin Message ---
[snip]
-----Original Message-----
From: Jay Blanchard [mailto:[EMAIL PROTECTED]
Sent: Wednesday, August 06, 2008 9:44 PM
To: Anna Vester; [EMAIL PROTECTED]
Subject: RE: [PHP] Uploading Large Files - Strange Issue
It is likely that it is not PHP causing the issue. What browser is he
using? What are his security settings for the browser? Have you viewed
the source of the upload form on his browser?
[/snip]
He is using IE7 on Vista, he has a number of fishing add-ons (I don't
remember which exactly since I've seen his laptop only once before this
problem arose). Also he's been using "You send it" service just fine. That's
what bums me!
Thanks for such a quick response.
Anna
--- End Message ---
--- Begin Message ---
Hi,
on a linux system (Suese 10.2) with 1 GB memory its not possible to upload
via http a 1 Gb File. Thats no limit problem on my php config. i can look
the mem stats when uploading and the growing tmp file. If the temp file has
900 MB, Main Memory free is 0 and the script aborts and php deletes the tmp
file.
Why don't php use swap memory ?
Greets Tom
--- End Message ---
--- Begin Message ---
Tom wrote:
> Hi,
>
> on a linux system (Suese 10.2) with 1 GB memory its not possible to
> upload via http a 1 Gb File. Thats no limit problem on my php
> config. i can look the mem stats when uploading and the growing tmp
> file. If the temp file has 900 MB, Main Memory free is 0 and the
> script aborts and php deletes the tmp file.
>
> Why don't php use swap memory ?
It doesn't need to - as you've noticed, the uploaded file is being
written to disk, it's not being kept in memory.
This sounds like a php limit problem to me.
/Per Jessen, Zürich
--- End Message ---
--- Begin Message ---
I'm wondering if anyone here has experience integrating FileMaker with
PHP using either ODBC or JDBC.
--
Thank you,
Micah Gersten
onShore Networks
Internal Developer
http://www.onshore.com
--- End Message ---