At 10:42 10/02/2003, you wrote:
Penny Bamborough wrote:
> We did write a
> perl version of the streetmap engine (with some help from some very nice
> people I might add) however performance tests on the system indicated that
> the processing power required would cripple the server and our site would
> be very short lived.

[...]

> The site has grown considerably since that time, we do use Win2k servers
> with our own IIS extensions written in C++ to power the site

I'm a little surprised by that.  Although I must admit that I've never
written IIS extensions in C++, I'm surprised that it offers significantly
better performance than a mod_perl solution.

So that make me wonder if your Perl solution was based on mod_perl or if
you were using CGI scripts?  That would certainly explain the discrepancy.

Any information you can share about your Perl (non-)solution would be
useful for us to try and understand why Perl didn't work for you.
Hi guys

Firstly, as this was some time ago, you will have to forgive me as I may be somewhat rusty on the whole thing.

Basically the script was simply to take in the geo search, convert it to a coordinate and output html in the form of the results page. The script, in isolation and under low load worked fine, however when we actually tested it on the site we found that the amount of CPU being used was extremely large and the site proved slow under load, I think we basically got a cascade effect that, beyond a certain point, the perl intepreter could not cope with the number of requests.

We tried both flat files and (I think) mysql as the data repository however neither had a noticeable effect on the overall result.

That said, I am *not* and expert on perl. My native skills are C and C++ (and I have experience of a myriad of other languages that have not been used for years :) and I have a reasonable amount of experience in databases (for example I know what a clustered index is :)

The IIS extensions are DLL's that are loaded up when the IIS process starts and are designed to handle the requests, parsing and html generation. As they are already loaded and resident in memory, there is no overhead of starting a process. Also, advantage can be taken of storing certain things in global memory at the start, to make some things even faster. We use SqlServer 2000 for our now very large gazetteer with tables being indexed just about every combination we use (there are very few updates done on these tables). Finally we now run on a server farm, ie the site is handled by a cluster of servers rather than an individual server.

We have been writing the extensions in C++ for quite some time now as this is a familiar environment, easy to interactively debug, provides ready access to OS level functions and is relatively fast. We deliver around 300GB/day though our server farm and it is not unusual for us to have around 10,000 simultaneous connections to a server.

I am not saying that this could not be done in perl, rather that *we* could not do this in perl.

Penny


Reply via email to