Have you looked the GC output during the test?
On Thursday, November 20, 2003, at 03:46 PM, Jill Rhoads wrote:
Ok, I chose a slightly larger file (15k) and inserted it into Xindice
500
times. (Remember I have a 2.7 Ghz Celeron and 512 Meg RAM.) Here's my
results:
1st trial:
Elapsed time (InsertDocument): 45.3489589691
Elapsed time (RemoveDocument): 7.29849600792
2nd trial:
Elapsed time (InsertDocument): 45.0858030319
Elapsed time (RemoveDocument): 6.85884797573
I did notice that Java only utilized at max 38% of my CPU at any time
and
3.3% of my memory (512 MB). So I started playing with Java's options
and
here are my results with the following JAVA_OPT's set.
-Xms100m -Xmx150m
Elapsed time (InsertDocument): 34.5685210228
Elapsed time (RemoveDocument): 8.1172440052
-Xms200m -Xmx300m
Elapsed time (InsertDocument): 35.4143769741
Elapsed time (RemoveDocument): 8.80774796009
-Xms200m -Xmx300m -XX:MinHeapFreeRatio=90 -XX:MinHeapFreeRatio=20
Elapsed time (InsertDocument): 36.0503109694
Elapsed time (RemoveDocument): 8.67905592918
-Xmx500m -Xms500m
Elapsed time (InsertDocument): 34.4147530794
Elapsed time (RemoveDocument): 7.81289899349
Not much of an improvment. BUT it does look scarily close to what you
got
with the smaller file which can mean that the java vm isnt' using my
machine any better than yours. There should be a way to get java even
more streamlined, but I'm no expert at that. Perhaps someone else has
some experience at optimizing the java VM. Anyone? I would be really
interested in learning how to this this myself.
/Jill
Don Stocks said:
He he. I'm hardly in any position to laugh. Your box is running 8X
faster than my old brick! ;)
I've attached a copy of the document I did my test with. I just put
the
contents of this file into a variable and insert it into the database
with
an incrementing name (i.e. insertTest0 - insertTest499) inside a for
loop.
Then I did a listDocuments call and looped through the array
deleteing
the documents. I timed both loops. My initial test was with a
document
about 1K in size and the inserts were taking 90 - 100 seconds.
Thanks for taking a peek at it. I'm curios to see how much of an
impact
the fast CPU has on performance.
- Don
Jill Rhoads <[EMAIL PROTECTED]> wrote:
Can you lend out those XML files? I can test to see if the performance
issues can be solved by a better processor. I have a 2.7 Celeron (don't
laugh) with 512 meg of DDR.
Here's where my work is with the php browser, if you're interested.
http://www.rhoads.nu:8080/~jrhoads/2D1517/project/xindice_browser.php
As you can see from the bottom of the page, I am making sure
performance
statistics are going to be provided.
/Jill
Don Stocks said:
I'm curious to know what type of performance I can expect from
Xindice
when inserting and deleting documents. I am planning an application
that
will be managing a lot of very small documents; perhaps several
thousand
a
minute. So I did some tests. I'm using PHP to communincate with
Xindice
running on the localhost via XML-RPC. Here's what I found.
Inserting 500 476 byte documents: 47 seconds
Deleting 500 476 byte documents: 31 seconds
Unfortunately this is far too slow. I am running this test on my old
junker 333Mhz laptop w/ 128MB RAM. But inserting 500 rows with the same
data into PostgreSQL on the same system only takes a couple of seconds.
I'm just looking for hits or tips to improve performance. I also want
to
validate Xindice as an appropriate choice for my data store. The
native
XML storage is really what I need.
Thanks, Don
---------------------------------
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
---------------------------------
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
Charles Hsiao
MRX Solutions Corp.
Smart Practice, Smart Solutions
http://www.mrxsolutions.com