Hi,
The details are as follows:
Solaris version: Solaris 10 U5 and U6
For the Java Setup, I have tried with:
Sun JDK 1.5 (32 & 64)
Sun JDK 1.6 (32 & 64)
Heap Space: 2G from 32 bit and 4G for 64 bit (Set the same values for
both XMS and XMX)
Disk: Tried with ZFS (U6) and UFS (U5)
I reduced the Inode density on the partition since page faults were
happening initially. Also, the block size was 8K.
The following reports are on Solaris 10 U5 with UFS (8K block size).
The vmstat report is below:
# vmstat 3
kthr memory page disk faults cpu
r b w swap free re mf pi po fr de sr s0 s2 -- -- in sy cs us
sy id
0 0 0 43408368 29603112 13 107 4 7 6 0 0 2 -0 0 0 458 589 321 0
0 99
0 0 0 22517072 30241872 830 7709 0 0 0 0 0 0 0 0 0 2537 30699 3899 36
3 61
0 0 0 24666416 30483248 860 7942 0 0 0 0 0 0 0 0 0 2235 31886 3362 28
3 69
0 0 0 24184008 30450728 791 7813 0 0 0 0 0 0 0 0 0 2335 31426 3740 33
3 65
0 0 0 23472024 30442168 813 7822 0 0 0 0 0 0 0 0 0 2494 31312 3923 35
3 62
0 0 0 22751768 30297240 787 7654 0 0 0 0 0 0 0 0 0 2498 31820 3892 35
3 62
0 0 0 25385568 30527600 835 7963 0 0 0 0 0 0 0 0 0 2360 32144 3691 29
3 68
0 0 0 22741056 30281376 787 7643 0 0 0 0 0 0 0 0 0 2249 30008 3484 37
3 60
0 0 0 22753800 30373072 847 7923 0 0 0 0 0 0 0 0 0 2501 31375 3854 33
3 64
0 0 0 23725600 30431312 841 8049 0 0 0 0 0 0 0 0 0 2362 32514 3752 32
3 65
0 0 0 22743008 30256368 826 7719 0 0 0 0 0 0 0 0 0 2427 32101 3773 34
3 63
0 0 0 23950336 30405312 818 7925 0 0 0 0 0 0 0 0 0 2337 32095 3606 31
3 66
0 0 0 22752872 30319312 778 7486 0 0 0 0 0 0 0 0 0 2435 30277 3769 37
3 60
0 0 0 24657848 30401408 836 7797 0 0 0 0 0 0 0 0 0 2469 31879 3770 32
3 65
0 0 0 23460408 30331056 776 7560 0 0 0 0 0 0 0 0 0 2461 31344 3883 37
3 60
0 0 0 22748352 30302248 792 7534 0 0 0 0 0 0 0 0 0 2424 30683 3733 36
3 61
0 0 0 23953848 30389280 837 7868 0 0 0 0 0 0 0 0 0 2280 30710 3511 34
3 63
0 0 0 22736592 30249672 810 7716 0 0 0 0 0 0 0 0 0 2345 30980 3593 34
3 63
kthr memory page disk faults cpu
r b w swap free re mf pi po fr de sr s0 s2 -- -- in sy cs us
sy id
0 0 0 23951544 30400016 834 7992 0 0 0 0 0 0 0 0 0 2431 32647 3776 31
3 66
0 0 0 21797992 30245536 812 7635 0 0 0 0 0 0 0 0 0 2402 30213 3693 39
3 59
0 0 0 22745264 30307728 781 7539 0 0 0 0 0 0 0 0 0 2437 30744 3860 38
3 59
0 0 0 22748536 30286456 851 7987 0 0 0 0 0 0 0 0 0 2433 32842 3678 29
3 68
0 0 0 22747680 30297544 791 7514 0 0 0 0 0 0 0 0 0 2369 30625 3655 38
3 59
0 0 0 23104848 30309488 793 7524 0 0 0 0 0 0 0 0 0 2566 31190 3942 36
3 61
0 0 0 23460368 30340688 810 7784 0 0 0 0 0 0 0 0 0 2302 31430 3688 33
3 64
0 0 0 22031344 30215504 792 7397 0 0 0 0 0 0 0 0 0 2477 30733 3850 39
3 58
0 0 0 22744168 30277312 834 7829 0 0 0 0 0 0 0 0 0 2370 29926 3592 35
3 63
0 0 0 24177728 30432600 806 7789 0 0 0 0 0 0 0 0 0 2379 31870 3731 32
3 65
0 0 0 23467896 30382728 802 7810 0 0 0 0 0 0 0 0 0 2527 31866 3980 33
3 64
0 0 0 22757912 30331080 823 7847 0 0 0 0 0 0 0 0 0 2496 31580 3871 33
3 64
0 0 0 22043312 30215904 819 7558 0 0 0 0 0 0 0 0 0 2253 30252 3455 36
3 61
0 0 0 22744952 30275008 793 7657 0 0 0 0 0 0 0 0 0 2499 31840 3937 36
3 61
0 0 0 23715200 30419008 843 7906 0 0 0 0 0 0 0 0 0 2404 31741 3707 33
3 64
0 0 0 21307184 30148472 788 7515 0 0 0 0 0 0 0 0 0 2430 31340 3819 39
3 58
0 0 0 23459544 30332160 817 7687 0 0 0 0 0 0 0 0 0 2353 31011 3526 34
3 63
0 0 0 23467304 30376392 819 7846 0 0 0 0 0 0 0 0 0 2522 31918 3933 32
3 65
0 0 0 23473464 30421992 830 7854 0 0 0 0 0 0 0 0 0 2407 31714 3827 33
3 65
kthr memory page disk faults cpu
r b w swap free re mf pi po fr de sr s0 s2 -- -- in sy cs us
sy id
0 0 0 22508648 30240256 799 7447 0 0 0 0 0 0 0 0 0 2524 30483 3911 39
3 58
0 0 0 22737496 30276864 837 7731 0 0 0 0 0 0 0 0 0 2310 30289 3473 36
3 61
0 0 0 22515488 30312728 792 7646 0 0 0 0 0 0 0 0 0 2472 31610 3867 35
3 62
0 0 0 22756248 30251584 823 7669 0 0 0 0 0 0 0 0 0 2637 31413 4119 36
3 60
0 0 0 23940168 30379336 833 7973 0 0 0 0 0 0 0 0 0 2240 32325 3472 31
3 66
0 0 0 24184736 30455928 846 7838 0 0 0 0 0 0 0 0 0 2288 31508 3538 32
3 65
0 0 0 22751808 30309984 776 7614 0 0 0 0 0 0 0 0 0 2480 31486 3906 36
3 61
0 0 0 22989624 30309536 813 7626 0 0 0 0 0 0 0 0 0 2537 30347 3988 38
3 59
0 0 0 22739224 30276936 814 7692 0 0 0 0 0 0 0 0 0 2261 31528 3409 34
3 63
0 0 0 23236200 30350216 853 7968 0 0 0 0 0 0 0 0 0 2676 32636 4167 32
3 65
0 0 0 24194016 30506960 836 8005 0 0 0 0 0 0 0 0 0 2222 31503 3382 30
3 67
0 0 0 23480504 30424536 834 7848 0 0 0 0 0 0 0 0 0 2390 32039 3736 32
3 65
0 0 0 22739048 30233176 812 7742 0 0 0 0 0 0 0 0 0 2604 31693 4029 35
3 62
0 0 0 23461664 30358528 805 7700 0 0 0 0 0 0 0 0 0 2463 30936 3922 36
3 61
0 0 0 22031616 30257808 796 7533 0 0 0 0 0 0 0 0 0 2226 29829 3377 37
3 60
0 0 0 23458168 30327688 813 7773 0 0 0 0 0 0 0 0 0 2388 31519 3763 33
3 64
0 0 0 23465184 30371904 834 7796 0 0 0 0 0 0 0 0 0 2510 32101 3933 34
3 63
0 0 0 22513648 30312240 800 7596 0 0 0 0 0 0 0 0 0 2416 30538 3734 38
3 59
0 0 0 22272824 30228912 802 7640 0 0 0 0 0 0 0 0 0 2292 31395 3469 34
3 63
kthr memory page disk faults cpu
r b w swap free re mf pi po fr de sr s0 s2 -- -- in sy cs us
sy id
0 0 0 22744032 30253480 830 7758 0 0 0 0 0 0 0 0 0 2568 31896 3935 35
3 62
0 0 0 23287696 30368432 808 7736 0 0 0 0 0 0 0 0 0 2439 30929 3844 35
3 62
0 0 0 22038504 30253112 794 7480 0 0 0 0 0 0 0 0 0 2213 29774 3433 37
3 60
0 0 0 24660144 30480240 846 8103 0 0 0 0 0 0 0 0 0 2548 32825 3944 30
3 66
0 0 0 24183648 30445120 823 7895 0 0 0 0 0 0 0 0 0 2441 32699 3822 31
3 66
0 0 0 23469136 30391744 830 7916 0 0 0 0 0 0 0 0 0 2441 31731 3813 32
3 65
0 0 0 22516312 30289688 796 7592 0 0 0 0 0 0 0 0 0 2318 30310 3616 36
3 61
0 0 0 23943448 30363248 821 7821 0 0 0 0 0 0 0 0 0 2355 31789 3641 31
3 66
0 0 0 22742768 30284896 795 7602 0 0 0 0 0 0 0 0 0 2387 30608 3675 37
3 60
0 0 0 23475456 30416136 852 7991 0 0 0 0 0 0 0 0 0 2489 32066 3871 31
3 66
0 0 0 22756848 30233968 798 7588 0 0 0 0 0 0 0 0 0 2464 31106 3836 37
3 60
Cooltest report said virtually nothing.
System Configuration
Host name mmiAS02
System name SUNW,SPARC-Enterprise-T5120
Effective UID 0
Cooltst version 3.0.1
OS Solaris
OS release 5.10
OS version Generic_127127-11
Distro Solaris
BIOS/PROM OBP 4.28.0 2008/01/22 21:10
Memory 32640 MB
Chip UltraSPARC-T2
MHz 1165
Architecture SPARC
# of Virtual CPUs 64
P0: 1165 MHz UltraSPARC-T2
P1: 1165 MHz UltraSPARC-T2
P2: 1165 MHz UltraSPARC-T2
P3: 1165 MHz UltraSPARC-T2
P4: 1165 MHz UltraSPARC-T2
P5: 1165 MHz UltraSPARC-T2
P6: 1165 MHz UltraSPARC-T2
P7: 1165 MHz UltraSPARC-T2
P8: 1165 MHz UltraSPARC-T2
P9: 1165 MHz UltraSPARC-T2
P10: 1165 MHz UltraSPARC-T2
P11: 1165 MHz UltraSPARC-T2
P12: 1165 MHz UltraSPARC-T2
P13: 1165 MHz UltraSPARC-T2
P14: 1165 MHz UltraSPARC-T2
P15: 1165 MHz UltraSPARC-T2
P16: 1165 MHz UltraSPARC-T2
P17: 1165 MHz UltraSPARC-T2
P18: 1165 MHz UltraSPARC-T2
P19: 1165 MHz UltraSPARC-T2
P20: 1165 MHz UltraSPARC-T2
P21: 1165 MHz UltraSPARC-T2
P22: 1165 MHz UltraSPARC-T2
P23: 1165 MHz UltraSPARC-T2
P24: 1165 MHz UltraSPARC-T2
P25: 1165 MHz UltraSPARC-T2
P26: 1165 MHz UltraSPARC-T2
P27: 1165 MHz UltraSPARC-T2
P28: 1165 MHz UltraSPARC-T2
P29: 1165 MHz UltraSPARC-T2
P30: 1165 MHz UltraSPARC-T2
P31: 1165 MHz UltraSPARC-T2
P32: 1165 MHz UltraSPARC-T2
P33: 1165 MHz UltraSPARC-T2
P34: 1165 MHz UltraSPARC-T2
P35: 1165 MHz UltraSPARC-T2
P36: 1165 MHz UltraSPARC-T2
P37: 1165 MHz UltraSPARC-T2
P38: 1165 MHz UltraSPARC-T2
P39: 1165 MHz UltraSPARC-T2
P40: 1165 MHz UltraSPARC-T2
P41: 1165 MHz UltraSPARC-T2
P42: 1165 MHz UltraSPARC-T2
P43: 1165 MHz UltraSPARC-T2
P44: 1165 MHz UltraSPARC-T2
P45: 1165 MHz UltraSPARC-T2
P46: 1165 MHz UltraSPARC-T2
P47: 1165 MHz UltraSPARC-T2
P48: 1165 MHz UltraSPARC-T2
P49: 1165 MHz UltraSPARC-T2
P50: 1165 MHz UltraSPARC-T2
P51: 1165 MHz UltraSPARC-T2
P52: 1165 MHz UltraSPARC-T2
P53: 1165 MHz UltraSPARC-T2
P54: 1165 MHz UltraSPARC-T2
P55: 1165 MHz UltraSPARC-T2
P56: 1165 MHz UltraSPARC-T2
P57: 1165 MHz UltraSPARC-T2
P58: 1165 MHz UltraSPARC-T2
P59: 1165 MHz UltraSPARC-T2
P60: 1165 MHz UltraSPARC-T2
P61: 1165 MHz UltraSPARC-T2
P62: 1165 MHz UltraSPARC-T2
P63: 1165 MHz UltraSPARC-T2
OS release detail:
Solaris 10 5/08 s10s_u5wos_10 SPARC Copyright 2008 Sun Microsystems,
Inc. All Rights Reserved. Use is subject to license terms. Assembled 24
March 2008
Workload Measurements
Observed system for 10 min
in intervals of 10 sec
Cycles 44768051692942
Instructions 3980806371547
CPI 11.25 **
FP instructions 5821938521
Emulated FP instructions 0
FP Percentage 0.1%
The following applies to the measurement interval with the
busiest single thread or process:
Peak thread utilization at 2009-02-13 14:36:16
Corresponding file name 1234515976
CPU utilization 39.4%
Command java
PID/LWPID 16396/1
Thread utilization 0.6%
More detail on processes and threads is in data/process.out
**Cycles per Instruction (CPI) is not comparable between UltraSPARC
T1 and T2 processors and conventional processors. Conventional
processors execute an idle loop when there is no work to do, so
CPI may be artificially low, especially when the system is
somewhat idle. The UltraSPARC T1 and T2 "park" idle threads,
consuming no energy, when there is no work to do, so CPI may
be artificially high, especially when the system is somewhat idle.
Advice
Floating Point GREEN
Observed floating point content was not excessive for
an UltraSPARC T1 processor. Floating point content is not
a limitation for UltraSPARC T2.
Parallelism GREEN
The observed workload has sufficient threads of execution to
efficiently utilize the multiple cores and threads of an
UltraSPARC T1 or UltraSPARC T2 processor.
Varun Dhussa
Product Architect
CE InfoSystems (P) Ltd
http://www.mapmyindia.com
Glen Newton wrote:
Could you give some configuration details:
- Solaris version
- Java VM version, heap size, and any other flags
- disk setup
You should also consider using huge pages (see
http://zzzoot.blogspot.com/2009/02/java-mysql-increased-performance-with.html)
I will also be posting performance gains using huge pages for Java
Lucene Linux large scale indexing in the next week or so...
-glen
2009/2/18 Varun Dhussa <va...@mapmyindia.com>:
Hi,
I have had a bad experience when migrating my application from Intel Xeon
based servers to Sun UltraSparc T2 T5120 servers. Lucene fuzzy search just
does not perform. A search which took approximately 500 ms takes more than 6
seconds to execute.
The index has about 100,000,000 records. So, I tried to split it into 10
indices and used the ParallelSearcher on it, but still got similar results.
I am guessing that this is because the distance implementation used by
Lucene requires higher clock speed and can't be parallelized much.
Please advice
--
Varun Dhussa
Product Architect
CE InfoSystems (P) Ltd
http://www.mapmyindia.com
---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org