Karl Rossing wrote:
> Ghee,
>
> b121 is available now, did the GSD bug fix make it in?
It went into glib 2-20 according to upstream maintainer. We have 2.20.4 
in b121. So you should have it :)
Let me know if it still causes problem.

-Ghee
>
> Karl
>
> Ghee Teo wrote:
>> Hi Karl,
>>
>> Thanks for all the information you have provided, It helps. As Lin 
>> has said, your problem is like 6718912 though the original  submitter 
>> there did not see the problem on gnome-settings-daemon, but 
>> gnome-panel but the behavior is similar. I have done some 
>> investigation and I have narrowed down the root cause, but the 
>> solution is not very clear yet, What I have found so far is this:
>>
>> - since 2.24.0 of gsd, fonts monitoring has changed from xft2 to 
>> fontconfig and the monitoring of fonts changes are done using gio 
>> API. These changes seems to have caused, number of fonts directory 
>> (88 on snv b111) [1] x number of entries (/etc/mnttab) of ioctl(19, 
>> MNTIOC_GETMNTENT, 0x08047614) whenever /etc/mnttab is updated.
>> - This is particularly bad in a Sun Ray environment, since it is more 
>> likely for user login to have cause an update of /etc/mnttab. The 
>> total number of ioctl is multiplied by the number of login users on 
>> the system all at the same time.
>>
>>
>> I have logged http://bugzilla.gnome.org/show_bug.cgi?id=585360 for 
>> now and will continue to look into this.
>>
>> -Ghee
>>
>> [1] The number of fonts is made of up the entries in the section of 
>> <!-- Font directory list --> /etc/fonts/fonts.conf and all the 
>> directories in /etc/fonts/conf.d/
>> If you are brave enough to play around with these fonts list, you can 
>> most likely to reduce the number of ioctl call significantly.
>>
>>
>> Karl Rossing wrote:
>>> Lin Ma wrote:
>>>> On 06/03/09 05:11, Karl Rossing wrote:
>>>>> Lin,
>>>>>
>>>>> It seems that all 56 gnome-settings-daemon  seem to an ioctl at 
>>>>> the same time every 5 seconds. That drives the load averages up.
>>>>>
>>>>> We currently have 36 gnome-settings-daemon  running and it seems 
>>>>> to an ioctl every minute or so.
>>>>>
>>>>> I'm not sure that what point you want a pstack done and of which 
>>>>> gnome-settings-daemon process.
>>>> I don't know which is the best way to know what the highest 
>>>> frequency of ioctl is related. So could you use dtrace aggregations 
>>>> to find out the highest ustack frequency (ustack is a dtrace 
>>>> function).
>>> knowlegde
>>> I ran dtruss and have the output if anyone is interested. I don't 
>>> think I can attach files to this list.
>>>
>>> Karl
>>>
>>>
>>> CONFIDENTIALITY NOTICE:  This communication (including all 
>>> attachments) is
>>> confidential and is intended for the use of the named addressee(s) 
>>> only and
>>> may contain information that is private, confidential, privileged, and
>>> exempt from disclosure under law.  All rights to privilege are 
>>> expressly
>>> claimed and reserved and are not waived.  Any use, dissemination,
>>> distribution, copying or disclosure of this message and any 
>>> attachments, in
>>> whole or in part, by anyone other than the intended recipient(s) is 
>>> strictly
>>> prohibited.  If you have received this communication in error, 
>>> please notify
>>> the sender immediately, delete this communication from all data storage
>>> devices and destroy all hard copies.
>>> _______________________________________________
>>> desktop-discuss mailing list
>>> desktop-discuss at opensolaris.org
>>
>
>
>
> CONFIDENTIALITY NOTICE:  This communication (including all 
> attachments) is
> confidential and is intended for the use of the named addressee(s) 
> only and
> may contain information that is private, confidential, privileged, and
> exempt from disclosure under law.  All rights to privilege are expressly
> claimed and reserved and are not waived.  Any use, dissemination,
> distribution, copying or disclosure of this message and any 
> attachments, in
> whole or in part, by anyone other than the intended recipient(s) is 
> strictly
> prohibited.  If you have received this communication in error, please 
> notify
> the sender immediately, delete this communication from all data storage
> devices and destroy all hard copies.


Reply via email to