[ccp4bb] Change of Email address

2016-12-20 Thread Ranvir Singh
I am in the mailing list of subscribers for ccp4bb with current yahoo mail 
account. I am going to discontinue this mail account. kindly change my email 
account to my gmail account, ranvirsin...@gmail.com
Ranvir SinghDept cum NCHGSRPanjab UniversityChandigarhINDIA.


Re: [ccp4bb] Rfree and unaccounted density

2007-09-05 Thread Ranvir Singh
Dear Vineet
Following points shall  be kept in mind.
1.If you have been able to model most of the protein
into the electron density, then its a bit strange. 
your refinement parameters suggest that refinement is
still away from convergence. 
2.You might be trapped in local minima, and in this
case  simulated annealing might help you.
3. As far as picking waters, to me, its perhaps a bit
early to pick water at this satge of refinement. 
4.Also, you can look at ramachandran plot to see which
residues needs attention. 
5.Something might have co-crystallized along with your
protein. You can check your crystallization
conditions. 
Ranvir

--- Vineet Gaur <[EMAIL PROTECTED]> wrote:

> Hi all
> 
> i m solving a structure at 2.2 A using MIR. rt now
> my Rfree is 30.55% and
> Rcryst is 28.51%. i have already carried out B
> factor refinement n water has
> also been picked up to 2.3 sigma cut off. now i m
> not able to refine the
> structure any further.
> 
> The protein has been purified directly from the
> source. At this pt of
> refinement i m able to see lot of unaccounted
> densities, far bigger to b
> accounted by water or sugar, as we don't have any
> information abt the
> possible molecules that can interact with this
> protein.
> .
> is this unaccounted density be the reason why i m
> not able to refine the
> strucure?
> 
> thanx in advance
> 
> Vineet Gaur
> 




   

Got a little couch potato? 
Check out fun summer activities for kids.
http://search.yahoo.com/search?fr=oni_on_mail&p=summer+activities+for+kids&cs=bz
 


Re: [ccp4bb] Highest shell standards

2007-03-22 Thread Ranvir Singh
I will agree with Ulrich. Even at 3.0 A, it is
possible to have a  structure with reasonable accuracy
which can explain the biological function/ or is
consistent with available biochemical data.
Ranvir
--- Ulrich Genick <[EMAIL PROTECTED]> wrote:

> Here are my 2-3 cents worth on the topic:
> 
> The first thing to keep in mind is that the goal of
> a structure  
> determination
> is not to get the best stats or to claim the highest
> possible  
> resolution.
> The goal is to get the best possible structure and
> to be confident that
> observed features in a structure are real and not
> the result of noise.
> 
>  From that perspective, if any of the conclusions
> one draws from a  
> structure
> change depending on whether one includes data with
> an I/sigI in the  
> highest
> resolution shell of 2 or 1, one probably treads on
> thin ice.
> 
> The general guide that one should include only data,
> for which the  
> shell's average
>   I/sigI > 2 comes from the following simple
> consideration.
> 
> 
> F/sigF = 2 I/sigI
> 
> So if you include data with an I/sigI of 2 then your
> F/sigF =4.  In  
> other words you will
> have a roughly 25% experimental uncertainty in your
> F.
> Now assume that you actually knew the structure of
> your protein and  
> you would
> calculate the crystallographic R-factor between the
> Fcalcs from your  
> true structure and the
> observed F.
> In this situation, you would expect to get a
> crystallographic R- 
> factor around 25%,
> simply because of the average error in your
> experimental structure  
> factor.
> Since most macromolecular structures have R-factors
> around 20%, it  
> makes little
> sense to include data, where the experimental
> uncertainty alone will
> guarantee that your R-factor will be worse.
> Of course, these days maximum-likely-hood refinement
> will just down  
> weight
> such data and all you do is to burn CPU cycles.
> 
> 
> If you actually want to do a semi rigorous test of
> where you should stop
> including data, simply include increasingly higher
> resolution data in  
> your
> refinement and see if your structure improves.
> If you have really high resolution data (i.e. 
> better than 1.2 Angstrom)
> you can do matrix inversion in SHELX and get
> estimated standard  
> deviations (esd)
> for your refined parameters. As you include more and
> more data the  
> esds should
> initially decrease. Simply keep including higher
> resolution data  
> until your esds
> start to increase again.
> 
> Similarly, for lower resolution data you can monitor
> some molecular  
> parameters, which are not
> included in the stereochemical restraints and see,
> if the inclusion  
> of higher-resolution data makes the
> agreement between the observed and expected
> parameters better. For  
> example SHELX does not
> restrain torsion angles in aliphatic portions of
> side chains. If your  
> structure improves, those
> angles should cluster more tightly around +60 -60
> and 180...
> 
> 
> 
> 
> Cheers,
> 
> Ulrich
> 
> 
> > Could someone point me to some standards for data
> quality,  
> > especially for publishing structures? I'm
> wondering in particular  
> > about highest shell completeness, multiplicity,
> sigma and Rmerge.
> >
> > A co-worker pointed me to a '97 article by
> Kleywegt and Jones:
> >
> > http://xray.bmc.uu.se/gerard/gmrp/gmrp.html
> >
> > "To decide at which shell to cut off the
> resolution, we nowadays  
> > tend to use the following criteria for the highest
> shell:  
> > completeness > 80 %, multiplicity > 2, more than
> 60 % of the  
> > reflections with I > 3 sigma(I), and Rmerge < 40
> %. In our opinion,  
> > it is better to have a good 1.8 Å structure, than
> a poor 1.637 Å  
> > structure."
> >
> > Are these recommendations still valid with maximum
> likelihood  
> > methods? We tend to use more data, especially in
> terms of the  
> > Rmerge and sigma cuttoff.
> >
> > Thanks in advance,
> >
> > Shane Atwell
> >
> 



 

TV dinner still cooling? 
Check out "Tonight's Picks" on Yahoo! TV.
http://tv.yahoo.com/