Re: [ccp4bb] Error message while refining protein-DNA complex structure in Refmac5

2009-03-16 Thread Yusuf Akhter
Hi Rajkumar,

Few days back similar problem was happening to me also.

I just changed the version of Coot and it worked.

In my case only Coot (ver. 4.1) was doing the job correct.

Please try Coot (version 4.1) in case you are using any of other versions.

HTH,

Y



On Mon, Mar 16, 2009 at 11:43 PM, E rajakumar  wrote:

> Dear All
> I am refining proitein-DNA complex structure in
> Refamac5. When I used coordinate file containing 2
> bases less, then the refinement is running smoth and
> perfect. But when I built 2 exta bases to the existing
> DNA in the coot then refinement is failed with the
> following error message.
>
> /usr/local/ccp4-6.1.0/bin/refmac5
> XYZIN"/usr6/rajkumar/APS/hem/mar9/H2/molrep/BCDNA-built2-NCS-refm2.pdb"
> XYZOUT "/tmp/rajkumar/hemCG_19_2_pdb_1.tmp" HKLIN
> "/usr6/rajkumar/APS/hem/mar9/H2/molrep/P6122.mtz"
> HKLOUT "/tmp/rajkumar/hemCG_19_3_mtz_1.tmp" LIBOUT
> "/usr6/rajkumar/APS/hem/mar9/H2/molrep/hemCG_19_lib.cif"
>
> has failed with error message
> At line 2486 of file
> /usr/local/xtal/ccp4-6.1.0/src/refmac5_/make_PDB.f
> Fortran runtime error: Bad value during floating point
> read
>
> It seems there is error in LIB file generation.
> Coordinate format and atom labelling is accoring to
> refmac convention.
>
> Please can anybody suggest me how do I trooubleshoot.
>
> Thanking you
> Rajakumara
>
>
>
>
> E. Rajakumara
> Postdoctoral Fellow
>  Strcutural Biology Program
>  Memorial Sloan-Kettering Cancer Center
>  New York-10021
>  NY
>  001 212 639 7986 (Lab)
>  001 917 674 6266 (Mobile)
>
>
>
>  Get your preferred Email name!
> Now you can @ymail.com and @rocketmail.com.
> http://mail.promotions.yahoo.com/newdomains/aa/
>



-- 
*
Present Address:

Yusuf Akhter, DAAD Fellow
EMBL Hamburg c/o DESY, Notkestraße 85,
22603 Hamburg, Germany
Mobile: +49-17676339706



[ccp4bb] Error message while refining protein-DNA complex structure in Refmac5

2009-03-16 Thread E rajakumar
Dear All
I am refining proitein-DNA complex structure in
Refamac5. When I used coordinate file containing 2
bases less, then the refinement is running smoth and
perfect. But when I built 2 exta bases to the existing
DNA in the coot then refinement is failed with the
following error message.
 
/usr/local/ccp4-6.1.0/bin/refmac5 
XYZIN"/usr6/rajkumar/APS/hem/mar9/H2/molrep/BCDNA-built2-NCS-refm2.pdb"
XYZOUT "/tmp/rajkumar/hemCG_19_2_pdb_1.tmp" HKLIN
"/usr6/rajkumar/APS/hem/mar9/H2/molrep/P6122.mtz"
HKLOUT "/tmp/rajkumar/hemCG_19_3_mtz_1.tmp" LIBOUT
"/usr6/rajkumar/APS/hem/mar9/H2/molrep/hemCG_19_lib.cif"

has failed with error message
At line 2486 of file
/usr/local/xtal/ccp4-6.1.0/src/refmac5_/make_PDB.f
Fortran runtime error: Bad value during floating point
read

It seems there is error in LIB file generation.
Coordinate format and atom labelling is accoring to
refmac convention.

Please can anybody suggest me how do I trooubleshoot.

Thanking you
Rajakumara




E. Rajakumara
Postdoctoral Fellow
  Strcutural Biology Program
  Memorial Sloan-Kettering Cancer Center
  New York-10021
  NY
  001 212 639 7986 (Lab)
  001 917 674 6266 (Mobile)



  Get your preferred Email name!
Now you can @ymail.com and @rocketmail.com. 
http://mail.promotions.yahoo.com/newdomains/aa/


[ccp4bb] arp/warp ligand

2009-03-16 Thread Sangeetha Vedula
Hi all,

I am trying to fit a ligand into density using ARP/wARP 7.0.1 in CCP4 suite
6.0.2 on CCP4interface 1.4.4.2.

I get an error message telling me to look for the error in a
"##_warp_ligand_details.log".

_

Running Refmac5 to refine the protein PDB without the search ligand.

 After refmac, R = 0.177 (Rfree = 0.000)


 The difference electron density map has been calculated.


Segmentation fault

QUITTING ... ARP/wARP module stopped with an error message:
MAPREAD_MODE_GRIDMAKER

*** Look for error message in the file:
29_warp_ligand_details.log


#CCP4I TERMINATION STATUS 0 All done
#CCP4I TERMINATION TIME 16 Mar 2009  14:43:30
#CCP4I MESSAGE Task failed

***
* Information from CCP4Interface script
***
 Error during script execution.
***

When I look at the details file, all I see at the end is (no error message):


 ## COORDINATE READING ##

 Reading apo protein ... done.
 Identifying N and O atoms for h-bond investigations ... done.
 Reading clean search ligand(s) ... (PDBfmt)  done

___

The details file ends thus, regardless of whether I read in a library file
for the ligand or not, the library is one generated from ProDRG or from
refmac.

Funnily enough, the program ends the same way even using input files that I
had used previously, with a previous version of ARP/wARP; input files that
worked before.

Help, please!

Thanks a ton!

Sangeetha.


Re: [ccp4bb] precipitation of deglycosylated protein

2009-03-16 Thread Filip Van Petegem
Dear Simon,

this may be an isolated case, but you might want to try to drastically
change the pH of the reaction conditions.
In our hands, a protein with 3 hyperglycosylated sites, expressed in Pichia
pastoris, could be deglycosylated readily with endoH. However, at the
recommended reaction pH (5-5.5), the protein rapidly precipitated in an
irreversible manner.  After testing the reaction in various different
buffers, nearly all of the protein could be deglycosylated at pH 7.5, with
no visible precipitation. Curiously, the buffer could easily be exchanged to
pH 5.5 and below after the reaction was finished (the protein even
crystallized at pH 4).

This of course only has a small chance of working in your case, but it's
quick to test various different mini-deglycosylation experiments using an
array of conditions.

Cheers

Filip Van Petegem

On Mon, Mar 16, 2009 at 9:29 AM, Yue Li  wrote:

>   Hi everyone,
>
> Recently, I obtained a soluble glyco-protein. Unfortunately, after I added
> PNGase or Endo Hf to remove the glycans, the deglycosylated protein is
> precipitated. Is there any method to avoid this kind of precipitation?
>
> Thanks,
>
> Simon
>
>


-- 
Filip Van Petegem, PhD
Assistant Professor
The University of British Columbia
Dept. of Biochemistry and Molecular Biology
2350 Health Sciences Mall - Rm 2.356
Vancouver, V6T 1Z3

phone: +1 604 827 4267
email: filip.vanpete...@gmail.com
http://crg.ubc.ca/VanPetegem/


Re: [ccp4bb] LSQKAB error

2009-03-16 Thread Anita Lewit-Bentley

Dear Norman,

That did the trick! And was easy to do...

Thanks a lot,

Anita


Dear Anita

You could try adding the following additional line of input:

rotate matrix 1 0 0 0 1 0 0 0 1

This multiplies the data in XYZINM by the identity matrix (so that  
the data should be unchanged) but has the side effect of forcing the  
program to read in the XYZINF input file.


Norman

From: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] On Behalf  
Of Anita Lewit-Bentley

Sent: 16 March 2009 14:59
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] LSQKAB error

Dear all,

I am trying to compare several related structures using LSQKAB. In  
order to refine the superpositions, I'd like to use the option  
"radius", to see the relative postion of certain residues within a  
given distance from a common point.


The programme reads the commands OK:
"   ALL ATOMS MORE THAN 30.000 ANGSTROMS FROM REFERENCE POINT 46.520  
37.890 40.280 EXCLUDED"


as well as the fixed input coordinate file:

"  Logical name: XYZIN2  File name: wtC_on_MnA_SSM.pdb
  PDB file is being opened on unit 1 for INPUT."

No sign of any other file being opened/read, but the following error  
message is output instead:


" LSQKAB:   ERROR: in XYZADVANCE file has not been opened
 LSQKAB:   ERROR: in XYZADVANCE file has not been opened   "

and the job stops.

This happens for both ccp4 versions 6.0.2 and 6.1.1.

Is there a working version of the programme somewhere? If not, what  
other programme could do the same thing (in a simple way...)?


Thanks for any suggestions!

Anita



Anita Lewit-Bentley
Unité d'Immunologie Structurale
CNRS URA 2185
Département de Biologie Structurale & Chimie
Institut Pasteur
25 rue du Dr. Roux
75724 Paris cedex 15
FRANCE

Tel: 33- (0)1 45 68 88 95
FAX: 33-(0)1 40 61 30 74
email: ale...@pasteur.fr






Re: [ccp4bb] precipitation of deglycosylated protein

2009-03-16 Thread Pascal Egea
My apologies Simon, I should have been more thorough answering your
question.
Yes the protein was shown to be quite homogenously glycosylated using
mass-spectrometry.
The ED maps showed the first two ordered fully occupied NAG units and
residual density for a third sugar unit although it is very poor defined.

Pascal Egea


Re: [ccp4bb] precipitation of deglycosylated protein

2009-03-16 Thread Pascal Egea
Hi Simon,
Although they are a source of heterogeneity for
crystallization, glycosylations usually stabilize proteins.
There are a couple of things that may be important to consider before you
deglycosylate your protein.

Do you know how many natural sequons/glycosylation sites your protein has?
And in what system/organism are you expressing your protein?
I am saying that because I have the case of a bovine enzyme which has three
sites. Inactivation of all sequons from Asn to Asp ( they are standards
N-glycosylation sites) and expression in Pichia pastoris results in a
fully-deglycosylated protein which is unstable and precipitates. However if
one specific sequon is kept intact, the obtained protein is glycosylated by
Pichia at this functional site, behaves very well and yields good quality
crystals. The electron density maps show the two first N-acetyl glucosamine
units N-linked to the Asn residue. Interestingly the protein is pretty
homogenously glycosylated by Pichia.

I know this may sound a little bizarre but you may get around this problems
by keeping some sequons or the sequon active (if there is only one) and try
to deal with a protein, "glycosylated-light" as you express it. This depends
on what your expression system is.

You can try to add stabilizing agents like glycerol, ethylene glycol or some
di-sugars like trehalose.

I hope this helps,
Cheers,

Pascal F. Egea, PhD
University of California San Francisco
Department of Biochemistry and Biophysics



On Mon, Mar 16, 2009 at 9:29 AM, Yue Li  wrote:

> Hi everyone,
>
> Recently, I obtained a soluble glyco-protein. Unfortunately, after I added
> PNGase or Endo Hf to remove the glycans, the deglycosylated protein is
> precipitated. Is there any method to avoid this kind of precipitation?
>
> Thanks,
>
> Simon
>
>


[ccp4bb] precipitation of deglycosylated protein

2009-03-16 Thread Yue Li
Hi everyone,

Recently, I obtained a soluble glyco-protein. Unfortunately, after I added 
PNGase or Endo Hf to remove the glycans, the deglycosylated protein is 
precipitated. Is there any method to avoid this kind of precipitation?

Thanks,

Simon   



  

[ccp4bb] LSQKAB error

2009-03-16 Thread Anita Lewit-Bentley

Dear all,

I am trying to compare several related structures using LSQKAB. In  
order to refine the superpositions, I'd like to use the option  
"radius", to see the relative postion of certain residues within a  
given distance from a common point.


The programme reads the commands OK:
"   ALL ATOMS MORE THAN 30.000 ANGSTROMS FROM REFERENCE POINT 46.520  
37.890 40.280 EXCLUDED"


as well as the fixed input coordinate file:

"  Logical name: XYZIN2  File name: wtC_on_MnA_SSM.pdb
  PDB file is being opened on unit 1 for INPUT."

No sign of any other file being opened/read, but the following error  
message is output instead:


" LSQKAB:   ERROR: in XYZADVANCE file has not been opened
 LSQKAB:   ERROR: in XYZADVANCE file has not been opened   "

and the job stops.

This happens for both ccp4 versions 6.0.2 and 6.1.1.

Is there a working version of the programme somewhere? If not, what  
other programme could do the same thing (in a simple way...)?


Thanks for any suggestions!

Anita



Anita Lewit-Bentley
Unité d'Immunologie Structurale
CNRS URA 2185
Département de Biologie Structurale & Chimie
Institut Pasteur
25 rue du Dr. Roux
75724 Paris cedex 15
FRANCE

Tel: 33- (0)1 45 68 88 95
FAX: 33-(0)1 40 61 30 74
email: ale...@pasteur.fr



[ccp4bb] Twinned P62

2009-03-16 Thread Ed Pozharski
Some time ago, I had a dataset which turned out to be P31 with a dimer
sitting on the three-fold axis.  The only way I found to process it was
to run twinned refinement in CNS with (-h,-k,l) and twinning fraction of
0.5.  R/Rfree are 20/24% at 2.4A resolution, so the model must be right
at least to some extent.  Good news is that I can figure out the dimer,
the bad news is that ligand binding site is not quite interpretable (it
was missing a loop in MR model and I can't build it from the density I
get).  So my question is: what is the right way (if any) to improve maps
in such a case?

Of course, it's quite possible that the aforementioned loop is simply
disordered in which case nothing can be done, I presume.

Thanks for your help.

-- 
Edwin Pozharski, PhD, Assistant Professor
University of Maryland, Baltimore
--
When the Way is forgotten duty and justice appear;
Then knowledge and wisdom are born along with hypocrisy.
When harmonious relationships dissolve then respect and devotion arise;
When a nation falls to chaos then loyalty and patriotism are born.
--   / Lao Tse /


Re: [ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Ed Pozharski
But wouldn't detwinning be problematic with nearly perfectly twinned
data?  I'll post my own question about separately to not hijack the
thread...

On Mon, 2009-03-16 at 11:24 +0100, Clemens Steegborn wrote:
> Hi Walter,
> 
> You should definitely detwin data for map calculation if you have a
> significant twinning fraction (and only for maps; keep using the twinned
> data set for refinement). We use the CCP4 program detwin. BUT if Shelxl
> gives bad density, maybe that's simply what you have, a bad density map -
> because output from Shelx is already detwinned!
> BTW, we observed that different programs handled different cases differently
> well; I would suggest ALWAYS to try more than one program, and also to try
> Phenix ...
>  
> Best
> Clemens
> 
> 
> -Ursprüngliche Nachricht-
> Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
> Walter Kim
> Gesendet: Monday, March 16, 2009 7:22 AM
> An: CCP4BB@JISCMAIL.AC.UK
> Betreff: [ccp4bb] Twinned data and maps
> 
> Hi again,
> 
> Thanks for your insight into refinement tools for twinned data. I have a
> couple of twinned data sets that are nearly perfectly pseudomerohedrally
> twinned. I've begun to refine my data in Refmac5 (using the automated twin
> refinement), CNS (using the twin inputs) and Shelxl; I'm testing out the
> different refinement programs to evaluate the best strategy for the
> refinement. However, I would like to start making maps.
> 
> 1. Refmac5 - outputs an mtz that is model-biased
> 2. CNS - maps made via model_map_twin.inp are poor
> 3. Shelxl - the maps generated in coot from the.fcf file are poor
> 
> Are there better ways to make cleaner maps with my twinnned data that are
> less model-biased that I can try to build into? Should I detwin the data and
> make maps from that (but continue to refine against the twinned data)?
> 
> Thanks,
> Walter
-- 
Edwin Pozharski, PhD, Assistant Professor
University of Maryland, Baltimore
--
When the Way is forgotten duty and justice appear;
Then knowledge and wisdom are born along with hypocrisy.
When harmonious relationships dissolve then respect and devotion arise;
When a nation falls to chaos then loyalty and patriotism are born.
--   / Lao Tse /


[ccp4bb] Software development posts at Diamond Light Source, UK

2009-03-16 Thread Ashton, AW (Alun)
See below details of two software development posts available at Diamond. For 
full details please go to the web pages at:
http://www.diamond.ac.uk/Jobs/CurrentVacancies/Scientific/default.htm


Ref: DIA0504/SB: Software Engineer/ Scientist, Macromolecular Crystallography 
(MX)
Post details: Fixed Term (2-3years) 
Salary information: Circa £32k (based on experience and qualifications; a 
higher salary is available for an exceptionally experienced and qualified 
candidate).  
Application deadline: 17th April 2009 

To support and enhance software for Data Acquisition on MX beamlines.
The role holder will work with the beamline scientists and users of the 
Macromolecular Crystallography (MX) beamlines at Diamond to examine their user 
operation and establish methods by which this could be enhanced and 
streamlined. They would be actively involved in scripting, GUI development and 
the optimization of the acquisition and analysis of MX data.


Ref: DIA0505/SB - Software Engineer,e-Science/GRID techniques 
Post details: Fixed Term, 3 years 
Salary information: Circa £32k (based on experience and qualifications; a 
higher salary is available for an exceptionally experienced and qualified 
candidate). 
Application deadline: 17th April 2009 

To support and enhance software for e-Science and GRID integration.
The role holder will work with the Data Acquisition and Scientific Computing 
and Computer administration teams at Diamond to ensure appropriate exploitation 
of e-Science and GRID technologies.

Alun Ashton
___
Data Acquisition Group,   http://www.diamond.ac.uk/
Diamond Light Source, Chilton, Didcot, Oxon, OX11 0DE, U.K.


This e-mail and any attachments may contain 
confidential, copyright and or privileged material, and are for the use of the 
intended addressee only. If you are not the intended addressee or an authorised 
recipient of the addressee please notify us of receipt by returning the e-mail 
and do not use, copy, retain, distribute or disclose the information in or 
attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not 
necessarily of Diamond Light Source Ltd. 
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments 
are free from viruses and we cannot accept liability for any damage which you 
may sustain as a result of software viruses which may be transmitted in or with 
the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and 
Wales with its registered office at Diamond House, Harwell Science and 
Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom
 
--
Scanned by iCritical.


[ccp4bb] AW: [ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Clemens Steegborn
Yes, Phenix default output (for twinned data) is detwinned, like the one
from Shelxl; didn't know for the new Refmac twin option - but I understand
from this posting and Garib's that it also detwins if the twin option is
used ...
So considering that the Shelxl-derived density looked bad, I definitely
agree with Tassos (and apparently didn't make that point clear enough) that
other reasons for bad density than twinning have to be considered ...

Best 
Clemens

-Ursprüngliche Nachricht-
Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
Eleanor Dodson
Gesendet: Monday, March 16, 2009 12:52 PM
An: CCP4BB@JISCMAIL.AC.UK
Betreff: Re: [ccp4bb] AW: [ccp4bb] Twinned data and maps

But dont all twinned refinement programs output detwinned terms for a map?
Certainly REFMAC and SHELX  do.


Eleanor

Clemens Steegborn wrote:
> Hi Walter,
>
> You should definitely detwin data for map calculation if you have a
> significant twinning fraction (and only for maps; keep using the twinned
> data set for refinement). We use the CCP4 program detwin. BUT if Shelxl
> gives bad density, maybe that's simply what you have, a bad density map -
> because output from Shelx is already detwinned!
> BTW, we observed that different programs handled different cases
differently
> well; I would suggest ALWAYS to try more than one program, and also to try
> Phenix ...
>  
> Best
> Clemens
>
>
> -Ursprüngliche Nachricht-
> Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
> Walter Kim
> Gesendet: Monday, March 16, 2009 7:22 AM
> An: CCP4BB@JISCMAIL.AC.UK
> Betreff: [ccp4bb] Twinned data and maps
>
> Hi again,
>
> Thanks for your insight into refinement tools for twinned data. I have a
> couple of twinned data sets that are nearly perfectly pseudomerohedrally
> twinned. I've begun to refine my data in Refmac5 (using the automated twin
> refinement), CNS (using the twin inputs) and Shelxl; I'm testing out the
> different refinement programs to evaluate the best strategy for the
> refinement. However, I would like to start making maps.
>
> 1. Refmac5 - outputs an mtz that is model-biased
> 2. CNS - maps made via model_map_twin.inp are poor
> 3. Shelxl - the maps generated in coot from the.fcf file are poor
>
> Are there better ways to make cleaner maps with my twinnned data that are
> less model-biased that I can try to build into? Should I detwin the data
and
> make maps from that (but continue to refine against the twinned data)?
>
> Thanks,
> Walter
>
>
>
>   


Re: [ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Garib Murshudov
As far as I know map coeffiicient correspond to detwinned data. But  
using detwinned data may not be a good idea.
It would be good to see what are R factor. Another thing to consider  
is that different program may use different flags for free R and it  
may cause some problem.
What are Rfactors, completeness, percentage of freeR reflections?  
These are printed by all programs.


If your solution is wrong and you are using twin refinement even if  
you do not have twinned crystals your R factor can be as low as 50%  
(that is theoretical limit for random Rfactor when one data are from  
twinned and another from untwinned crystals). If you have perfect twin  
and you are modelling twin (using twin refinement) then your random  
Rfactors can be even smaller.



regards
Garib


On 16 Mar 2009, at 11:51, Eleanor Dodson wrote:

But dont all twinned refinement programs output detwinned terms for  
a map?

Certainly REFMAC and SHELX  do.


Eleanor

Clemens Steegborn wrote:

Hi Walter,

You should definitely detwin data for map calculation if you have a
significant twinning fraction (and only for maps; keep using the  
twinned
data set for refinement). We use the CCP4 program detwin. BUT if  
Shelxl
gives bad density, maybe that's simply what you have, a bad density  
map -

because output from Shelx is already detwinned!
BTW, we observed that different programs handled different cases  
differently
well; I would suggest ALWAYS to try more than one program, and also  
to try

Phenix ...
Best
Clemens


-Ursprüngliche Nachricht-
Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag  
von

Walter Kim
Gesendet: Monday, March 16, 2009 7:22 AM
An: CCP4BB@JISCMAIL.AC.UK
Betreff: [ccp4bb] Twinned data and maps

Hi again,

Thanks for your insight into refinement tools for twinned data. I  
have a
couple of twinned data sets that are nearly perfectly  
pseudomerohedrally
twinned. I've begun to refine my data in Refmac5 (using the  
automated twin
refinement), CNS (using the twin inputs) and Shelxl; I'm testing  
out the

different refinement programs to evaluate the best strategy for the
refinement. However, I would like to start making maps.

1. Refmac5 - outputs an mtz that is model-biased
2. CNS - maps made via model_map_twin.inp are poor
3. Shelxl - the maps generated in coot from the.fcf file are poor

Are there better ways to make cleaner maps with my twinnned data  
that are
less model-biased that I can try to build into? Should I detwin the  
data and
make maps from that (but continue to refine against the twinned  
data)?


Thanks,
Walter









Re: [ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Eleanor Dodson

But dont all twinned refinement programs output detwinned terms for a map?
Certainly REFMAC and SHELX  do.


Eleanor

Clemens Steegborn wrote:

Hi Walter,

You should definitely detwin data for map calculation if you have a
significant twinning fraction (and only for maps; keep using the twinned
data set for refinement). We use the CCP4 program detwin. BUT if Shelxl
gives bad density, maybe that's simply what you have, a bad density map -
because output from Shelx is already detwinned!
BTW, we observed that different programs handled different cases differently
well; I would suggest ALWAYS to try more than one program, and also to try
Phenix ...
 
Best

Clemens


-Ursprüngliche Nachricht-
Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
Walter Kim
Gesendet: Monday, March 16, 2009 7:22 AM
An: CCP4BB@JISCMAIL.AC.UK
Betreff: [ccp4bb] Twinned data and maps

Hi again,

Thanks for your insight into refinement tools for twinned data. I have a
couple of twinned data sets that are nearly perfectly pseudomerohedrally
twinned. I've begun to refine my data in Refmac5 (using the automated twin
refinement), CNS (using the twin inputs) and Shelxl; I'm testing out the
different refinement programs to evaluate the best strategy for the
refinement. However, I would like to start making maps.

1. Refmac5 - outputs an mtz that is model-biased
2. CNS - maps made via model_map_twin.inp are poor
3. Shelxl - the maps generated in coot from the.fcf file are poor

Are there better ways to make cleaner maps with my twinnned data that are
less model-biased that I can try to build into? Should I detwin the data and
make maps from that (but continue to refine against the twinned data)?

Thanks,
Walter



  


Re: [ccp4bb] images

2009-03-16 Thread Herbert J. Bernstein

Dear Colleagues,

  The issue Harry is describing, of people writing multiple variations of 
"image formats" even though all of them are imgCIF is not really a 
problems with the images themselves.  Rather it is a lack of agreement on 
the metadata to go with the images.  This is similar to the problem of 
lack of consistency in REMARKS for early PDB data sets, which eventually 
required the adoption of standardized REMARKS and reprocessing of almost 
all data sets.  I don't think it would have been easier to reprocess those 
data sets if the original data sets had also had their coordinates and 
sequences recorded with wide variations in formats.


  The advantage of using imgCIF for an archive is not that it would force 
everybody to to their experiments using precisely the same format, but 
that, because it is capable of faithfully representing all the wide 
variations in current formats, it would allow what we now have to be 
captured and preserved and, when someone needed a dataset back, to be 
recast in an format appropriate to the use.


  Think of it as that little figure-8 plug and socket we are able to use 
to adapt our power cords for travel around the world.  There are other 
possible hub format (NeXus, DICOM, etc.), but the sensisble thing for an 
archive is to choose one of them for internal use, just as the PDB uses a 
variation on mmCIF for its internal use to allow it to easily deliver 
valid PDB, CIF and XML versions of sets of coordinates.  For an archive, 
the advantages of using imgCIF internally, no matter which of the more 
than 200 current formats were to be used at beam lines and in labs, is 
that it would not be necessary to discard any of the metadata people 
provided and it could be made to interoperate easily with the systems used 
internally by the PDB for coordinate data sets.


  For many of the formats in current use, there is no place to store some 
of the information people provide and translation to other formats can 
sometimes be much more difficult than one might expect unless additional 
metadata is provided.  Even such obvious things as image orientations are 
sometimes carried separately from the images themselves and can easily get 
lost.


  Don't let the perfect be the enemy of the good.  Archiving images in a 
common format, such as imgCIF, or, if you prefer, say, in the NeXus 
transliteration of imgCIF, would help to make some very useful information 
accessible for future use.  It may not be a perfect solution, but it is a 
good one.


  This is a good time to start a major crystallogrpahic image archiving 
effort.  Money may well be available now that will not be avialable six 
month from now, and we have good, if not perfect, solutions available for 
many, if not all, of the technical issues involved.  Is it really wise to 
let this opportunity pass us by?


  Regards,
Herbert
=
 Herbert J. Bernstein, Professor of Computer Science
   Dowling College, Kramer Science Center, KSC 121
Idle Hour Blvd, Oakdale, NY, 11769

 +1-631-244-3035
 y...@dowling.edu
=

On Mon, 16 Mar 2009, Harry Powell wrote:


Hi

I'm afraid the adoption of imgCIF (or CBF, its useful binary equivalent) 
doesn't help a lot - I know of three different manufacturers of detectors 
who, between them, write out four different image formats, all of which 
apparently conform to the agreed IUCr imgCIF standard. Each manufacturer has 
its own good and valid reasons for doing this. It's actually less work for me 
as a developer of integration software to write new code to incorporate a new 
format than to make sure I can read all the different imgCIFs properly.



On 16 Mar 2009, at 09:32, Eleanor Dodson wrote:

The deposition of images would be possible providing some consistent 
imagecif format was agreed.
This would of course be of great use to developers for certain pathological 
cases, but not I suspect much value to the user community - I down load 
structure factors all the time for test purposes but I probably would not 
bother to go through the data processing, and unless there were extensive 
notes associated with each set of images I suspect it would be hard to 
reproduce sensible results.


The research council policy in the UK is that original data is meant to be 
archived for publicly funded projects. Maybe someone should test the 
reality of this by asking the PI for the data sets?

Eleanor


Garib Murshudov wrote:

Dear Gerard and all MX crystallographers

As I see there are two problems.
1) Minor problem: Sanity, semantic and other checks for currently 
available data. It should not be difficult to do. Things like I/sigma, 
some statistical analysis expected vs "observed" statistical behaviour 
should sort out many of these problems (Eleanor mentioned some and they 
can be used). I do not think that depositors should be blamed for 
mista

[ccp4bb] EMBO / MAX INF2 Practical Course on Structure Determination in Macromolecular Crystallography held at the ESRF, June 15-19, 2009.

2009-03-16 Thread Matthew BOWLER

* EMBO / MAX INF2 2009 a Practical Course on*
*Structure Determination in Macromolecular Structure*
**ESRF-EMBL, Grenoble, France, 15 - 19 June 2009


An "EMBO / MAX INF2 2009 course on Structure Determination in 
Macromolecular Structure " will be hosted by the ESRF in Grenoble, 
France from 15 to 19, June 2009. This practical course addresses young 
scientists who intend to apply single- and multiple-wavelength anomalous 
scattering (SAD & MAD) methods in macromolecular structure 
determination. The course aims to impart the theoretical and practical 
basis for the 3-dimensional structure determination of 
bio-macromolecules using these techniques. Through a series of lectures, 
software demonstrations, practicals and tutorials, participants will get 
insights into all aspects of the structure determination process 
including beamline instrumentation, data collection and processing, 
heavy atom substructure determination, phasing and model building. There 
will also be sessions focusing on automated structure solution 
procedures and newer methods which exploit small anomalous scattering 
signals from crystals of native macomolecules. The number of 
participants is limited to 20 and the deadline for application is May 
1^st , 2009. More detailed information, the course programme and 
instructions as to how to apply to attend the course can be found in the 
webpages at the URL: http://cwp.embo.org/pc09-05/.




--
Matthew Bowler
Macromolecular Crystallography Group
European Synchrotron Radiation Facility
B.P. 220, 6 rue Jules Horowitz
F-38043 GRENOBLE CEDEX
FRANCE
=
Tel: +33 (0) 4.76.88.29.28
Fax: +33 (0) 4.76.88.29.04

http://www.esrf.fr/UsersAndScience/Experiments/MX/About_our_beamlines/ID14-2/
=



Re: [ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Anastassis Perrakis

Hi -

I thought that both phenix and refmac output map coefficients  
corresponding to de-twinned data - or do I get this wrong?


I am also wondering what is the context of "poor" and if it has to do  
with twinning,
or simply if the starting model is not so good. In what was are these  
"poor" maps

different than the refmac5 "model biased" maps?

From what I have seen also from your previous email its hard to advice.
Its not clear if the Phaser solution is correct, how good the search  
model was,

how the refinement goes. I would suggest to post these details so maybe
we could send more detailed comments.

Tassos

On Mar 16, 2009, at 11:24, Clemens Steegborn wrote:


Hi Walter,

You should definitely detwin data for map calculation if you have a
significant twinning fraction (and only for maps; keep using the  
twinned
data set for refinement). We use the CCP4 program detwin. BUT if  
Shelxl
gives bad density, maybe that's simply what you have, a bad density  
map -

because output from Shelx is already detwinned!
BTW, we observed that different programs handled different cases  
differently
well; I would suggest ALWAYS to try more than one program, and also  
to try

Phenix ...

Best
Clemens


-Ursprüngliche Nachricht-
Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
Walter Kim
Gesendet: Monday, March 16, 2009 7:22 AM
An: CCP4BB@JISCMAIL.AC.UK
Betreff: [ccp4bb] Twinned data and maps

Hi again,

Thanks for your insight into refinement tools for twinned data. I  
have a
couple of twinned data sets that are nearly perfectly  
pseudomerohedrally
twinned. I've begun to refine my data in Refmac5 (using the  
automated twin
refinement), CNS (using the twin inputs) and Shelxl; I'm testing out  
the

different refinement programs to evaluate the best strategy for the
refinement. However, I would like to start making maps.

1. Refmac5 - outputs an mtz that is model-biased
2. CNS - maps made via model_map_twin.inp are poor
3. Shelxl - the maps generated in coot from the.fcf file are poor

Are there better ways to make cleaner maps with my twinnned data  
that are
less model-biased that I can try to build into? Should I detwin the  
data and

make maps from that (but continue to refine against the twinned data)?

Thanks,
Walter


P please don't print this e-mail unless you really need to
Anastassis (Tassos) Perrakis, Principal Investigator / Staff Member
Department of Biochemistry (B8)
Netherlands Cancer Institute,
Dept. B8, 1066 CX Amsterdam, The Netherlands
Tel: +31 20 512 1951 Fax: +31 20 512 1954 Mobile / SMS: +31 6 28 597791






[ccp4bb] AW: [ccp4bb] Twinned data and maps

2009-03-16 Thread Clemens Steegborn
Hi Walter,

You should definitely detwin data for map calculation if you have a
significant twinning fraction (and only for maps; keep using the twinned
data set for refinement). We use the CCP4 program detwin. BUT if Shelxl
gives bad density, maybe that's simply what you have, a bad density map -
because output from Shelx is already detwinned!
BTW, we observed that different programs handled different cases differently
well; I would suggest ALWAYS to try more than one program, and also to try
Phenix ...
 
Best
Clemens


-Ursprüngliche Nachricht-
Von: CCP4 bulletin board [mailto:ccp...@jiscmail.ac.uk] Im Auftrag von
Walter Kim
Gesendet: Monday, March 16, 2009 7:22 AM
An: CCP4BB@JISCMAIL.AC.UK
Betreff: [ccp4bb] Twinned data and maps

Hi again,

Thanks for your insight into refinement tools for twinned data. I have a
couple of twinned data sets that are nearly perfectly pseudomerohedrally
twinned. I've begun to refine my data in Refmac5 (using the automated twin
refinement), CNS (using the twin inputs) and Shelxl; I'm testing out the
different refinement programs to evaluate the best strategy for the
refinement. However, I would like to start making maps.

1. Refmac5 - outputs an mtz that is model-biased
2. CNS - maps made via model_map_twin.inp are poor
3. Shelxl - the maps generated in coot from the.fcf file are poor

Are there better ways to make cleaner maps with my twinnned data that are
less model-biased that I can try to build into? Should I detwin the data and
make maps from that (but continue to refine against the twinned data)?

Thanks,
Walter


Re: [ccp4bb] images

2009-03-16 Thread Harry Powell

Hi

I'm afraid the adoption of imgCIF (or CBF, its useful binary  
equivalent) doesn't help a lot - I know of three different  
manufacturers of detectors who, between them, write out four different  
image formats, all of which apparently conform to the agreed IUCr  
imgCIF standard. Each manufacturer has its own good and valid reasons  
for doing this. It's actually less work for me as a developer of  
integration software to write new code to incorporate a new format  
than to make sure I can read all the different imgCIFs properly.



On 16 Mar 2009, at 09:32, Eleanor Dodson wrote:

The deposition of images would be possible providing some consistent  
imagecif format was agreed.
This would of course be of great use to developers for certain  
pathological cases, but not I suspect much value to the user  
community - I down load structure factors all the time for test  
purposes but I probably would not bother to go through the data  
processing, and unless there were extensive notes associated with  
each set of images I suspect it would be hard to reproduce sensible  
results.


The research council policy in the UK is that original data is meant  
to be archived for publicly funded projects. Maybe someone should  
test the reality of this by asking the PI for the data sets?

 Eleanor


Garib Murshudov wrote:

Dear Gerard and all MX crystallographers

As I see there are two problems.
1) Minor problem: Sanity, semantic and other checks for currently  
available data. It should not be difficult to do. Things like I/ 
sigma, some statistical analysis expected vs "observed" statistical  
behaviour should sort out many of these problems (Eleanor mentioned  
some and they can be used). I do not think that depositors should  
be blamed for mistakes. They are doing their best to produce and  
deposit. There should be a proper mechanism to reduce the number of  
mistakes.

You should agree that situation is now much better than few years.

2) A fundamental problem: What are observed data? I agree with you  
(Gerard) that images are only true observations. All others  
(intensities, amplitudes etc) have undergone some processing using  
some assumptions and they cannot be considered as true  
observations. The dataprocessing is irreversible process. I hope  
your effort will be supported by community. I personally get  
excited with the idea that images may be available. There are  
exciting possibilities. For example modular crystals, OD, twin in  
general, space group uncertaintly cannot be truly modeled without  
images (it does not mean refinement against images). Radiation  
damage is another example where after processing and merging  
information is lost and cannot be recovered fully. You can extend  
the list where images would be very helpful.


I do not know any reason (apart from technical one - size of files)  
why images should not be deposited and archived. I think this  
problem is very important.


regards
Garib


On 12 Mar 2009, at 14:03, Gerard Bricogne wrote:


Dear Eleanor,

   That is a useful suggestion, but in the case of 3ftt it would  
not have
helped: the amplitudes would have looked as healthy as can be  
(they were
calculated!), and it was the associated Sigmas that had absurd  
values, being
in fact phases in degrees. A sanity check on some (recalculated) I/ 
sig(I)

statistics could have detected that something was fishy.

   Looking forward to the archiving of the REAL data ... i.e. the  
images.
Using any other form of "data" is like having to eat out of  
someone else's

dirty plate!


   With best wishes,

Gerard.

--
On Thu, Mar 12, 2009 at 09:22:26AM +, Eleanor Dodson wrote:
It would be possible for the deposition sites to run a few simple  
tests to
at least find cases where intensities are labelled as amplitudes  
or vice
versa - the truncate plots of moments and cumulative intensities  
at least

would show something was wrong.

Eleanor




--

   ===
   * *
   * Gerard Bricogne g...@globalphasing.com  *
   * *
   * Global Phasing Ltd. *
   * Sheraton House, Castle Park Tel: +44-(0)1223-353033 *
   * Cambridge CB3 0AX, UK   Fax: +44-(0)1223-366889 *
   * *
   ===






Harry
--
Dr Harry Powell, MRC Laboratory of Molecular Biology, MRC Centre,  
Hills Road, Cambridge, CB2 0QH


Re: [ccp4bb] Twinned data and maps

2009-03-16 Thread Eleanor Dodson
I am not sure if there is any way of avoiding model bias if the 
coordinates are included regardless of whether there is twinning or not 
- my preferred method is to set the occupancies of suspect regions to 
0.00 then do refinement of the better parts of the model and then check 
maps again and rebuild as the maps indicate..

Eleanor

Walter Kim wrote:

Hi again,

Thanks for your insight into refinement tools for twinned data. I have a
couple of twinned data sets that are nearly perfectly pseudomerohedrally
twinned. I've begun to refine my data in Refmac5 (using the automated twin
refinement), CNS (using the twin inputs) and Shelxl; I'm testing out the
different refinement programs to evaluate the best strategy for the
refinement. However, I would like to start making maps.

1. Refmac5 - outputs an mtz that is model-biased
2. CNS - maps made via model_map_twin.inp are poor
3. Shelxl - the maps generated in coot from the.fcf file are poor

Are there better ways to make cleaner maps with my twinnned data that are
less model-biased that I can try to build into? Should I detwin the data and
make maps from that (but continue to refine against the twinned data)?

Thanks,
Walter



  


Re: [ccp4bb] images

2009-03-16 Thread Graeme Winter
Hi Folks,

We have two problems here, which are orthogonal and should probably
not be muddled. The first is the archival / making available of
diffraction images. Although this is (in theory) currently possible it
has not been enforced so is variable. I have however found that when
someone has an interesting data set, and I ask for it, they can
usually dig it out.

The second problem is the single format to rule them all. At the
moment we store data in whatever format the detector was minded to use
in recording. This is actually fine, as the data reduction programs we
all use will work with them, provided that the images are corrected.
The point about the associated documentation is far more important.
However, an analysis performed with just the information provided in
the paper, assuming that the contents of the image headers is
somewhere near correct, helps to independently verify the results. If
they are not correct, the values used should be included.

These are both valuable discussions, but should (IMnsHO) be carried
out separately so as to avoid the one causing problems for the other.

To a large extent, simply recovering the images from a DAT and finding
somewhere to put them appears to be the biggest problem - I found that
having an FTP incoming available was often the one thing which made
this possible. Therefore, having a central repository would be
excellent. I have a couple of comments about this too...

At the moment you can pretty much fit the raw data stored in the pdb
on some DVD's or a firewire disk or something - not the derived
tables, which I expect are huge, but the source files are fairly
small. This means that backing them up is tractable, and some quality
of service can be provided.

When you back up your data to say a firewire disk, and it fails, you
can just take the hit and not worry about it. If a service takes
responsibility for the data it must be curated, backed up ideally to
multiple locations, be available, provide sufficient space / bandwidth
etc. Much more expensive, much more complicated. You then also get the
problem previously mentioned of ensuring that these images are from
*this* pdb not the mutant you are working on, which has much the same
cell and symmetry. Now that's hard!

Just because something is hard does not mean that it should not be
done, but this can't be done ad hoc - if it is going to be done and be
useful, it would have to be done properly. I'd be delighted to see it
happen mind.

Cheers,

Graeme





2009/3/16 Eleanor Dodson :
> The deposition of images would be possible providing some consistent
> imagecif format was agreed.
> This would of course be of great use to developers for certain pathological
> cases, but not I suspect much value to the user community - I down load
> structure factors all the time for test purposes but I probably would not
> bother to go through the data processing, and unless there were extensive
> notes associated with each set of images I suspect it would be hard to
> reproduce sensible results.
>
> The research council policy in the UK is that original data is meant to be
> archived for publicly funded projects. Maybe someone should test the reality
> of this by asking the PI for the data sets?
>  Eleanor
>
>
> Garib Murshudov wrote:
>>
>> Dear Gerard and all MX crystallographers
>>
>> As I see there are two problems.
>> 1) Minor problem: Sanity, semantic and other checks for currently
>> available data. It should not be difficult to do. Things like I/sigma, some
>> statistical analysis expected vs "observed" statistical behaviour should
>> sort out many of these problems (Eleanor mentioned some and they can be
>> used). I do not think that depositors should be blamed for mistakes. They
>> are doing their best to produce and deposit. There should be a proper
>> mechanism to reduce the number of mistakes.
>> You should agree that situation is now much better than few years.
>>
>> 2) A fundamental problem: What are observed data? I agree with you
>> (Gerard) that images are only true observations. All others (intensities,
>> amplitudes etc) have undergone some processing using some assumptions and
>> they cannot be considered as true observations. The dataprocessing is
>> irreversible process. I hope your effort will be supported by community. I
>> personally get excited with the idea that images may be available. There are
>> exciting possibilities. For example modular crystals, OD, twin in general,
>> space group uncertaintly cannot be truly modeled without images (it does not
>> mean refinement against images). Radiation damage is another example where
>> after processing and merging information is lost and cannot be recovered
>> fully. You can extend the list where images would be very helpful.
>>
>> I do not know any reason (apart from technical one - size of files) why
>> images should not be deposited and archived. I think this problem is very
>> important.
>>
>> regards
>> Garib
>>
>>
>> On 12 Mar 2009, at 14:03,

Re: [ccp4bb] images

2009-03-16 Thread Eleanor Dodson
The deposition of images would be possible providing some consistent 
imagecif format was agreed.
This would of course be of great use to developers for certain 
pathological cases, but not I suspect much value to the user community - 
I down load structure factors all the time for test purposes but I 
probably would not bother to go through the data processing, and unless 
there were extensive notes associated with each set of images I suspect 
it would be hard to reproduce sensible results.


The research council policy in the UK is that original data is meant to 
be archived for publicly funded projects. Maybe someone should test the 
reality of this by asking the PI for the data sets? 


  Eleanor


Garib Murshudov wrote:

Dear Gerard and all MX crystallographers

As I see there are two problems.
1) Minor problem: Sanity, semantic and other checks for currently 
available data. It should not be difficult to do. Things like I/sigma, 
some statistical analysis expected vs "observed" statistical behaviour 
should sort out many of these problems (Eleanor mentioned some and 
they can be used). I do not think that depositors should be blamed for 
mistakes. They are doing their best to produce and deposit. There 
should be a proper mechanism to reduce the number of mistakes.

You should agree that situation is now much better than few years.

2) A fundamental problem: What are observed data? I agree with you 
(Gerard) that images are only true observations. All others 
(intensities, amplitudes etc) have undergone some processing using 
some assumptions and they cannot be considered as true observations. 
The dataprocessing is irreversible process. I hope your effort will be 
supported by community. I personally get excited with the idea that 
images may be available. There are exciting possibilities. For example 
modular crystals, OD, twin in general, space group uncertaintly cannot 
be truly modeled without images (it does not mean refinement against 
images). Radiation damage is another example where after processing 
and merging information is lost and cannot be recovered fully. You can 
extend the list where images would be very helpful.


I do not know any reason (apart from technical one - size of files) 
why images should not be deposited and archived. I think this problem 
is very important.


regards
Garib


On 12 Mar 2009, at 14:03, Gerard Bricogne wrote:


Dear Eleanor,

That is a useful suggestion, but in the case of 3ftt it would not 
have

helped: the amplitudes would have looked as healthy as can be (they were
calculated!), and it was the associated Sigmas that had absurd 
values, being
in fact phases in degrees. A sanity check on some (recalculated) 
I/sig(I)

statistics could have detected that something was fishy.

Looking forward to the archiving of the REAL data ... i.e. the 
images.
Using any other form of "data" is like having to eat out of someone 
else's

dirty plate!


With best wishes,

 Gerard.

--
On Thu, Mar 12, 2009 at 09:22:26AM +, Eleanor Dodson wrote:
It would be possible for the deposition sites to run a few simple 
tests to
at least find cases where intensities are labelled as amplitudes or 
vice
versa - the truncate plots of moments and cumulative intensities at 
least

would show something was wrong.

Eleanor




--

===
* *
* Gerard Bricogne g...@globalphasing.com  *
* *
* Global Phasing Ltd. *
* Sheraton House, Castle Park Tel: +44-(0)1223-353033 *
* Cambridge CB3 0AX, UK   Fax: +44-(0)1223-366889 *
* *
===






Re: [ccp4bb] ANISOU

2009-03-16 Thread Pavel Afonine

Dear friends,

if the data is of high enough resolution, wouldn't be more reasonable to 
attempt anisotropic refinement (constrained with TLS or refining 
individual anisotropic ADP), or mixed one - some atoms are isotropic and 
some anisotropic, rather than struggle with file conversions and getting 
rid of ANISOU in order to remove escape the reality? -:)


There is a bunch of programs out there that can do it for you!

Cheers,
Pavel.


On 3/15/09 10:55 PM, Tim Fenn wrote:

On Mon, 16 Mar 2009 11:01:34 +0800
Sheng Li  wrote:

  

Please read the coordinate file with alwyn's O, and then save it to
another file. The ANISOU lines will be removed.




only if you use s_a_i - pdb_read will preserve ANISOU.

It might be easier to just grep them out:

grep -v "^ANISOU" foo.pdb > bar.pdb

-Tim