And again :)  This time
0005-Added-check-for-number-of-supplied-spectra-ID-s-and-.patch (git
hash 7cc9b7407e77a3f5526336dc54f854b4e82c7a4f).  This check should be
independent of the peak list format.  I suggest that it goes after:

    for i in range(len(intensity_data)):
        # Extract the data.
        H_name, X_name, spin_id, intensity, line = intensity_data[i]

        # Convert the intensity data and spectrum IDs to lists if needed.
        if not isinstance(intensity, list):
            intensity = [intensity]
        if not isinstance(spectrum_id, list):
            spectrum_id = [spectrum_id]

This is at the bottom of the pipe_control.spectrum.read() function.
At this point you can simply check if the intensity and spectrum_id
lists are of the same length.  I thought I had already written this
check, but obviously not.  For keeping the data pipe clean if the user
makes a mistake, I would recommend shifting the code:

    # Add the spectrum id (and ncproc) to the relax data store.
    spectrum_ids = spectrum_id
    if isinstance(spectrum_id, str):
        spectrum_ids = [spectrum_id]
    if not hasattr(cdp, 'spectrum_ids'):
        cdp.spectrum_ids = []
        if ncproc != None:
            cdp.ncproc = {}
    for i in range(len(spectrum_ids)):
        if not spectrum_ids[i] in cdp.spectrum_ids:
            cdp.spectrum_ids.append(spectrum_ids[i])
            if ncproc != None:
                cdp.ncproc[spectrum_ids[i]] = ncproc

This should go after the "for i in range(len(intensity_data)):" loop,
i.e. after this check.  Shifting it to later will prevent the
cdp.spectrum_ids list to be populated after the user calls the user
function incorrectly.

Cheers,

Edward


On 2 August 2013 17:44, Edward d'Auvergne <[email protected]> wrote:
> Hi again,
>
> A small fix is needed for the
> 0004-Added-system-test-for-reading-of-a-multi-column-form.patch file
> (git hash 034ec73171270182afe6c4907179d46eebe1b759).  Spaces are
> needed after the commas in the self.assertAlmostEqual() calls.  The
> 2to3 conversion program (for Python 2 to Python 3 conversion)
> highlights this issue.  Also, the first few tests should be against
> floats rather than integers, i.e. simply add '.0' to the end of each
> number.
>
> Cheers,
>
> Edward
>
>
>
> On 2 August 2013 17:39, Edward d'Auvergne <[email protected]> wrote:
>> Hi Troels,
>>
>> I'm getting through you patches and have applied the first two.  These
>> are ok.  I have an issue with
>> 0003-The-ID-of-spins-in-seriesTab_multi.ser-was-not-forma.patch though
>> (git hash f49b4f096773cb2f7ecb70293122e03b0b40e889).  It says that you
>> are changing the seriesTab_multi.ser file's 'ASS' column format to
>> that of Sparky.  Why is this so?  Is there an NMRPipe seriesTab format
>> which is different from Sparky?  If so, we should support that.
>>
>> Cheers,
>>
>> Edward
>>
>>
>> On 2 August 2013 17:29, Troels E. Linnet
>> <[email protected]> wrote:
>>> Follow-up Comment #53, sr #3043 (project relax):
>>>
>>> Multi colums import now supported.
>>>
>>> Note, I made it possible to autogenerate spectrum ID's, if
>>>  spectrum_id='auto' for the function:
>>>
>>> spectrum.read_intensities(file="seriesTab_multi.ser", 
>>> dir=status.install_path
>>> + sep+'test_suite'+sep+'shared_data'+sep+'peak_lists', spectrum_id='auto',
>>> int_method='point sum')
>>>
>>> (file #18619)
>>>     _______________________________________________________
>>>
>>> Additional Item Attachment:
>>>
>>> File name: stab.patch.tar.gz              Size:4 KB
>>>
>>>
>>>     _______________________________________________________
>>>
>>> Reply to this item at:
>>>
>>>   <http://gna.org/support/?3043>
>>>
>>> _______________________________________________
>>>   Message sent via/by Gna!
>>>   http://gna.org/
>>>

_______________________________________________
relax (http://www.nmr-relax.com)

This is the relax-devel mailing list
[email protected]

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-devel

Reply via email to