How bizarre. This does seem like a bit of a show stopper to me in terms of
testing in multiple environments.

Hopefully, we can figure it out at some point.

On Fri, Nov 28, 2014 at 11:30 AM, Dan Bode <[email protected]> wrote:

>
>
> On Tue, Nov 25, 2014 at 5:25 PM, Trevor Vaughan <[email protected]>
> wrote:
>
>> Hi All,
>>
>> I've got a couple of issues running rspec-puppet tests under Ruby 2 that
>> I was hoping someone could shed some light on.
>>
>> First, under Ruby 2, any validation functions in the code appear to be
>> getting parsed *after* the inline templates. I like to stick my validation
>> at the bottom of the file to prevent users from wading through a sea of
>> garbage to get to the meat of the code so this isn't thrilling.
>>
>> Second, even though I have code properly stashed under Puppetx::Me in
>> various lib directories, I don't seem to be able to access them across
>> modules in Ruby 2.
>>
>
> I have also seen this issue in Ruby 1.9.3. It's actually on my list of
> things to root cause. I was able to track it down to the following:
>
> rspec-puppet dynamically adds all lib directories to puppet's libdir (
> https://github.com/rodjek/rspec-puppet/blob/master/lib/rspec-puppet/support.rb#L140)
> (not sure exactly why it is not possible to rely on Puppet to do the right
> thing). I printed $LOAD_PATH from the top of files that were trying to load
> external dependencies and noticed that the part of the LOAD_PATH translated
> from Puppet[:libdir] is of the form:
>
> [
>   'somenormal_path1', 'somenormalpath_1', 'libdir_part1:libdir_part2:...'
> ]
>
> I added some code to translate those ':' delimited paths translated from
> libdir into array elements and loading external libraries from fixtures
> started working.
>
> As an FYI, I did attempt to modify that code from rspec-puppet to set
> libdir as an array instead of a ':' delim string, and it led to failures
> from Puppet
>
>
>>
>> Under 1.8.7 both of these work just fine.
>>
>> Note: I have *not* seen any issues in production relating to this at this
>> time.
>>
>> Thanks,
>>
>> Trevor
>>
>> --
>> Trevor Vaughan
>> Vice President, Onyx Point, Inc
>> (410) 541-6699
>> [email protected]
>>
>> -- This account not approved for unencrypted proprietary information --
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Puppet Developers" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To view this discussion on the web visit
>> https://groups.google.com/d/msgid/puppet-dev/CANs%2BFoXBGgij%2BWcnMcR-BTz7EjRRCOoLuFp162nwFh7YAeREJA%40mail.gmail.com
>> <https://groups.google.com/d/msgid/puppet-dev/CANs%2BFoXBGgij%2BWcnMcR-BTz7EjRRCOoLuFp162nwFh7YAeREJA%40mail.gmail.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Puppet Developers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/puppet-dev/CA%2B0t2Lz06Oct%3DvTTnB_YGxZ%3DCn0HBMHw%3DyiD42m56H5dP_64iw%40mail.gmail.com
> <https://groups.google.com/d/msgid/puppet-dev/CA%2B0t2Lz06Oct%3DvTTnB_YGxZ%3DCn0HBMHw%3DyiD42m56H5dP_64iw%40mail.gmail.com?utm_medium=email&utm_source=footer>
> .
> For more options, visit https://groups.google.com/d/optout.
>



-- 
Trevor Vaughan
Vice President, Onyx Point, Inc
(410) 541-6699
[email protected]

-- This account not approved for unencrypted proprietary information --

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/puppet-dev/CANs%2BFoVAj6nFiHGSSrn%2BZjo8KTZQstCwk6LzL7j8%2BMu96Z3WFg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to