Re: karaf-maven-plugin and version ranges

2023-12-05 Thread Bengt Rodehav
Hello JB,

I updated the JIRA. I am now on Karaf 4.4.4 and it still does not work. It
would be great if a fix could be part of the next Karaf release.

/Bengt

Den tors 28 apr. 2022 kl 15:59 skrev Jean-Baptiste Onofré :

> Hi Bengt,
>
> Thanks, I saw the Jira. Thanks ! I will work on it asap.
>
> Regards
> JB
>
> On Thu, Apr 28, 2022 at 3:33 PM Bengt Rodehav  wrote:
> >
> > I created this issue:
> >
> > https://issues.apache.org/jira/browse/KARAF-7428
> >
> > /Bengt
> >
> > Den mån 25 apr. 2022 kl 10:47 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>
> >> Hi Bengt,
> >>
> >> Unfortunately, I missed this one. Can you please create a Jira like
> >> "version range on Windows" quickly describing the problem ? I won't
> >> miss this time like this, I promise ;)
> >>
> >> Regards
> >> JB
> >>
> >> On Mon, Apr 25, 2022 at 10:22 AM Bengt Rodehav 
> wrote:
> >> >
> >> > Hello JB,
> >> >
> >> > I noticed that you have now released Karaf 4.4.0. Did you get a
> chance to look at this issue?
> >> >
> >> > /Bengt
> >> >
> >> > Den tors 24 mars 2022 kl 13:32 skrev Bengt Rodehav  >:
> >> >>
> >> >> OK - thanks,
> >> >>
> >> >> /Bengt
> >> >>
> >> >> Den ons 23 mars 2022 kl 07:19 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >> >>>
> >> >>> Hi
> >> >>>
> >> >>> Not yet, I will as part of 4.4.0 release preparation.
> >> >>> I will keep you posted soon.
> >> >>>
> >> >>> Regards
> >> >>> JB
> >> >>>
> >> >>> On Tue, Mar 22, 2022 at 5:30 PM Bengt Rodehav 
> wrote:
> >> >>> >
> >> >>> > Did you have a chance to test this on Windows JB?
> >> >>> >
> >> >>> > /Bengt
> >> >>> >
> >> >>> > Den fre 11 mars 2022 kl 17:17 skrev Bengt Rodehav <
> be...@rodehav.com>:
> >> >>> >>
> >> >>> >> OK - thanks.
> >> >>> >>
> >> >>> >> /Bengt
> >> >>> >>
> >> >>> >> Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >> >>> >>>
> >> >>> >>> Hi
> >> >>> >>>
> >> >>> >>> I think it’s more Karaf-maven-plugin issue/use on windows.
> >> >>> >>>
> >> >>> >>> We don’t have such issue on Unix. Let me try on Windows vm.
> >> >>> >>>
> >> >>> >>> Regards
> >> >>> >>> JB
> >> >>> >>>
> >> >>> >>> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav 
> a écrit :
> >> >>> >>>>
> >> >>> >>>> Do you think this is a Camel problem? The more I think about
> it I wonder how it could ever work having a version range in the repository
> tag like that. Doesn't it have to be a specific version since it identifies
> a repository to be searched for artifacts?
> >> >>> >>>>
> >> >>> >>>> /Bengt
> >> >>> >>>>
> >> >>> >>>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav <
> be...@rodehav.com>:
> >> >>> >>>>>
> >> >>> >>>>> Yes, that's correct.
> >> >>> >>>>>
> >> >>> >>>>> /Bengt
> >> >>> >>>>>
> >> >>> >>>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >> >>> >>>>>>
> >> >>> >>>>>> Hi Bengt,
> >> >>> >>>>>>
> >> >>> >>>>>> I guess you are on Windows right ?
> >> >>> >>>>>>
> >> >>> >>>>>> Regards
> >> >>> >>>>>> JB
> >> >>> >>>>>>
> >> >>> >>>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav <
> be...@rodehav.com> wrote:
> >> >>> >>>>>>>
> >> >>> >>>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel
> version to the latest (2.14.2). It turns out I get a problem with
> karaf-maven-plugin, goal features-add-to-repository.
> >> >>> >>>>>>>
> >> >>> >>>>>>> The new Karaf feature descriptor for Camel now starts with
> the following three lines:
> >> >>> >>>>>>>
> >> >>> >>>>>>>
>  
> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
> >> >>> >>>>>>>
>  
> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
> >> >>> >>>>>>>
>  
> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
> >> >>> >>>>>>>
> >> >>> >>>>>>> But this causes the karaf-maven-plugin to try to download a
> file called
> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
> which of course fails.
> >> >>> >>>>>>>
> >> >>> >>>>>>> Is this a problem with the karaf-maven-plugin or with
> Camel's karaf feature descriptor?
> >> >>> >>>>>>>
> >> >>> >>>>>>> If I comment out the above three lines it seems to work (it
> builds anyway - haven't actually tested it yet).
> >> >>> >>>>>>>
> >> >>> >>>>>>> /Bengt
>


Re: karaf-maven-plugin and version ranges

2022-04-28 Thread Bengt Rodehav
I created this issue:

https://issues.apache.org/jira/browse/KARAF-7428

/Bengt

Den mån 25 apr. 2022 kl 10:47 skrev Jean-Baptiste Onofré :

> Hi Bengt,
>
> Unfortunately, I missed this one. Can you please create a Jira like
> "version range on Windows" quickly describing the problem ? I won't
> miss this time like this, I promise ;)
>
> Regards
> JB
>
> On Mon, Apr 25, 2022 at 10:22 AM Bengt Rodehav  wrote:
> >
> > Hello JB,
> >
> > I noticed that you have now released Karaf 4.4.0. Did you get a chance
> to look at this issue?
> >
> > /Bengt
> >
> > Den tors 24 mars 2022 kl 13:32 skrev Bengt Rodehav :
> >>
> >> OK - thanks,
> >>
> >> /Bengt
> >>
> >> Den ons 23 mars 2022 kl 07:19 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>>
> >>> Hi
> >>>
> >>> Not yet, I will as part of 4.4.0 release preparation.
> >>> I will keep you posted soon.
> >>>
> >>> Regards
> >>> JB
> >>>
> >>> On Tue, Mar 22, 2022 at 5:30 PM Bengt Rodehav 
> wrote:
> >>> >
> >>> > Did you have a chance to test this on Windows JB?
> >>> >
> >>> > /Bengt
> >>> >
> >>> > Den fre 11 mars 2022 kl 17:17 skrev Bengt Rodehav  >:
> >>> >>
> >>> >> OK - thanks.
> >>> >>
> >>> >> /Bengt
> >>> >>
> >>> >> Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>> >>>
> >>> >>> Hi
> >>> >>>
> >>> >>> I think it’s more Karaf-maven-plugin issue/use on windows.
> >>> >>>
> >>> >>> We don’t have such issue on Unix. Let me try on Windows vm.
> >>> >>>
> >>> >>> Regards
> >>> >>> JB
> >>> >>>
> >>> >>> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav  a
> écrit :
> >>> >>>>
> >>> >>>> Do you think this is a Camel problem? The more I think about it I
> wonder how it could ever work having a version range in the repository tag
> like that. Doesn't it have to be a specific version since it identifies a
> repository to be searched for artifacts?
> >>> >>>>
> >>> >>>> /Bengt
> >>> >>>>
> >>> >>>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav <
> be...@rodehav.com>:
> >>> >>>>>
> >>> >>>>> Yes, that's correct.
> >>> >>>>>
> >>> >>>>> /Bengt
> >>> >>>>>
> >>> >>>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>> >>>>>>
> >>> >>>>>> Hi Bengt,
> >>> >>>>>>
> >>> >>>>>> I guess you are on Windows right ?
> >>> >>>>>>
> >>> >>>>>> Regards
> >>> >>>>>> JB
> >>> >>>>>>
> >>> >>>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav <
> be...@rodehav.com> wrote:
> >>> >>>>>>>
> >>> >>>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version
> to the latest (2.14.2). It turns out I get a problem with
> karaf-maven-plugin, goal features-add-to-repository.
> >>> >>>>>>>
> >>> >>>>>>> The new Karaf feature descriptor for Camel now starts with the
> following three lines:
> >>> >>>>>>>
> >>> >>>>>>>
>  
> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
> >>> >>>>>>>
>  
> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
> >>> >>>>>>>
>  
> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
> >>> >>>>>>>
> >>> >>>>>>> But this causes the karaf-maven-plugin to try to download a
> file called
> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
> which of course fails.
> >>> >>>>>>>
> >>> >>>>>>> Is this a problem with the karaf-maven-plugin or with Camel's
> karaf feature descriptor?
> >>> >>>>>>>
> >>> >>>>>>> If I comment out the above three lines it seems to work (it
> builds anyway - haven't actually tested it yet).
> >>> >>>>>>>
> >>> >>>>>>> /Bengt
>


Re: karaf-maven-plugin and version ranges

2022-04-25 Thread Bengt Rodehav
Hello JB,

I noticed that you have now released Karaf 4.4.0. Did you get a chance to
look at this issue?

/Bengt

Den tors 24 mars 2022 kl 13:32 skrev Bengt Rodehav :

> OK - thanks,
>
> /Bengt
>
> Den ons 23 mars 2022 kl 07:19 skrev Jean-Baptiste Onofré  >:
>
>> Hi
>>
>> Not yet, I will as part of 4.4.0 release preparation.
>> I will keep you posted soon.
>>
>> Regards
>> JB
>>
>> On Tue, Mar 22, 2022 at 5:30 PM Bengt Rodehav  wrote:
>> >
>> > Did you have a chance to test this on Windows JB?
>> >
>> > /Bengt
>> >
>> > Den fre 11 mars 2022 kl 17:17 skrev Bengt Rodehav :
>> >>
>> >> OK - thanks.
>> >>
>> >> /Bengt
>> >>
>> >> Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré <
>> j...@nanthrax.net>:
>> >>>
>> >>> Hi
>> >>>
>> >>> I think it’s more Karaf-maven-plugin issue/use on windows.
>> >>>
>> >>> We don’t have such issue on Unix. Let me try on Windows vm.
>> >>>
>> >>> Regards
>> >>> JB
>> >>>
>> >>> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav  a
>> écrit :
>> >>>>
>> >>>> Do you think this is a Camel problem? The more I think about it I
>> wonder how it could ever work having a version range in the repository tag
>> like that. Doesn't it have to be a specific version since it identifies a
>> repository to be searched for artifacts?
>> >>>>
>> >>>> /Bengt
>> >>>>
>> >>>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav > >:
>> >>>>>
>> >>>>> Yes, that's correct.
>> >>>>>
>> >>>>> /Bengt
>> >>>>>
>> >>>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
>> j...@nanthrax.net>:
>> >>>>>>
>> >>>>>> Hi Bengt,
>> >>>>>>
>> >>>>>> I guess you are on Windows right ?
>> >>>>>>
>> >>>>>> Regards
>> >>>>>> JB
>> >>>>>>
>> >>>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav 
>> wrote:
>> >>>>>>>
>> >>>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to
>> the latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
>> goal features-add-to-repository.
>> >>>>>>>
>> >>>>>>> The new Karaf feature descriptor for Camel now starts with the
>> following three lines:
>> >>>>>>>
>> >>>>>>>
>>  
>> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
>> >>>>>>>
>>  
>> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
>> >>>>>>>
>>  
>> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
>> >>>>>>>
>> >>>>>>> But this causes the karaf-maven-plugin to try to download a file
>> called
>> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
>> which of course fails.
>> >>>>>>>
>> >>>>>>> Is this a problem with the karaf-maven-plugin or with Camel's
>> karaf feature descriptor?
>> >>>>>>>
>> >>>>>>> If I comment out the above three lines it seems to work (it
>> builds anyway - haven't actually tested it yet).
>> >>>>>>>
>> >>>>>>> /Bengt
>>
>


Re: karaf-maven-plugin and version ranges

2022-03-24 Thread Bengt Rodehav
OK - thanks,

/Bengt

Den ons 23 mars 2022 kl 07:19 skrev Jean-Baptiste Onofré :

> Hi
>
> Not yet, I will as part of 4.4.0 release preparation.
> I will keep you posted soon.
>
> Regards
> JB
>
> On Tue, Mar 22, 2022 at 5:30 PM Bengt Rodehav  wrote:
> >
> > Did you have a chance to test this on Windows JB?
> >
> > /Bengt
> >
> > Den fre 11 mars 2022 kl 17:17 skrev Bengt Rodehav :
> >>
> >> OK - thanks.
> >>
> >> /Bengt
> >>
> >> Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>>
> >>> Hi
> >>>
> >>> I think it’s more Karaf-maven-plugin issue/use on windows.
> >>>
> >>> We don’t have such issue on Unix. Let me try on Windows vm.
> >>>
> >>> Regards
> >>> JB
> >>>
> >>> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav  a
> écrit :
> >>>>
> >>>> Do you think this is a Camel problem? The more I think about it I
> wonder how it could ever work having a version range in the repository tag
> like that. Doesn't it have to be a specific version since it identifies a
> repository to be searched for artifacts?
> >>>>
> >>>> /Bengt
> >>>>
> >>>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav  >:
> >>>>>
> >>>>> Yes, that's correct.
> >>>>>
> >>>>> /Bengt
> >>>>>
> >>>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
> j...@nanthrax.net>:
> >>>>>>
> >>>>>> Hi Bengt,
> >>>>>>
> >>>>>> I guess you are on Windows right ?
> >>>>>>
> >>>>>> Regards
> >>>>>> JB
> >>>>>>
> >>>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav 
> wrote:
> >>>>>>>
> >>>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to
> the latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
> goal features-add-to-repository.
> >>>>>>>
> >>>>>>> The new Karaf feature descriptor for Camel now starts with the
> following three lines:
> >>>>>>>
> >>>>>>>
>  
> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
> >>>>>>>
>  
> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
> >>>>>>>
>  
> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
> >>>>>>>
> >>>>>>> But this causes the karaf-maven-plugin to try to download a file
> called
> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
> which of course fails.
> >>>>>>>
> >>>>>>> Is this a problem with the karaf-maven-plugin or with Camel's
> karaf feature descriptor?
> >>>>>>>
> >>>>>>> If I comment out the above three lines it seems to work (it builds
> anyway - haven't actually tested it yet).
> >>>>>>>
> >>>>>>> /Bengt
>


Re: karaf-maven-plugin and version ranges

2022-03-22 Thread Bengt Rodehav
Did you have a chance to test this on Windows JB?

/Bengt

Den fre 11 mars 2022 kl 17:17 skrev Bengt Rodehav :

> OK - thanks.
>
> /Bengt
>
> Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré  >:
>
>> Hi
>>
>> I think it’s more Karaf-maven-plugin issue/use on windows.
>>
>> We don’t have such issue on Unix. Let me try on Windows vm.
>>
>> Regards
>> JB
>>
>> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav  a écrit :
>>
>>> Do you think this is a Camel problem? The more I think about it I wonder
>>> how it could ever work having a version range in the repository tag like
>>> that. Doesn't it have to be a specific version since it identifies a
>>> repository to be searched for artifacts?
>>>
>>> /Bengt
>>>
>>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav :
>>>
>>>> Yes, that's correct.
>>>>
>>>> /Bengt
>>>>
>>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
>>>> j...@nanthrax.net>:
>>>>
>>>>> Hi Bengt,
>>>>>
>>>>> I guess you are on Windows right ?
>>>>>
>>>>> Regards
>>>>> JB
>>>>>
>>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav 
>>>>> wrote:
>>>>>
>>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to the
>>>>>> latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
>>>>>> goal features-add-to-repository.
>>>>>>
>>>>>> The new Karaf feature descriptor for Camel now starts with the
>>>>>> following three lines:
>>>>>>
>>>>>>
>>>>>> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
>>>>>>
>>>>>> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
>>>>>>
>>>>>> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
>>>>>>
>>>>>> But this causes the karaf-maven-plugin to try to download a file
>>>>>> called
>>>>>> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
>>>>>> which of course fails.
>>>>>>
>>>>>> Is this a problem with the karaf-maven-plugin or with Camel's karaf
>>>>>> feature descriptor?
>>>>>>
>>>>>> If I comment out the above three lines it seems to work (it builds
>>>>>> anyway - haven't actually tested it yet).
>>>>>>
>>>>>> /Bengt
>>>>>>
>>>>>


Re: karaf-maven-plugin and version ranges

2022-03-11 Thread Bengt Rodehav
OK - thanks.

/Bengt

Den fre 11 mars 2022 kl 15:53 skrev Jean-Baptiste Onofré :

> Hi
>
> I think it’s more Karaf-maven-plugin issue/use on windows.
>
> We don’t have such issue on Unix. Let me try on Windows vm.
>
> Regards
> JB
>
> Le ven. 11 mars 2022 à 15:38, Bengt Rodehav  a écrit :
>
>> Do you think this is a Camel problem? The more I think about it I wonder
>> how it could ever work having a version range in the repository tag like
>> that. Doesn't it have to be a specific version since it identifies a
>> repository to be searched for artifacts?
>>
>> /Bengt
>>
>> Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav :
>>
>>> Yes, that's correct.
>>>
>>> /Bengt
>>>
>>> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré <
>>> j...@nanthrax.net>:
>>>
>>>> Hi Bengt,
>>>>
>>>> I guess you are on Windows right ?
>>>>
>>>> Regards
>>>> JB
>>>>
>>>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav 
>>>> wrote:
>>>>
>>>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to the
>>>>> latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
>>>>> goal features-add-to-repository.
>>>>>
>>>>> The new Karaf feature descriptor for Camel now starts with the
>>>>> following three lines:
>>>>>
>>>>>
>>>>> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
>>>>>
>>>>> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
>>>>>
>>>>> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
>>>>>
>>>>> But this causes the karaf-maven-plugin to try to download a file
>>>>> called
>>>>> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
>>>>> which of course fails.
>>>>>
>>>>> Is this a problem with the karaf-maven-plugin or with Camel's karaf
>>>>> feature descriptor?
>>>>>
>>>>> If I comment out the above three lines it seems to work (it builds
>>>>> anyway - haven't actually tested it yet).
>>>>>
>>>>> /Bengt
>>>>>
>>>>


Re: karaf-maven-plugin and version ranges

2022-03-11 Thread Bengt Rodehav
Do you think this is a Camel problem? The more I think about it I wonder
how it could ever work having a version range in the repository tag like
that. Doesn't it have to be a specific version since it identifies a
repository to be searched for artifacts?

/Bengt

Den fre 11 mars 2022 kl 09:23 skrev Bengt Rodehav :

> Yes, that's correct.
>
> /Bengt
>
> Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré  >:
>
>> Hi Bengt,
>>
>> I guess you are on Windows right ?
>>
>> Regards
>> JB
>>
>> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav  wrote:
>>
>>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to the
>>> latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
>>> goal features-add-to-repository.
>>>
>>> The new Karaf feature descriptor for Camel now starts with the following
>>> three lines:
>>>
>>>
>>> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
>>>
>>> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
>>>
>>> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
>>>
>>> But this causes the karaf-maven-plugin to try to download a file called
>>> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
>>> which of course fails.
>>>
>>> Is this a problem with the karaf-maven-plugin or with Camel's karaf
>>> feature descriptor?
>>>
>>> If I comment out the above three lines it seems to work (it builds
>>> anyway - haven't actually tested it yet).
>>>
>>> /Bengt
>>>
>>


Re: karaf-maven-plugin and version ranges

2022-03-11 Thread Bengt Rodehav
Yes, that's correct.

/Bengt

Den tors 10 mars 2022 kl 18:55 skrev Jean-Baptiste Onofré :

> Hi Bengt,
>
> I guess you are on Windows right ?
>
> Regards
> JB
>
> On Thu, Mar 10, 2022 at 3:56 PM Bengt Rodehav  wrote:
>
>> I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to the
>> latest (2.14.2). It turns out I get a problem with karaf-maven-plugin,
>> goal features-add-to-repository.
>>
>> The new Karaf feature descriptor for Camel now starts with the following
>> three lines:
>>
>>
>> mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features
>>
>> mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features
>>
>> mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features
>>
>> But this causes the karaf-maven-plugin to try to download a file called
>> "org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
>> which of course fails.
>>
>> Is this a problem with the karaf-maven-plugin or with Camel's karaf
>> feature descriptor?
>>
>> If I comment out the above three lines it seems to work (it builds anyway
>> - haven't actually tested it yet).
>>
>> /Bengt
>>
>


karaf-maven-plugin and version ranges

2022-03-10 Thread Bengt Rodehav
I use Karaf 4.3.6 and I'm trying to upgrade our Camel version to the latest
(2.14.2). It turns out I get a problem with karaf-maven-plugin,
goal features-add-to-repository.

The new Karaf feature descriptor for Camel now starts with the following
three lines:


mvn:org.apache.cxf.karaf/apache-cxf/[3.4,3.4]/xml/features

mvn:org.ops4j.pax.cdi/pax-cdi-features/[1,2)/xml/features

mvn:org.hibernate.validator/hibernate-validator-osgi-karaf-features/[6.2,6.3)/xml/features

But this causes the karaf-maven-plugin to try to download a file called
"org/apache/cxf/karaf/apache-cxf/%5B3.4,3.4%5D/apache-cxf-%5B3.4,3.4%5D-features.xml"
which of course fails.

Is this a problem with the karaf-maven-plugin or with Camel's karaf feature
descriptor?

If I comment out the above three lines it seems to work (it builds anyway -
haven't actually tested it yet).

/Bengt


Re: Difference between installing a bundle and a feature

2022-03-10 Thread Bengt Rodehav
Tried your advise and it worked perfectly - thanks!

/Bengt

Den tors 10 mars 2022 kl 09:16 skrev Bengt Rodehav :

> Sorry, just saw that you answered how to provide capability via the
> feature - will try that.
>
> /Bengt
>
> Den tors 10 mars 2022 kl 09:06 skrev Bengt Rodehav :
>
>> Thanks a lot JB - this really helps!
>>
>> Do you also know if it is possible to make pax-jdbc generate the
>> corresponding "provides"? Otherwise I might do as you suggest and disable
>> the capabilities.
>>
>> /Bengt
>>
>> Den ons 9 mars 2022 kl 21:02 skrev Jean-Baptiste Onofré > >:
>>
>>> Exactly: maven-bundle-plugin 2.3.7 doesn't generate the
>>> Requirement/Capability headers. That's why it works. If you deploy your
>>> bundles built with this version, you won't have the problem.
>>>
>>> maven-bundle-plugin 5.1.4 generates the req/cap headers, used by the
>>> features service. If you want, you can disable req/cap headers generation.
>>> Just pass the following instructions to maven-bundle-plugin:
>>>
>>> <_norequirements>true
>>> <_nocapabilities>true
>>>
>>> So, it's what I said in my previous email: it's not related directly to
>>> Karaf (Karaf features resolver use req/cap since Karaf 4.2.x), it's because
>>> your bundles MANIFEST have changed (when you upgraded the
>>> maven-bundle-plugin version).
>>>
>>> Regards
>>> JB
>>>
>>> On Wed, Mar 9, 2022 at 8:37 PM Bengt Rodehav  wrote:
>>>
>>>> Aha - thanks a lot. I thought I was goin crazy. I use pax-jdbc for the
>>>> data source. I'll need to check if it can provide the capability. Or, maybe
>>>> the problem is in the other end. I updated to a newer version of maven
>>>> bundle plugin (from 2.3.7 to 5.1.4) - maybe the old one didn't require the
>>>> capability.
>>>>
>>>> /Bengt
>>>>
>>>> On Wed, 9 Mar 2022, 18:27 Jean-Baptiste Onofré, 
>>>> wrote:
>>>>
>>>>> No, you didn't get my point.
>>>>>
>>>>> The fact the datasource is there or not doesn't matter at runtime if
>>>>> it doesn't provide the capability.
>>>>>
>>>>> Let me explain.
>>>>>
>>>>> Your bundles history-stuff contains in META-INF/MANIFEST:
>>>>>
>>>>> Require-Capability:
>>>>> osgi.service;effective:=active;objectClass="javax.sql.DataSource";filter=
>>>>>
>>>>> When you install with bundle:install, this header is simply ignored.
>>>>>
>>>>> BUT, when you install using features service, the feature resolver
>>>>> check all bundles requirements/capabilities.
>>>>>
>>>>> So, the feature resolver is looking for:
>>>>> - a bundle containing Provide-Capability header in MANIFEST
>>>>> matching the requirement
>>>>> - a feature containing  matching the requirement
>>>>>
>>>>> It's nothing related to runtime (the actual bundle installed and
>>>>> running), only the MANIFEST cap/req matter;
>>>>>
>>>>> So, in your case, you can just provide the capability in the feature
>>>>> providing the datasource. It's exactly what you can see in the
>>>>> karaf-jpa-example:
>>>>> https://github.com/apache/karaf/blob/main/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28
>>>>>
>>>>> Regards
>>>>> JB
>>>>>
>>>>> On Wed, Mar 9, 2022 at 5:30 PM Bengt Rodehav 
>>>>> wrote:
>>>>>
>>>>>> But the  filetransfoerhistoryjta datasource is there. I've verified
>>>>>> that. And if I install the bundle directly instead of going through the
>>>>>> feature it works fine and connects to the datasource. Why can it not be
>>>>>> found when I go through the feature but when I install the bundle 
>>>>>> directly?
>>>>>>
>>>>>> /Bengt
>>>>>>
>>>>>> Den ons 9 mars 2022 kl 17:10 skrev Jean-Baptiste Onofré <
>>>>>> j...@nanthrax.net>:
>>>>>>
>>>>>>> I don't think anything changed on the resolver, maybe you updated
>>>>>>> maven-bundle-plugin or bnd to create your bundle, and now it includes 
>

Re: Difference between installing a bundle and a feature

2022-03-10 Thread Bengt Rodehav
Sorry, just saw that you answered how to provide capability via the feature
- will try that.

/Bengt

Den tors 10 mars 2022 kl 09:06 skrev Bengt Rodehav :

> Thanks a lot JB - this really helps!
>
> Do you also know if it is possible to make pax-jdbc generate the
> corresponding "provides"? Otherwise I might do as you suggest and disable
> the capabilities.
>
> /Bengt
>
> Den ons 9 mars 2022 kl 21:02 skrev Jean-Baptiste Onofré :
>
>> Exactly: maven-bundle-plugin 2.3.7 doesn't generate the
>> Requirement/Capability headers. That's why it works. If you deploy your
>> bundles built with this version, you won't have the problem.
>>
>> maven-bundle-plugin 5.1.4 generates the req/cap headers, used by the
>> features service. If you want, you can disable req/cap headers generation.
>> Just pass the following instructions to maven-bundle-plugin:
>>
>> <_norequirements>true
>> <_nocapabilities>true
>>
>> So, it's what I said in my previous email: it's not related directly to
>> Karaf (Karaf features resolver use req/cap since Karaf 4.2.x), it's because
>> your bundles MANIFEST have changed (when you upgraded the
>> maven-bundle-plugin version).
>>
>> Regards
>> JB
>>
>> On Wed, Mar 9, 2022 at 8:37 PM Bengt Rodehav  wrote:
>>
>>> Aha - thanks a lot. I thought I was goin crazy. I use pax-jdbc for the
>>> data source. I'll need to check if it can provide the capability. Or, maybe
>>> the problem is in the other end. I updated to a newer version of maven
>>> bundle plugin (from 2.3.7 to 5.1.4) - maybe the old one didn't require the
>>> capability.
>>>
>>> /Bengt
>>>
>>> On Wed, 9 Mar 2022, 18:27 Jean-Baptiste Onofré,  wrote:
>>>
>>>> No, you didn't get my point.
>>>>
>>>> The fact the datasource is there or not doesn't matter at runtime if it
>>>> doesn't provide the capability.
>>>>
>>>> Let me explain.
>>>>
>>>> Your bundles history-stuff contains in META-INF/MANIFEST:
>>>>
>>>> Require-Capability:
>>>> osgi.service;effective:=active;objectClass="javax.sql.DataSource";filter=
>>>>
>>>> When you install with bundle:install, this header is simply ignored.
>>>>
>>>> BUT, when you install using features service, the feature resolver
>>>> check all bundles requirements/capabilities.
>>>>
>>>> So, the feature resolver is looking for:
>>>> - a bundle containing Provide-Capability header in MANIFEST
>>>> matching the requirement
>>>> - a feature containing  matching the requirement
>>>>
>>>> It's nothing related to runtime (the actual bundle installed and
>>>> running), only the MANIFEST cap/req matter;
>>>>
>>>> So, in your case, you can just provide the capability in the feature
>>>> providing the datasource. It's exactly what you can see in the
>>>> karaf-jpa-example:
>>>> https://github.com/apache/karaf/blob/main/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28
>>>>
>>>> Regards
>>>> JB
>>>>
>>>> On Wed, Mar 9, 2022 at 5:30 PM Bengt Rodehav  wrote:
>>>>
>>>>> But the  filetransfoerhistoryjta datasource is there. I've verified
>>>>> that. And if I install the bundle directly instead of going through the
>>>>> feature it works fine and connects to the datasource. Why can it not be
>>>>> found when I go through the feature but when I install the bundle 
>>>>> directly?
>>>>>
>>>>> /Bengt
>>>>>
>>>>> Den ons 9 mars 2022 kl 17:10 skrev Jean-Baptiste Onofré <
>>>>> j...@nanthrax.net>:
>>>>>
>>>>>> I don't think anything changed on the resolver, maybe you updated
>>>>>> maven-bundle-plugin or bnd to create your bundle, and now it includes the
>>>>>> requirement (whereas your bundles didn't contain requirement in MANIFEST
>>>>>> before).
>>>>>>
>>>>>> On Wed, Mar 9, 2022 at 4:59 PM Bengt Rodehav 
>>>>>> wrote:
>>>>>>
>>>>>>> I didn't have any problems with this using Karaf 4.3.3. Do you know
>>>>>>> if something has changed? I'm using Karaf 4.3.6 now.
>>>>>>>
>>>>>>> /Bengt
>>>>>>>
>

Re: Difference between installing a bundle and a feature

2022-03-10 Thread Bengt Rodehav
Thanks a lot JB - this really helps!

Do you also know if it is possible to make pax-jdbc generate the
corresponding "provides"? Otherwise I might do as you suggest and disable
the capabilities.

/Bengt

Den ons 9 mars 2022 kl 21:02 skrev Jean-Baptiste Onofré :

> Exactly: maven-bundle-plugin 2.3.7 doesn't generate the
> Requirement/Capability headers. That's why it works. If you deploy your
> bundles built with this version, you won't have the problem.
>
> maven-bundle-plugin 5.1.4 generates the req/cap headers, used by the
> features service. If you want, you can disable req/cap headers generation.
> Just pass the following instructions to maven-bundle-plugin:
>
> <_norequirements>true
> <_nocapabilities>true
>
> So, it's what I said in my previous email: it's not related directly to
> Karaf (Karaf features resolver use req/cap since Karaf 4.2.x), it's because
> your bundles MANIFEST have changed (when you upgraded the
> maven-bundle-plugin version).
>
> Regards
> JB
>
> On Wed, Mar 9, 2022 at 8:37 PM Bengt Rodehav  wrote:
>
>> Aha - thanks a lot. I thought I was goin crazy. I use pax-jdbc for the
>> data source. I'll need to check if it can provide the capability. Or, maybe
>> the problem is in the other end. I updated to a newer version of maven
>> bundle plugin (from 2.3.7 to 5.1.4) - maybe the old one didn't require the
>> capability.
>>
>> /Bengt
>>
>> On Wed, 9 Mar 2022, 18:27 Jean-Baptiste Onofré,  wrote:
>>
>>> No, you didn't get my point.
>>>
>>> The fact the datasource is there or not doesn't matter at runtime if it
>>> doesn't provide the capability.
>>>
>>> Let me explain.
>>>
>>> Your bundles history-stuff contains in META-INF/MANIFEST:
>>>
>>> Require-Capability:
>>> osgi.service;effective:=active;objectClass="javax.sql.DataSource";filter=
>>>
>>> When you install with bundle:install, this header is simply ignored.
>>>
>>> BUT, when you install using features service, the feature resolver check
>>> all bundles requirements/capabilities.
>>>
>>> So, the feature resolver is looking for:
>>> - a bundle containing Provide-Capability header in MANIFEST matching the
>>> requirement
>>> - a feature containing  matching the requirement
>>>
>>> It's nothing related to runtime (the actual bundle installed and
>>> running), only the MANIFEST cap/req matter;
>>>
>>> So, in your case, you can just provide the capability in the feature
>>> providing the datasource. It's exactly what you can see in the
>>> karaf-jpa-example:
>>> https://github.com/apache/karaf/blob/main/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28
>>>
>>> Regards
>>> JB
>>>
>>> On Wed, Mar 9, 2022 at 5:30 PM Bengt Rodehav  wrote:
>>>
>>>> But the  filetransfoerhistoryjta datasource is there. I've verified
>>>> that. And if I install the bundle directly instead of going through the
>>>> feature it works fine and connects to the datasource. Why can it not be
>>>> found when I go through the feature but when I install the bundle directly?
>>>>
>>>> /Bengt
>>>>
>>>> Den ons 9 mars 2022 kl 17:10 skrev Jean-Baptiste Onofré <
>>>> j...@nanthrax.net>:
>>>>
>>>>> I don't think anything changed on the resolver, maybe you updated
>>>>> maven-bundle-plugin or bnd to create your bundle, and now it includes the
>>>>> requirement (whereas your bundles didn't contain requirement in MANIFEST
>>>>> before).
>>>>>
>>>>> On Wed, Mar 9, 2022 at 4:59 PM Bengt Rodehav 
>>>>> wrote:
>>>>>
>>>>>> I didn't have any problems with this using Karaf 4.3.3. Do you know
>>>>>> if something has changed? I'm using Karaf 4.3.6 now.
>>>>>>
>>>>>> /Bengt
>>>>>>
>>>>>> Den ons 9 mars 2022 kl 16:46 skrev Bengt Rodehav :
>>>>>>
>>>>>>> Is there any way to stop the feature installer from using the
>>>>>>> resolver?
>>>>>>>
>>>>>>> /Bengt
>>>>>>>
>>>>>>> Den ons 9 mars 2022 kl 16:37 skrev Bengt Rodehav >>>>>> >:
>>>>>>>
>>>>>>>> Unfortunately I didn't get any extra information. I got the same as
>>>>>

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
Aha - thanks a lot. I thought I was goin crazy. I use pax-jdbc for the data
source. I'll need to check if it can provide the capability. Or, maybe the
problem is in the other end. I updated to a newer version of maven bundle
plugin (from 2.3.7 to 5.1.4) - maybe the old one didn't require the
capability.

/Bengt

On Wed, 9 Mar 2022, 18:27 Jean-Baptiste Onofré,  wrote:

> No, you didn't get my point.
>
> The fact the datasource is there or not doesn't matter at runtime if it
> doesn't provide the capability.
>
> Let me explain.
>
> Your bundles history-stuff contains in META-INF/MANIFEST:
>
> Require-Capability:
> osgi.service;effective:=active;objectClass="javax.sql.DataSource";filter=
>
> When you install with bundle:install, this header is simply ignored.
>
> BUT, when you install using features service, the feature resolver check
> all bundles requirements/capabilities.
>
> So, the feature resolver is looking for:
> - a bundle containing Provide-Capability header in MANIFEST matching the
> requirement
> - a feature containing  matching the requirement
>
> It's nothing related to runtime (the actual bundle installed and running),
> only the MANIFEST cap/req matter;
>
> So, in your case, you can just provide the capability in the feature
> providing the datasource. It's exactly what you can see in the
> karaf-jpa-example:
> https://github.com/apache/karaf/blob/main/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28
>
> Regards
> JB
>
> On Wed, Mar 9, 2022 at 5:30 PM Bengt Rodehav  wrote:
>
>> But the  filetransfoerhistoryjta datasource is there. I've verified that.
>> And if I install the bundle directly instead of going through the feature
>> it works fine and connects to the datasource. Why can it not be found when
>> I go through the feature but when I install the bundle directly?
>>
>> /Bengt
>>
>> Den ons 9 mars 2022 kl 17:10 skrev Jean-Baptiste Onofré > >:
>>
>>> I don't think anything changed on the resolver, maybe you updated
>>> maven-bundle-plugin or bnd to create your bundle, and now it includes the
>>> requirement (whereas your bundles didn't contain requirement in MANIFEST
>>> before).
>>>
>>> On Wed, Mar 9, 2022 at 4:59 PM Bengt Rodehav  wrote:
>>>
>>>> I didn't have any problems with this using Karaf 4.3.3. Do you know if
>>>> something has changed? I'm using Karaf 4.3.6 now.
>>>>
>>>> /Bengt
>>>>
>>>> Den ons 9 mars 2022 kl 16:46 skrev Bengt Rodehav :
>>>>
>>>>> Is there any way to stop the feature installer from using the resolver?
>>>>>
>>>>> /Bengt
>>>>>
>>>>> Den ons 9 mars 2022 kl 16:37 skrev Bengt Rodehav :
>>>>>
>>>>>> Unfortunately I didn't get any extra information. I got the same as
>>>>>> before which is:
>>>>>>
>>>>>> 2022-03-09T16:26:02,494 | INFO  | pipe-feature:install -v
>>>>>> connect-filetransfer-history-db | FeaturesServiceImpl  | 18 -
>>>>>> org.apache.karaf.features.core - 4.3.6 | Adding features:
>>>>>> connect-filetransfer-history-db/[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]
>>>>>> 2022-03-09T16:26:02,901 | ERROR | Karaf local console user karaf |
>>>>>> ShellUtil| 70 - org.apache.karaf.shell.core - 
>>>>>> 4.3.6
>>>>>> | Exception caught while executing command
>>>>>> org.apache.felix.resolver.reason.ReasonException: Unable to resolve
>>>>>> root: missing requirement [root] osgi.identity;
>>>>>> osgi.identity=connect-filetransfer-history-db; type=karaf.feature;
>>>>>> version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>>>>>> filter:="(&(osgi.identity=connect-filetransfer-history-db)(type=karaf.feature)(version>=3.1.0.SNAPSHOT)(version<=3.1.0.SNAPSHOT))"
>>>>>> [caused by: Unable to resolve
>>>>>> connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
>>>>>> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
>>>>>> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
>>>>>> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>>>>>> resolution:=mandatory [caused by: Unable to resolve
>>>>>> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
>>>>>> missing requi

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
But the  filetransfoerhistoryjta datasource is there. I've verified that.
And if I install the bundle directly instead of going through the feature
it works fine and connects to the datasource. Why can it not be found when
I go through the feature but when I install the bundle directly?

/Bengt

Den ons 9 mars 2022 kl 17:10 skrev Jean-Baptiste Onofré :

> I don't think anything changed on the resolver, maybe you updated
> maven-bundle-plugin or bnd to create your bundle, and now it includes the
> requirement (whereas your bundles didn't contain requirement in MANIFEST
> before).
>
> On Wed, Mar 9, 2022 at 4:59 PM Bengt Rodehav  wrote:
>
>> I didn't have any problems with this using Karaf 4.3.3. Do you know if
>> something has changed? I'm using Karaf 4.3.6 now.
>>
>> /Bengt
>>
>> Den ons 9 mars 2022 kl 16:46 skrev Bengt Rodehav :
>>
>>> Is there any way to stop the feature installer from using the resolver?
>>>
>>> /Bengt
>>>
>>> Den ons 9 mars 2022 kl 16:37 skrev Bengt Rodehav :
>>>
>>>> Unfortunately I didn't get any extra information. I got the same as
>>>> before which is:
>>>>
>>>> 2022-03-09T16:26:02,494 | INFO  | pipe-feature:install -v
>>>> connect-filetransfer-history-db | FeaturesServiceImpl  | 18 -
>>>> org.apache.karaf.features.core - 4.3.6 | Adding features:
>>>> connect-filetransfer-history-db/[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]
>>>> 2022-03-09T16:26:02,901 | ERROR | Karaf local console user karaf |
>>>> ShellUtil| 70 - org.apache.karaf.shell.core - 4.3.6
>>>> | Exception caught while executing command
>>>> org.apache.felix.resolver.reason.ReasonException: Unable to resolve
>>>> root: missing requirement [root] osgi.identity;
>>>> osgi.identity=connect-filetransfer-history-db; type=karaf.feature;
>>>> version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>>>> filter:="(&(osgi.identity=connect-filetransfer-history-db)(type=karaf.feature)(version>=3.1.0.SNAPSHOT)(version<=3.1.0.SNAPSHOT))"
>>>> [caused by: Unable to resolve
>>>> connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
>>>> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
>>>> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
>>>> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>>>> resolution:=mandatory [caused by: Unable to resolve
>>>> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
>>>> missing requirement
>>>> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
>>>> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
>>>> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]]
>>>> at
>>>> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
>>>> ~[?:?]
>>>> at
>>>> org.apache.felix.resolver.ResolverImpl.doResolve(ResolverImpl.java:433)
>>>> ~[?:?]
>>>> at
>>>> org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:420) 
>>>> ~[?:?]
>>>> at
>>>> org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:374) 
>>>> ~[?:?]
>>>> at
>>>> org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:257)
>>>> ~[?:?]
>>>> at
>>>> org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:399)
>>>> ~[?:?]
>>>> at
>>>> org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1069)
>>>> ~[?:?]
>>>> at
>>>> org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:1004)
>>>> ~[?:?]
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
>>>> ~[?:?]
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
>>>> ~[?:?]
>>>> at java.lang.Thread.run(Thread.java:833) [?:?]
>>>> Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
>>>> resolve connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
>>>> [con

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
I didn't have any problems with this using Karaf 4.3.3. Do you know if
something has changed? I'm using Karaf 4.3.6 now.

/Bengt

Den ons 9 mars 2022 kl 16:46 skrev Bengt Rodehav :

> Is there any way to stop the feature installer from using the resolver?
>
> /Bengt
>
> Den ons 9 mars 2022 kl 16:37 skrev Bengt Rodehav :
>
>> Unfortunately I didn't get any extra information. I got the same as
>> before which is:
>>
>> 2022-03-09T16:26:02,494 | INFO  | pipe-feature:install -v
>> connect-filetransfer-history-db | FeaturesServiceImpl  | 18 -
>> org.apache.karaf.features.core - 4.3.6 | Adding features:
>> connect-filetransfer-history-db/[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]
>> 2022-03-09T16:26:02,901 | ERROR | Karaf local console user karaf |
>> ShellUtil| 70 - org.apache.karaf.shell.core - 4.3.6
>> | Exception caught while executing command
>> org.apache.felix.resolver.reason.ReasonException: Unable to resolve root:
>> missing requirement [root] osgi.identity;
>> osgi.identity=connect-filetransfer-history-db; type=karaf.feature;
>> version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>> filter:="(&(osgi.identity=connect-filetransfer-history-db)(type=karaf.feature)(version>=3.1.0.SNAPSHOT)(version<=3.1.0.SNAPSHOT))"
>> [caused by: Unable to resolve
>> connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
>> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
>> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
>> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>> resolution:=mandatory [caused by: Unable to resolve
>> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
>> missing requirement
>> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
>> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
>> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]]
>> at
>> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
>> ~[?:?]
>> at
>> org.apache.felix.resolver.ResolverImpl.doResolve(ResolverImpl.java:433)
>> ~[?:?]
>> at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:420)
>> ~[?:?]
>> at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:374)
>> ~[?:?]
>> at
>> org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:257)
>> ~[?:?]
>> at
>> org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:399)
>> ~[?:?]
>> at
>> org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1069)
>> ~[?:?]
>> at
>> org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:1004)
>> ~[?:?]
>> at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
>> ~[?:?]
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
>> ~[?:?]
>> at java.lang.Thread.run(Thread.java:833) [?:?]
>> Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
>> resolve connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
>> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
>> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
>> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
>> resolution:=mandatory [caused by: Unable to resolve
>> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
>> missing requirement
>> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
>> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
>> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]
>> at
>> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
>> ~[?:?]
>> ... 12 more
>> Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
>> resolve
>> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
>> missing requirement
>> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
>> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
>> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"
>> at
>> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Cand

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
Is there any way to stop the feature installer from using the resolver?

/Bengt

Den ons 9 mars 2022 kl 16:37 skrev Bengt Rodehav :

> Unfortunately I didn't get any extra information. I got the same as before
> which is:
>
> 2022-03-09T16:26:02,494 | INFO  | pipe-feature:install -v
> connect-filetransfer-history-db | FeaturesServiceImpl  | 18 -
> org.apache.karaf.features.core - 4.3.6 | Adding features:
> connect-filetransfer-history-db/[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]
> 2022-03-09T16:26:02,901 | ERROR | Karaf local console user karaf |
> ShellUtil| 70 - org.apache.karaf.shell.core - 4.3.6
> | Exception caught while executing command
> org.apache.felix.resolver.reason.ReasonException: Unable to resolve root:
> missing requirement [root] osgi.identity;
> osgi.identity=connect-filetransfer-history-db; type=karaf.feature;
> version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
> filter:="(&(osgi.identity=connect-filetransfer-history-db)(type=karaf.feature)(version>=3.1.0.SNAPSHOT)(version<=3.1.0.SNAPSHOT))"
> [caused by: Unable to resolve
> connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
> resolution:=mandatory [caused by: Unable to resolve
> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
> missing requirement
> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]]
> at
> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
> ~[?:?]
> at org.apache.felix.resolver.ResolverImpl.doResolve(ResolverImpl.java:433)
> ~[?:?]
> at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:420)
> ~[?:?]
> at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:374)
> ~[?:?]
> at
> org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:257)
> ~[?:?]
> at
> org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:399)
> ~[?:?]
> at
> org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1069)
> ~[?:?]
> at
> org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:1004)
> ~[?:?]
> at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
> ~[?:?]
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
> ~[?:?]
> at java.lang.Thread.run(Thread.java:833) [?:?]
> Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
> resolve connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
> [connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
> osgi.identity=se.digia.connect.services.filetransfer.history-domain;
> type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
> resolution:=mandatory [caused by: Unable to resolve
> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
> missing requirement
> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]
> at
> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
> ~[?:?]
> ... 12 more
> Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
> resolve
> se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
> missing requirement
> [se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
> osgi.service; objectClass=javax.sql.DataSource; effective:=active;
> filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"
> at
> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
> ~[?:?]
> at
> org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
> ~[?:?]
> ... 12 more
>
> I tried the "--store" option which gave me a quite large file that I
> unfortunately did not understand fully. Not sure if it helps any.
>
> /Bengt
>
> Den ons 9 mars 2022 kl 16:22 skrev Bengt Rodehav :
>
>> OK - thanks. Will try that.
>>
>> /Bengt
>>
>> Den ons 9 mars 2022 kl 16:16 skr

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
Unfortunately I didn't get any extra information. I got the same as before
which is:

2022-03-09T16:26:02,494 | INFO  | pipe-feature:install -v
connect-filetransfer-history-db | FeaturesServiceImpl  | 18 -
org.apache.karaf.features.core - 4.3.6 | Adding features:
connect-filetransfer-history-db/[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]
2022-03-09T16:26:02,901 | ERROR | Karaf local console user karaf |
ShellUtil| 70 - org.apache.karaf.shell.core - 4.3.6
| Exception caught while executing command
org.apache.felix.resolver.reason.ReasonException: Unable to resolve root:
missing requirement [root] osgi.identity;
osgi.identity=connect-filetransfer-history-db; type=karaf.feature;
version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
filter:="(&(osgi.identity=connect-filetransfer-history-db)(type=karaf.feature)(version>=3.1.0.SNAPSHOT)(version<=3.1.0.SNAPSHOT))"
[caused by: Unable to resolve
connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
[connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
osgi.identity=se.digia.connect.services.filetransfer.history-domain;
type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
resolution:=mandatory [caused by: Unable to resolve
se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
missing requirement
[se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
osgi.service; objectClass=javax.sql.DataSource; effective:=active;
filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]]
at
org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
~[?:?]
at org.apache.felix.resolver.ResolverImpl.doResolve(ResolverImpl.java:433)
~[?:?]
at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:420)
~[?:?]
at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:374)
~[?:?]
at
org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:257)
~[?:?]
at
org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:399)
~[?:?]
at
org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1069)
~[?:?]
at
org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:1004)
~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
~[?:?]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
~[?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
resolve connect-filetransfer-history-db/3.1.0.SNAPSHOT: missing requirement
[connect-filetransfer-history-db/3.1.0.SNAPSHOT] osgi.identity;
osgi.identity=se.digia.connect.services.filetransfer.history-domain;
type=osgi.bundle; version="[3.1.0.SNAPSHOT,3.1.0.SNAPSHOT]";
resolution:=mandatory [caused by: Unable to resolve
se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
missing requirement
[se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
osgi.service; objectClass=javax.sql.DataSource; effective:=active;
filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"]
at
org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
~[?:?]
... 12 more
Caused by: org.apache.felix.resolver.reason.ReasonException: Unable to
resolve
se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT:
missing requirement
[se.digia.connect.services.filetransfer.history-domain/3.1.0.SNAPSHOT]
osgi.service; objectClass=javax.sql.DataSource; effective:=active;
filter:="(osgi.jndi.service.name=jdbc/filetransferhistoryjta)"
at
org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
~[?:?]
at
org.apache.felix.resolver.Candidates$MissingRequirementError.toException(Candidates.java:1341)
~[?:?]
... 12 more

I tried the "--store" option which gave me a quite large file that I
unfortunately did not understand fully. Not sure if it helps any.

/Bengt

Den ons 9 mars 2022 kl 16:22 skrev Bengt Rodehav :

> OK - thanks. Will try that.
>
> /Bengt
>
> Den ons 9 mars 2022 kl 16:16 skrev Jean-Baptiste Onofré :
>
>> Hi
>>
>> The main difference is that feature used the resolver to optimize the
>> installation. Bundle installation doesn’t use the feature resolver.
>>
>> You can use feature:install -v to get resolver output and you will
>> probably see the chain found by the resolver.
>>
>> Regards
>> JB
>>
>> Le mer. 9 mars 2022 à 15:22, Bengt Rodehav  a écrit :
>>
>>> I have a very strange problem (in Karaf 4.3.6). I use JPA and have a
>>> bundle containing a persistence.xml in which a 

Re: Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
OK - thanks. Will try that.

/Bengt

Den ons 9 mars 2022 kl 16:16 skrev Jean-Baptiste Onofré :

> Hi
>
> The main difference is that feature used the resolver to optimize the
> installation. Bundle installation doesn’t use the feature resolver.
>
> You can use feature:install -v to get resolver output and you will
> probably see the chain found by the resolver.
>
> Regards
> JB
>
> Le mer. 9 mars 2022 à 15:22, Bengt Rodehav  a écrit :
>
>> I have a very strange problem (in Karaf 4.3.6). I use JPA and have a
>> bundle containing a persistence.xml in which a datasource is referenced:
>>
>>   osgi:service/javax.sql.DataSource/(
>> osgi.jndi.service.name=jdbc/filetransferhistoryjta)
>>
>> If the datasource is not available when I install this bundle then I will
>> get an error complaining that the datasource is not present. The strange
>> thing is that it seems to be dependent on how I install this bundle -
>> directly installing the bundle or doing it via a feature. In the first case
>> it works but in the latter it doesn't.
>>
>> If I first install all the prerequisites I need and then issue the
>> following command in the Karaf shell:
>>
>>   bundle:install -s
>> mvn:se.digia.connect.services.filetransfer/history-domain/3.1-SNAPSHOT
>>
>> Then it works fine. It even works if I remove the "-s" and start the
>> bundle afterwards instead.
>>
>> However, if I use the following feature:
>>
>>   
>>
>> mvn:se.digia.connect.services.filetransfer/history-domain/3.1-SNAPSHOT
>>   
>>
>> And then issue the following command:
>>
>>   feature:install connect-filetransfer-history-db
>>
>> Then the datasource cannot be found and the install fails. This happens
>> consistently. I am using Pax-Jdbc for exposing the datasource via JNDI.
>>
>> First I thought that there might be a timing problem and that you have to
>> wait a while to get the datasource published but it doesn't seem to have
>> anything to do with that at all. I can wait 5 minutes after installing the
>> datasource. I also check with the command "jndi:names" that it is
>> published. But it still doesn't work using a feature.
>>
>> Can anyone tell me what is being done differently when I use a feature
>> compared to when I just install the bundle directly? There is apparently
>> some kind of difference.
>>
>> /Bengt
>>
>>


Difference between installing a bundle and a feature

2022-03-09 Thread Bengt Rodehav
I have a very strange problem (in Karaf 4.3.6). I use JPA and have a bundle
containing a persistence.xml in which a datasource is referenced:

  osgi:service/javax.sql.DataSource/(osgi.jndi.service.name
=jdbc/filetransferhistoryjta)

If the datasource is not available when I install this bundle then I will
get an error complaining that the datasource is not present. The strange
thing is that it seems to be dependent on how I install this bundle -
directly installing the bundle or doing it via a feature. In the first case
it works but in the latter it doesn't.

If I first install all the prerequisites I need and then issue the
following command in the Karaf shell:

  bundle:install -s
mvn:se.digia.connect.services.filetransfer/history-domain/3.1-SNAPSHOT

Then it works fine. It even works if I remove the "-s" and start the bundle
afterwards instead.

However, if I use the following feature:

  

mvn:se.digia.connect.services.filetransfer/history-domain/3.1-SNAPSHOT
  

And then issue the following command:

  feature:install connect-filetransfer-history-db

Then the datasource cannot be found and the install fails. This happens
consistently. I am using Pax-Jdbc for exposing the datasource via JNDI.

First I thought that there might be a timing problem and that you have to
wait a while to get the datasource published but it doesn't seem to have
anything to do with that at all. I can wait 5 minutes after installing the
datasource. I also check with the command "jndi:names" that it is
published. But it still doesn't work using a feature.

Can anyone tell me what is being done differently when I use a feature
compared to when I just install the bundle directly? There is apparently
some kind of difference.

/Bengt


Re: Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-18 Thread Bengt Rodehav
I think I've got this working now although it took quite a bit of
experimenting.

In the past, the current directory was the directory where the wrapper
executable was run from (the bin directory). But now it seems to be one
step up (where you unpack the Karaf distribution). I used to have:

set.default.KARAF_HOME=..
set.default.KARAF_BASE=..

But now I have:

set.default.KARAF_HOME=.
set.default.KARAF_BASE=%KARAF_HOME%

But this doesn't seem to be the case when it comes to the wrapper
properties. I used to have:

wrapper.java.command=%JAVA_HOME%/bin/java
wrapper.java.mainclass=org.apache.karaf.wrapper.internal.service.Main
wrapper.java.classpath.1=%KARAF_BASE%/lib/boot/*.jar
wrapper.java.classpath.2=%KARAF_BASE%/lib/wrapper/*.jar
wrapper.java.library.path.1=%KARAF_BASE%/lib/wrapper/

But I now have:

wrapper.working.dir=..
wrapper.java.command=%JAVA_HOME%/bin/java
wrapper.java.mainclass=org.apache.karaf.wrapper.internal.service.Main
wrapper.java.classpath.1=./lib/boot/*.jar
wrapper.java.classpath.2=./lib/jdk9plus/*.jar
wrapper.java.classpath.3=./lib/wrapper/*.jar
wrapper.java.library.path.1=./lib/wrapper/

I guess the reason for this is the parameter "wrapper.working.dir". I had
forgotten that we used to comment it out but now I didn't. I think this is
the main source of confusion for me.

Also, I had problems getting the wrapper's log file to the correct place -
it always ended up in the root directory and was called wrapper.log even if
I tried calling it something else. It turned out that the problem is that
if the directory of the wrapper's log file does not exist then it defaults
to wrapper.log in the root. As I often delete the data directory when
testing, the data/log directory did not exist and the default was used.

All well now,

/Bengt

Den tors 17 dec. 2020 kl 17:46 skrev Bengt Rodehav :

> Just a thought...
>
> The wrapper sets the current directory to the folder where its executable
> resides. But what current directory does Karaf have? If the two don't agree
> with this but they both parse the KARAF_HOME and KARAF_BASE variables then
> it becomes impossible to use relative paths.
>
> /Bengt
>
> Den tors 17 dec. 2020 kl 17:33 skrev Bengt Rodehav :
>
>> Hello JB,
>>
>> I've tested and experimented more about this issue. First I replaced the
>> wrapper bundled int Karaf 4.3.0 with the one bundled in Karaf 4.0.7. I get
>> the same result so the problem is probably not with the wrapper itself.
>>
>> Next, I replaced all the *use* of variables KARAF_HOME and KARAF_BASE
>> with the hard coded absolute paths. But it still doesn't work. I get:
>>
>> INFO   | jvm 1| 2020/12/17 17:10:20 | Could not create framework:
>> java.lang.NumberFormatException: null
>> INFO   | jvm 1| 2020/12/17 17:10:20 |
>> java.lang.NumberFormatException: null
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> java.base/java.lang.Integer.parseInt(Integer.java:614)
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> java.base/java.lang.Integer.parseInt(Integer.java:770)
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> org.apache.karaf.main.ConfigProperties.(ConfigProperties.java:251)
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> org.apache.karaf.main.Main.launch(Main.java:262)
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> org.apache.karaf.wrapper.internal.service.Main.start(Main.java:55)
>> INFO   | jvm 1| 2020/12/17 17:10:20 | at
>> org.tanukisoftware.wrapper.WrapperManager$12.run(WrapperManager.java:2788)
>>
>> Note that I still had KARAF_HOME and KARAF_BASE set to ".." but I didn't
>> *use* those variables anywhere so it's not the wrapper that has problems
>> but some part of Karaf seems to read those variables (are they environment
>> variables?) and cannot handle that the paths are relative.
>>
>> /Bengt
>>
>> Den tors 17 dec. 2020 kl 13:54 skrev Bengt Rodehav :
>>
>>> I am using openjdk15 on Windows 7 and Windows 10 in case that could
>>> affect this issue.
>>>
>>> /Bengt
>>>
>>> Den tors 17 dec. 2020 kl 08:51 skrev Bengt Rodehav :
>>>
>>>> Thanks JB - I appreciate it.
>>>>
>>>> /Bengt
>>>>
>>>> Den tors 17 dec. 2020 kl 06:08 skrev Jean-Baptiste Onofre <
>>>> j...@nanthrax.net>:
>>>>
>>>>> Hi,
>>>>>
>>>>> There were lot of changes between 4.0.x and 4.3.x. I don’t remember
>>>>> changes about the service wrapper.
>>>>>
>>>>> Let me reproduce and do a git bisect to identify the change.
>>>>>
>>>>> I will keep you posted.
>>>>>
>>>>&g

Re: Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-17 Thread Bengt Rodehav
Just a thought...

The wrapper sets the current directory to the folder where its executable
resides. But what current directory does Karaf have? If the two don't agree
with this but they both parse the KARAF_HOME and KARAF_BASE variables then
it becomes impossible to use relative paths.

/Bengt

Den tors 17 dec. 2020 kl 17:33 skrev Bengt Rodehav :

> Hello JB,
>
> I've tested and experimented more about this issue. First I replaced the
> wrapper bundled int Karaf 4.3.0 with the one bundled in Karaf 4.0.7. I get
> the same result so the problem is probably not with the wrapper itself.
>
> Next, I replaced all the *use* of variables KARAF_HOME and KARAF_BASE
> with the hard coded absolute paths. But it still doesn't work. I get:
>
> INFO   | jvm 1| 2020/12/17 17:10:20 | Could not create framework:
> java.lang.NumberFormatException: null
> INFO   | jvm 1| 2020/12/17 17:10:20 | java.lang.NumberFormatException:
> null
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> java.base/java.lang.Integer.parseInt(Integer.java:614)
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> java.base/java.lang.Integer.parseInt(Integer.java:770)
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> org.apache.karaf.main.ConfigProperties.(ConfigProperties.java:251)
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> org.apache.karaf.main.Main.launch(Main.java:262)
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> org.apache.karaf.wrapper.internal.service.Main.start(Main.java:55)
> INFO   | jvm 1| 2020/12/17 17:10:20 | at
> org.tanukisoftware.wrapper.WrapperManager$12.run(WrapperManager.java:2788)
>
> Note that I still had KARAF_HOME and KARAF_BASE set to ".." but I didn't
> *use* those variables anywhere so it's not the wrapper that has problems
> but some part of Karaf seems to read those variables (are they environment
> variables?) and cannot handle that the paths are relative.
>
> /Bengt
>
> Den tors 17 dec. 2020 kl 13:54 skrev Bengt Rodehav :
>
>> I am using openjdk15 on Windows 7 and Windows 10 in case that could
>> affect this issue.
>>
>> /Bengt
>>
>> Den tors 17 dec. 2020 kl 08:51 skrev Bengt Rodehav :
>>
>>> Thanks JB - I appreciate it.
>>>
>>> /Bengt
>>>
>>> Den tors 17 dec. 2020 kl 06:08 skrev Jean-Baptiste Onofre <
>>> j...@nanthrax.net>:
>>>
>>>> Hi,
>>>>
>>>> There were lot of changes between 4.0.x and 4.3.x. I don’t remember
>>>> changes about the service wrapper.
>>>>
>>>> Let me reproduce and do a git bisect to identify the change.
>>>>
>>>> I will keep you posted.
>>>>
>>>> Regards
>>>> JB
>>>>
>>>> > Le 16 déc. 2020 à 16:16, Bengt Rodehav  a écrit :
>>>> >
>>>> > I am upgrading from Karaf 4.0.7 to 4.3.0 and have run into problems
>>>> starting Karaf as a service.
>>>> >
>>>> > We want to be able to unpack our Karaf based application anywhere so
>>>> we need to avoid absolute paths everywhere. In the past, our
>>>> karaf-wrapper.conf has contained the following lines:
>>>> >
>>>> > set.default.KARAF_HOME=..
>>>> > set.default.KARAF_BASE=..
>>>> > set.default.KARAF_DATA=../data
>>>> > set.default.KARAF_ETC=../etc
>>>> >
>>>> > This has worked fine since the Wrapper always sets the directory in
>>>> which karaf-wrapper.exe resides as the working directory. Therefore, since
>>>> karaf-wrapper.exe resides in the %KARAF_HOME%/bin directory, ".." takes us
>>>> back to %KARAF_HOME%.
>>>> >
>>>> > However, this does not seem to work in Karaf 4.3.0. I have tried
>>>> several relative paths but I cannot figure out what directory ".." seems to
>>>> point to in Karaf 4.3.0.
>>>> >
>>>> > Has anything related to this been changed from Karaf 4.0.7 to Karaf
>>>> 4.3.0?
>>>>
>>>>


Re: Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-17 Thread Bengt Rodehav
Hello JB,

I've tested and experimented more about this issue. First I replaced the
wrapper bundled int Karaf 4.3.0 with the one bundled in Karaf 4.0.7. I get
the same result so the problem is probably not with the wrapper itself.

Next, I replaced all the *use* of variables KARAF_HOME and KARAF_BASE with
the hard coded absolute paths. But it still doesn't work. I get:

INFO   | jvm 1| 2020/12/17 17:10:20 | Could not create framework:
java.lang.NumberFormatException: null
INFO   | jvm 1| 2020/12/17 17:10:20 | java.lang.NumberFormatException:
null
INFO   | jvm 1| 2020/12/17 17:10:20 | at
java.base/java.lang.Integer.parseInt(Integer.java:614)
INFO   | jvm 1| 2020/12/17 17:10:20 | at
java.base/java.lang.Integer.parseInt(Integer.java:770)
INFO   | jvm 1| 2020/12/17 17:10:20 | at
org.apache.karaf.main.ConfigProperties.(ConfigProperties.java:251)
INFO   | jvm 1| 2020/12/17 17:10:20 | at
org.apache.karaf.main.Main.launch(Main.java:262)
INFO   | jvm 1| 2020/12/17 17:10:20 | at
org.apache.karaf.wrapper.internal.service.Main.start(Main.java:55)
INFO   | jvm 1| 2020/12/17 17:10:20 | at
org.tanukisoftware.wrapper.WrapperManager$12.run(WrapperManager.java:2788)

Note that I still had KARAF_HOME and KARAF_BASE set to ".." but I didn't
*use* those variables anywhere so it's not the wrapper that has problems
but some part of Karaf seems to read those variables (are they environment
variables?) and cannot handle that the paths are relative.

/Bengt

Den tors 17 dec. 2020 kl 13:54 skrev Bengt Rodehav :

> I am using openjdk15 on Windows 7 and Windows 10 in case that could affect
> this issue.
>
> /Bengt
>
> Den tors 17 dec. 2020 kl 08:51 skrev Bengt Rodehav :
>
>> Thanks JB - I appreciate it.
>>
>> /Bengt
>>
>> Den tors 17 dec. 2020 kl 06:08 skrev Jean-Baptiste Onofre <
>> j...@nanthrax.net>:
>>
>>> Hi,
>>>
>>> There were lot of changes between 4.0.x and 4.3.x. I don’t remember
>>> changes about the service wrapper.
>>>
>>> Let me reproduce and do a git bisect to identify the change.
>>>
>>> I will keep you posted.
>>>
>>> Regards
>>> JB
>>>
>>> > Le 16 déc. 2020 à 16:16, Bengt Rodehav  a écrit :
>>> >
>>> > I am upgrading from Karaf 4.0.7 to 4.3.0 and have run into problems
>>> starting Karaf as a service.
>>> >
>>> > We want to be able to unpack our Karaf based application anywhere so
>>> we need to avoid absolute paths everywhere. In the past, our
>>> karaf-wrapper.conf has contained the following lines:
>>> >
>>> > set.default.KARAF_HOME=..
>>> > set.default.KARAF_BASE=..
>>> > set.default.KARAF_DATA=../data
>>> > set.default.KARAF_ETC=../etc
>>> >
>>> > This has worked fine since the Wrapper always sets the directory in
>>> which karaf-wrapper.exe resides as the working directory. Therefore, since
>>> karaf-wrapper.exe resides in the %KARAF_HOME%/bin directory, ".." takes us
>>> back to %KARAF_HOME%.
>>> >
>>> > However, this does not seem to work in Karaf 4.3.0. I have tried
>>> several relative paths but I cannot figure out what directory ".." seems to
>>> point to in Karaf 4.3.0.
>>> >
>>> > Has anything related to this been changed from Karaf 4.0.7 to Karaf
>>> 4.3.0?
>>>
>>>


Re: Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-17 Thread Bengt Rodehav
I am using openjdk15 on Windows 7 and Windows 10 in case that could affect
this issue.

/Bengt

Den tors 17 dec. 2020 kl 08:51 skrev Bengt Rodehav :

> Thanks JB - I appreciate it.
>
> /Bengt
>
> Den tors 17 dec. 2020 kl 06:08 skrev Jean-Baptiste Onofre  >:
>
>> Hi,
>>
>> There were lot of changes between 4.0.x and 4.3.x. I don’t remember
>> changes about the service wrapper.
>>
>> Let me reproduce and do a git bisect to identify the change.
>>
>> I will keep you posted.
>>
>> Regards
>> JB
>>
>> > Le 16 déc. 2020 à 16:16, Bengt Rodehav  a écrit :
>> >
>> > I am upgrading from Karaf 4.0.7 to 4.3.0 and have run into problems
>> starting Karaf as a service.
>> >
>> > We want to be able to unpack our Karaf based application anywhere so we
>> need to avoid absolute paths everywhere. In the past, our
>> karaf-wrapper.conf has contained the following lines:
>> >
>> > set.default.KARAF_HOME=..
>> > set.default.KARAF_BASE=..
>> > set.default.KARAF_DATA=../data
>> > set.default.KARAF_ETC=../etc
>> >
>> > This has worked fine since the Wrapper always sets the directory in
>> which karaf-wrapper.exe resides as the working directory. Therefore, since
>> karaf-wrapper.exe resides in the %KARAF_HOME%/bin directory, ".." takes us
>> back to %KARAF_HOME%.
>> >
>> > However, this does not seem to work in Karaf 4.3.0. I have tried
>> several relative paths but I cannot figure out what directory ".." seems to
>> point to in Karaf 4.3.0.
>> >
>> > Has anything related to this been changed from Karaf 4.0.7 to Karaf
>> 4.3.0?
>>
>>


Re: Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-16 Thread Bengt Rodehav
Thanks JB - I appreciate it.

/Bengt

Den tors 17 dec. 2020 kl 06:08 skrev Jean-Baptiste Onofre :

> Hi,
>
> There were lot of changes between 4.0.x and 4.3.x. I don’t remember
> changes about the service wrapper.
>
> Let me reproduce and do a git bisect to identify the change.
>
> I will keep you posted.
>
> Regards
> JB
>
> > Le 16 déc. 2020 à 16:16, Bengt Rodehav  a écrit :
> >
> > I am upgrading from Karaf 4.0.7 to 4.3.0 and have run into problems
> starting Karaf as a service.
> >
> > We want to be able to unpack our Karaf based application anywhere so we
> need to avoid absolute paths everywhere. In the past, our
> karaf-wrapper.conf has contained the following lines:
> >
> > set.default.KARAF_HOME=..
> > set.default.KARAF_BASE=..
> > set.default.KARAF_DATA=../data
> > set.default.KARAF_ETC=../etc
> >
> > This has worked fine since the Wrapper always sets the directory in
> which karaf-wrapper.exe resides as the working directory. Therefore, since
> karaf-wrapper.exe resides in the %KARAF_HOME%/bin directory, ".." takes us
> back to %KARAF_HOME%.
> >
> > However, this does not seem to work in Karaf 4.3.0. I have tried several
> relative paths but I cannot figure out what directory ".." seems to point
> to in Karaf 4.3.0.
> >
> > Has anything related to this been changed from Karaf 4.0.7 to Karaf
> 4.3.0?
>
>


Relative paths in karaf-wrapper.conf for Karaf 4.3.0

2020-12-16 Thread Bengt Rodehav
I am upgrading from Karaf 4.0.7 to 4.3.0 and have run into problems
starting Karaf as a service.

We want to be able to unpack our Karaf based application anywhere so we
need to avoid absolute paths everywhere. In the past, our
karaf-wrapper.conf has contained the following lines:

set.default.KARAF_HOME=..
set.default.KARAF_BASE=..
set.default.KARAF_DATA=../data
set.default.KARAF_ETC=../etc

This has worked fine since the Wrapper always sets the directory in which
karaf-wrapper.exe resides as the working directory. Therefore, since
karaf-wrapper.exe resides in the %KARAF_HOME%/bin directory, ".." takes us
back to %KARAF_HOME%.

However, this does not seem to work in Karaf 4.3.0. I have tried several
relative paths but I cannot figure out what directory ".." seems to point
to in Karaf 4.3.0.

Has anything related to this been changed from Karaf 4.0.7 to Karaf 4.3.0?


Re: Apache Shiro in Karaf

2016-11-21 Thread Bengt Rodehav
Thanks for you explanation Achim.

As you know I use a WebContainer in Pax-Web today. It suits me very well
since I also use iPojo and do a lot of dynamic initialization that way. I
have a feeling that it would be much harder to do this if I simply deployed
a WAR/WAB. Don't I have to go back to the dreaded XML deployment
descriptors (static) and  instead of being able to do my initialization
dynamically with API calls to the WebContainer?

/Bengt

2016-11-21 9:59 GMT+01:00 Achim Nierbeck <bcanh...@googlemail.com>:

> Hi Bengt,
>
> WABs are basically nothing else then std. WARs with an OSGi Manifest
> declaring the Web-ContextPath.
>
> The problem you currently run into, is a very simple one. Well easy to
> explain, but hard to work around.
> So let's take a look at the way a web container works and how web
> applications are usually deployed (std. WARs)
> When you deploy a war you usually end up with a isolated Web Context, so
> everything inside this context is bound to the root context path,
> for example "my-war" context will take care of everything with this
> context, filters, servlets etc.
> Now if you deploy an application on "/" and on "my-war" the separation of
> those contexts already makes sure you have a separation of concern.
> For example filters bound to context "/" will only match for everything in
> "/" which doesn't match "/my-war"
>
> Now let us take a look at how the HttpService works and where those things
> come into play.
> The HttpService doesn't know of those boundaries, so everything deployed
> with the HttpService (this also applies to the webcontainer in Pax-Web as
> it's
> just an enhanced HttpService) is bound to "/" as this is the context of
> it.
> So everything inside this context "/" will be managed by everything we got
> there. Therefore a filter matching "/*" will always match for every other
> root folder. As those are just aliases on top of context "/". That's the
> main difference between HttpServices and regular wars.
>
> regards, Achim
>
>
>
> 2016-11-21 9:37 GMT+01:00 Bengt Rodehav <be...@rodehav.com>:
>
>> OK - thanks.
>>
>> I had missed your replies - just saw them.
>>
>> Where can I read about how to use WAB's?
>>
>> The work-around I'm currently using is simply to not install Felix Web
>> console at all but I'd rather fix this in a better way. I don't want to
>> give up Shiro - don't really think that Shiro is the problem here.
>>
>> /Bengt
>>
>> 2016-11-12 14:04 GMT+01:00 Achim Nierbeck <bcanh...@googlemail.com>:
>>
>>> Hi,
>>>
>>> afaik the webconsole registers a filter on "/", therefore it will match
>>> any other path registered on "/".
>>> At this point it might be better to use WAB's as those have another
>>> HttpContext.
>>>
>>> regards, Achim
>>>
>>>
>>> 2016-11-11 19:25 GMT+01:00 Pratt, Jason <jason.pr...@windriver.com>:
>>>
>>>> I ran into the same issue and eventually gave up using Shiro
>>>>
>>>>
>>>>
>>>> *From:* bengt.rode...@gmail.com [mailto:bengt.rode...@gmail.com] *On
>>>> Behalf Of *Bengt Rodehav
>>>> *Sent:* Sunday, November 06, 2016 11:57 PM
>>>> *To:* user@karaf.apache.org
>>>> *Subject:* Re: Apache Shiro in Karaf
>>>>
>>>>
>>>>
>>>> It seems like the webconsole is what causes me problems. If I install
>>>> the "webconsole" feature, then I'm prompted for basic authentiction when I
>>>> use the anonymous filter in Shiro. If I do not install the "webconsole"
>>>> feature, then this doesn't happen. It seems like the webconsole installs
>>>> some filter that will kick in when I use anonymous filter.
>>>>
>>>>
>>>>
>>>> Anyone has an idea about this?
>>>>
>>>>
>>>>
>>>> I guess as a workaround I'll have to skip the webconsole. Normally I
>>>> would like it installed though since it is very useful.
>>>>
>>>>
>>>>
>>>> /Bengt
>>>>
>>>>
>>>>
>>>> 2016-11-07 8:40 GMT+01:00 Bengt Rodehav <be...@rodehav.com>:
>>>>
>>>> Thanks for your reply Steinar,
>>>>
>>>>
>>>>
>>>> I think the difference is that you don't use the anonymous filter
>>>> (keyword "anon" in shiro.ini). I need to use that 

Re: Apache Shiro in Karaf

2016-11-21 Thread Bengt Rodehav
OK - thanks.

I had missed your replies - just saw them.

Where can I read about how to use WAB's?

The work-around I'm currently using is simply to not install Felix Web
console at all but I'd rather fix this in a better way. I don't want to
give up Shiro - don't really think that Shiro is the problem here.

/Bengt

2016-11-12 14:04 GMT+01:00 Achim Nierbeck <bcanh...@googlemail.com>:

> Hi,
>
> afaik the webconsole registers a filter on "/", therefore it will match
> any other path registered on "/".
> At this point it might be better to use WAB's as those have another
> HttpContext.
>
> regards, Achim
>
>
> 2016-11-11 19:25 GMT+01:00 Pratt, Jason <jason.pr...@windriver.com>:
>
>> I ran into the same issue and eventually gave up using Shiro
>>
>>
>>
>> *From:* bengt.rode...@gmail.com [mailto:bengt.rode...@gmail.com] *On
>> Behalf Of *Bengt Rodehav
>> *Sent:* Sunday, November 06, 2016 11:57 PM
>> *To:* user@karaf.apache.org
>> *Subject:* Re: Apache Shiro in Karaf
>>
>>
>>
>> It seems like the webconsole is what causes me problems. If I install the
>> "webconsole" feature, then I'm prompted for basic authentiction when I use
>> the anonymous filter in Shiro. If I do not install the "webconsole"
>> feature, then this doesn't happen. It seems like the webconsole installs
>> some filter that will kick in when I use anonymous filter.
>>
>>
>>
>> Anyone has an idea about this?
>>
>>
>>
>> I guess as a workaround I'll have to skip the webconsole. Normally I
>> would like it installed though since it is very useful.
>>
>>
>>
>> /Bengt
>>
>>
>>
>> 2016-11-07 8:40 GMT+01:00 Bengt Rodehav <be...@rodehav.com>:
>>
>> Thanks for your reply Steinar,
>>
>>
>>
>> I think the difference is that you don't use the anonymous filter
>> (keyword "anon" in shiro.ini). I need to use that on a couple of pages that
>> need to be accessible by anyone without having to login.
>>
>>
>>
>> What happens if you try using "anon"? Will Karaf require basic
>> authentication?
>>
>>
>>
>> Note also that I have the Karaf web console installed. I think it might
>> interfere with this.
>>
>>
>>
>> /Bengt
>>
>>
>>
>> 2016-11-04 17:27 GMT+01:00 Steinar Bang <s...@dod.no>:
>>
>> >>>>> Bengt Rodehav <be...@rodehav.com>:
>>
>> > It seems that if I comment away the following line in
>> etc/system.properties
>> > then the basic authentication goes away:
>>
>> > *karaf.local.roles = admin,manager,viewer,systembundles*
>>
>> > Not sure how this works. Would appreciate if someone could explain.
>>
>> Except for the fact that one of my karaf installations is failing
>> mysteriously I have successfully used shiro basic authentication in
>> karaf.
>>
>> The changes were:
>>  1. Added the ShiroFilter to the web.xml of my webapp
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/webapp/WEB-INF/web.xml
>>  2. Added a shiro.ini file to the webapp
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/webapp/WEB-INF/shiro.ini
>>  3. Added a custom realm (maybe I can replace this by the JDBC realm...?
>> But I was trying out stuff and learning as I created it)
>>   https://github.com/steinarb/ukelonn/blob/using-primefaces/uk
>> elonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/UkelonnRealm.java
>>  4. Added a redirection in the main JSF page redirecting the admins to a
>> different page (that's the preRenderView  tag)
>>   https://github.com/steinarb/ukelonn/blob/using-primefaces/uk
>> elonn.bundle/src/main/webapp/ukelonn.xhtml
>>  5. Added a redirect method to the bean serving the main JSF page
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/Ukelon
>> nController.java
>>  6. Pulled in shiro-core and shiro-web as runtime dependencies
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.karaf/ukelonn/pom.xml
>>
>> And that was it, basically.
>>
>> I basically just followed the directions I found here, and adapted them
>> to a pax-web setting:
>>  http://balusc.omnifaces.org/2013/01/apache-shiro-is-it-read
>> y-for-java-ee-6.html
>>
>>
>>
>>
>>
>
>
>
> --
>
> Apache Member
> Apache Karaf <http://karaf.apache.org/> Committer & PMC
> OPS4J Pax Web <http://wiki.ops4j.org/display/paxweb/Pax+Web/> Committer &
> Project Lead
> blog <http://notizblog.nierbeck.de/>
> Co-Author of Apache Karaf Cookbook <http://bit.ly/1ps9rkS>
>
> Software Architect / Project Manager / Scrum Master
>
>


Re: Apache Shiro in Karaf

2016-11-06 Thread Bengt Rodehav
It seems like the webconsole is what causes me problems. If I install the
"webconsole" feature, then I'm prompted for basic authentiction when I use
the anonymous filter in Shiro. If I do not install the "webconsole"
feature, then this doesn't happen. It seems like the webconsole installs
some filter that will kick in when I use anonymous filter.

Anyone has an idea about this?

I guess as a workaround I'll have to skip the webconsole. Normally I would
like it installed though since it is very useful.

/Bengt

2016-11-07 8:40 GMT+01:00 Bengt Rodehav <be...@rodehav.com>:

> Thanks for your reply Steinar,
>
> I think the difference is that you don't use the anonymous filter (keyword
> "anon" in shiro.ini). I need to use that on a couple of pages that need to
> be accessible by anyone without having to login.
>
> What happens if you try using "anon"? Will Karaf require basic
> authentication?
>
> Note also that I have the Karaf web console installed. I think it might
> interfere with this.
>
> /Bengt
>
> 2016-11-04 17:27 GMT+01:00 Steinar Bang <s...@dod.no>:
>
>> >>>>> Bengt Rodehav <be...@rodehav.com>:
>>
>> > It seems that if I comment away the following line in
>> etc/system.properties
>> > then the basic authentication goes away:
>>
>> > *karaf.local.roles = admin,manager,viewer,systembundles*
>>
>> > Not sure how this works. Would appreciate if someone could explain.
>>
>> Except for the fact that one of my karaf installations is failing
>> mysteriously I have successfully used shiro basic authentication in
>> karaf.
>>
>> The changes were:
>>  1. Added the ShiroFilter to the web.xml of my webapp
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/webapp/WEB-INF/web.xml
>>  2. Added a shiro.ini file to the webapp
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/webapp/WEB-INF/shiro.ini
>>  3. Added a custom realm (maybe I can replace this by the JDBC realm...?
>> But I was trying out stuff and learning as I created it)
>>   https://github.com/steinarb/ukelonn/blob/using-primefaces/uk
>> elonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/UkelonnRealm.java
>>  4. Added a redirection in the main JSF page redirecting the admins to a
>> different page (that's the preRenderView  tag)
>>   https://github.com/steinarb/ukelonn/blob/using-primefaces/uk
>> elonn.bundle/src/main/webapp/ukelonn.xhtml
>>  5. Added a redirect method to the bean serving the main JSF page
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/Ukelon
>> nController.java
>>  6. Pulled in shiro-core and shiro-web as runtime dependencies
>>  https://github.com/steinarb/ukelonn/blob/using-primefaces/u
>> kelonn.karaf/ukelonn/pom.xml
>>
>> And that was it, basically.
>>
>> I basically just followed the directions I found here, and adapted them
>> to a pax-web setting:
>>  http://balusc.omnifaces.org/2013/01/apache-shiro-is-it-read
>> y-for-java-ee-6.html
>>
>>
>


Re: Apache Shiro in Karaf

2016-11-06 Thread Bengt Rodehav
Thanks for your reply Steinar,

I think the difference is that you don't use the anonymous filter (keyword
"anon" in shiro.ini). I need to use that on a couple of pages that need to
be accessible by anyone without having to login.

What happens if you try using "anon"? Will Karaf require basic
authentication?

Note also that I have the Karaf web console installed. I think it might
interfere with this.

/Bengt

2016-11-04 17:27 GMT+01:00 Steinar Bang <s...@dod.no>:

> >>>>> Bengt Rodehav <be...@rodehav.com>:
>
> > It seems that if I comment away the following line in
> etc/system.properties
> > then the basic authentication goes away:
>
> > *karaf.local.roles = admin,manager,viewer,systembundles*
>
> > Not sure how this works. Would appreciate if someone could explain.
>
> Except for the fact that one of my karaf installations is failing
> mysteriously I have successfully used shiro basic authentication in
> karaf.
>
> The changes were:
>  1. Added the ShiroFilter to the web.xml of my webapp
>  https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.bundle/src/main/webapp/WEB-INF/web.xml
>  2. Added a shiro.ini file to the webapp
>  https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.bundle/src/main/webapp/WEB-INF/shiro.ini
>  3. Added a custom realm (maybe I can replace this by the JDBC realm...?
> But I was trying out stuff and learning as I created it)
>   https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/UkelonnRealm.java
>  4. Added a redirection in the main JSF page redirecting the admins to a
> different page (that's the preRenderView  tag)
>   https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.bundle/src/main/webapp/ukelonn.xhtml
>  5. Added a redirect method to the bean serving the main JSF page
>  https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.bundle/src/main/java/no/priv/bang/ukelonn/impl/
> UkelonnController.java
>  6. Pulled in shiro-core and shiro-web as runtime dependencies
>  https://github.com/steinarb/ukelonn/blob/using-primefaces/
> ukelonn.karaf/ukelonn/pom.xml
>
> And that was it, basically.
>
> I basically just followed the directions I found here, and adapted them
> to a pax-web setting:
>  http://balusc.omnifaces.org/2013/01/apache-shiro-is-it-
> ready-for-java-ee-6.html
>
>


Re: Apache Shiro in Karaf

2016-11-04 Thread Bengt Rodehav
It seems that if I comment away the following line in etc/system.properties
then the basic authentication goes away:

*karaf.local.roles = admin,manager,viewer,systembundles*

Not sure how this works. Would appreciate if someone could explain.

/Bengt

2016-11-04 16:42 GMT+01:00 Bengt Rodehav <be...@rodehav.com>:

> Hi,
>
> I'm using Apache Shiro in Karaf 4.0.7. Not sure if the problem I have is a
> Karaf related problem or just a Pax-Web related problem so I post in both
> foras.
>
> Here is an extract of my Shiro ini file:
>
> [urls]
> /api/getCurrentUser = anon
> /login = authc
> /logout = logout
> /admin/** = authc
>
> The intention is that the first url (that is associated with "anon")
> should be allowed to access without a user being authenticated.
>
> When I deploy my application in Karaf, an HTTP status code 401 is returned
> and basic authentication is triggered in the browser. If I enter
> user=password=karaf then I get through.
>
> Does anyone have any idea why this happens? Is it so that if the url is
> not stopped by Shiro then it continues to a filter that Karaf/Pax-Web has
> set up that requires basic authentication?
>
> How can I get around this?
>
> /Bengt
>


Apache Shiro in Karaf

2016-11-04 Thread Bengt Rodehav
Hi,

I'm using Apache Shiro in Karaf 4.0.7. Not sure if the problem I have is a
Karaf related problem or just a Pax-Web related problem so I post in both
foras.

Here is an extract of my Shiro ini file:

[urls]
/api/getCurrentUser = anon
/login = authc
/logout = logout
/admin/** = authc

The intention is that the first url (that is associated with "anon") should
be allowed to access without a user being authenticated.

When I deploy my application in Karaf, an HTTP status code 401 is returned
and basic authentication is triggered in the browser. If I enter
user=password=karaf then I get through.

Does anyone have any idea why this happens? Is it so that if the url is not
stopped by Shiro then it continues to a filter that Karaf/Pax-Web has set
up that requires basic authentication?

How can I get around this?

/Bengt


Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-12 Thread Bengt Rodehav
OK - thanks for trying.

/Bengt

2016-07-12 11:12 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:

> Hi Bengt,
>
> sorry been very busy lately. I did give it a try but couldn't find a
> reason why Log4j2 should be used or Pax Logging should be triggered to be
> restarted. There isn't any reason for the pax web bundles to do so ... :/
> sorry didn't get any further on this ..
>
> regards, Achim
>
>
>
>
> 2016-07-11 14:47 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>
>> Did you have a chance to look at this Achim? If there is a problem with
>> the pax-jetty feature it would be nice to have it fixed in Karaf 4.0.6
>> which I understand is in the works.
>>
>> /Bengt
>>
>> 2016-07-07 9:13 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>
>>> OK - thanks Achim,
>>>
>>> /Bengt
>>>
>>> 2016-07-06 22:08 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:
>>>
>>>> Hi Bengt,
>>>>
>>>> I'll try to find out if one of the bundles in that feature depends on
>>>> log4j2 ... but I'm not aware of such a dependency.
>>>>
>>>> Your suspicion about dynamic loading of DLLs is correct in case of the
>>>> location of the dll is inside of a bundle and does have dependencies to
>>>> another dll. If it's a dll loaded via the root classloader that shouldn't
>>>> be of an issue.
>>>>
>>>> regards, Achim
>>>>
>>>>
>>>> 2016-07-04 16:04 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>>
>>>>> Another theory: Looking at the stack trace this seems to be triggered
>>>>> by a configuration update. Could the problem be that Pax-logging is trying
>>>>> to load the DLL again and failing since it is already loaded? Perhaps the
>>>>> initial load works but subsequent configuration updates does not?
>>>>>
>>>>> I tried to verify this...
>>>>>
>>>>> After successful start of Karaf (after step 9 in my previous post), I
>>>>> edit org.ops4j.pas.logging.cfg (by changing the root logger between INFO
>>>>> and DEBUG). This causes no error.
>>>>>
>>>>> But after having installed feature pax-jetty (after step 10 in my
>>>>> previous post), every change in org.ops4j.pas.logging.cfg causes the same
>>>>> error to appear (the stack trace included in my previous post).
>>>>>
>>>>> It's as if installing the pax-jetty feature takes gives control of
>>>>> org.ops4j.pas.logging.cfg to someone who cannot load the DLL. I have no
>>>>> idea how this could happen.
>>>>>
>>>>> Anyone else has an idea?
>>>>>
>>>>> /Bengt
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> /Bengt
>>>>>
>>>>> 2016-07-04 15:51 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>>>
>>>>>> A theory: Could one of the bundles installed by feature pax-jetty be
>>>>>> using log4j 2.x directly without using Pax-logging? If so, would it too 
>>>>>> try
>>>>>> to read the log4j configuration file? I guess it would fail to load the 
>>>>>> DLL
>>>>>> since it is probably not compatible with log4j 2.x.
>>>>>>
>>>>>> Could this happen? If so, how can I find out which bundle?
>>>>>>
>>>>>> /Bengt
>>>>>>
>>>>>> 2016-07-04 15:15 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>>>>
>>>>>>> Back to the Karaf mailing list
>>>>>>>
>>>>>>> I can actually get this problem on a standard vanilla Karaf 4.0.5.
>>>>>>> It seems to be triggered when installing the feature pax-jetty.
>>>>>>>
>>>>>>> *1. Install standard Karaf 4.0.5*
>>>>>>>
>>>>>>> *2. Replace org.ops4j.pax.logging.cfg with the following:*
>>>>>>>
>>>>>>> log4j.rootLogger=INFO, stdout
>>>>>>>
>>>>>>> # CONSOLE appender
>>>>>>> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
>>>>>>> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
>>>>>

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-11 Thread Bengt Rodehav
Did you have a chance to look at this Achim? If there is a problem with the
pax-jetty feature it would be nice to have it fixed in Karaf 4.0.6 which I
understand is in the works.

/Bengt

2016-07-07 9:13 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> OK - thanks Achim,
>
> /Bengt
>
> 2016-07-06 22:08 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:
>
>> Hi Bengt,
>>
>> I'll try to find out if one of the bundles in that feature depends on
>> log4j2 ... but I'm not aware of such a dependency.
>>
>> Your suspicion about dynamic loading of DLLs is correct in case of the
>> location of the dll is inside of a bundle and does have dependencies to
>> another dll. If it's a dll loaded via the root classloader that shouldn't
>> be of an issue.
>>
>> regards, Achim
>>
>>
>> 2016-07-04 16:04 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>
>>> Another theory: Looking at the stack trace this seems to be triggered by
>>> a configuration update. Could the problem be that Pax-logging is trying to
>>> load the DLL again and failing since it is already loaded? Perhaps the
>>> initial load works but subsequent configuration updates does not?
>>>
>>> I tried to verify this...
>>>
>>> After successful start of Karaf (after step 9 in my previous post), I
>>> edit org.ops4j.pas.logging.cfg (by changing the root logger between INFO
>>> and DEBUG). This causes no error.
>>>
>>> But after having installed feature pax-jetty (after step 10 in my
>>> previous post), every change in org.ops4j.pas.logging.cfg causes the same
>>> error to appear (the stack trace included in my previous post).
>>>
>>> It's as if installing the pax-jetty feature takes gives control of
>>> org.ops4j.pas.logging.cfg to someone who cannot load the DLL. I have no
>>> idea how this could happen.
>>>
>>> Anyone else has an idea?
>>>
>>> /Bengt
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> /Bengt
>>>
>>> 2016-07-04 15:51 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>
>>>> A theory: Could one of the bundles installed by feature pax-jetty be
>>>> using log4j 2.x directly without using Pax-logging? If so, would it too try
>>>> to read the log4j configuration file? I guess it would fail to load the DLL
>>>> since it is probably not compatible with log4j 2.x.
>>>>
>>>> Could this happen? If so, how can I find out which bundle?
>>>>
>>>> /Bengt
>>>>
>>>> 2016-07-04 15:15 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>>
>>>>> Back to the Karaf mailing list
>>>>>
>>>>> I can actually get this problem on a standard vanilla Karaf 4.0.5. It
>>>>> seems to be triggered when installing the feature pax-jetty.
>>>>>
>>>>> *1. Install standard Karaf 4.0.5*
>>>>>
>>>>> *2. Replace org.ops4j.pax.logging.cfg with the following:*
>>>>>
>>>>> log4j.rootLogger=INFO, stdout
>>>>>
>>>>> # CONSOLE appender
>>>>> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
>>>>> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
>>>>> log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} | %-5.5p |
>>>>> %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
>>>>> log4j.appender.stdout.threshold=ERROR
>>>>>
>>>>> # Windows event log
>>>>> log4j.appender.nteventlog=org.apache.log4j.nt.NTEventLogAppender
>>>>> log4j.appender.nteventlog.source=Test source
>>>>> log4j.appender.nteventlog.layout=org.apache.log4j.PatternLayout
>>>>> log4j.appender.nteventlog.layout.ConversionPattern=Time:
>>>>> %d{ISO8601}%n%nSeverity: %p%n%nThread: %t%n%n%m%n
>>>>> log4j.appender.nteventlog.threshold=DEBUG
>>>>>
>>>>> *3. Start Karaf: "bin\karaf clean"*
>>>>>
>>>>> This should work.
>>>>>
>>>>> *4. Exit Karaf*
>>>>>
>>>>> *5. Change the root looger line to:*
>>>>>
>>>>> log4j.rootLogger=INFO, stdout, nteventlog
>>>>>
>>>>> *6. Start Karaf again*
>>>>>
>>>>> I get the following error:
>>>>>
>>>>> 2016-07-04 15:05:39,53

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-07 Thread Bengt Rodehav
OK - thanks Achim,

/Bengt

2016-07-06 22:08 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:

> Hi Bengt,
>
> I'll try to find out if one of the bundles in that feature depends on
> log4j2 ... but I'm not aware of such a dependency.
>
> Your suspicion about dynamic loading of DLLs is correct in case of the
> location of the dll is inside of a bundle and does have dependencies to
> another dll. If it's a dll loaded via the root classloader that shouldn't
> be of an issue.
>
> regards, Achim
>
>
> 2016-07-04 16:04 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>
>> Another theory: Looking at the stack trace this seems to be triggered by
>> a configuration update. Could the problem be that Pax-logging is trying to
>> load the DLL again and failing since it is already loaded? Perhaps the
>> initial load works but subsequent configuration updates does not?
>>
>> I tried to verify this...
>>
>> After successful start of Karaf (after step 9 in my previous post), I
>> edit org.ops4j.pas.logging.cfg (by changing the root logger between INFO
>> and DEBUG). This causes no error.
>>
>> But after having installed feature pax-jetty (after step 10 in my
>> previous post), every change in org.ops4j.pas.logging.cfg causes the same
>> error to appear (the stack trace included in my previous post).
>>
>> It's as if installing the pax-jetty feature takes gives control of
>> org.ops4j.pas.logging.cfg to someone who cannot load the DLL. I have no
>> idea how this could happen.
>>
>> Anyone else has an idea?
>>
>> /Bengt
>>
>>
>>
>>
>>
>>
>>
>> /Bengt
>>
>> 2016-07-04 15:51 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>
>>> A theory: Could one of the bundles installed by feature pax-jetty be
>>> using log4j 2.x directly without using Pax-logging? If so, would it too try
>>> to read the log4j configuration file? I guess it would fail to load the DLL
>>> since it is probably not compatible with log4j 2.x.
>>>
>>> Could this happen? If so, how can I find out which bundle?
>>>
>>> /Bengt
>>>
>>> 2016-07-04 15:15 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>>
>>>> Back to the Karaf mailing list
>>>>
>>>> I can actually get this problem on a standard vanilla Karaf 4.0.5. It
>>>> seems to be triggered when installing the feature pax-jetty.
>>>>
>>>> *1. Install standard Karaf 4.0.5*
>>>>
>>>> *2. Replace org.ops4j.pax.logging.cfg with the following:*
>>>>
>>>> log4j.rootLogger=INFO, stdout
>>>>
>>>> # CONSOLE appender
>>>> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
>>>> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
>>>> log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} | %-5.5p |
>>>> %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
>>>> log4j.appender.stdout.threshold=ERROR
>>>>
>>>> # Windows event log
>>>> log4j.appender.nteventlog=org.apache.log4j.nt.NTEventLogAppender
>>>> log4j.appender.nteventlog.source=Test source
>>>> log4j.appender.nteventlog.layout=org.apache.log4j.PatternLayout
>>>> log4j.appender.nteventlog.layout.ConversionPattern=Time:
>>>> %d{ISO8601}%n%nSeverity: %p%n%nThread: %t%n%n%m%n
>>>> log4j.appender.nteventlog.threshold=DEBUG
>>>>
>>>> *3. Start Karaf: "bin\karaf clean"*
>>>>
>>>> This should work.
>>>>
>>>> *4. Exit Karaf*
>>>>
>>>> *5. Change the root looger line to:*
>>>>
>>>> log4j.rootLogger=INFO, stdout, nteventlog
>>>>
>>>> *6. Start Karaf again*
>>>>
>>>> I get the following error:
>>>>
>>>> 2016-07-04 15:05:39,534 | ERROR | s4j.pax.logging) | configadmin
>>>>| ?? | [org.osgi.service.log.LogService,
>>>> org.knopflerfish.service.log.LogService,
>>>> org.ops4j.pax.logging.PaxLoggingService,
>>>> org.osgi.service.cm.ManagedService, id=12,
>>>> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]: Unexpected
>>>> problem updating configuration org.ops4j.pax.logging
>>>> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in
>>>> java.library.path
>>>> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)
>>>> at java.la

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-04 Thread Bengt Rodehav
Another theory: Looking at the stack trace this seems to be triggered by a
configuration update. Could the problem be that Pax-logging is trying to
load the DLL again and failing since it is already loaded? Perhaps the
initial load works but subsequent configuration updates does not?

I tried to verify this...

After successful start of Karaf (after step 9 in my previous post), I edit
org.ops4j.pas.logging.cfg (by changing the root logger between INFO and
DEBUG). This causes no error.

But after having installed feature pax-jetty (after step 10 in my previous
post), every change in org.ops4j.pas.logging.cfg causes the same error to
appear (the stack trace included in my previous post).

It's as if installing the pax-jetty feature takes gives control of
org.ops4j.pas.logging.cfg to someone who cannot load the DLL. I have no
idea how this could happen.

Anyone else has an idea?

/Bengt







/Bengt

2016-07-04 15:51 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> A theory: Could one of the bundles installed by feature pax-jetty be using
> log4j 2.x directly without using Pax-logging? If so, would it too try to
> read the log4j configuration file? I guess it would fail to load the DLL
> since it is probably not compatible with log4j 2.x.
>
> Could this happen? If so, how can I find out which bundle?
>
> /Bengt
>
> 2016-07-04 15:15 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>
>> Back to the Karaf mailing list
>>
>> I can actually get this problem on a standard vanilla Karaf 4.0.5. It
>> seems to be triggered when installing the feature pax-jetty.
>>
>> *1. Install standard Karaf 4.0.5*
>>
>> *2. Replace org.ops4j.pax.logging.cfg with the following:*
>>
>> log4j.rootLogger=INFO, stdout
>>
>> # CONSOLE appender
>> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
>> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
>> log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} | %-5.5p |
>> %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
>> log4j.appender.stdout.threshold=ERROR
>>
>> # Windows event log
>> log4j.appender.nteventlog=org.apache.log4j.nt.NTEventLogAppender
>> log4j.appender.nteventlog.source=Test source
>> log4j.appender.nteventlog.layout=org.apache.log4j.PatternLayout
>> log4j.appender.nteventlog.layout.ConversionPattern=Time:
>> %d{ISO8601}%n%nSeverity: %p%n%nThread: %t%n%n%m%n
>> log4j.appender.nteventlog.threshold=DEBUG
>>
>> *3. Start Karaf: "bin\karaf clean"*
>>
>> This should work.
>>
>> *4. Exit Karaf*
>>
>> *5. Change the root looger line to:*
>>
>> log4j.rootLogger=INFO, stdout, nteventlog
>>
>> *6. Start Karaf again*
>>
>> I get the following error:
>>
>> 2016-07-04 15:05:39,534 | ERROR | s4j.pax.logging) | configadmin
>>  | ?? | [org.osgi.service.log.LogService,
>> org.knopflerfish.service.log.LogService,
>> org.ops4j.pax.logging.PaxLoggingService,
>> org.osgi.service.cm.ManagedService, id=12,
>> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]: Unexpected
>> problem updating configuration org.ops4j.pax.logging
>> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in java.library.path
>> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)
>> at java.lang.Runtime.loadLibrary0(Runtime.java:870)
>> at java.lang.System.loadLibrary(System.java:1122)
>> at
>> org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at java.lang.Class.newInstance(Class.java:442)
>> at
>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)
>> at
>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)
>> at
>> org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator.java:97)
>> at
>> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
>> at
>> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
>> at
>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyCo

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-04 Thread Bengt Rodehav
A theory: Could one of the bundles installed by feature pax-jetty be using
log4j 2.x directly without using Pax-logging? If so, would it too try to
read the log4j configuration file? I guess it would fail to load the DLL
since it is probably not compatible with log4j 2.x.

Could this happen? If so, how can I find out which bundle?

/Bengt

2016-07-04 15:15 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> Back to the Karaf mailing list
>
> I can actually get this problem on a standard vanilla Karaf 4.0.5. It
> seems to be triggered when installing the feature pax-jetty.
>
> *1. Install standard Karaf 4.0.5*
>
> *2. Replace org.ops4j.pax.logging.cfg with the following:*
>
> log4j.rootLogger=INFO, stdout
>
> # CONSOLE appender
> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
> log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} | %-5.5p |
> %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
> log4j.appender.stdout.threshold=ERROR
>
> # Windows event log
> log4j.appender.nteventlog=org.apache.log4j.nt.NTEventLogAppender
> log4j.appender.nteventlog.source=Test source
> log4j.appender.nteventlog.layout=org.apache.log4j.PatternLayout
> log4j.appender.nteventlog.layout.ConversionPattern=Time:
> %d{ISO8601}%n%nSeverity: %p%n%nThread: %t%n%n%m%n
> log4j.appender.nteventlog.threshold=DEBUG
>
> *3. Start Karaf: "bin\karaf clean"*
>
> This should work.
>
> *4. Exit Karaf*
>
> *5. Change the root looger line to:*
>
> log4j.rootLogger=INFO, stdout, nteventlog
>
> *6. Start Karaf again*
>
> I get the following error:
>
> 2016-07-04 15:05:39,534 | ERROR | s4j.pax.logging) | configadmin
>| ?? | [org.osgi.service.log.LogService,
> org.knopflerfish.service.log.LogService,
> org.ops4j.pax.logging.PaxLoggingService,
> org.osgi.service.cm.ManagedService, id=12,
> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]: Unexpected
> problem updating configuration org.ops4j.pax.logging
> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in java.library.path
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)
> at java.lang.Runtime.loadLibrary0(Runtime.java:870)
> at java.lang.System.loadLibrary(System.java:1122)
> at
> org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at java.lang.Class.newInstance(Class.java:442)
> at
> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)
> at
> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)
> at
> org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator.java:97)
> at
> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
> at
> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
> at
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)
> at
> org.apache.log4j.PaxLoggingConfigurator.doConfigure(PaxLoggingConfigurator.java:72)
> at
> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl.updated(PaxLoggingServiceImpl.java:214)
> at
> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl$1ManagedPaxLoggingService.updated(PaxLoggingServiceImpl.java:362)
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updated(ManagedServiceTracker.java:189)
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updateService(ManagedServiceTracker.java:152)
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.provideConfiguration(ManagedServiceTracker.java:85)
> at
> org.apache.felix.cm.impl.ConfigurationManager$UpdateConfiguration.run(ConfigurationManager.java:1753)
> at
> org.apache.felix.cm.impl.UpdateThread.run0(UpdateThread.java:143)
> at org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:110)
> at java.lang.Thread.run(Thread.java:745)
>
> This makes sense since I haven't provided the DLL yet.
>
> *7. Exit Karaf*
>
> *8. Put the file NTEventLogAppender.amd64.dll in KARAF_HOME/lib (I attach
> the file for 64 bit Windows)*
>
> *9. Start Karaf again*
>
&g

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-01 Thread Bengt Rodehav
52)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.helper.ManagedServiceTracker.provideConfiguration(ManagedServiceTracker.java:85)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.ConfigurationManager$UpdateConfiguration.run(ConfigurationManager.java:1753)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.UpdateThread.run0(UpdateThread.java:143)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:110)[7:org.apache.felix.configadmin:1.8.8]
at java.lang.Thread.run(Thread.java:745)[:1.8.0_74]
ERROR: Bundle org.apache.karaf.features.core [9] Error starting
mvn:org.apache.karaf.features/org.apache.karaf.features.core/4.0.5
(org.osgi.framework.BundleException: Unable to resolve
org.apache.karaf.features.core [9](R 9.0): missing requirement
[org.apache.karaf.features.core [9](R 9.0)] osgi.wiring.package;
(&(osgi.wiring.package=org.ops4j.pax.url.mvn)(version>=2.4.0)(!(version>=3.0.0)))
Unresolved requirements: [[org.apache.karaf.features.core [9](R 9.0)]
osgi.wiring.package;
(&(osgi.wiring.package=org.ops4j.pax.url.mvn)(version>=2.4.0)(!(version>=3.0.0)))])org.osgi.framework.BundleException:
Unable to resolve org.apache.karaf.features.core [9](R 9.0): missing
requirement [org.apache.karaf.features.core [9](R 9.0)]
osgi.wiring.package;
(&(osgi.wiring.package=org.ops4j.pax.url.mvn)(version>=2.4.0)(!(version>=3.0.0)))
Unresolved requirements: [[org.apache.karaf.features.core [9](R 9.0)]
osgi.wiring.package;
(&(osgi.wiring.package=org.ops4j.pax.url.mvn)(version>=2.4.0)(!(version>=3.0.0)))]
at
org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4111)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2117)
at
org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1371)
at
org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:308)
at java.lang.Thread.run(Thread.java:745)

So, the DLL seems to be loaded and Pax-logging seems to work using
Pax-logging 1.8.1 but not using Pax-logging 1.8.5.

I will re-post this conversation to the OOPS4J mailing list.

/Bengt


















2016-07-01 8:55 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> OK - I guess I misunderstood this then.
>
> Looking in the POM's I now see that there are dependencies to both log4j
> 1.2.16 and log4j 2.x.
>
> I wonder then why the NTEventLogAppender can't be used in Karaf 4.0.5.
> For a while I thought it might be a java version problem. I now use Java 8
> instead of Java 7 like I did before. But even if I run Karaf 4.0.5 using
> Java 7 I still get the same problem.
>
> I will try to use Karaf 4.0.5 with Pax-logging 1.8.1 to see if it makes
> any difference. What is the best way to accomplish that?
>
> /Bengt
>
> 2016-06-30 16:54 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:
>
>> Hi Bengt,
>>
>> newer versions of Pax-Logging don't use log4j2 per default so this should
>> still work ...
>> the underlying impl is still log4j 1 unless someone changed it on a minor
>> version update ...
>>
>> regards, Achim
>>
>>
>> 2016-06-30 16:23 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>>
>>> Thanks JB,
>>>
>>> Tried it though and no diffference.
>>>
>>> When investigating this it seems like newer versions of pax-logging uses
>>> log4j2. Unfortunately the NTEventLogAppender is incompatible with
>>> log4j2.
>>>
>>> I've found the project log4jna that seems to target this. Unfortunately
>>> I cannot find a released version that supports log4j2.
>>>
>>> Anyone else encountered this?
>>>
>>> /Bengt
>>>
>>> 2016-06-30 14:48 GMT+02:00 Jean-Baptiste Onofré <j...@nanthrax.net>:
>>>
>>>> In Karaf 4, the dll should go in lib/ext.
>>>>
>>>> Regards
>>>> JB
>>>>
>>>> On 06/30/2016 02:16 PM, Bengt Rodehav wrote:
>>>>
>>>>> I have a feeling that I need to put the NTEventLogAppender.amd4.dll in
>>>>> another directory in Karaf 4.0.5 then in Karaf 2.4.1.
>>>>>
>>>>> I have always put it in the directory %KARAF_HOME%/lib which works for
>>>>> Karaf 2.4.1. Where should DLL's be put in Karaf 4.0.5?
>>>>>
>>>>> /Bengt
>>>>>
>>>>> 2016-06-29 17:37 GMT+02:00 Bengt Rodehav <be...@rodehav.com
>>>>> <mailto:be...@rodehav.com>>:
>>>>>
>>>>>
>>>>> I'm trying to upgrade from Karaf 2..1 to

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-07-01 Thread Bengt Rodehav
OK - I guess I misunderstood this then.

Looking in the POM's I now see that there are dependencies to both log4j
1.2.16 and log4j 2.x.

I wonder then why the NTEventLogAppender can't be used in Karaf 4.0.5. For
a while I thought it might be a java version problem. I now use Java 8
instead of Java 7 like I did before. But even if I run Karaf 4.0.5 using
Java 7 I still get the same problem.

I will try to use Karaf 4.0.5 with Pax-logging 1.8.1 to see if it makes any
difference. What is the best way to accomplish that?

/Bengt

2016-06-30 16:54 GMT+02:00 Achim Nierbeck <bcanh...@googlemail.com>:

> Hi Bengt,
>
> newer versions of Pax-Logging don't use log4j2 per default so this should
> still work ...
> the underlying impl is still log4j 1 unless someone changed it on a minor
> version update ...
>
> regards, Achim
>
>
> 2016-06-30 16:23 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:
>
>> Thanks JB,
>>
>> Tried it though and no diffference.
>>
>> When investigating this it seems like newer versions of pax-logging uses
>> log4j2. Unfortunately the NTEventLogAppender is incompatible with log4j2.
>>
>> I've found the project log4jna that seems to target this. Unfortunately I
>> cannot find a released version that supports log4j2.
>>
>> Anyone else encountered this?
>>
>> /Bengt
>>
>> 2016-06-30 14:48 GMT+02:00 Jean-Baptiste Onofré <j...@nanthrax.net>:
>>
>>> In Karaf 4, the dll should go in lib/ext.
>>>
>>> Regards
>>> JB
>>>
>>> On 06/30/2016 02:16 PM, Bengt Rodehav wrote:
>>>
>>>> I have a feeling that I need to put the NTEventLogAppender.amd4.dll in
>>>> another directory in Karaf 4.0.5 then in Karaf 2.4.1.
>>>>
>>>> I have always put it in the directory %KARAF_HOME%/lib which works for
>>>> Karaf 2.4.1. Where should DLL's be put in Karaf 4.0.5?
>>>>
>>>> /Bengt
>>>>
>>>> 2016-06-29 17:37 GMT+02:00 Bengt Rodehav <be...@rodehav.com
>>>> <mailto:be...@rodehav.com>>:
>>>>
>>>>
>>>> I'm trying to upgrade from Karaf 2..1 to 4.0.5 and I run into
>>>> problems regarding NTEventLogAppender. I get the following on
>>>> startup:
>>>>
>>>> 2016-06-29 17:16:05,354 | ERROR | 4j.pax.logging]) | configadmin
>>>>   | ?
>>>>  ? | [org.osgi.service.log.LogService,
>>>> org.knopflerfish.service.log.LogService,
>>>> org.ops4j.pax.logging.PaxLoggingService,
>>>> org.osgi.service.cm.ManagedService, id=34,
>>>> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]:
>>>> Unexpected problem updating configuration org.ops4j.pax.logging
>>>> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in
>>>> java.library.path
>>>>  at
>>>> java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)[:1.8.0_74]
>>>>  at
>>>> java.lang.Runtime.loadLibrary0(Runtime.java:870)[:1.8.0_74]
>>>>  at
>>>> java.lang.System.loadLibrary(System.java:1122)[:1.8.0_74]
>>>>  at
>>>>
>>>> org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
>>>>  at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>> Method)[:1.8.0_74]
>>>>  at
>>>>
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)[:1.8.0_74]
>>>>  at
>>>>
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)[:1.8.0_74]
>>>>  at
>>>>
>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)[:1.8.0_74]
>>>>  at java.lang.Class.newInstance(Class.java:442)[:1.8.0_74]
>>>>  at
>>>>
>>>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>>>  at
>>>>
>>>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>>>  at
>>>>
>>>> org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-06-30 Thread Bengt Rodehav
Thanks JB,

Tried it though and no diffference.

When investigating this it seems like newer versions of pax-logging uses
log4j2. Unfortunately the NTEventLogAppender is incompatible with log4j2.

I've found the project log4jna that seems to target this. Unfortunately I
cannot find a released version that supports log4j2.

Anyone else encountered this?

/Bengt

2016-06-30 14:48 GMT+02:00 Jean-Baptiste Onofré <j...@nanthrax.net>:

> In Karaf 4, the dll should go in lib/ext.
>
> Regards
> JB
>
> On 06/30/2016 02:16 PM, Bengt Rodehav wrote:
>
>> I have a feeling that I need to put the NTEventLogAppender.amd4.dll in
>> another directory in Karaf 4.0.5 then in Karaf 2.4.1.
>>
>> I have always put it in the directory %KARAF_HOME%/lib which works for
>> Karaf 2.4.1. Where should DLL's be put in Karaf 4.0.5?
>>
>> /Bengt
>>
>> 2016-06-29 17:37 GMT+02:00 Bengt Rodehav <be...@rodehav.com
>> <mailto:be...@rodehav.com>>:
>>
>>
>> I'm trying to upgrade from Karaf 2..1 to 4.0.5 and I run into
>> problems regarding NTEventLogAppender. I get the following on startup:
>>
>> 2016-06-29 17:16:05,354 | ERROR | 4j.pax.logging]) | configadmin
>>   | ?
>>  ? | [org.osgi.service.log.LogService,
>> org.knopflerfish.service.log.LogService,
>> org.ops4j.pax.logging.PaxLoggingService,
>> org.osgi.service.cm.ManagedService, id=34,
>> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]:
>> Unexpected problem updating configuration org.ops4j.pax.logging
>> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in
>> java.library.path
>>  at
>> java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)[:1.8.0_74]
>>  at
>> java.lang.Runtime.loadLibrary0(Runtime.java:870)[:1.8.0_74]
>>  at java.lang.System.loadLibrary(System.java:1122)[:1.8.0_74]
>>  at
>>
>> org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
>>  at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)[:1.8.0_74]
>>  at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)[:1.8.0_74]
>>  at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)[:1.8.0_74]
>>  at
>>
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)[:1.8.0_74]
>>  at java.lang.Class.newInstance(Class.java:442)[:1.8.0_74]
>>  at
>>
>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator.java:97)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:639)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:504)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.log4j.PaxLoggingConfigurator.doConfigure(PaxLoggingConfigurator.java:72)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl.updated(PaxLoggingServiceImpl.java:214)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl$1ManagedPaxLoggingService.updated(PaxLoggingServiceImpl.java:362)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
>>  at
>>
>> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updated(ManagedServiceTracker.java:189)[7:org.apache.felix.configadmin:1.8.8]
>>  at
>>
>> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updateService(ManagedServiceTracker.java:152)[7:org.apache.felix.configadmin:1.8.8]
>>  at
>>
>> org.apache.felix.cm.impl.helper.ManagedServiceTracker.provideConfiguration(M

Re: Different log level for different Karaf Bundles

2016-06-30 Thread Bengt Rodehav
You can do this by using MDC combined with filters (I implemented that in
Pax logging a few years back).

E g if you use this root logger:

log4j.rootLogger=INFO, stdout, info, error, bundle, context, osgi:*

And you define the "bundle" log as follows:

log4j.appender.bundle=org.apache.log4j.sift.MDCSiftingAppender
log4j.appender.bundle.key=bundle.name
log4j.appender.bundle.default=karaf
log4j.appender.bundle.appender=org.apache.log4j.RollingFileAppender
log4j.appender.bundle.appender.MaxFileSize=1MB
log4j.appender.bundle.appender.MaxBackupIndex=2
log4j.appender.bundle.appender.layout=org.apache.log4j.PatternLayout
log4j.appender.bundle.appender.layout.ConversionPattern=%d{ISO8601} |
%-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
log4j.appender.bundle.appender.file=${logdir}/bundles/$\\{bundle.name\\}.log
log4j.appender.bundle.appender.append=true
log4j.appender.bundle.threshold=INFO

You will end up with a separate log file per bundle (named with the
bundle's name). I use a custom variable (${logdir}) to specify where to
create the log file but you can do as you wish. In this case these log
files will be at INFO level.

Sometimes I want TRACE logging on a specific bundle. I can then do as
follows:

log4j.rootLogger=TRACE, stdout, info, error, bundle, context, osgi:*,
bundle_trace

log4j.appender.bundle_trace=org.apache.log4j.sift.MDCSiftingAppender
log4j.appender.bundle_trace.key=bundle.name
log4j.appender.bundle_trace.default=karaf
log4j.appender.bundle_trace.appender=org.apache.log4j.RollingFileAppender
log4j.appender.bundle_trace.appender.MaxFileSize=10MB
log4j.appender.bundle_trace.appender.MaxBackupIndex=2
log4j.appender.bundle_trace.appender.layout=org.apache.log4j.PatternLayout
log4j.appender.bundle_trace.appender.layout.ConversionPattern=%d{ISO8601} |
%-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
log4j.appender.bundle_trace.appender.file=${logdir}/bundles/trace/$\\{
bundle.name\\}.log
log4j.appender.bundle_trace.appender.append=true
log4j.appender.bundle_trace.threshold=TRACE
log4j.appender.bundle_trace.filter.a=org.apache.log4j.filter.MDCMatchFilter
log4j.appender.bundle_trace.filter.a.exactMatch=false
log4j.appender.bundle_trace.filter.a.keyToMatch=bundle.name
log4j.appender.bundle_trace.filter.a.valueToMatch=org.apache.aries.blueprint.core
# DenyAllFilter should always be the last filter
log4j.appender.bundle_trace.filter.z=org.apache.log4j.varia.DenyAllFilter

In the above example I create a separate TRACE log for the bundle with the
name "org.apache.aries.blueprint.core".

It is also possible to configure custom logging for a particular camel
context which we do in our integration platform based on Karaf and Camel.

/Bengt










2016-06-30 13:59 GMT+02:00 Jean-Baptiste Onofré :

> Then it's different sift appenders that you have to define.
>
> Generally speaking, you don't need sift for what you want: if your bundles
> use different loggers, then, just create the logger category in the
> pax-logging config.
>
> Regards
> JB
>
> On 06/30/2016 01:56 PM, Debraj Manna wrote:
>
>>
>> Yeah if I enable sifting appender let's say with a config  and add it to
>> rootLogger
>>
>> log4j.appender.sift.threshold=DEBUG
>>
>>
>> Then this will make log level DEBUGfor all bundles. I am trying to ask
>> is let's say I have two bundles1& bundles2and I want bundle1 's log
>> level to be DEBUGand bundle2log level to be ERROR.
>>
>>
>> On Thu, Jun 30, 2016 at 2:12 PM, Jean-Baptiste Onofré > > wrote:
>>
>> Hi,
>>
>> I don't see the sift appender enable for the root logger.
>>
>> You should have:
>>
>> log4j.rootLogger=DEBUG, async, sift, osgi:*
>>
>> Regards
>> JB
>>
>> On 06/30/2016 08:23 AM, Debraj Manna wrote:
>>
>> In |Karaf 3.0.5| running under |Servicemix 6.1.0| my
>> |org.ops4j.pax.logging.cfg| looks like below:-
>>
>> |# Root logger log4j.rootLogger=DEBUG, async, osgi:*
>> log4j.throwableRenderer=org.apache.log4j.OsgiThrowableRenderer #
>> To
>> avoid flooding the log when using DEBUG level on an ssh
>> connection and
>> doing log:tail
>> log4j.logger.org.apache.sshd.server.channel.ChannelSession = INFO
>> #
>> CONSOLE appender not used by default
>> log4j.appender.stdout=org.apache.log4j.ConsoleAppender
>> log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
>> log4j.appender.stdout.layout.ConversionPattern=%d{ISO8601} |
>> %-5.5p |
>> %-16.16t | %-32.32c{1} | %X{bundle.id 
>> } -
>> %X{bundle.name  } -
>> %X{bundle.version} | %X | %m%n #
>> File appender
>> log4j.appender.out=org.apache.log4j.RollingFileAppender
>> log4j.appender.out.layout=org.apache.log4j.PatternLayout
>> log4j.appender.out.layout.ConversionPattern=%d{ISO8601} | %-5.5p |
>>  

Re: Log4j NTEventLogAppender in Karaf 4.0.5

2016-06-30 Thread Bengt Rodehav
I have a feeling that I need to put the NTEventLogAppender.amd4.dll in
another directory in Karaf 4.0.5 then in Karaf 2.4.1.

I have always put it in the directory %KARAF_HOME%/lib which works for
Karaf 2.4.1. Where should DLL's be put in Karaf 4.0.5?

/Bengt

2016-06-29 17:37 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> I'm trying to upgrade from Karaf 2..1 to 4.0.5 and I run into problems
> regarding NTEventLogAppender. I get the following on startup:
>
> 2016-06-29 17:16:05,354 | ERROR | 4j.pax.logging]) | configadmin
>| ?
> ? | [org.osgi.service.log.LogService,
> org.knopflerfish.service.log.LogService,
> org.ops4j.pax.logging.PaxLoggingService,
> org.osgi.service.cm.ManagedService, id=34,
> bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]: Unexpected
> problem updating configuration org.ops4j.pax.logging
> java.lang.UnsatisfiedLinkError: no NTEventLogAppender in java.library.path
> at
> java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)[:1.8.0_74]
> at java.lang.Runtime.loadLibrary0(Runtime.java:870)[:1.8.0_74]
> at java.lang.System.loadLibrary(System.java:1122)[:1.8.0_74]
> at
> org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)[:1.8.0_74]
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)[:1.8.0_74]
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)[:1.8.0_74]
> at
> java.lang.reflect.Constructor.newInstance(Constructor.java:423)[:1.8.0_74]
> at java.lang.Class.newInstance(Class.java:442)[:1.8.0_74]
> at
> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator.java:97)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:639)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:504)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.log4j.PaxLoggingConfigurator.doConfigure(PaxLoggingConfigurator.java:72)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl.updated(PaxLoggingServiceImpl.java:214)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl$1ManagedPaxLoggingService.updated(PaxLoggingServiceImpl.java:362)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updated(ManagedServiceTracker.java:189)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.updateService(ManagedServiceTracker.java:152)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.helper.ManagedServiceTracker.provideConfiguration(ManagedServiceTracker.java:85)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.provide(ConfigurationManager.java:1444)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.run(ConfigurationManager.java:1400)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.UpdateThread.run0(UpdateThread.java:143)[7:org.apache.felix.configadmin:1.8.8]
> at
> org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:110)[7:org.apache.felix.configadmin:1.8.8]
> at java.lang.Thread.run(Thread.java:745)[:1.8.0_74]
>
> Like I did on Karaf 2.4.1, I have put the
> file NTEventLogAppender.amd64.dll in the "lib" directory under Karaf. It
> has the version 1.2.16.1.
>
> Does anyone know how to get the NTEventLogAppender to work with Karaf
> 4.0.5?
>
> /Bengt
>
>
>
>
>


Log4j NTEventLogAppender in Karaf 4.0.5

2016-06-29 Thread Bengt Rodehav
I'm trying to upgrade from Karaf 2..1 to 4.0.5 and I run into problems
regarding NTEventLogAppender. I get the following on startup:

2016-06-29 17:16:05,354 | ERROR | 4j.pax.logging]) | configadmin
   | ?
? | [org.osgi.service.log.LogService,
org.knopflerfish.service.log.LogService,
org.ops4j.pax.logging.PaxLoggingService,
org.osgi.service.cm.ManagedService, id=34,
bundle=6/mvn:org.ops4j.pax.logging/pax-logging-service/1.8.5]: Unexpected
problem updating configuration org.ops4j.pax.logging
java.lang.UnsatisfiedLinkError: no NTEventLogAppender in java.library.path
at
java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)[:1.8.0_74]
at java.lang.Runtime.loadLibrary0(Runtime.java:870)[:1.8.0_74]
at java.lang.System.loadLibrary(System.java:1122)[:1.8.0_74]
at
org.apache.log4j.nt.NTEventLogAppender.(NTEventLogAppender.java:179)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)[:1.8.0_74]
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)[:1.8.0_74]
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)[:1.8.0_74]
at
java.lang.reflect.Constructor.newInstance(Constructor.java:423)[:1.8.0_74]
at java.lang.Class.newInstance(Class.java:442)[:1.8.0_74]
at
org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:336)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.PaxLoggingConfigurator.parseAppender(PaxLoggingConfigurator.java:97)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:639)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:504)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.log4j.PaxLoggingConfigurator.doConfigure(PaxLoggingConfigurator.java:72)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl.updated(PaxLoggingServiceImpl.java:214)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.ops4j.pax.logging.service.internal.PaxLoggingServiceImpl$1ManagedPaxLoggingService.updated(PaxLoggingServiceImpl.java:362)[6:org.ops4j.pax.logging.pax-logging-service:1.8.5]
at
org.apache.felix.cm.impl.helper.ManagedServiceTracker.updated(ManagedServiceTracker.java:189)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.helper.ManagedServiceTracker.updateService(ManagedServiceTracker.java:152)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.helper.ManagedServiceTracker.provideConfiguration(ManagedServiceTracker.java:85)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.provide(ConfigurationManager.java:1444)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.run(ConfigurationManager.java:1400)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.UpdateThread.run0(UpdateThread.java:143)[7:org.apache.felix.configadmin:1.8.8]
at
org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:110)[7:org.apache.felix.configadmin:1.8.8]
at java.lang.Thread.run(Thread.java:745)[:1.8.0_74]

Like I did on Karaf 2.4.1, I have put the file NTEventLogAppender.amd64.dll
in the "lib" directory under Karaf. It has the version 1.2.16.1.

Does anyone know how to get the NTEventLogAppender to work with Karaf 4.0.5?

/Bengt


Re: add-features-to-repo failure when upgrading to karaf-maven-plugin

2016-06-29 Thread Bengt Rodehav
Got it to work as follows:

http://karaf.apache.org/xmlns/features/v1.4.0;
name="enterprise-4.0.5">

mvn:org.apache.karaf.features/standard/${karaf-version}/xml/features

I didn't have a namespace in the  tag and I hadn't added the
Karaf standard repository descriptor.

That's all it took,

/Bengt

2016-06-29 16:08 GMT+02:00 Bengt Rodehav <be...@rodehav.com>:

> Did you ever get an answer to this?
>
> I get the same problem when upgrading from Karaf 2.4.1 to 4.0.5 and would
> like to know how you solved it.
>
> /Bengt
>
> 2013-09-27 14:55 GMT+02:00 A. Rothman <amich...@amichais.net>:
>
>> Hi,
>>
>> I'm trying to create an offline installation of karaf 2.3.3 containing
>> the dosgi and activemq features. The following configuration worked (when
>> adding the springsource release and ops4j sonatype snapshot repositories):
>>
>>
>> 
>> org.apache.karaf.tooling
>> features-maven-plugin
>> 2.3.3
>> 
>> 
>> add-features-to-repo
>> generate-resources
>> 
>> add-features-to-repo
>> 
>> 
>> 2.3.3
>> true
>> 
>>
>> mvn:org.apache.activemq/activemq-karaf/5.8.0/xml/features
>>
>> mvn:org.apache.cxf.dosgi/cxf-dosgi/1.6-SNAPSHOT/xml/features
>> 
>> 
>> cxf-dosgi-discovery-distributed
>> activemq-broker
>> 
>> target/features-repo
>> 
>> 
>> 
>> 
>>
>>
>> Next I tried upgrading to the new karaf-maven-plugin 3.0.0-SNAPSHOT: I
>> changed the plugin name and version to the new ones, and the goal name to
>> features-add-to-repository, so now it should be working with the new
>> plugin. However now the build fails:
>>
>>
>> [INFO] --- karaf-maven-plugin:3.0.0-SNAPSHOT:features-add-to-repository
>> (features-add-to-repository) @ com.intellitradegroup.custom-karaf ---
>> [INFO] Copying artifact:
>> org.apache.karaf.features:enterprise:xml:features:2.3.3
>> [WARNING] Can't add
>> mvn:org.apache.karaf.features/enterprise/2.3.3/xml/features in the
>> descriptors set
>> [INFO] Copying artifact:
>> org.apache.karaf.features:standard:xml:features:2.3.3
>> [WARNING] Can't add
>> mvn:org.apache.karaf.features/standard/2.3.3/xml/features in the
>> descriptors set
>> [INFO] Copying artifact:
>> org.apache.karaf.features:standard:xml:features:2.3.3
>> [WARNING] Can't add
>> mvn:org.apache.karaf.features/standard/2.3.3/xml/features in the
>> descriptors set
>> [INFO] Copying artifact:
>> org.apache.activemq:activemq-karaf:xml:features:5.8.0
>> [INFO] Copying artifact:
>> org.apache.cxf.dosgi:cxf-dosgi:xml:features:1.6-SNAPSHOT
>> [INFO] Copying artifact:
>> org.apache.cxf.karaf:apache-cxf:xml:features:2.7.6
>> [INFO]
>> 
>> [INFO] BUILD FAILURE
>> [INFO]
>> 
>> [INFO] Total time: 7.156s
>> [INFO] Finished at: Fri Sep 27 15:46:59 IDT 2013
>> [INFO] Final Memory: 26M/527M
>> [INFO]
>> 
>> [ERROR] Failed to execute goal
>> org.apache.karaf.tooling:karaf-maven-plugin:3.0.0-SNAPSHOT:features-add-to-repository
>> (features-add-to-repository) on project example.custom-karaf: Error
>> populating repository: Unable to find the feature 'http-whiteboard' ->
>> [Help 1]
>>
>>
>> I also tried removing the addTransitiveFeatures option (docs don't
>> mention it, though it seems to still be supported) and also changing the
>> karafVersion option to 3.0.0-SNAPSHOT (in case there's an issue with
>> cross-version repo creation), but the error persists.
>>
>> What else needs to be updated for the new plugin to work?
>>
>> Thanks,
>>
>> Amichai
>>
>>
>>
>


Re: add-features-to-repo failure when upgrading to karaf-maven-plugin

2016-06-29 Thread Bengt Rodehav
Did you ever get an answer to this?

I get the same problem when upgrading from Karaf 2.4.1 to 4.0.5 and would
like to know how you solved it.

/Bengt

2013-09-27 14:55 GMT+02:00 A. Rothman :

> Hi,
>
> I'm trying to create an offline installation of karaf 2.3.3 containing the
> dosgi and activemq features. The following configuration worked (when
> adding the springsource release and ops4j sonatype snapshot repositories):
>
>
> 
> org.apache.karaf.tooling
> features-maven-plugin
> 2.3.3
> 
> 
> add-features-to-repo
> generate-resources
> 
> add-features-to-repo
> 
> 
> 2.3.3
> true
> 
>
> mvn:org.apache.activemq/activemq-karaf/5.8.0/xml/features
>
> mvn:org.apache.cxf.dosgi/cxf-dosgi/1.6-SNAPSHOT/xml/features
> 
> 
> cxf-dosgi-discovery-distributed
> activemq-broker
> 
> target/features-repo
> 
> 
> 
> 
>
>
> Next I tried upgrading to the new karaf-maven-plugin 3.0.0-SNAPSHOT: I
> changed the plugin name and version to the new ones, and the goal name to
> features-add-to-repository, so now it should be working with the new
> plugin. However now the build fails:
>
>
> [INFO] --- karaf-maven-plugin:3.0.0-SNAPSHOT:features-add-to-repository
> (features-add-to-repository) @ com.intellitradegroup.custom-karaf ---
> [INFO] Copying artifact:
> org.apache.karaf.features:enterprise:xml:features:2.3.3
> [WARNING] Can't add
> mvn:org.apache.karaf.features/enterprise/2.3.3/xml/features in the
> descriptors set
> [INFO] Copying artifact:
> org.apache.karaf.features:standard:xml:features:2.3.3
> [WARNING] Can't add
> mvn:org.apache.karaf.features/standard/2.3.3/xml/features in the
> descriptors set
> [INFO] Copying artifact:
> org.apache.karaf.features:standard:xml:features:2.3.3
> [WARNING] Can't add
> mvn:org.apache.karaf.features/standard/2.3.3/xml/features in the
> descriptors set
> [INFO] Copying artifact:
> org.apache.activemq:activemq-karaf:xml:features:5.8.0
> [INFO] Copying artifact:
> org.apache.cxf.dosgi:cxf-dosgi:xml:features:1.6-SNAPSHOT
> [INFO] Copying artifact: org.apache.cxf.karaf:apache-cxf:xml:features:2.7.6
> [INFO]
> 
> [INFO] BUILD FAILURE
> [INFO]
> 
> [INFO] Total time: 7.156s
> [INFO] Finished at: Fri Sep 27 15:46:59 IDT 2013
> [INFO] Final Memory: 26M/527M
> [INFO]
> 
> [ERROR] Failed to execute goal
> org.apache.karaf.tooling:karaf-maven-plugin:3.0.0-SNAPSHOT:features-add-to-repository
> (features-add-to-repository) on project example.custom-karaf: Error
> populating repository: Unable to find the feature 'http-whiteboard' ->
> [Help 1]
>
>
> I also tried removing the addTransitiveFeatures option (docs don't mention
> it, though it seems to still be supported) and also changing the
> karafVersion option to 3.0.0-SNAPSHOT (in case there's an issue with
> cross-version repo creation), but the error persists.
>
> What else needs to be updated for the new plugin to work?
>
> Thanks,
>
> Amichai
>
>
>


Re: Dependencies matrix

2016-06-29 Thread Bengt Rodehav
OK - thanks.

/Bengt

2016-06-29 15:02 GMT+02:00 Jean-Baptiste Onofré <j...@nanthrax.net>:

> Hi Bengt,
>
> yes, I have a Jira about that. We have the short version on the download
> page but I will re-add the extended version (with all dependencies) soon.
>
> Thanks,
> Regards
> JB
>
>
> On 06/29/2016 10:45 AM, Bengt Rodehav wrote:
>
>> I haven't visited the Karaf web site for while. It looks much better
>> these days.
>>
>> However, I was looking for the dependencies matrix for the latest
>> version (4.0.5) and couldn't find it. It is normally of great help when
>> upgrading to a new Karaf version.
>>
>> /Bengt
>>
>
> --
> Jean-Baptiste Onofré
> jbono...@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>


Dependencies matrix

2016-06-29 Thread Bengt Rodehav
I haven't visited the Karaf web site for while. It looks much better these
days.

However, I was looking for the dependencies matrix for the latest version
(4.0.5) and couldn't find it. It is normally of great help when upgrading
to a new Karaf version.

/Bengt


Re: XML unmarshalling performance in Karaf

2014-03-20 Thread Bengt Rodehav
OK - thanks JB,

I guess that means that the next version of Karaf will have this property
set to 0 by default?

/Bengt


2014-03-19 17:30 GMT+01:00 Jean-Baptiste Onofré j...@nanthrax.net:

 Hi guys,

 the reason is explained in:
 https://issues.apache.org/jira/browse/KARAF-2269

 I already set it to 0, but, as you can see in this commit:
 http://mail-archives.apache.org/mod_mbox/karaf-commits/201312.mbox/%
 3c7dd9a5793f88433d8dbfaed8ebd14...@git.apache.org%3E

 I think we fixed the 0 timeout support in ServiceMix Specs 2.3.0 (the
 issue was with 1.9.0). I gonna check and I will set the timeout to 0 if
 it's correctly supported by the specs.

 Regards
 JB


 On 03/19/2014 02:00 PM, Guillaume Nodet wrote:

 Mmh, i would have liked to get away with the explanation ;-)

 So the problem is related to java specifications, like jaxp, stax,
 jaxws, jaxb, etc...
 The packages are usually provided by the JRE and the discovery is done
 using META-INF/services usually.
 Unfortunately, this does not work well in OSGi because the application
 classloader is used for discovery.

 The servicemix-specs project makes this integration of java specs work
 in OSGi.
 Over the years, it has improved in various ways, but now, we have the
 following:
* the discovery part of the various specs is enhanced to be OSGi
 friendly
* those enhanced specs are in the ${karaf.home}/endorsed folder so
 that they are used instead of the JRE ones
* the discovery mechanism will look for an implementation bundle in
 OSGi, then default to the JRE implementation

 Historically, the discovery of JRE implementations was not possible, so
 implementations had to be deployed as bundles.
 However, given there's no way to order the resolution of bundles
 sufficiently, we needed a timeout so that when a bundle was using one of
 the spec, for example the xml parser, the discovery would wait for an
 implementation bundle to become available.  Without that timeout, the
 discovery could fail during the karaf startup.

 However, since the JRE implementations can now be leveraged (mostly
 because the specs are now endorsed instead of being deployed as
 bundles), that timeout can be safely set to 0 so that the specs won't
 wait until a bundle implementation is available, but will delegate to
 the JRE directly if no bundle implementation is present.

 Some weeks ago, I had a quick chat with Jean-Baptiste about setting back
 the timeout to 0, but we delayed it for some reason I don't recall.
   Maybe JB remembers ...


 2014-03-19 13:28 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 Hello Guillaume!

 That made all the difference in the world. The CPU now finally
 started to work and processing time for 1000 of my exchanges went
 down to 9s from 38s.

 But you've got some explaining to do :-)

 What is the purpose of this property and why is it set as high as
 100 ms? Also, what can go wrong if I set it to 0?

 /Bengt


 2014-03-19 11:43 GMT+01:00 Guillaume Nodet gno...@apache.org
 mailto:gno...@apache.org:


 I don't think your problem is concurrency, but just wait.
 Make the org.apache.servicemix.specs.timeout property is set to
 0 in etc/system.properties and retry.


 2014-03-19 10:25 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 To clarify, again looking at the sampler info, the locate()
 method spends all its time waiting which indicates a
 concurrency/synchronization problem.

 When googling about this I found the following JIRA:

 https://issues.apache.org/jira/browse/SMX4-803

 This seems to have been fixed though.

 I'm not exactly sure how the locator works and if I install
 a locator myself or not. I've tried to see what bundle
 exports the org.apache.servicemix.specs.locator package but
 no bundle does that. So I guess it's a private package in
 some bundle.

 Perhaps someone can inform me how this works so I can check
 if I need an updated version of some bundle?

 /Bengt




 2014-03-19 9:56 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 Thanks to Guillaume I managed to get the VisualVM to
 work with Karaf.

 I then ran my transformations a number of times while
 sampling to see where the time is spent. I attach an
 image from a snapshot from VisualVM. I'm not sure if it
 will be accepted by the mailing list but if anyone wants
 to look at it I can send it to your email directly if
 you want.

 I'm not an expert on interpreting VisualVM profiling
 info but it seems to me that in the thread doing the
 transformation, more

Re: XML unmarshalling performance in Karaf

2014-03-20 Thread Bengt Rodehav
Great!


2014-03-20 10:01 GMT+01:00 j...@nanthrax.net j...@nanthrax.net:

 Hi Bengt,

 Yes it's already done.

 Regards
 JB


 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://wwx.talend.com


 - Reply message -
 From: Bengt Rodehav be...@rodehav.com
 To: user@karaf.apache.org
 Subject: XML unmarshalling performance in Karaf
 Date: Thu, Mar 20, 2014 9:38 am


 OK - thanks JB,

 I guess that means that the next version of Karaf will have this property
 set to 0 by default?

 /Bengt


 2014-03-19 17:30 GMT+01:00 Jean-Baptiste Onofré j...@nanthrax.net:

 Hi guys,

 the reason is explained in:
 https://issues.apache.org/jira/browse/KARAF-2269

 I already set it to 0, but, as you can see in this commit:
 http://mail-archives.apache.org/mod_mbox/karaf-commits/201312.mbox/%
 3c7dd9a5793f88433d8dbfaed8ebd14...@git.apache.org%3E

 I think we fixed the 0 timeout support in ServiceMix Specs 2.3.0 (the
 issue was with 1.9.0). I gonna check and I will set the timeout to 0 if
 it's correctly supported by the specs.

 Regards
 JB


 On 03/19/2014 02:00 PM, Guillaume Nodet wrote:

 Mmh, i would have liked to get away with the explanation ;-)

 So the problem is related to java specifications, like jaxp, stax,
 jaxws, jaxb, etc...
 The packages are usually provided by the JRE and the discovery is done
 using META-INF/services usually.
 Unfortunately, this does not work well in OSGi because the application
 classloader is used for discovery.

 The servicemix-specs project makes this integration of java specs work
 in OSGi.
 Over the years, it has improved in various ways, but now, we have the
 following:
* the discovery part of the various specs is enhanced to be OSGi
 friendly
* those enhanced specs are in the ${karaf.home}/endorsed folder so
 that they are used instead of the JRE ones
* the discovery mechanism will look for an implementation bundle in
 OSGi, then default to the JRE implementation

 Historically, the discovery of JRE implementations was not possible, so
 implementations had to be deployed as bundles.
 However, given there's no way to order the resolution of bundles
 sufficiently, we needed a timeout so that when a bundle was using one of
 the spec, for example the xml parser, the discovery would wait for an
 implementation bundle to become available.  Without that timeout, the
 discovery could fail during the karaf startup.

 However, since the JRE implementations can now be leveraged (mostly
 because the specs are now endorsed instead of being deployed as
 bundles), that timeout can be safely set to 0 so that the specs won't
 wait until a bundle implementation is available, but will delegate to
 the JRE directly if no bundle implementation is present.

 Some weeks ago, I had a quick chat with Jean-Baptiste about setting back
 the timeout to 0, but we delayed it for some reason I don't recall.
   Maybe JB remembers ...


 2014-03-19 13:28 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 Hello Guillaume!

 That made all the difference in the world. The CPU now finally
 started to work and processing time for 1000 of my exchanges went
 down to 9s from 38s.

 But you've got some explaining to do :-)

 What is the purpose of this property and why is it set as high as
 100 ms? Also, what can go wrong if I set it to 0?

 /Bengt


 2014-03-19 11:43 GMT+01:00 Guillaume Nodet gno...@apache.org
 mailto:gno...@apache.org:


 I don't think your problem is concurrency, but just wait.
 Make the org.apache.servicemix.specs.timeout property is set to
 0 in etc/system.properties and retry.


 2014-03-19 10:25 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 To clarify, again looking at the sampler info, the locate()
 method spends all its time waiting which indicates a
 concurrency/synchronization problem.

 When googling about this I found the following JIRA:

 https://issues.apache.org/jira/browse/SMX4-803

 This seems to have been fixed though.

 I'm not exactly sure how the locator works and if I install
 a locator myself or not. I've tried to see what bundle
 exports the org.apache.servicemix.specs.locator package but
 no bundle does that. So I guess it's a private package in
 some bundle.

 Perhaps someone can inform me how this works so I can check
 if I need an updated version of some bundle?

 /Bengt




 2014-03-19 9:56 GMT+01:00 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com:


 Thanks to Guillaume I managed to get the VisualVM to
 work with Karaf.

 I then ran my transformations a number of times while
 sampling to see where the time is spent. I attach

Re: Using Java VisualVM profiler in Karaf

2014-03-19 Thread Bengt Rodehav
I added ,org.netbeans.lib.profiler.server to the property
org.osgi.framework.bootdelegation and the profiling now seems to work!

Thanks Guillaume,

/Bengt


2014-03-18 16:12 GMT+01:00 Bengt Rodehav be...@rodehav.com:

 I'll try that. Thanks for the tip.

 /Bengt
 Den 18 mar 2014 16:04 skrev Guillaume Nodet gno...@apache.org:

 Have you tried adding the needed package to
 the org.osgi.framework.bootdelegation property ? It may help in that case.


 2014-03-18 15:49 GMT+01:00 Bengt Rodehav be...@rodehav.com:

 I'm using Karaf 2.3.4 currently with Java 6 / 64 bit (1.6.0_32). I've
 been using Java VisualVM to monitor threads. But I haven't managed to get
 the profiler to work with a Karaf process. As soon as I start the profiling
 I get a lot of NoClassDefFoundError's in the Karaf console. I think most of
 them refer to the same class:

 Exception in thread qtp1732535673-50 java.lang.NoClassDefFoundError:
 org/netbeans/lib/profiler/server/ProfilerRuntimeCPUFullInstr
 at
 org.eclipse.jetty.io.nio.SelectorManager$SelectSet$1.run(SelectorManager.java:716)
 at
 org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
 at
 org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
 at java.lang.Thread.run(Thread.java:662)

 Has anyone got the VisualVM to work with Karaf?

 /Bengt





Re: Using Java VisualVM profiler in Karaf

2014-03-19 Thread Bengt Rodehav
I agree. VisualVM is a big helper. It actually helped me to locate the
problem (I think). I'll post about that in my previous thread.

Guillaume, perhaps you can add this to the Karaf documentation?

/Bengt


2014-03-19 9:10 GMT+01:00 Claus Ibsen claus.ib...@gmail.com:

 On Wed, Mar 19, 2014 at 8:48 AM, Bengt Rodehav be...@rodehav.com wrote:
  I added ,org.netbeans.lib.profiler.server to the property
  org.osgi.framework.bootdelegation and the profiling now seems to work!
 

 Ah thanks for sharing.

 Wonder if this can be added to the Karaf documentation somehwere  - as
 visualvm is a lovely tool, and other people will hit this problem with
 Karaf.

  Thanks Guillaume,
 
  /Bengt
 
 
  2014-03-18 16:12 GMT+01:00 Bengt Rodehav be...@rodehav.com:
 
  I'll try that. Thanks for the tip.
 
  /Bengt
 
  Den 18 mar 2014 16:04 skrev Guillaume Nodet gno...@apache.org:
 
  Have you tried adding the needed package to the
  org.osgi.framework.bootdelegation property ? It may help in that case.
 
 
  2014-03-18 15:49 GMT+01:00 Bengt Rodehav be...@rodehav.com:
 
  I'm using Karaf 2.3.4 currently with Java 6 / 64 bit (1.6.0_32). I've
  been using Java VisualVM to monitor threads. But I haven't managed to
 get
  the profiler to work with a Karaf process. As soon as I start the
 profiling
  I get a lot of NoClassDefFoundError's in the Karaf console. I think
 most of
  them refer to the same class:
 
  Exception in thread qtp1732535673-50 java.lang.NoClassDefFoundError:
  org/netbeans/lib/profiler/server/ProfilerRuntimeCPUFullInstr
  at
 
 org.eclipse.jetty.io.nio.SelectorManager$SelectSet$1.run(SelectorManager.java:716)
  at
 
 org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
  at
 
 org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
  at java.lang.Thread.run(Thread.java:662)
 
  Has anyone got the VisualVM to work with Karaf?
 
  /Bengt
 
 
 



 --
 Claus Ibsen
 -
 Red Hat, Inc.
 Email: cib...@redhat.com
 Twitter: davsclaus
 Blog: http://davsclaus.com
 Author of Camel in Action: http://www.manning.com/ibsen
 Make your Camel applications look hawt, try: http://hawt.io



XML unmarshalling performance in Karaf

2014-03-18 Thread Bengt Rodehav
I'm using karaf 2.3.4 and Camel 2.13.3.

I've been investigating performance problems with Camel's sjms component.
Here is the discussion:

http://camel.465427.n5.nabble.com/sjms-and-multiple-threads-td5748836.html

However, at the end I discovered that my real problem was the unmarshalling
of an XML file in Karaf. For some reason, if I unmarshall a certain XML
file it takes about 105 ms in Karaf. If I do the same from my Junit test in
Eclipse it takes around 10 ms. In fact, in Eclipse it starts with around 30
ms but consecutive calls gradually go down to 7-8 ms. In Karaf it doesn't
matter how many times I do the unmarshalling. It stays at about 105 ms
everytime.

I'm very confused about this.

The actual code looks like this (approximately):

  public MmlMessage unmarshallMmlMessage(String theXml) throws
JAXBException {
final Unmarshaller unMarshaller =
cMmlMessageJAXBcontext.createUnmarshaller();
StreamSource ss = new StreamSource(new StringReader(theXml));
long t0 = System.nanoTime();
JAXBElement? mmlMessageElement = unMarshaller.unmarshal(ss,
MmlMessage.class);
long t1 = System.nanoTime();
MmlMessage mmlMessage = (MmlMessage) mmlMessageElement.getValue();
System.out.println(t1:  + (t1-t0) +  ns);
return mmlMessage;
  }

The MmlMessage class is generated from an XML schema
using maven-jaxb2-plugin. But it shouldn't matter since the same classes
are used within Karaf as outside of Karaf.

I assumed that for some reason I'm not running the same code in Karaf as
outside Karaf. When logging the actual class of the unMarshaller variable I
get: com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl both
within and outside Karaf.

The classloader for the unMarshaller in Karaf is:
org.apache.felix.framework.BundleWiringImpl@5740eacd.

I thought I had the answer when I noted that outside Karaf I use the Jaxb
implementation that is listed in Camel-jaxb dependencies. This is version
2.1.13. In Karaf I had installed a jaxb version from servicemix bundles
namely:

  groupIdorg.apache.servicemix.bundles/groupId
  artifactIdorg.apache.servicemix.bundles.jaxb-impl/artifactId
  version2.2.1.1_2/version

So I forced my Junit test to use the same servicemix bundles version but it
was still equally fast as before. No where near the 105 ms I get in Karaf.

I realize that this probably is not a Karaf problem per se. But, I know
there are probably lots of people on this mailing that have handled XML a
lot in Karaf. Do you have any tips on what to look at? What could cause
this performance problem?

/Bengt


Using Java VisualVM profiler in Karaf

2014-03-18 Thread Bengt Rodehav
I'm using Karaf 2.3.4 currently with Java 6 / 64 bit (1.6.0_32). I've been
using Java VisualVM to monitor threads. But I haven't managed to get the
profiler to work with a Karaf process. As soon as I start the profiling I
get a lot of NoClassDefFoundError's in the Karaf console. I think most of
them refer to the same class:

Exception in thread qtp1732535673-50 java.lang.NoClassDefFoundError:
org/netbeans/lib/profiler/server/ProfilerRuntimeCPUFullInstr
at
org.eclipse.jetty.io.nio.SelectorManager$SelectSet$1.run(SelectorManager.java:716)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:662)

Has anyone got the VisualVM to work with Karaf?

/Bengt


Re: Using Java VisualVM profiler in Karaf

2014-03-18 Thread Bengt Rodehav
I'll try that. Thanks for the tip.

/Bengt
Den 18 mar 2014 16:04 skrev Guillaume Nodet gno...@apache.org:

 Have you tried adding the needed package to
 the org.osgi.framework.bootdelegation property ? It may help in that case.


 2014-03-18 15:49 GMT+01:00 Bengt Rodehav be...@rodehav.com:

 I'm using Karaf 2.3.4 currently with Java 6 / 64 bit (1.6.0_32). I've
 been using Java VisualVM to monitor threads. But I haven't managed to get
 the profiler to work with a Karaf process. As soon as I start the profiling
 I get a lot of NoClassDefFoundError's in the Karaf console. I think most of
 them refer to the same class:

 Exception in thread qtp1732535673-50 java.lang.NoClassDefFoundError:
 org/netbeans/lib/profiler/server/ProfilerRuntimeCPUFullInstr
 at
 org.eclipse.jetty.io.nio.SelectorManager$SelectSet$1.run(SelectorManager.java:716)
 at
 org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
 at
 org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
 at java.lang.Thread.run(Thread.java:662)

 Has anyone got the VisualVM to work with Karaf?

 /Bengt





Re: Can't find artifact mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE

2013-10-15 Thread Bengt Rodehav
But...

I installed a fresh Karaf 2.3.3. I then used features:chooseurl to get
access to latest Camel features. Then I did features:install
camel-spring. This succeeded and in that process spring-aop version
3.2.4.RELEASE was installed. How is this possible if that version doesn't
exist?

I noticed that the org.ops4j.pax.url.mvn.cfg file contains the following:

org.ops4j.pax.url.mvn.repositories= \
http://repo1.maven.org/maven2@id=central, \
http://svn.apache.org/repos/asf/servicemix/m2-repo@id=servicemix, \

http://repository.springsource.com/maven/bundles/release@id=springsource.release,
\

http://repository.springsource.com/maven/bundles/external@id=springsource.external,
\
http://oss.sonatype.org/content/repositories/releases/@id=sonatype

So I guess it is downloading directly from the Spring source EBR.

Can you explain this Achim?

/Bengt



2013/10/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 it might very well be that this is an issue with Camel, cause as you can
 see at [1].
 We upgraded to Spring 3.2.4 but no OSGi bundles are available at the EBR.

 regards, Achim

 [1] - https://issues.apache.org/jira/browse/KARAF-2458



 2013/10/14 Bengt Rodehav be...@rodehav.com

 See what you mean Achim... I can only find up to version 3.2.3 in the
 Spring EBR not version 3.2.4. Then I would definitely call it a bug since 
 standard-2.3.3-features.xm
 is referencing an artifact version that doesn't exist.

 I don't reference the 3.2.4 version myself but I think my upgrade to
 Camel 2.12.1 caused this to happen.

 I will investigate this further but could this mean that Camel 2.12.1
 (its features) is incompatible with Karaf 2.3.3?

 /Bengt


 2013/10/14 Bengt Rodehav be...@rodehav.com

 OK - thanks. Will try adding the Spring repo tomorrow. Really strange
 that they don't use Maven central anymore. It complicates things for
 everyone.

 /Bengt


 2013/10/14 Minto van der Sluis mi...@xup.nl

 Hi Bengt,

 No it's not a bug.

 Have a look at https://issues.apache.org/jira/browse/KARAF-2430

 Regards,

 Minto

 Bengt Rodehav schreef op 14-10-2013 17:30:
  I just upgraded to from Karaf 2.3.1 to 2.3.3. When I build my custom
  server I try to download all the dependencies. I tried to use Spring
  version 3.2.4.RELEASE since that is now available in Karaf.
 
  Below is an excerpt from the standard-2.3.3-features.xml included in
  Karaf 2.3.3:
 
  !-- Spring 3.2 support --
 
  feature name=spring description=Spring 3.2.x support
  version=3.2.4.RELEASE resolver=(obr)
  bundle dependency=true
 
 start-level=30mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.aopalliance/1.0_6/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.core/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.expression/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.beans/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context.support/3.2.4.RELEASE/bundle
  /feature
 
  All artifact names are org.springframework.xyz. But in Maven central
  they seem to be called spring-aop etc. Is this a bug?
 
  /Bengt






 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer 
 Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/



Re: Can't find artifact mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE

2013-10-15 Thread Bengt Rodehav
FYI.

I proxied the following repositories in our Nexus repository server (they
are listed on the FAQ on the Springsource EBR):

repository
idcom.springsource.repository.bundles.release/id
nameSpringSource Enterprise Bundle Repository - SpringSource Bundle
Releases/name
urlhttp://repository.springsource.com/maven/bundles/release/url
/repository

repository
idcom.springsource.repository.bundles.external/id
nameSpringSource Enterprise Bundle Repository - External Bundle
Releases/name
urlhttp://repository.springsource.com/maven/bundles/external/url
/repository

I then browsed them via Nexus Web GUI and found that the 3.2.4.RELEASE is
available there. I was also able to build correctly now.

So, in short:

- The 3.2.4.RELEASE version of Spring IS available in the Springsource EBR
even if you can't find it when you use their search facility. So, no bug in
Karaf's feature file. It's just a little sad that you cant find these
artifacts on Maven central.

- The camel-spring feature uses the version range [3.1,3.3) for the
spring feature which means that it will try to pick up the 3.2.4.RELEASE
version of the spring feature which is what caused my problems at first.

My problems are solved. Hope this helps anyone else encountering these
problems.

/Bengt



2013/10/15 Achim Nierbeck bcanh...@googlemail.com

 plainly, no.
 I've no idea what spring is up to.

 I just can help you with stuff I've seen already asked. And people told us
 that 3.2.4 is not available as OSGi bundles from EBR.
 As you can see at the issue that I linked for your convenience.


 regards, Achim




 2013/10/15 Bengt Rodehav be...@rodehav.com

 But...

 I installed a fresh Karaf 2.3.3. I then used features:chooseurl to get
 access to latest Camel features. Then I did features:install
 camel-spring. This succeeded and in that process spring-aop version
 3.2.4.RELEASE was installed. How is this possible if that version doesn't
 exist?

 I noticed that the org.ops4j.pax.url.mvn.cfg file contains the following:

 org.ops4j.pax.url.mvn.repositories= \
 http://repo1.maven.org/maven2@id=central, \
 http://svn.apache.org/repos/asf/servicemix/m2-repo@id=servicemix, \

 http://repository.springsource.com/maven/bundles/release@id=springsource.release,
 \

 http://repository.springsource.com/maven/bundles/external@id=springsource.external,
 \
 http://oss.sonatype.org/content/repositories/releases/@id=sonatype

 So I guess it is downloading directly from the Spring source EBR.

 Can you explain this Achim?

 /Bengt



 2013/10/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 it might very well be that this is an issue with Camel, cause as you can
 see at [1].
 We upgraded to Spring 3.2.4 but no OSGi bundles are available at the
 EBR.

 regards, Achim

 [1] - https://issues.apache.org/jira/browse/KARAF-2458



 2013/10/14 Bengt Rodehav be...@rodehav.com

 See what you mean Achim... I can only find up to version 3.2.3 in the
 Spring EBR not version 3.2.4. Then I would definitely call it a bug since 
 standard-2.3.3-features.xm
 is referencing an artifact version that doesn't exist.

 I don't reference the 3.2.4 version myself but I think my upgrade to
 Camel 2.12.1 caused this to happen.

 I will investigate this further but could this mean that Camel 2.12.1
 (its features) is incompatible with Karaf 2.3.3?

 /Bengt


 2013/10/14 Bengt Rodehav be...@rodehav.com

 OK - thanks. Will try adding the Spring repo tomorrow. Really strange
 that they don't use Maven central anymore. It complicates things for
 everyone.

 /Bengt


 2013/10/14 Minto van der Sluis mi...@xup.nl

 Hi Bengt,

 No it's not a bug.

 Have a look at https://issues.apache.org/jira/browse/KARAF-2430

 Regards,

 Minto

 Bengt Rodehav schreef op 14-10-2013 17:30:
  I just upgraded to from Karaf 2.3.1 to 2.3.3. When I build my custom
  server I try to download all the dependencies. I tried to use Spring
  version 3.2.4.RELEASE since that is now available in Karaf.
 
  Below is an excerpt from the standard-2.3.3-features.xml included
 in
  Karaf 2.3.3:
 
  !-- Spring 3.2 support --
 
  feature name=spring description=Spring 3.2.x support
  version=3.2.4.RELEASE resolver=(obr)
  bundle dependency=true
 
 start-level=30mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.aopalliance/1.0_6/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.core/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.expression/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.beans/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context.support/3.2.4.RELEASE/bundle

Can't find artifact mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE

2013-10-14 Thread Bengt Rodehav
I just upgraded to from Karaf 2.3.1 to 2.3.3. When I build my custom server
I try to download all the dependencies. I tried to use Spring version
3.2.4.RELEASE since that is now available in Karaf.

Below is an excerpt from the standard-2.3.3-features.xml included in
Karaf 2.3.3:

!-- Spring 3.2 support --

feature name=spring description=Spring 3.2.x support
version=3.2.4.RELEASE resolver=(obr)
bundle dependency=true
start-level=30mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.aopalliance/1.0_6/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.core/3.2.4.RELEASE/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.expression/3.2.4.RELEASE/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.beans/3.2.4.RELEASE/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.context/3.2.4.RELEASE/bundle
bundle
start-level=30mvn:org.springframework/org.springframework.context.support/3.2.4.RELEASE/bundle
/feature

All artifact names are org.springframework.xyz. But in Maven central they
seem to be called spring-aop etc. Is this a bug?

/Bengt


Re: Can't find artifact mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE

2013-10-14 Thread Bengt Rodehav
OK - thanks. Will try adding the Spring repo tomorrow. Really strange that
they don't use Maven central anymore. It complicates things for everyone.

/Bengt


2013/10/14 Minto van der Sluis mi...@xup.nl

 Hi Bengt,

 No it's not a bug.

 Have a look at https://issues.apache.org/jira/browse/KARAF-2430

 Regards,

 Minto

 Bengt Rodehav schreef op 14-10-2013 17:30:
  I just upgraded to from Karaf 2.3.1 to 2.3.3. When I build my custom
  server I try to download all the dependencies. I tried to use Spring
  version 3.2.4.RELEASE since that is now available in Karaf.
 
  Below is an excerpt from the standard-2.3.3-features.xml included in
  Karaf 2.3.3:
 
  !-- Spring 3.2 support --
 
  feature name=spring description=Spring 3.2.x support
  version=3.2.4.RELEASE resolver=(obr)
  bundle dependency=true
 
 start-level=30mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.aopalliance/1.0_6/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.core/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.expression/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.beans/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context.support/3.2.4.RELEASE/bundle
  /feature
 
  All artifact names are org.springframework.xyz. But in Maven central
  they seem to be called spring-aop etc. Is this a bug?
 
  /Bengt




Re: Can't find artifact mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE

2013-10-14 Thread Bengt Rodehav
See what you mean Achim... I can only find up to version 3.2.3 in the
Spring EBR not version 3.2.4. Then I would definitely call it a bug
since standard-2.3.3-features.xm
is referencing an artifact version that doesn't exist.

I don't reference the 3.2.4 version myself but I think my upgrade to Camel
2.12.1 caused this to happen.

I will investigate this further but could this mean that Camel 2.12.1 (its
features) is incompatible with Karaf 2.3.3?

/Bengt


2013/10/14 Bengt Rodehav be...@rodehav.com

 OK - thanks. Will try adding the Spring repo tomorrow. Really strange that
 they don't use Maven central anymore. It complicates things for everyone.

 /Bengt


 2013/10/14 Minto van der Sluis mi...@xup.nl

 Hi Bengt,

 No it's not a bug.

 Have a look at https://issues.apache.org/jira/browse/KARAF-2430

 Regards,

 Minto

 Bengt Rodehav schreef op 14-10-2013 17:30:
  I just upgraded to from Karaf 2.3.1 to 2.3.3. When I build my custom
  server I try to download all the dependencies. I tried to use Spring
  version 3.2.4.RELEASE since that is now available in Karaf.
 
  Below is an excerpt from the standard-2.3.3-features.xml included in
  Karaf 2.3.3:
 
  !-- Spring 3.2 support --
 
  feature name=spring description=Spring 3.2.x support
  version=3.2.4.RELEASE resolver=(obr)
  bundle dependency=true
 
 start-level=30mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.aopalliance/1.0_6/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.core/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.expression/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.beans/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.aop/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context/3.2.4.RELEASE/bundle
  bundle
 
 start-level=30mvn:org.springframework/org.springframework.context.support/3.2.4.RELEASE/bundle
  /feature
 
  All artifact names are org.springframework.xyz. But in Maven central
  they seem to be called spring-aop etc. Is this a bug?
 
  /Bengt





Re: How to configure managed service factory with blueprint?

2013-10-10 Thread Bengt Rodehav
OK.

I normally use iPOJO for anything that is a bit more advanced. iPOJO has
much more possibilities than Blueprint. So far I have used iPOJO for all
managed service factories but was looking at Blueprint for the simpler,
more straightfoward, cases. Unfortunately it doesn't seemt to fit the bill.

Also, when encountering problems/questions regarding iPOJO you always get
prompt responses on the Felix mailing list. Not so on Aries unfortunately.

I will probably stick with iPOJO although it is sometimes overkill.

/Bengt


2013/10/8 SvS dumpacco...@solcon.nl

 Yes. Similar problem.

 No idea how to solve this issue! After debugging, it seems that we need
 managed-properties, because I miss a BeanProcessor.

 I can publish a service, but the configuration properties are never used.

 Thanks,
 Sam.



 --
 View this message in context:
 http://karaf.922171.n3.nabble.com/How-to-configure-managed-service-factory-with-blueprint-tp4029857p4029875.html
 Sent from the Karaf - User mailing list archive at Nabble.com.



Re: How to configure managed service factory with blueprint?

2013-10-07 Thread Bengt Rodehav
You and me seem to have similar problems. I posted the following on the
Aries user list:

http://www.mail-archive.com/user@aries.apache.org/msg01043.html

So far no one has responded which is a bit disappointing.

I also have problems with using the configured properties. But I also
couldn't get the service to be published. An instance of my class is
created but no service is published.

Did you get a service published?

/Bengt


2013/10/7 SvS dumpacco...@solcon.nl

 I use the following configuration in blueprint and I can create the service
 with the Apache Karaf Web Console Configuration. But how can I use the
 specified properties (initial and when updated)?

 ?xml version=1.0 encoding=UTF-8 standalone=no?
 blueprint xmlns=http://www.osgi.org/xmlns/blueprint/v1.0.0;
 xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance;
 xmlns:cm=http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0;
   xsi:schemaLocation=http://www.osgi.org/xmlns/blueprint/v1.0.0
 http://www.osgi.org/xmlns/blueprint/v1.0.0/blueprint.xsd
   http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0
 http://aries.apache.org/schemas/blueprint-cm/blueprint-cm-1.1.0.xsd;

   reference id=eventAdmin interface=org.osgi.service.event.EventAdmin
 /

   cm:managed-service-factory factory-pid=xxx.ServiceFactory
 interfaces
   valuexxx.Service/value
 /interfaces

 cm:managed-component  class=xxx.ServiceImpl init-method=start
 destroy-method=stop
   property name=eventAdmin ref=eventAdmin /
 /cm:managed-component
   /cm:managed-service-factory
 /blueprint

 Also when I specify a ManagedServiceFactory. The ManagedServiceFactory is
 not invoked.

 ?xml version=1.0 encoding=UTF-8 standalone=no?
 blueprint xmlns=http://www.osgi.org/xmlns/blueprint/v1.0.0;
 xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance;
 xmlns:cm=http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0;
   xsi:schemaLocation=http://www.osgi.org/xmlns/blueprint/v1.0.0
 http://www.osgi.org/xmlns/blueprint/v1.0.0/blueprint.xsd
   http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0
 http://aries.apache.org/schemas/blueprint-cm/blueprint-cm-1.1.0.xsd;

   bean id=serviceFactory class=xxx.ServiceFactory init-method=start
 destroy-method=stop
   /bean

   service ref=serviceFactory
 interface=org.osgi.service.cm.ManagedServiceFactory/

   reference id=eventAdmin interface=org.osgi.service.event.EventAdmin
 /

   cm:managed-service-factory ref=serviceFactory
 factory-pid=xxx.ServiceFactory
 interfaces
   valuexxx.Service/value
 /interfaces

 cm:managed-component  class=xxx.ServiceImpl init-method=start
 destroy-method=stop
   property name=eventAdmin ref=eventAdmin /
 /cm:managed-component
   /cm:managed-service-factory
 /blueprint

 What is wrong / missing?

 Thanks,
 Sam.



 --
 View this message in context:
 http://karaf.922171.n3.nabble.com/How-to-configure-managed-service-factory-with-blueprint-tp4029857.html
 Sent from the Karaf - User mailing list archive at Nabble.com.



Re: JNDI and Weblogic

2013-09-11 Thread Bengt Rodehav
An update...

When looking further into this, it does not seem to involve service
discovery but only old fashioned classloading problems. It seems I have to
set the TCCL in order to get this to work. The following code works:

env.put(Context.PROVIDER_URL, t3://127.0.0.1:7001);
env.put(Context.INITIAL_CONTEXT_FACTORY,
weblogic.jndi.WLInitialContextFactory);
ClassLoader orgCl = Thread.currentThread().getContextClassLoader();
InitialContext ctx = null;
ConnectionFactory cf = null;
try {

Thread.currentThread().setContextClassLoader(WLInitialContextFactory.class.getClassLoader());
  ctx = new InitialContext(env);
  cf = (ConnectionFactory)ctx.lookup(Q_CONN_FACTORY);
}
finally {
  Thread.currentThread().setContextClassLoader(orgCl);
}

Note that I first tried to use this  bundle's classloader but it didn't
work. When I do an actual lookup, I will get back an internal Weblogic
class (in this case weblogic.jms.client.JMSConnectionFactory) that cannot
be unmarshalled unless that class can be found on the classpath. Thus, I
must set the TCCL to the bundle that wraps wlthint3client.jar.

/Bengt


2013/9/10 Bengt Rodehav be...@rodehav.com

 I'm trying to connect to Weblogic JMS from within Karaf. I've
 wrapped wlthint3client.jar Weblogic from Weblogic in an OSGi bundle. I use
 the following code:

 HashtableString, String env = new HashtableString, String();
 env.put(Context.PROVIDER_URL, t3://127.0.0.1:7001);
 env.put(Context.INITIAL_CONTEXT_FACTORY,
 weblogic.jndi.WLInitialContextFactory);
 InitialContext ctx = new InitialContext(env);

 This works perfectly outside OSGi but in Karaf I get:

 Caused by: java.lang.ClassNotFoundException: Failed to load class
 weblogic.jms.client.JMSConnectionFactory
  at
 weblogic.rmi.utils.WLRMIClassLoaderDelegate.loadClass(WLRMIClassLoaderDelegate.java:208)
 at
 weblogic.rmi.utils.WLRMIClassLoaderDelegate.loadClass(WLRMIClassLoaderDelegate.java:135)
  at weblogic.rmi.utils.Utilities.loadClass(Utilities.java:305)
 at
 weblogic.rjvm.MsgAbbrevInputStream.resolveClass(MsgAbbrevInputStream.java:433)
  at
 weblogic.utils.io.ChunkedObjectInputStream$NestedObjectInputStream.resolveClass(ChunkedObjectInputStream.java:268)
 at
 java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)[:1.6.0_32]
  at
 java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)[:1.6.0_32]
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)[:1.6.0_32]
  at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)[:1.6.0_32]
 at
 java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)[:1.6.0_32]
  at
 weblogic.utils.io.ChunkedObjectInputStream.readObject(ChunkedObjectInputStream.java:208)
 at
 weblogic.rjvm.MsgAbbrevInputStream.readObject(MsgAbbrevInputStream.java:596)
  at
 weblogic.utils.io.ChunkedObjectInputStream.readObject(ChunkedObjectInputStream.java:204)
 at weblogic.rmi.internal.ObjectIO.readObject(ObjectIO.java:62)
  at weblogic.rjvm.ResponseImpl.unmarshalReturn(ResponseImpl.java:240)
 ... 35 more

 It looks like the the classloader being used is the frameworks classloader
 (:1.6.0_32) which has not imported these packages. I have a feeling that it
 is the service discovery mechanism that is failing. I know that Servicemix
 has wrapped several bundles and made it possible for Java's service
 discovery mechanism to work.

 Am I doing this all wrong or should my code work? Also, if I want to
 support service discovery the way ServiceMix bundles do - how do I do this?
 Is there a good example bundle I can look at?

 /Bengt





JNDI and Weblogic

2013-09-10 Thread Bengt Rodehav
I'm trying to connect to Weblogic JMS from within Karaf. I've
wrapped wlthint3client.jar Weblogic from Weblogic in an OSGi bundle. I use
the following code:

HashtableString, String env = new HashtableString, String();
env.put(Context.PROVIDER_URL, t3://127.0.0.1:7001);
env.put(Context.INITIAL_CONTEXT_FACTORY,
weblogic.jndi.WLInitialContextFactory);
InitialContext ctx = new InitialContext(env);

This works perfectly outside OSGi but in Karaf I get:

Caused by: java.lang.ClassNotFoundException: Failed to load class
weblogic.jms.client.JMSConnectionFactory
at
weblogic.rmi.utils.WLRMIClassLoaderDelegate.loadClass(WLRMIClassLoaderDelegate.java:208)
at
weblogic.rmi.utils.WLRMIClassLoaderDelegate.loadClass(WLRMIClassLoaderDelegate.java:135)
at weblogic.rmi.utils.Utilities.loadClass(Utilities.java:305)
at
weblogic.rjvm.MsgAbbrevInputStream.resolveClass(MsgAbbrevInputStream.java:433)
at
weblogic.utils.io.ChunkedObjectInputStream$NestedObjectInputStream.resolveClass(ChunkedObjectInputStream.java:268)
at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)[:1.6.0_32]
at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)[:1.6.0_32]
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)[:1.6.0_32]
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)[:1.6.0_32]
at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)[:1.6.0_32]
at
weblogic.utils.io.ChunkedObjectInputStream.readObject(ChunkedObjectInputStream.java:208)
at
weblogic.rjvm.MsgAbbrevInputStream.readObject(MsgAbbrevInputStream.java:596)
at
weblogic.utils.io.ChunkedObjectInputStream.readObject(ChunkedObjectInputStream.java:204)
at weblogic.rmi.internal.ObjectIO.readObject(ObjectIO.java:62)
at weblogic.rjvm.ResponseImpl.unmarshalReturn(ResponseImpl.java:240)
... 35 more

It looks like the the classloader being used is the frameworks classloader
(:1.6.0_32) which has not imported these packages. I have a feeling that it
is the service discovery mechanism that is failing. I know that Servicemix
has wrapped several bundles and made it possible for Java's service
discovery mechanism to work.

Am I doing this all wrong or should my code work? Also, if I want to
support service discovery the way ServiceMix bundles do - how do I do this?
Is there a good example bundle I can look at?

/Bengt


Re: Karaf webconsole exports wrong package version

2013-05-30 Thread Bengt Rodehav
Thanks a lot JB - you are fast...

/Bengt


2013/5/30 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 I fixed it (with a couple of other issues around org.json package, etc).
 It will be included in 2.3.2, etc.

 Regards
 JB


 On 05/30/2013 09:56 AM, Bengt Rodehav wrote:

 I'm using Karaf 2.3.1. I just attempted to upgrade iPojo to the latest
 version incuding the new (version 1.7.0) iPojo web console plugin. This
 fails because it requires a version of the package
 org.apache.felix.webconsole that is less than 4.0.0.

 After discussing this on Felix user mailing list, it seems like the
 Karaf webconsole exports this package using the wrong version. Karaf
 seems to export the package without a specific version which causes the
 bundle's version to be used for the package. Karaf should export this
 package at version 3.1.2 explicitly.

 The discussion can be seen at:

 http://www.mail-archive.com/**us...@felix.apache.org/**msg13910.htmlhttp://www.mail-archive.com/users@felix.apache.org/msg13910.html

 I'll create a JIRA if someone can verify this analysis.

 /Bengt


 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com



Re: Karaf webconsole exports wrong package version

2013-05-30 Thread Bengt Rodehav
Thanks.


2013/5/30 Jean-Baptiste Onofré j...@nanthrax.net

 FYI, just for tracking and fixVersion:

 https://issues.apache.org/**jira/browse/KARAF-2346https://issues.apache.org/jira/browse/KARAF-2346

 The other one that can interest you:

 https://issues.apache.org/**jira/browse/KARAF-2297https://issues.apache.org/jira/browse/KARAF-2297

 Regards
 JB


 On 05/30/2013 11:08 AM, Bengt Rodehav wrote:

 Thanks a lot JB - you are fast...

 /Bengt


 2013/5/30 Jean-Baptiste Onofré j...@nanthrax.net mailto:j...@nanthrax.net


 Hi Bengt,

 I fixed it (with a couple of other issues around org.json package,
 etc). It will be included in 2.3.2, etc.

 Regards
 JB


 On 05/30/2013 09:56 AM, Bengt Rodehav wrote:

 I'm using Karaf 2.3.1. I just attempted to upgrade iPojo to the
 latest
 version incuding the new (version 1.7.0) iPojo web console
 plugin. This
 fails because it requires a version of the package
 org.apache.felix.webconsole that is less than 4.0.0.

 After discussing this on Felix user mailing list, it seems like
 the
 Karaf webconsole exports this package using the wrong version.
 Karaf
 seems to export the package without a specific version which
 causes the
 bundle's version to be used for the package. Karaf should export
 this
 package at version 3.1.2 explicitly.

 The discussion can be seen at:

 http://www.mail-archive.com/__**us...@felix.apache.org/__**
 msg13910.htmlhttp://www.mail-archive.com/__users@felix.apache.org/__msg13910.html

 http://www.mail-archive.com/**us...@felix.apache.org/**
 msg13910.htmlhttp://www.mail-archive.com/users@felix.apache.org/msg13910.html
 

 I'll create a JIRA if someone can verify this analysis.

 /Bengt


 --
 Jean-Baptiste Onofré
 jbono...@apache.org mailto:jbono...@apache.org

 http://blog.nanthrax.net
 Talend - http://www.talend.com



 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com



Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
I've been using Karaf 2.3.0 for a while. I now tried to upgrade to Karaf
2.3.1 but ran into problems with CXF.

I use cxf-codegen-plugin to generate code from a WSDL file so that I can
call the web service via a proxy. However, after upgrading to Karaf 2.3.1 I
get the following exception:

2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 | BlueprintContainerImpl
  | container.BlueprintContainerImpl  393 | Unable to start
blueprint container for bundle se.digia.connect.services.iso20022.iws-client
org.osgi.service.blueprint.container.ComponentDefinitionException: Error
when instantiating bean iwsService of class class
se.digia.connect.iso20022.iwsclient.Client
at
org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[7:org.apache.aries.blueprint.core:1.1.0]
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
at
org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:668)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:370)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:261)[7:org.apache.aries.blueprint.core:1.1.0]
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
at
org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)[7:org.apache.aries.blueprint.core:1.1.0]
at
org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)[7:org.apache.aries.blueprint.core:1.1.0]
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)[:1.6.0_32]
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)[:1.6.0_32]
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)[:1.6.0_32]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)[:1.6.0_32]
at java.lang.Thread.run(Thread.java:662)[:1.6.0_32]
Caused by: javax.xml.ws.spi.FactoryFinder$ConfigurationError: Provider
org.apache.cxf.jaxws.spi.ProviderImpl not found
at javax.xml.ws.spi.FactoryFinder$2.run(FactoryFinder.java:130)
at
javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:229)[:1.6.0_32]
at
javax.xml.ws.spi.FactoryFinder.newInstance(FactoryFinder.java:124)[:1.6.0_32]
at
javax.xml.ws.spi.FactoryFinder.access$200(FactoryFinder.java:44)[:1.6.0_32]
at javax.xml.ws.spi.FactoryFinder$3.run(FactoryFinder.java:220)
at
javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:229)[:1.6.0_32]
at javax.xml.ws.spi.FactoryFinder.find(FactoryFinder.java:160)[:1.6.0_32]
at javax.xml.ws.spi.Provider.provider(Provider.java:43)[:1.6.0_32]
at javax.xml.ws.Service.init(Service.java:35)[:1.6.0_32]
at
se.digia.connect.iso20022.iwsclient.iws.IntegrationWebService.init(IntegrationWebService.java:30)
at se.digia.connect.iso20022.iwsclient.Client.createProxy(Client.java:198)
at se.digia.connect.iso20022.iwsclient.Client.init(Client.java:35)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)[:1.6.0_32]
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)[:1.6.0_32]
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)[:1.6.0_32]
at
java.lang.reflect.Constructor.newInstance(Constructor.java:513)[:1.6.0_32]
at
org.apache.aries.blueprint.utils.ReflectionUtils.newInstance(ReflectionUtils.java:329)
at

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Hello JB,

No I don't use jre.properties.cxf - where can I find it? It doesn't seem to
be part of the distribution.

Everything works fine under Karaf 2.3.0 - without using jre.properties.cxf.

I'm also using Camel 2.10.3. I did this on Karaf 2.3.0 too without any
problems. It seems like the org.apache.cxf.jaxws.spi package is exported by
the bundle org.apache.cxf.cxf-rt-frontend-jax2s in Cxf 2.6.3. However, no
bundle is importing the package.

I've experimented with setting the TCCL in my bundle but I then get other
problems instead. Also, I didn't have to do this in Karaf 2.3.0.

How is this supposed to work? Should I need to manipulate the TCCL or
should I add any imports (dynamic?) to get this to work? I didn't use to do
anything special at all.

Looking at the stack trace it seems to be the system bundle (class
javax.xml.ws.spi.FactoryFinder) that tries to instantiate the ProviderImpl.
Thus, it doesn't matter if my bundle imports the org.apache.cxf.jaxws.spi
or not.

Not really sure how this is supposed to work...

/Bengt



2013/4/2 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 do you use the jre.properties.cxf ?

 No problem with 2.3.0 ?

 Regards
 JB


 On 04/02/2013 09:30 AM, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to Karaf
 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I can
 call the web service via a proxy. However, after upgrading to Karaf
 2.3.1 I get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.**BlueprintContainerImpl
  393
 | Unable to start blueprint container for bundle
 se.digia.connect.services.**iso20022.iws-client
 org.osgi.service.blueprint.**container.**ComponentDefinitionException:
 Error
 when instantiating bean iwsService of class class
 se.digia.connect.iso20022.**iwsclient.Client
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 getInstance(BeanRecipe.java:**333)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 internalCreate2(BeanRecipe.**java:806)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 internalCreate(BeanRecipe.**java:787)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.di.**AbstractRecipe$1.call(**
 AbstractRecipe.java:79)[7:org.**apache.aries.blueprint.core:1.**1.0]
 at
 java.util.concurrent.**FutureTask$Sync.innerRun(**
 FutureTask.java:303)[:1.6.0_**32]
 at java.util.concurrent.**FutureTask.run(FutureTask.**
 java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.**AbstractRecipe.create(**
 AbstractRecipe.java:88)[7:org.**apache.aries.blueprint.core:1.**1.0]
 at
 org.apache.aries.blueprint.**container.BlueprintRepository.**
 createInstances(**BlueprintRepository.java:245)[**
 7:org.apache.aries.blueprint.**core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BlueprintRepository.**
 createAll(BlueprintRepository.**java:183)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**BlueprintContainerImpl.**
 instantiateEagerComponents(**BlueprintContainerImpl.java:**
 668)[7:org.apache.aries.**blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**BlueprintContainerImpl.doRun(**
 BlueprintContainerImpl.java:**370)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**BlueprintContainerImpl.run(**
 BlueprintContainerImpl.java:**261)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 java.util.concurrent.**Executors$RunnableAdapter.**
 call(Executors.java:441)[:1.6.**0_32]
 at
 java.util.concurrent.**FutureTask$Sync.innerRun(**
 FutureTask.java:303)[:1.6.0_**32]
 at java.util.concurrent.**FutureTask.run(FutureTask.**
 java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.**container.**ExecutorServiceWrapper.run(**
 ExecutorServiceWrapper.java:**106)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**utils.threading.impl.**
 DiscardableRunnable.run(**DiscardableRunnable.java:48)[**
 7:org.apache.aries.blueprint.**core:1.1.0]
 at
 java.util.concurrent.**Executors$RunnableAdapter.**
 call(Executors.java:441)[:1.6.**0_32]
 at
 java.util.concurrent.**FutureTask$Sync.innerRun(**
 FutureTask.java:303)[:1.6.0_**32]
 at java.util.concurrent.**FutureTask.run(FutureTask.**
 java:138)[:1.6.0_32]
 at
 java.util.concurrent.**ScheduledThreadPoolExecutor$**
 ScheduledFutureTask.access$**301(**ScheduledThreadPoolExecutor.**
 java:98)[:1.6.0_32]
 at
 java.util.concurrent.**ScheduledThreadPoolExecutor$**
 ScheduledFutureTask.run(**ScheduledThreadPoolExecutor.**
 java:206)[:1.6.0_32]
 at
 java.util.concurrent.**ThreadPoolExecutor$Worker.**
 runTask(ThreadPoolExecutor.**java:886)[:1.6.0_32]
 at
 java.util.concurrent.**ThreadPoolExecutor$Worker.run(**
 ThreadPoolExecutor.java:908)[:**1.6.0_32]
 at java.lang.Thread.run(Thread.**java:662

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Hello Freeman,

It would be a lot of work for me to narrow down my application to a simple
test case. I'd really like to try other possibilities first, like:

- Understanding how the factory pattern is supposed to work, espcially for
Cxf
- What has been changed in Karaf 2.3.1. that could affect this

/Bengt


2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No concrete idea now, could you please append a test case which we can
 build and reproduce it?
 Thanks
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to Karaf
 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I can
 call the web service via a proxy. However, after upgrading to Karaf 2.3.1 I
 get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.BlueprintContainerImpl  393 |
 Unable to start blueprint container for bundle
 se.digia.connect.services.iso20022.iws-client
 org.osgi.service.blueprint.container.ComponentDefinitionException: Error
 when instantiating bean iwsService of class class
 se.digia.connect.iso20022.iwsclient.Client
 at
 org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:668)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:370)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:261)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
  at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
 at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
  at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)[:1.6.0_32]
 at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)[:1.6.0_32]
  at
 java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)[:1.6.0_32]
 at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)[:1.6.0_32]
  at java.lang.Thread.run(Thread.java:662)[:1.6.0_32]
 Caused by: javax.xml.ws.spi.FactoryFinder$ConfigurationError: Provider
 org.apache.cxf.jaxws.spi.ProviderImpl not found
  at javax.xml.ws.spi.FactoryFinder$2.run(FactoryFinder.java:130)
 at
 javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:229)[:1.6.0_32]
  at
 javax.xml.ws.spi.FactoryFinder.newInstance(FactoryFinder.java:124)[:1.6.0_32]
 at
 javax.xml.ws.spi.FactoryFinder.access$200(FactoryFinder.java:44)[:1.6.0_32]
  at javax.xml.ws.spi.FactoryFinder$3.run(FactoryFinder.java:220)
 at
 javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:229)[:1.6.0_32]
  at javax.xml.ws.spi.FactoryFinder.find(FactoryFinder.java:160)[:1.6.0_32]
 at javax.xml.ws.spi.Provider.provider

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
I found this blog post by Dan Kulp:
http://www.dankulp.com/blog/2011/11/apache-cxf-in-osgi/

I modified the jre.properties accordingly but I still get the exact same
stack trace.

/Bengt


2013/4/2 Bengt Rodehav be...@rodehav.com

 Hello Freeman,

 It would be a lot of work for me to narrow down my application to a simple
 test case. I'd really like to try other possibilities first, like:

 - Understanding how the factory pattern is supposed to work, espcially for
 Cxf
 - What has been changed in Karaf 2.3.1. that could affect this

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No concrete idea now, could you please append a test case which we can
 build and reproduce it?
 Thanks
  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to Karaf
 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I can
 call the web service via a proxy. However, after upgrading to Karaf 2.3.1 I
 get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.BlueprintContainerImpl  393 |
 Unable to start blueprint container for bundle
 se.digia.connect.services.iso20022.iws-client
 org.osgi.service.blueprint.container.ComponentDefinitionException: Error
 when instantiating bean iwsService of class class
 se.digia.connect.iso20022.iwsclient.Client
 at
 org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:668)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:370)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:261)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
  at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
 at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
  at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)[:1.6.0_32]
 at
 java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)[:1.6.0_32]
  at
 java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)[:1.6.0_32]
 at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)[:1.6.0_32]
  at java.lang.Thread.run(Thread.java:662)[:1.6.0_32]
 Caused by: javax.xml.ws.spi.FactoryFinder$ConfigurationError: Provider
 org.apache.cxf.jaxws.spi.ProviderImpl not found
  at javax.xml.ws.spi.FactoryFinder$2.run(FactoryFinder.java:130)
 at
 javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:229)[:1.6.0_32]
  at
 javax.xml.ws.spi.FactoryFinder.newInstance(FactoryFinder.java:124)[:1.6.0_32]
 at
 javax.xml.ws.spi.FactoryFinder.access$200(FactoryFinder.java:44)[:1.6.0_32

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Thanks JB,

I did notice that Karaf provides servicemix-specs API bundles in
lib\endorsed. It includes jaxws. I thought that these jar's overrode the
JVM. If so, then you shouldn't need to modifiy the jre.properties - or have
I got it all wrong?

/Bengt


2013/4/2 Jean-Baptiste Onofré j...@nanthrax.net

 It's the jre.properties that I'm taking about. It should be provided in
 etc/jre.properties.cxf.

 So the problem is not about a JRE package mismatch.

 I gonna take a deeper look (especially around the dependencies updates
 between 2.3.0 and 2.3.1).

 Regards
 JB


 On 04/02/2013 10:38 AM, Bengt Rodehav wrote:

 I found this blog post by Dan Kulp:
 http://www.dankulp.com/blog/**2011/11/apache-cxf-in-osgi/http://www.dankulp.com/blog/2011/11/apache-cxf-in-osgi/

 I modified the jre.properties accordingly but I still get the exact same
 stack trace.

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com


 Hello Freeman,

 It would be a lot of work for me to narrow down my application to a
 simple test case. I'd really like to try other possibilities first,
 like:

 - Understanding how the factory pattern is supposed to work,
 espcially for Cxf
 - What has been changed in Karaf 2.3.1. that could affect this

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 No concrete idea now, could you please append a test case which
 we can build and reproduce it?
 Thanks
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: 
 http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

  I've been using Karaf 2.3.0 for a while. I now tried to
 upgrade to Karaf 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so
 that I can call the web service via a proxy. However, after
 upgrading to Karaf 2.3.1 I get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   |
 container.**BlueprintContainerImpl  393 | Unable to start
 blueprint container for bundle
 se.digia.connect.services.**iso20022.iws-client
 org.osgi.service.blueprint.**container.**
 ComponentDefinitionException:
 Error when instantiating bean iwsService of class class
 se.digia.connect.iso20022.**iwsclient.Client
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 getInstance(BeanRecipe.java:**333)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 internalCreate2(BeanRecipe.**java:806)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BeanRecipe.**
 internalCreate(BeanRecipe.**java:787)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.di.**AbstractRecipe$1.call(**
 AbstractRecipe.java:79)[7:org.**apache.aries.blueprint.core:1.**1.0]
 at
 java.util.concurrent.**FutureTask$Sync.innerRun(**
 FutureTask.java:303)[:1.6.0_**32]
 at
 java.util.concurrent.**FutureTask.run(FutureTask.**
 java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.**AbstractRecipe.create(**
 AbstractRecipe.java:88)[7:org.**apache.aries.blueprint.core:1.**1.0]
 at
 org.apache.aries.blueprint.**container.BlueprintRepository.**
 createInstances(**BlueprintRepository.java:245)[**
 7:org.apache.aries.blueprint.**core:1.1.0]
 at
 org.apache.aries.blueprint.**container.BlueprintRepository.**
 createAll(BlueprintRepository.**java:183)[7:org.apache.aries.**
 blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**BlueprintContainerImpl.
 **instantiateEagerComponents(**BlueprintContainerImpl.java:**
 668)[7:org.apache.aries.**blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**
 BlueprintContainerImpl.doRun(**BlueprintContainerImpl.java:**
 370)[7:org.apache.aries.**blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.**container.**
 BlueprintContainerImpl.run(**BlueprintContainerImpl.java:**
 261)[7:org.apache.aries.**blueprint.core:1.1.0]
 at
 java.util.concurrent.**Executors$RunnableAdapter.**
 call(Executors.java:441)[:1.6.**0_32]
 at
 java.util.concurrent.**FutureTask$Sync.innerRun(**
 FutureTask.java:303)[:1.6.0_**32]
 at
 java.util.concurrent.**FutureTask.run(FutureTask.**
 java:138)[:1.6.0_32

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
I'll see if I can get a test case done for this although it might take a
while. Meanwhile, can you explain what mechanism is used for resolving the
implementation classes. I mean, how is the system bundle supposed to
resolve a class that resides in another bundle? (In this case the
cxf-rt-frontend).

/Bengt


2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No, that blog is a little bit out of data and not applicable for Karaf
 2.3.x anymore.

 With Karaf 2.3.x we endorse specs(like jaxws/jaxb) jars, so we need export
 those packages from system bundle 0, so don't comment it out

 and those endorsed specs jars can load jaxws impl bundle(cxf jaxws
 frontend bundle in this case) during runtime OOTB.

 No concrete idea why it doesn't work for you(also I don't think there's
 any real difference between karaf 2.3 and karaf 2.3.1 in terms of jaxws
 loading mechanism), that's why I ask a test case, it's definitely very
 helpful to resolve the issue faster.


 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午4:38, Bengt Rodehav wrote:

 I found this blog post by Dan Kulp:
 http://www.dankulp.com/blog/2011/11/apache-cxf-in-osgi/

 I modified the jre.properties accordingly but I still get the exact same
 stack trace.

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com

 Hello Freeman,

 It would be a lot of work for me to narrow down my application to a
 simple test case. I'd really like to try other possibilities first, like:

 - Understanding how the factory pattern is supposed to work, espcially
 for Cxf
 - What has been changed in Karaf 2.3.1. that could affect this

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No concrete idea now, could you please append a test case which we can
 build and reproduce it?
 Thanks
  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to Karaf
 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I can
 call the web service via a proxy. However, after upgrading to Karaf 2.3.1 I
 get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.BlueprintContainerImpl  393 |
 Unable to start blueprint container for bundle
 se.digia.connect.services.iso20022.iws-client
 org.osgi.service.blueprint.container.ComponentDefinitionException: Error
 when instantiating bean iwsService of class class
 se.digia.connect.iso20022.iwsclient.Client
 at
 org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:668)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:370)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:261)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)[:1.6.0_32]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Thanks, will read the blog.


2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,
 Good question, yeah, the traditional JAVA SPI mechanism generally doesn't
 work in OSGi container.
 So Servicemix Specs project[1] use OSGi locator to resolve this problem,
 Guillaume has a blog about it years ago[2]

 [1]https://svn.apache.org/repos/asf/servicemix/smx4/specs/trunk/
 [2]http://gnodet.blogspot.com/2008/05/jee-specs-in-osgi.html

 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午5:07, Bengt Rodehav wrote:

 I'll see if I can get a test case done for this although it might take a
 while. Meanwhile, can you explain what mechanism is used for resolving the
 implementation classes. I mean, how is the system bundle supposed to
 resolve a class that resides in another bundle? (In this case the
 cxf-rt-frontend).

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No, that blog is a little bit out of data and not applicable for Karaf
 2.3.x anymore.

 With Karaf 2.3.x we endorse specs(like jaxws/jaxb) jars, so we need
 export those packages from system bundle 0, so don't comment it out

 and those endorsed specs jars can load jaxws impl bundle(cxf jaxws
 frontend bundle in this case) during runtime OOTB.

 No concrete idea why it doesn't work for you(also I don't think there's
 any real difference between karaf 2.3 and karaf 2.3.1 in terms of jaxws
 loading mechanism), that's why I ask a test case, it's definitely very
 helpful to resolve the issue faster.


  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午4:38, Bengt Rodehav wrote:

 I found this blog post by Dan Kulp:
 http://www.dankulp.com/blog/2011/11/apache-cxf-in-osgi/

 I modified the jre.properties accordingly but I still get the exact same
 stack trace.

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com

 Hello Freeman,

 It would be a lot of work for me to narrow down my application to a
 simple test case. I'd really like to try other possibilities first, like:

 - Understanding how the factory pattern is supposed to work, espcially
 for Cxf
 - What has been changed in Karaf 2.3.1. that could affect this

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No concrete idea now, could you please append a test case which we can
 build and reproduce it?
 Thanks
  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to
 Karaf 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I
 can call the web service via a proxy. However, after upgrading to Karaf
 2.3.1 I get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.BlueprintContainerImpl  393 |
 Unable to start blueprint container for bundle
 se.digia.connect.services.iso20022.iws-client
 org.osgi.service.blueprint.container.ComponentDefinitionException:
 Error when instantiating bean iwsService of class class
 se.digia.connect.iso20022.iwsclient.Client
 at
 org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)[:1.6.0_32]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138)[:1.6.0_32]
 at
 org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)[7:org.apache.aries.blueprint.core:1.1.0]
 at
 org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:668)[7:org.apache.aries.blueprint.core:1.1.0

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
I read the blog but it didn't go into details regarding how the
implementation classes are found - I guess I'll have to look in the code.

One observation regarding my problem: What gives me an exception is when
blueprint attempts to instantiate my class. In my class constructor I try
to create the web service proxy. I did try to import the
org.apache.cxf.jaxws.spi package to the bundle that contains the class that
cannot be instantiated. I can see in runtime that the bundle does indeed
import the org.apache.cxf.jaxws.spi package from the cxf-rt-frontend
bundle. But I still get the exception indicating that it cannot find the
org.apache.cxf.jaxws.spi.ProviderImpl class.

Thus, putting the ProviderImpl class in my bundle's classpath doesn't help.
It seems like it has to be in the system bundle's classpath. I really don't
know how that mechanism is supposed to work - but it did work in Karaf
2.3.0. Does the new blueprint version do some classloading magic?

/Bengt


2013/4/2 Bengt Rodehav be...@rodehav.com

 Thanks, will read the blog.


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,
 Good question, yeah, the traditional JAVA SPI mechanism generally doesn't
 work in OSGi container.
 So Servicemix Specs project[1] use OSGi locator to resolve this
 problem, Guillaume has a blog about it years ago[2]

 [1]https://svn.apache.org/repos/asf/servicemix/smx4/specs/trunk/
 [2]http://gnodet.blogspot.com/2008/05/jee-specs-in-osgi.html

  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午5:07, Bengt Rodehav wrote:

 I'll see if I can get a test case done for this although it might take a
 while. Meanwhile, can you explain what mechanism is used for resolving the
 implementation classes. I mean, how is the system bundle supposed to
 resolve a class that resides in another bundle? (In this case the
 cxf-rt-frontend).

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No, that blog is a little bit out of data and not applicable for Karaf
 2.3.x anymore.

 With Karaf 2.3.x we endorse specs(like jaxws/jaxb) jars, so we need
 export those packages from system bundle 0, so don't comment it out

 and those endorsed specs jars can load jaxws impl bundle(cxf jaxws
 frontend bundle in this case) during runtime OOTB.

 No concrete idea why it doesn't work for you(also I don't think there's
 any real difference between karaf 2.3 and karaf 2.3.1 in terms of jaxws
 loading mechanism), that's why I ask a test case, it's definitely very
 helpful to resolve the issue faster.


  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午4:38, Bengt Rodehav wrote:

 I found this blog post by Dan Kulp:
 http://www.dankulp.com/blog/2011/11/apache-cxf-in-osgi/

 I modified the jre.properties accordingly but I still get the exact same
 stack trace.

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com

 Hello Freeman,

 It would be a lot of work for me to narrow down my application to a
 simple test case. I'd really like to try other possibilities first, like:

 - Understanding how the factory pattern is supposed to work, espcially
 for Cxf
 - What has been changed in Karaf 2.3.1. that could affect this

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com

 Hi,

 No concrete idea now, could you please append a test case which we can
 build and reproduce it?
 Thanks
  -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午3:30, Bengt Rodehav wrote:

 I've been using Karaf 2.3.0 for a while. I now tried to upgrade to
 Karaf 2.3.1 but ran into problems with CXF.

 I use cxf-codegen-plugin to generate code from a WSDL file so that I
 can call the web service via a proxy. However, after upgrading to Karaf
 2.3.1 I get the following exception:

 2013-04-02 09:19:03,317 | ERROR | rint Extender: 3 |
 BlueprintContainerImpl   | container.BlueprintContainerImpl  393 |
 Unable to start blueprint container for bundle
 se.digia.connect.services.iso20022.iws-client
 org.osgi.service.blueprint.container.ComponentDefinitionException:
 Error when instantiating bean iwsService of class class
 se.digia.connect.iso20022.iwsclient.Client
 at
 org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)[7:org.apache.aries.blueprint.core:1.1.0]
  at
 org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)[7

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Another observation.

In Cxf 2.6.3 feature descriptor, the feature cxf-specs include, among
others, 2.2 versions of jaxb-api and jaxws-api. Since the corresponding
api's are only on 2.1 level in Karaf 2.3.0 lib\endorsed, the bundles are
used instead. But, Karaf 2.3.1 has updated the endorsed libraries to
version 2.2 which means that they are now  used by Cxf instead of the
bundles listed in the cxf-spec feature.

In other words, the Cxf feature descriptor must (or should) be changed in
order to work under Karaf 2.3.1. Perhaps this has been done in later
versions of Cxf - I haven't checked.

Even if I fix this I'm still not done since I don't know how to get the
endorsed versions to work with Cxf. Hopefully someone knows how to get that
to work.

/Bengt


2013/4/2 Bengt Rodehav be...@rodehav.com

 No, I did not update the Cxf version. I use Cxf 2.6.3 in both cases.

 I wonder why the cxf-api bundle imports the org.apache.cxf.jaxws.spi
 package. Maybe it's been done by accident but it still is what makes it
 work under Karaf 2.3.0...

 I'm currently looking at the source code for the FactoryFinder class that
 does the actual loading of the factory class. It does seem like it uses the
 TCCL. Do you know if I'm supposed to set the TCCL manually? What is the
 actual usage pattern here?

 /Bengt


 2013/4/2 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 it can but it doesn't come from Karaf. I guess that you updated the CXF
 version as well ?

 Regards
 JB


 On 04/02/2013 02:01 PM, Bengt Rodehav wrote:

 I compared Karaf 2.3.0 with 2.3.1 and noticed that (with my
 configuration) in Karaf 2.3.0, the org.apache.cxf.cxf-api bundle imports
 the org.apache.cxf.jaxws.spi package from the cxf-rt-frontend bundle.
 This does not happen in Karaf 2.3.1. Could this cause the problem?

 /Bengt




 2013/4/2 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com


 I read the blog but it didn't go into details regarding how the
 implementation classes are found - I guess I'll have to look in the
 code.

 One observation regarding my problem: What gives me an exception is
 when blueprint attempts to instantiate my class. In my class
 constructor I try to create the web service proxy. I did try to
 import the org.apache.cxf.jaxws.spi package to the bundle that
 contains the class that cannot be instantiated. I can see in runtime
 that the bundle does indeed import the org.apache.cxf.jaxws.spi
 package from the cxf-rt-frontend bundle. But I still get the
 exception indicating that it cannot find the
 org.apache.cxf.jaxws.spi.**ProviderImpl class.

 Thus, putting the ProviderImpl class in my bundle's classpath
 doesn't help. It seems like it has to be in the system bundle's
 classpath. I really don't know how that mechanism is supposed to
 work - but it did work in Karaf 2.3.0. Does the new blueprint
 version do some classloading magic?

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com
 


 Thanks, will read the blog.


 2013/4/2 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,
 Good question, yeah, the traditional JAVA SPI mechanism
 generally doesn't work in OSGi container.
 So Servicemix Specs project[1] use OSGi locator to resolve
 this problem, Guillaume has a blog about it years ago[2]

 [1]https://svn.apache.org/**repos/asf/servicemix/smx4/**
 specs/trunk/https://svn.apache.org/repos/asf/servicemix/smx4/specs/trunk/
 [2]http://gnodet.blogspot.com/**
 2008/05/jee-specs-in-osgi.htmlhttp://gnodet.blogspot.com/2008/05/jee-specs-in-osgi.html

 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: 
 http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午5:07, Bengt Rodehav wrote:

  I'll see if I can get a test case done for this although
 it might take a while. Meanwhile, can you explain what
 mechanism is used for resolving the implementation
 classes. I mean, how is the system bundle supposed to
 resolve a class that resides in another bundle? (In this
 case the cxf-rt-frontend).

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 No, that blog is a little bit out of data and not
 applicable for Karaf 2.3.x anymore.

 With Karaf 2.3.x we endorse specs(like jaxws/jaxb)
 jars, so we need export those

Re: Problems with Karaf 2.3.1 and Cxf

2013-04-02 Thread Bengt Rodehav
Just checked the feature descriptor for Cxf 2.7.3. It still includes the
2.2 api's. I wonder if anyone has gotten this to work on Karaf 2.3.1 and,
if so, how they did it.

/Bengt


2013/4/2 Bengt Rodehav be...@rodehav.com

 Another observation.

 In Cxf 2.6.3 feature descriptor, the feature cxf-specs include, among
 others, 2.2 versions of jaxb-api and jaxws-api. Since the corresponding
 api's are only on 2.1 level in Karaf 2.3.0 lib\endorsed, the bundles are
 used instead. But, Karaf 2.3.1 has updated the endorsed libraries to
 version 2.2 which means that they are now  used by Cxf instead of the
 bundles listed in the cxf-spec feature.

 In other words, the Cxf feature descriptor must (or should) be changed in
 order to work under Karaf 2.3.1. Perhaps this has been done in later
 versions of Cxf - I haven't checked.

 Even if I fix this I'm still not done since I don't know how to get the
 endorsed versions to work with Cxf. Hopefully someone knows how to get that
 to work.

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com

 No, I did not update the Cxf version. I use Cxf 2.6.3 in both cases.

 I wonder why the cxf-api bundle imports the org.apache.cxf.jaxws.spi
 package. Maybe it's been done by accident but it still is what makes it
 work under Karaf 2.3.0...

 I'm currently looking at the source code for the FactoryFinder class that
 does the actual loading of the factory class. It does seem like it uses the
 TCCL. Do you know if I'm supposed to set the TCCL manually? What is the
 actual usage pattern here?

 /Bengt


 2013/4/2 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 it can but it doesn't come from Karaf. I guess that you updated the CXF
 version as well ?

 Regards
 JB


 On 04/02/2013 02:01 PM, Bengt Rodehav wrote:

 I compared Karaf 2.3.0 with 2.3.1 and noticed that (with my
 configuration) in Karaf 2.3.0, the org.apache.cxf.cxf-api bundle imports
 the org.apache.cxf.jaxws.spi package from the cxf-rt-frontend bundle.
 This does not happen in Karaf 2.3.1. Could this cause the problem?

 /Bengt




 2013/4/2 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com


 I read the blog but it didn't go into details regarding how the
 implementation classes are found - I guess I'll have to look in the
 code.

 One observation regarding my problem: What gives me an exception is
 when blueprint attempts to instantiate my class. In my class
 constructor I try to create the web service proxy. I did try to
 import the org.apache.cxf.jaxws.spi package to the bundle that
 contains the class that cannot be instantiated. I can see in runtime
 that the bundle does indeed import the org.apache.cxf.jaxws.spi
 package from the cxf-rt-frontend bundle. But I still get the
 exception indicating that it cannot find the
 org.apache.cxf.jaxws.spi.**ProviderImpl class.

 Thus, putting the ProviderImpl class in my bundle's classpath
 doesn't help. It seems like it has to be in the system bundle's
 classpath. I really don't know how that mechanism is supposed to
 work - but it did work in Karaf 2.3.0. Does the new blueprint
 version do some classloading magic?

 /Bengt


 2013/4/2 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com
 


 Thanks, will read the blog.


 2013/4/2 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,
 Good question, yeah, the traditional JAVA SPI mechanism
 generally doesn't work in OSGi container.
 So Servicemix Specs project[1] use OSGi locator to resolve
 this problem, Guillaume has a blog about it years ago[2]

 [1]https://svn.apache.org/**repos/asf/servicemix/smx4/**
 specs/trunk/https://svn.apache.org/repos/asf/servicemix/smx4/specs/trunk/
 [2]http://gnodet.blogspot.com/**
 2008/05/jee-specs-in-osgi.htmlhttp://gnodet.blogspot.com/2008/05/jee-specs-in-osgi.html

 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: 
 http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2013-4-2, at 下午5:07, Bengt Rodehav wrote:

  I'll see if I can get a test case done for this although
 it might take a while. Meanwhile, can you explain what
 mechanism is used for resolving the implementation
 classes. I mean, how is the system bundle supposed to
 resolve a class that resides in another bundle? (In this
 case the cxf-rt-frontend).

 /Bengt


 2013/4/2 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi

Re: My Karaf PaxExam test mysteriously fails after new year

2013-01-03 Thread Bengt Rodehav
I've had the exact same problem and have really been scratching my head...

Dan, can you share exactly what you put in the dependency management
section?

/Bengt


2013/1/3 mikevan mvangeert...@comcast.net

 Great catch, Dan!

 On 1/2/2013 9:47 PM, dantran [via Karaf] wrote:
  Looks like it is a bug under pax-exam 2.4, by forcing my maven build
  to pickup pax-exam 2.6 via dependencyManagement fixes the issue.
 
  My guess karaf 2.3.1-SNAPSHOT also seeing the same issue, so upgrading
  to pax-exam 1.6 would fix it as well
 
  -Dan
 
 
  On Wed, Jan 2, 2013 at 7:13 PM, Dan Tran [hidden email]
  /user/SendEmail.jtp?type=nodenode=4027176i=0 wrote:
 
   Hi
  
   My pax-exam test started to fail with the following trace
  
   java.lang.Exception: Could not start bundle
   mvn:org.ops4j.pax.swissbox/pax-swissbox-core/ in feature(s)
   exam-2.4.0: Unresolved constraint in bundle
   org.ops4j.pax.swissbox.core [68]: Unable to resolve
   68.0: missing requirement [68.0] osgi.wiring.package;
   ((osgi.wiring.package=org.ops4j.lang)(version=1.4.0))
   at
 
 org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:476)[20:org.apache.karaf.features.core:2.3.0]
   at
 
 org.apache.karaf.features.internal.FeaturesServiceImpl$2.run(FeaturesServiceImpl.java:1141)[20:org.apache.karaf.features.core:2.3.0]
   Caused by: org.osgi.framework.BundleException: Unresolved constraint
   in bundle org.ops4j.pax.swissbox.core [68]: Unable to resolve 68.0:
   missing requirement [68.0] osgi.wiring.package; ((osgi.wiring.
   package=org.ops4j.lang)(version=1.4.0))
   at
 
 org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:3826)[org.apache.felix.framework-4.0.3.jar:]
   at
 
 org.apache.felix.framework.Felix.startBundle(Felix.java:1868)[org.apache.felix.framework-4.0.3.jar:]
   at
 
 org.apache.felix.framework.BundleImpl.start(BundleImpl.java:944)[org.apache.felix.framework-4.0.3.jar:]
   at
 
 org.apache.felix.framework.BundleImpl.start(BundleImpl.java:931)[org.apache.felix.framework-4.0.3.jar:]
   at
 
 org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:473)[20:org.apache.karaf.features.core:2.3.0]
   ... 1 more
  
  
   Further investigation points to a generated feature file under my exam
   directory with this name examfeatures.xml
  
   ?xml version=1.0 encoding=UTF-8?
   features name=pax-exam-features-2.4.0
   feature name=exam version=2.4.0
  
   bundle
  start-level='5'mvn:org.ops4j.base/ops4j-base-lang/1.3.0/bundle
   bundle
  start-level='5'mvn:org.ops4j.base/ops4j-base-monitors/1.3.0/bundle
   bundle
 start-level='5'mvn:org.ops4j.base/ops4j-base-net/1.3.0/bundle
   bundle
  start-level='5'mvn:org.ops4j.base/ops4j-base-store/1.3.0/bundle
   bundle start-level='5'mvn:org.ops4j.base/ops4j-base-io/1.3.0/bundle
   bundle
 start-level='5'mvn:org.ops4j.base/ops4j-base-spi/1.3.0/bundle
   bundle
 
 start-level='5'mvn:org.ops4j.base/ops4j-base-util-property/1.3.0/bundle
   bundle
  start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-core//bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-extender//bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-lifecycle//bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-framework//bundle
   bundle start-level='5'mvn:org.ops4j.pax.exam/pax-exam/2.4.0/bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-extender-service/2.4.0/bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-container-rbc/2.4.0/bundle
   bundle start-level='5'wrap:mvn:junit/junit/4.10/bundle
   bundle
 
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-invoker-junit/2.4.0/bundle
   bundle
 
 start-level='5'mvn:org.apache.karaf.tooling.exam/org.apache.karaf.tooling.exam.options/2.3.0/bundle
   bundle
 
 start-level='5'mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.0/bundle
   bundle
  start-level='5'mvn:org.ops4j.pax.exam/pax-exam-inject/2.4.0/bundle
   /feature
   /features
  
  
   Where pax-swissbox-xxx bundles do not have associated version, by
   default it would pickup the latest version 1.6 ( release on
   12/26/2012), instead of 1.5.1 and cause the failure
  
   How do I fix this?  Adding those artifact into my test dependencies
   does not work either
  
   Thanks
  
   -Dan
 
 
  
  If you reply to this email, your message will be added to the
  discussion below:
 
 http://karaf.922171.n3.nabble.com/My-Karaf-PaxExam-test-mysteriously-fails-after-new-year-tp4027175p4027176.html
 
  To start a new topic under Karaf - User, email
  ml-node+s922171n930749...@n3.nabble.com
  To unsubscribe from Karaf - User, click here
  
 http://karaf.922171.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=930749code=bXZhbmdlZXJ0cnV5QGNvbWNhc3QubmV0fDkzMDc0OXwtNjA0ODc4OTY2
 .

Re: My Karaf PaxExam test mysteriously fails after new year

2013-01-03 Thread Bengt Rodehav
OK - I'll wait until you've committed this in Karaf then.

Thanks,

/Bengt


2013/1/3 Jean-Baptiste Onofré j...@nanthrax.net

 I think that pax-exam 2.6 (and dependencies like swissbox, etc) should be
 enough.

 Anyway, I gonna update for Karaf 2.3.1.

 Regards
 JB


 On 01/03/2013 10:10 AM, Bengt Rodehav wrote:

 I've had the exact same problem and have really been scratching my head...

 Dan, can you share exactly what you put in the dependency management
 section?

 /Bengt


 2013/1/3 mikevan mvangeert...@comcast.net
 mailto:mvangeertruy@comcast.**net mvangeert...@comcast.net


 Great catch, Dan!

 On 1/2/2013 9:47 PM, dantran [via Karaf] wrote:
   Looks like it is a bug under pax-exam 2.4, by forcing my maven
 build
   to pickup pax-exam 2.6 via dependencyManagement fixes the issue.
  
   My guess karaf 2.3.1-SNAPSHOT also seeing the same issue, so
 upgrading
   to pax-exam 1.6 would fix it as well
  
   -Dan
  
  
   On Wed, Jan 2, 2013 at 7:13 PM, Dan Tran [hidden email]
   /user/SendEmail.jtp?type=**nodenode=4027176i=0 wrote:
  
Hi
   
My pax-exam test started to fail with the following trace
   
java.lang.Exception: Could not start bundle
mvn:org.ops4j.pax.swissbox/**pax-swissbox-core/ in feature(s)
exam-2.4.0: Unresolved constraint in bundle
org.ops4j.pax.swissbox.core [68]: Unable to resolve
68.0: missing requirement [68.0] osgi.wiring.package;
((osgi.wiring.package=org.**ops4j.lang)(version=1.4.0))
at
  
 org.apache.karaf.features.**internal.FeaturesServiceImpl.**
 installFeatures(**FeaturesServiceImpl.java:476)[**
 20:org.apache.karaf.features.**core:2.3.0]
at
  
 org.apache.karaf.features.**internal.FeaturesServiceImpl$**
 2.run(FeaturesServiceImpl.**java:1141)[20:org.apache.**
 karaf.features.core:2.3.0]
Caused by: org.osgi.framework.**BundleException: Unresolved
 constraint
in bundle org.ops4j.pax.swissbox.core [68]: Unable to resolve
 68.0:
missing requirement [68.0] osgi.wiring.package; ((osgi.wiring.
package=org.ops4j.lang)(**version=1.4.0))
at
  
 org.apache.felix.framework.**Felix.resolveBundleRevision(**
 Felix.java:3826)[org.apache.**felix.framework-4.0.3.jar:]
at
  
 org.apache.felix.framework.**Felix.startBundle(Felix.java:**
 1868)[org.apache.felix.**framework-4.0.3.jar:]
at
  
 org.apache.felix.framework.**BundleImpl.start(BundleImpl.**
 java:944)[org.apache.felix.**framework-4.0.3.jar:]
at
  
 org.apache.felix.framework.**BundleImpl.start(BundleImpl.**
 java:931)[org.apache.felix.**framework-4.0.3.jar:]
at
  
 org.apache.karaf.features.**internal.FeaturesServiceImpl.**
 installFeatures(**FeaturesServiceImpl.java:473)[**
 20:org.apache.karaf.features.**core:2.3.0]
... 1 more
   
   
Further investigation points to a generated feature file under
 my exam
directory with this name examfeatures.xml
   
?xml version=1.0 encoding=UTF-8?
features name=pax-exam-features-2.4.0**
feature name=exam version=2.4.0
   
bundle
   start-level='5'mvn:org.ops4j.**base/ops4j-base-lang/1.3.0/**
 bundle
bundle
   start-level='5'mvn:org.ops4j.**base/ops4j-base-monitors/1.3.**
 0/bundle
bundle
 start-level='5'mvn:org.ops4j.**base/ops4j-base-net/1.3.0/**bundle
bundle
   start-level='5'mvn:org.ops4j.**base/ops4j-base-store/1.3.0/**
 bundle
bundle
 start-level='5'mvn:org.ops4j.**base/ops4j-base-io/1.3.0/**bundle
bundle
 start-level='5'mvn:org.ops4j.**base/ops4j-base-spi/1.3.0/**bundle
bundle
  
 start-level='5'mvn:org.ops4j.**base/ops4j-base-util-property/**
 1.3.0/bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.swissbox/pax-swissbox-**
 core//bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.swissbox/pax-swissbox-**
 extender//bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.swissbox/pax-swissbox-**
 lifecycle//bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.swissbox/pax-swissbox-**
 framework//bundle
bundle
 start-level='5'mvn:org.ops4j.**pax.exam/pax-exam/2.4.0/**bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.exam/pax-exam-extender-**
 service/2.4.0/bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.exam/pax-exam-container-**
 rbc/2.4.0/bundle
bundle start-level='5'wrap:mvn:**junit/junit/4.10/bundle
bundle
  
 start-level='5'mvn:org.ops4j.**pax.exam/pax-exam-invoker-**
 junit/2.4.0/bundle
bundle
  
 start-level='5'mvn:org.**apache.karaf.tooling.exam/org.**
 apache.karaf.tooling.exam.**options/2.3.0/bundle
bundle
  
 start

Re: My Karaf PaxExam test mysteriously fails after new year

2013-01-03 Thread Bengt Rodehav
Works perfectly - thanks for the workaround,

/Bengt


2013/1/3 Dan Tran dant...@gmail.com

 dependencyManagement
   depenenencies


   !--this override the one under karaf-exam to pickup a fix --
   !-- remove this when karaf pickup pax-exam 2.6+ --
   dependency
 groupIdorg.ops4j.pax.exam/groupId
 artifactIdpax-exam/artifactId
 version${pax-exam.version}/version
   /dependency

...

   /dependencies

 /dependencyManagement

 On Thu, Jan 3, 2013 at 1:10 AM, Bengt Rodehav be...@rodehav.com wrote:
  I've had the exact same problem and have really been scratching my
 head...
 
  Dan, can you share exactly what you put in the dependency management
  section?
 
  /Bengt
 
 
  2013/1/3 mikevan mvangeert...@comcast.net
 
  Great catch, Dan!
 
  On 1/2/2013 9:47 PM, dantran [via Karaf] wrote:
   Looks like it is a bug under pax-exam 2.4, by forcing my maven build
   to pickup pax-exam 2.6 via dependencyManagement fixes the issue.
  
   My guess karaf 2.3.1-SNAPSHOT also seeing the same issue, so upgrading
   to pax-exam 1.6 would fix it as well
  
   -Dan
  
  
   On Wed, Jan 2, 2013 at 7:13 PM, Dan Tran [hidden email]
   /user/SendEmail.jtp?type=nodenode=4027176i=0 wrote:
  
Hi
   
My pax-exam test started to fail with the following trace
   
java.lang.Exception: Could not start bundle
mvn:org.ops4j.pax.swissbox/pax-swissbox-core/ in feature(s)
exam-2.4.0: Unresolved constraint in bundle
org.ops4j.pax.swissbox.core [68]: Unable to resolve
68.0: missing requirement [68.0] osgi.wiring.package;
((osgi.wiring.package=org.ops4j.lang)(version=1.4.0))
at
  
  
 org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:476)[20:org.apache.karaf.features.core:2.3.0]
at
  
  
 org.apache.karaf.features.internal.FeaturesServiceImpl$2.run(FeaturesServiceImpl.java:1141)[20:org.apache.karaf.features.core:2.3.0]
Caused by: org.osgi.framework.BundleException: Unresolved constraint
in bundle org.ops4j.pax.swissbox.core [68]: Unable to resolve 68.0:
missing requirement [68.0] osgi.wiring.package; ((osgi.wiring.
package=org.ops4j.lang)(version=1.4.0))
at
  
  
 org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:3826)[org.apache.felix.framework-4.0.3.jar:]
at
  
  
 org.apache.felix.framework.Felix.startBundle(Felix.java:1868)[org.apache.felix.framework-4.0.3.jar:]
at
  
  
 org.apache.felix.framework.BundleImpl.start(BundleImpl.java:944)[org.apache.felix.framework-4.0.3.jar:]
at
  
  
 org.apache.felix.framework.BundleImpl.start(BundleImpl.java:931)[org.apache.felix.framework-4.0.3.jar:]
at
  
  
 org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:473)[20:org.apache.karaf.features.core:2.3.0]
... 1 more
   
   
Further investigation points to a generated feature file under my
 exam
directory with this name examfeatures.xml
   
?xml version=1.0 encoding=UTF-8?
features name=pax-exam-features-2.4.0
feature name=exam version=2.4.0
   
bundle
   start-level='5'mvn:org.ops4j.base/ops4j-base-lang/1.3.0/bundle
bundle
   start-level='5'mvn:org.ops4j.base/ops4j-base-monitors/1.3.0/bundle
bundle
start-level='5'mvn:org.ops4j.base/ops4j-base-net/1.3.0/bundle
bundle
   start-level='5'mvn:org.ops4j.base/ops4j-base-store/1.3.0/bundle
bundle
start-level='5'mvn:org.ops4j.base/ops4j-base-io/1.3.0/bundle
bundle
start-level='5'mvn:org.ops4j.base/ops4j-base-spi/1.3.0/bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.base/ops4j-base-util-property/1.3.0/bundle
bundle
   start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-core//bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-extender//bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-lifecycle//bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.swissbox/pax-swissbox-framework//bundle
bundle
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam/2.4.0/bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-extender-service/2.4.0/bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-container-rbc/2.4.0/bundle
bundle start-level='5'wrap:mvn:junit/junit/4.10/bundle
bundle
  
  
 start-level='5'mvn:org.ops4j.pax.exam/pax-exam-invoker-junit/2.4.0/bundle
bundle
  
  
 start-level='5'mvn:org.apache.karaf.tooling.exam/org.apache.karaf.tooling.exam.options/2.3.0/bundle
bundle
  
  
 start-level='5'mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.0/bundle
bundle
   start-level='5'mvn:org.ops4j.pax.exam/pax-exam-inject/2.4.0/bundle
/feature
/features
   
   
Where pax-swissbox-xxx bundles do not have associated version, by
default it would pickup the latest version 1.6 ( release on
12/26/2012), instead of 1.5.1

Re: Problems with jetty.xml

2012-12-17 Thread Bengt Rodehav
I've experimented a bit more and found something strange. I wanted to know
whether the jetty.xml is actually being read. So I changed the starting
root element Configure to xConfigure and sure enough I get the
following message in my console on startup:

*karaf@root 2012-12-17 09:01:36,481 | ERROR | g.ops4j.pax.web) |
JettyServerImpl  | e.jetty.internal.JettyServerImpl  100 |
org.xml.sax.SAXParseException: The element type xConfigure must be
terminated by the matching end-tag /xConfigure.
*

This is as expected as the jetty.xml does not contain a valid XML document.

But, then I tried to use my own configuration file (org.ops4j.pax.web.cfg)
with the following contents:

*org.apache.karaf.features.configKey=org.ops4j.pax.web*
*org.osgi.service.http.port=${seco.httpPort}*
*org.osgi.service.http.port.secure=${seco.httpsPort}*
*org.ops4j.pax.web.session.timeout=30*
*javax.servlet.context.tempdir=${karaf.data}/pax-web-jsp*
*org.ops4j.pax.web.config.file=${karaf.base}/etc/jetty2.xml*

The variables for the ports are set in custom.properties and work as thy
should. However, I do not get any error on startup. This indicates that
jetty2.xml is not parsed at all (since it contains the same error that I
put in jetty.xml). Furthermore, it also means that jetty.xml is not parsed
either since it also contains an invalid XML document.

So, for some reason, if you use your own org.ops4j.pax.web.cfg, no jetty
configuration file is read at all. Sounds like a bug somewhere.

I don't think this is the cause for my problem since I've tried to skip my
own org.ops4j.pax.web.cfg but still haven't managed to set my special
property correctly.

/Bengt



2012/12/17 Bengt Rodehav be...@rodehav.com

 Good idea,

 I already have my own org.ops4j.pax.web.cfg but it's easy to forget to
 include the org.ops4j.pax.web.config.file attribute causing jetty.xml not
 to be used at all.

 BTW do you use the metadata services? If not, I suggest to do so since
 it's then easy to look at the configuration in the web console and see all
 possible values.

 /Bengt


 2012/12/17 Jean-Baptiste Onofré j...@nanthrax.net

 FYI, in order to give more visibility to the users:

 https://issues.apache.org/**jira/browse/KARAF-2053https://issues.apache.org/jira/browse/KARAF-2053

 Regards
 JB


 On 12/17/2012 07:55 AM, Bengt Rodehav wrote:

 Thanks for the advice Freeman - I'll think about that.

 /Bengt


 2012/12/17 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 As you also have your own etc/org.ops4j.pax.web.cfg, it means it
 will override the configuration for http feature
config name=org.ops4j.pax.web
  org.osgi.service.http.port=**8181
  javax.servlet.context.tempdir=**
 ${karaf.data}/pax-web-jsp
  org.ops4j.pax.web.config.file=**
 ${karaf.base}/etc/jetty.xml
  /config

 So you need ensure your own etc/org.ops4j.pax.web.cfg has something
 like
 org.ops4j.pax.web.config.file=**Your_karaf_kit_path/etc/jetty.**xml

 So that the etc/jetty.xml could be picked up.

 Freeman
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2012-12-14, at 下午11:21, Bengt Rodehav wrote:

  Thanks for your reply Achim.

 However, I don't quite understand how this works - is this
 described somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file.
 I'm not sure how it works together with the default configuration
 in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com
 mailto:bcanhome@googlemail.**com bcanh...@googlemail.com


 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the
 jetty file and since the jetty is started in the embedded
 style you need to get a hold of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it
 doesn't work and the jetty.xml is interpreted after the server
 is configured you probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - 
 http://nierbeck.de/cgi-bin/**weblog_basic/index.php?p=165http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14

Re: Problems with jetty.xml

2012-12-17 Thread Bengt Rodehav
Perfect - it makes life easier,

/Bengt


2012/12/17 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 Pax-Web does use the Metadata service.

 regards, Achim


 2012/12/17 Bengt Rodehav be...@rodehav.com

 Good idea,

 I already have my own org.ops4j.pax.web.cfg but it's easy to forget to
 include the org.ops4j.pax.web.config.file attribute causing jetty.xml
 not to be used at all.

 BTW do you use the metadata services? If not, I suggest to do so since
 it's then easy to look at the configuration in the web console and see all
 possible values.

 /Bengt


 2012/12/17 Jean-Baptiste Onofré j...@nanthrax.net

 FYI, in order to give more visibility to the users:

 https://issues.apache.org/**jira/browse/KARAF-2053https://issues.apache.org/jira/browse/KARAF-2053

 Regards
 JB


 On 12/17/2012 07:55 AM, Bengt Rodehav wrote:

 Thanks for the advice Freeman - I'll think about that.

 /Bengt


 2012/12/17 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 As you also have your own etc/org.ops4j.pax.web.cfg, it means it
 will override the configuration for http feature
config name=org.ops4j.pax.web
  org.osgi.service.http.port=**8181
  javax.servlet.context.tempdir=**
 ${karaf.data}/pax-web-jsp
  org.ops4j.pax.web.config.file=**
 ${karaf.base}/etc/jetty.xml
  /config

 So you need ensure your own etc/org.ops4j.pax.web.cfg has something
 like
 org.ops4j.pax.web.config.file=**Your_karaf_kit_path/etc/jetty.**xml

 So that the etc/jetty.xml could be picked up.

 Freeman
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: 
 http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2012-12-14, at 下午11:21, Bengt Rodehav wrote:

  Thanks for your reply Achim.

 However, I don't quite understand how this works - is this
 described somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file.
 I'm not sure how it works together with the default configuration
 in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com
 mailto:bcanhome@googlemail.**com bcanh...@googlemail.com


 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the
 jetty file and since the jetty is started in the embedded
 style you need to get a hold of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it
 doesn't work and the jetty.xml is interpreted after the server
 is configured you probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - http://nierbeck.de/cgi-bin/**
 weblog_basic/index.php?p=165http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com


 I'm running a web application on Karaf 2.2.8. I need to
 send quite a lot of data to the server using the POST
 method. I get the following error message on the web
 browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 
 http://wiki.eclipse.org/Jetty/**Howto/Configure_Form_Sizehttp://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size
 .

 I therefore modified the etc/jetty.xml as follows:

 ...
 Configure class=org.eclipse.jetty.**server.Server
 Call name=setAttribute

 Argorg.eclipse.jetty.server.**
 Request.maxFormContentSize/**Arg
   Arg200/Arg
 /Call
 ...

 But I still get the same error message. The configuration
 hasn't changed. Am I doing this the wrong way?

 /Bengt




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web 
 http://wiki.ops4j.org/**display/paxweb/Pax+Web/http://wiki.ops4j.org/display/paxweb/Pax+Web/
 
 Committer  Project Lead
 OPS4J Pax for Vaadin
 
 http://team.ops4j.org/wiki/**display/PAXVAADIN/Homehttp://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter 
 Project Lead
 blog http://notizblog.nierbeck.de/**





 --
 Jean-Baptiste

Re: Problems with jetty.xml

2012-12-17 Thread Bengt Rodehav
I've made some more research. I found this line in Jetty's ContexHandler
class:

*private int _maxFormContentSize =
Integer.getInteger(org.eclipse.jetty.server.Request.maxFormContentSize,20).intValue();
*

This implies that the default value is taken from a system property. So, I
set that system property to a higher value and it worked. I guess this is a
good-enough workaround for me.

I think it should also be possible to override the system property by
setting the corresponding attribute on the server. But adding the following
does not seem to work:

Call name=setAttribute
  Argorg.eclipse.jetty.server.Request.maxFormContentSize/Arg
  Arg200/Arg
/Call

I think that in a normal Jetty server this would work but I can't seem to
get it to work with Pax-Web.

The issue with not reading the jetty configuration file when you provide
your own org.ops4j.pax.web.cfg I believe must be fixed.

/Bengt


2012/12/17 Bengt Rodehav be...@rodehav.com

 Perfect - it makes life easier,

 /Bengt


 2012/12/17 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 Pax-Web does use the Metadata service.

 regards, Achim


 2012/12/17 Bengt Rodehav be...@rodehav.com

 Good idea,

 I already have my own org.ops4j.pax.web.cfg but it's easy to forget to
 include the org.ops4j.pax.web.config.file attribute causing jetty.xml
 not to be used at all.

 BTW do you use the metadata services? If not, I suggest to do so since
 it's then easy to look at the configuration in the web console and see all
 possible values.

 /Bengt


 2012/12/17 Jean-Baptiste Onofré j...@nanthrax.net

 FYI, in order to give more visibility to the users:

 https://issues.apache.org/**jira/browse/KARAF-2053https://issues.apache.org/jira/browse/KARAF-2053

 Regards
 JB


 On 12/17/2012 07:55 AM, Bengt Rodehav wrote:

 Thanks for the advice Freeman - I'll think about that.

 /Bengt


 2012/12/17 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 As you also have your own etc/org.ops4j.pax.web.cfg, it means it
 will override the configuration for http feature
config name=org.ops4j.pax.web
  org.osgi.service.http.port=**8181
  javax.servlet.context.tempdir=**
 ${karaf.data}/pax-web-jsp
  org.ops4j.pax.web.config.file=**
 ${karaf.base}/etc/jetty.xml
  /config

 So you need ensure your own etc/org.ops4j.pax.web.cfg has
 something like
 org.ops4j.pax.web.config.file=**Your_karaf_kit_path/etc/jetty.**
 xml

 So that the etc/jetty.xml could be picked up.

 Freeman
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: 
 http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2012-12-14, at 下午11:21, Bengt Rodehav wrote:

  Thanks for your reply Achim.

 However, I don't quite understand how this works - is this
 described somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file.
 I'm not sure how it works together with the default configuration
 in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com
 mailto:bcanhome@googlemail.**com bcanh...@googlemail.com


 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the
 jetty file and since the jetty is started in the embedded
 style you need to get a hold of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it
 doesn't work and the jetty.xml is interpreted after the server
 is configured you probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - http://nierbeck.de/cgi-bin/**
 weblog_basic/index.php?p=165http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com


 I'm running a web application on Karaf 2.2.8. I need to
 send quite a lot of data to the server using the POST
 method. I get the following error message on the web
 browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 
 http://wiki.eclipse.org/Jetty/**Howto/Configure_Form_Sizehttp://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size

Re: Problems with jetty.xml

2012-12-16 Thread Bengt Rodehav
Thanks for the advice Freeman - I'll think about that.

/Bengt


2012/12/17 Freeman Fang freeman.f...@gmail.com

 Hi,

 As you also have your own etc/org.ops4j.pax.web.cfg, it means it will
 override the configuration for http feature
   config name=org.ops4j.pax.web
 org.osgi.service.http.port=8181
 javax.servlet.context.tempdir=${karaf.data}/pax-web-jsp
 org.ops4j.pax.web.config.file=${karaf.base}/etc/jetty.xml
 /config

 So you need ensure your own etc/org.ops4j.pax.web.cfg has something like
 org.ops4j.pax.web.config.file=Your_karaf_kit_path/etc/jetty.xml

 So that the etc/jetty.xml could be picked up.

 Freeman
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.com
 http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2012-12-14, at 下午11:21, Bengt Rodehav wrote:

 Thanks for your reply Achim.

 However, I don't quite understand how this works - is this described
 somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file. I'm not
 sure how it works together with the default configuration in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the jetty file and
 since the jetty is started in the embedded style you need to get a hold
 of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it doesn't work
 and the jetty.xml is interpreted after the server is configured you
 probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com

 I'm running a web application on Karaf 2.2.8. I need to send quite a lot
 of data to the server using the POST method. I get the following error
 message on the web browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 http://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size.

 I therefore modified the etc/jetty.xml as follows:

 ...
 Configure class=org.eclipse.jetty.server.Server
 Call name=setAttribute
   Argorg.eclipse.jetty.server.Request.maxFormContentSize/Arg
   Arg200/Arg
 /Call
 ...

 But I still get the same error message. The configuration hasn't
 changed. Am I doing this the wrong way?

 /Bengt




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer
  Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/






Re: Problems with jetty.xml

2012-12-16 Thread Bengt Rodehav
Good idea,

I already have my own org.ops4j.pax.web.cfg but it's easy to forget to
include the org.ops4j.pax.web.config.file attribute causing jetty.xml not
to be used at all.

BTW do you use the metadata services? If not, I suggest to do so since it's
then easy to look at the configuration in the web console and see all
possible values.

/Bengt


2012/12/17 Jean-Baptiste Onofré j...@nanthrax.net

 FYI, in order to give more visibility to the users:

 https://issues.apache.org/**jira/browse/KARAF-2053https://issues.apache.org/jira/browse/KARAF-2053

 Regards
 JB


 On 12/17/2012 07:55 AM, Bengt Rodehav wrote:

 Thanks for the advice Freeman - I'll think about that.

 /Bengt


 2012/12/17 Freeman Fang freeman.f...@gmail.com
 mailto:freeman.f...@gmail.com**


 Hi,

 As you also have your own etc/org.ops4j.pax.web.cfg, it means it
 will override the configuration for http feature
config name=org.ops4j.pax.web
  org.osgi.service.http.port=**8181
  javax.servlet.context.tempdir=**
 ${karaf.data}/pax-web-jsp
  org.ops4j.pax.web.config.file=**
 ${karaf.base}/etc/jetty.xml
  /config

 So you need ensure your own etc/org.ops4j.pax.web.cfg has something
 like
 org.ops4j.pax.web.config.file=**Your_karaf_kit_path/etc/jetty.**xml

 So that the etc/jetty.xml could be picked up.

 Freeman
 -
 Freeman(Yue) Fang

 Red Hat, Inc.
 FuseSource is now part of Red Hat
 Web: http://fusesource.com | http://www.redhat.com/
 Twitter: freemanfang
 Blog: http://freemanfang.blogspot.**comhttp://freemanfang.blogspot.com
 
 http://blog.sina.com.cn/u/**1473905042http://blog.sina.com.cn/u/1473905042
 weibo: @Freeman小屋

 On 2012-12-14, at 下午11:21, Bengt Rodehav wrote:

  Thanks for your reply Achim.

 However, I don't quite understand how this works - is this
 described somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file.
 I'm not sure how it works together with the default configuration
 in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com
 mailto:bcanhome@googlemail.**com bcanh...@googlemail.com


 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the
 jetty file and since the jetty is started in the embedded
 style you need to get a hold of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it
 doesn't work and the jetty.xml is interpreted after the server
 is configured you probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - 
 http://nierbeck.de/cgi-bin/**weblog_basic/index.php?p=165http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com


 I'm running a web application on Karaf 2.2.8. I need to
 send quite a lot of data to the server using the POST
 method. I get the following error message on the web
 browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 
 http://wiki.eclipse.org/Jetty/**Howto/Configure_Form_Sizehttp://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size
 .

 I therefore modified the etc/jetty.xml as follows:

 ...
 Configure class=org.eclipse.jetty.**server.Server
 Call name=setAttribute

 Argorg.eclipse.jetty.server.**Request.maxFormContentSize/
 **Arg
   Arg200/Arg
 /Call
 ...

 But I still get the same error message. The configuration
 hasn't changed. Am I doing this the wrong way?

 /Bengt




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web 
 http://wiki.ops4j.org/**display/paxweb/Pax+Web/http://wiki.ops4j.org/display/paxweb/Pax+Web/
 
 Committer  Project Lead
 OPS4J Pax for Vaadin
 
 http://team.ops4j.org/wiki/**display/PAXVAADIN/Homehttp://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter 
 Project Lead
 blog http://notizblog.nierbeck.de/**





 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com



Re: Problems with jetty.xml

2012-12-15 Thread Bengt Rodehav
Thanks for the explanation Achim. Will see if I can get further now.

/Bengt


2012/12/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 let's try to clarify this. Pax-Web starts the jetty container and
 configures it by
 1) using the configuration through the configuration admin service. As
 it's a requirement by the OSGi spec that the service port is configured
 that way.
 2) reading the jetty.xml file either through the configured config.file
 property or if available from the class-space (attached by a fragment)

 So if pax-web is running in Karaf you can stick to the jetty.xml in the
 etc folder, you don't need a fragment bundle for this case.

 The configuration you have in your cfg file will override the
 configuration of the feature file.

 Finally to your last question of how to use the jetty.xml file.
 Basically it's the way described at [1]. For certain configurations you
 need to change the way to get a hold on it, cause the jetty.xml file is
 interpreted after the server has already been configured. So basically it's
 a re-configuration of the existing instance so if there are examples on
 how to use it for jetty-web.xml you should try that kind of configuration
 also. :)

 regards, Achim


 2012/12/14 Bengt Rodehav be...@rodehav.com

 Thanks for your reply Achim.

 However, I don't quite understand how this works - is this described
 somewhere? Things that would be nice to understand are:

 - What role does the jettyconfig file has?
 - What role does etc/jetty.xml has? Is it generated?
 - How is the final jetty configuration built up?
 - When do I have to use a fragment (as described on the wiki)?

 To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file. I'm not
 sure how it works together with the default configuration in the feature.

 Just trying to get a grasp on this...

 /Bengt




 2012/12/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the jetty file
 and since the jetty is started in the embedded style you need to get a
 hold of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it doesn't work
 and the jetty.xml is interpreted after the server is configured you
 probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com

 I'm running a web application on Karaf 2.2.8. I need to send quite a
 lot of data to the server using the POST method. I get the following error
 message on the web browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 http://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size.

 I therefore modified the etc/jetty.xml as follows:

 ...
 Configure class=org.eclipse.jetty.server.Server
 Call name=setAttribute
   Argorg.eclipse.jetty.server.Request.maxFormContentSize/Arg
   Arg200/Arg
 /Call
 ...

 But I still get the same error message. The configuration hasn't
 changed. Am I doing this the wrong way?

 /Bengt




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer
  Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/





 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer 
 Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/



Re: Using variables in org.apache.karaf.features.cfg

2012-12-14 Thread Bengt Rodehav
https://issues.apache.org/jira/browse/KARAF-2060

Thanks,

/Bengt


2012/12/13 Andreas Pieber anpie...@gmail.com

 I meant xxx.logging.cfg and custom.properties; was just at a customer with
 no local karaf to check :-)

 OK, can you please provide a bug report? I can check on it tomorrow.

 Kind regards,
 Andreas


 On Thu, Dec 13, 2012 at 2:22 PM, Bengt Rodehav be...@rodehav.com wrote:

 Hello Andreas,

 Yes, the example with the features file (org.apache.karaf.features.cfg)
 doesn't work but should.

 I'm not sure what the log.cfg and the custom.settings files are. However,
 I do this in my org.ops4j.pax.logging.cfg:

 log4j.appender.info.file=${logdir}/info.log

 And I put this in the custom.properties:

 logdir=data/log

 I use that mechanism in several configuration files, e g:

 - org.ops4j.pax.web.cfg
 - org.apache.karaf.management.cfg
 - org.apache.karaf.shell.cfg

 The above gives me the possiblitly to manage all the ports in one place
 (custom.properties) - which is very convenient.

 But for some reason this mechanism doesn't work
 for org.apache.karaf.features.cfg.

 /Bengt





 2012/12/13 Andreas Pieber anpie...@gmail.com

 well, checking the code I would say there's no difference to the other
 .cfg files. I'm even not sure if it's a bootstrap error. How exactly can I
 reproduce the problem?

 in the features file:
 featuresRepositories = ${var}

 and in custom.properties
 var =
 mvn:org.apache.karaf.features/standard/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/enterprise/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/spring/3.0.0-SNAPSHOT/xml/features

 this shouldn't work, but e.g.

 in log.cfg
 pattern = ${abc}

 and in custom.settings
 abc = %d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %X{bundle.id} -
 %X{bundle.name} - %X{bundle.version} | %m%n

 works?

 Kind regards,
 Andreas



 On Thu, Dec 13, 2012 at 8:22 AM, Bengt Rodehav be...@rodehav.comwrote:

 What I've tried to do in org.apache.karaf.features.cfg works with other
 configuration files. File install does support this. It's the way I handle
 most of my tailored configurations in my custom server.

 However, there seem to be something special
 with org.apache.karaf.features.cfg since the same mechanisms dont  work
 there. That's why I wondered whether file install was used for installing
 the features feature or if it was done by some other means. Could it be a
 bootstrap problem?

 /Bengt


 2012/12/12 Andreas Pieber anpie...@gmail.com

 Hey,

 I'm afraid this is currently not really possible. The
 custom.properties is written into the System.setProperty while the
 fileinstall (but I've only checked the code only shortly) does not access
 this sort. I think to make this available would require a patch to
 fileinstall.

 @Everybody with more knowhow about the fileinstall internals: feel
 free to correct me :-)

 Kind regards,
 Andreas


 On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.comwrote:

 I have a use case where I want to move the list of boot features (the
 featuresBoot property) from org.apache.karaf.features.cfg into
 custom.properties. The reason is that our custom server comes with a
 great number of features but each customer only uses some of them. To 
 allow
 for easy customisation (and upgrades) I put everything related to a
 specific installation in a custom.properties file (that I put outside the
 Karaf home directory). I can then  easily see how this installation is
 customised and I can easily upgrade by simply replacing the entire Karaf
 installation and keep the customisation (since it is located outside 
 Karaf).

 However, it seems I cannot use variables defined in
 custom.properties in org.apache.karaf.features.cfg. In fact, I cannot
 even define a variable in org.apache.karaf.features.cfg and then use it
 in the same file.

 How come? Isn't FileInstall used for
 parsing org.apache.karaf.features.cfg?

 How can I use custom variables in org.apache.karaf.features.cfg?

 I use Karaf 2.3.0 with Java 6 on Windows 7.

 /Bengt









Problems with jetty.xml

2012-12-14 Thread Bengt Rodehav
I'm running a web application on Karaf 2.2.8. I need to send quite a lot of
data to the server using the POST method. I get the following error message
on the web browser side:

Form too large15920

After googling I found how to reconfigure this on
http://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size.

I therefore modified the etc/jetty.xml as follows:

...
Configure class=org.eclipse.jetty.server.Server
Call name=setAttribute
  Argorg.eclipse.jetty.server.Request.maxFormContentSize/Arg
  Arg200/Arg
/Call
...

But I still get the same error message. The configuration hasn't changed.
Am I doing this the wrong way?

/Bengt


Re: Problems with jetty.xml

2012-12-14 Thread Bengt Rodehav
Thanks for your reply Achim.

However, I don't quite understand how this works - is this described
somewhere? Things that would be nice to understand are:

- What role does the jettyconfig file has?
- What role does etc/jetty.xml has? Is it generated?
- How is the final jetty configuration built up?
- When do I have to use a fragment (as described on the wiki)?

To top it off I also have my ownd etc/org.ops4j.pax.web.cfg file. I'm not
sure how it works together with the default configuration in the feature.

Just trying to get a grasp on this...

/Bengt




2012/12/14 Achim Nierbeck bcanh...@googlemail.com

 Hi Bengt,

 since the Jetty.xml isn't the lead configuration for the jetty file and
 since the jetty is started in the embedded style you need to get a hold
 of this a bit different, or
 you use a jetty-web.xml file.

 I'm not sure about the right syntax right now, but since it doesn't work
 and the jetty.xml is interpreted after the server is configured you
 probably need some getAttribute first.
 A maybe not so good matching example can be found at [1]

 regards, Achim

 [1] - http://nierbeck.de/cgi-bin/weblog_basic/index.php?p=165



 2012/12/14 Bengt Rodehav be...@rodehav.com

 I'm running a web application on Karaf 2.2.8. I need to send quite a lot
 of data to the server using the POST method. I get the following error
 message on the web browser side:

 Form too large15920

 After googling I found how to reconfigure this on
 http://wiki.eclipse.org/Jetty/Howto/Configure_Form_Size.

 I therefore modified the etc/jetty.xml as follows:

 ...
 Configure class=org.eclipse.jetty.server.Server
 Call name=setAttribute
   Argorg.eclipse.jetty.server.Request.maxFormContentSize/Arg
   Arg200/Arg
 /Call
 ...

 But I still get the same error message. The configuration hasn't changed.
 Am I doing this the wrong way?

 /Bengt




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer 
 Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/



Re: Using variables in org.apache.karaf.features.cfg

2012-12-13 Thread Bengt Rodehav
Hello Andreas,

Yes, the example with the features file (org.apache.karaf.features.cfg)
doesn't work but should.

I'm not sure what the log.cfg and the custom.settings files are. However, I
do this in my org.ops4j.pax.logging.cfg:

log4j.appender.info.file=${logdir}/info.log

And I put this in the custom.properties:

logdir=data/log

I use that mechanism in several configuration files, e g:

- org.ops4j.pax.web.cfg
- org.apache.karaf.management.cfg
- org.apache.karaf.shell.cfg

The above gives me the possiblitly to manage all the ports in one place
(custom.properties) - which is very convenient.

But for some reason this mechanism doesn't work
for org.apache.karaf.features.cfg.

/Bengt





2012/12/13 Andreas Pieber anpie...@gmail.com

 well, checking the code I would say there's no difference to the other
 .cfg files. I'm even not sure if it's a bootstrap error. How exactly can I
 reproduce the problem?

 in the features file:
 featuresRepositories = ${var}

 and in custom.properties
 var =
 mvn:org.apache.karaf.features/standard/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/enterprise/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/spring/3.0.0-SNAPSHOT/xml/features

 this shouldn't work, but e.g.

 in log.cfg
 pattern = ${abc}

 and in custom.settings
 abc = %d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %X{bundle.id} - %X{
 bundle.name} - %X{bundle.version} | %m%n

 works?

 Kind regards,
 Andreas



 On Thu, Dec 13, 2012 at 8:22 AM, Bengt Rodehav be...@rodehav.com wrote:

 What I've tried to do in org.apache.karaf.features.cfg works with other
 configuration files. File install does support this. It's the way I handle
 most of my tailored configurations in my custom server.

 However, there seem to be something special
 with org.apache.karaf.features.cfg since the same mechanisms dont  work
 there. That's why I wondered whether file install was used for installing
 the features feature or if it was done by some other means. Could it be a
 bootstrap problem?

 /Bengt


 2012/12/12 Andreas Pieber anpie...@gmail.com

 Hey,

 I'm afraid this is currently not really possible. The custom.properties
 is written into the System.setProperty while the fileinstall (but I've only
 checked the code only shortly) does not access this sort. I think to make
 this available would require a patch to fileinstall.

 @Everybody with more knowhow about the fileinstall internals: feel free
 to correct me :-)

 Kind regards,
 Andreas


 On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.comwrote:

 I have a use case where I want to move the list of boot features (the
 featuresBoot property) from org.apache.karaf.features.cfg into
 custom.properties. The reason is that our custom server comes with a
 great number of features but each customer only uses some of them. To allow
 for easy customisation (and upgrades) I put everything related to a
 specific installation in a custom.properties file (that I put outside the
 Karaf home directory). I can then  easily see how this installation is
 customised and I can easily upgrade by simply replacing the entire Karaf
 installation and keep the customisation (since it is located outside 
 Karaf).

 However, it seems I cannot use variables defined in custom.properties
 in org.apache.karaf.features.cfg. In fact, I cannot even define a
 variable in org.apache.karaf.features.cfg and then use it in the same
 file.

 How come? Isn't FileInstall used for
 parsing org.apache.karaf.features.cfg?

 How can I use custom variables in org.apache.karaf.features.cfg?

 I use Karaf 2.3.0 with Java 6 on Windows 7.

 /Bengt







Re: Using variables in org.apache.karaf.features.cfg

2012-12-12 Thread Bengt Rodehav
What I've tried to do in org.apache.karaf.features.cfg works with other
configuration files. File install does support this. It's the way I handle
most of my tailored configurations in my custom server.

However, there seem to be something special
with org.apache.karaf.features.cfg since the same mechanisms dont  work
there. That's why I wondered whether file install was used for installing
the features feature or if it was done by some other means. Could it be a
bootstrap problem?

/Bengt


2012/12/12 Andreas Pieber anpie...@gmail.com

 Hey,

 I'm afraid this is currently not really possible. The custom.properties is
 written into the System.setProperty while the fileinstall (but I've only
 checked the code only shortly) does not access this sort. I think to make
 this available would require a patch to fileinstall.

 @Everybody with more knowhow about the fileinstall internals: feel free to
 correct me :-)

 Kind regards,
 Andreas


 On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.com wrote:

 I have a use case where I want to move the list of boot features (the
 featuresBoot property) from org.apache.karaf.features.cfg into
 custom.properties. The reason is that our custom server comes with a
 great number of features but each customer only uses some of them. To allow
 for easy customisation (and upgrades) I put everything related to a
 specific installation in a custom.properties file (that I put outside the
 Karaf home directory). I can then  easily see how this installation is
 customised and I can easily upgrade by simply replacing the entire Karaf
 installation and keep the customisation (since it is located outside Karaf).

 However, it seems I cannot use variables defined in custom.properties
 in org.apache.karaf.features.cfg. In fact, I cannot even define a
 variable in org.apache.karaf.features.cfg and then use it in the same
 file.

 How come? Isn't FileInstall used for
 parsing org.apache.karaf.features.cfg?

 How can I use custom variables in org.apache.karaf.features.cfg?

 I use Karaf 2.3.0 with Java 6 on Windows 7.

 /Bengt





Using variables in org.apache.karaf.features.cfg

2012-12-11 Thread Bengt Rodehav
I have a use case where I want to move the list of boot features (the
featuresBoot property) from org.apache.karaf.features.cfg into
custom.properties. The reason is that our custom server comes with a
great number of features but each customer only uses some of them. To allow
for easy customisation (and upgrades) I put everything related to a
specific installation in a custom.properties file (that I put outside the
Karaf home directory). I can then  easily see how this installation is
customised and I can easily upgrade by simply replacing the entire Karaf
installation and keep the customisation (since it is located outside Karaf).

However, it seems I cannot use variables defined in custom.properties
in org.apache.karaf.features.cfg. In fact, I cannot even define a
variable in org.apache.karaf.features.cfg and then use it in the same
file.

How come? Isn't FileInstall used for
parsing org.apache.karaf.features.cfg?

How can I use custom variables in org.apache.karaf.features.cfg?

I use Karaf 2.3.0 with Java 6 on Windows 7.

/Bengt


Strange log and startup problems with Karaf 2.3.0

2012-11-26 Thread Bengt Rodehav
I have the strangest of problems on one of my installations. I have a
custom server based on Karaf 2.3.0. I have installed it in several places
with no problem. But, on one of our virtual servers running Windows Server
2003 SP2, I cannot start Karaf correctly due to the following:

log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to a
org.apache.log4j.Layout variable.
log4j:ERROR The class org.apache.log4j.Layout was loaded by
log4j:ERROR [org.apache.felix.framework.BundleWiringImpl@73305c] whereas
object of type
log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
[sun.misc.Launcher$AppClassLoader@360be0].
log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to a
org.apache.log4j.Layout variable.
log4j:ERROR The class org.apache.log4j.Layout was loaded by
log4j:ERROR [org.apache.felix.framework.BundleWiringImpl@73305c] whereas
object of type
log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
[sun.misc.Launcher$AppClassLoader@360be0].
log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to a
org.apache.log4j.Layout variable.
log4j:ERROR The class org.apache.log4j.Layout was loaded by
log4j:ERROR [org.apache.felix.framework.BundleWiringImpl@73305c] whereas
object of type
log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
[sun.misc.Launcher$AppClassLoader@360be0].

I then tried with a standard Karaf 2.3.0 and got the exact same error.

The above errors only show up in the console since the logging system
cannot initialize properly. It seems like a strange classloading issue but
I fail to understand how it can happen. It looks like the JVM itself has
loaded log4j classes making them incompatible with the real ones.

Has anyone seen this before? Any clues?

It's 32 bit Windows and I've tried with java 1.6.0_29 as well as 1.6.0_37.

/Bengt


Re: Strange log and startup problems with Karaf 2.3.0

2012-11-26 Thread Bengt Rodehav
JB and Achim,

I've tried with a standard Karaf 2.3.0 - without any customizations - and I
still get this problem. Therefore, the etc/org.ops4j.pax.logging.cfg and
the etc/jre.properties are the ones bundled with Karaf.

There seems to be no other bundle exporting log4j:

karaf@root exports | grep -i log4j
 4 org.apache.log4j; version=1.2.15
 4 org.apache.log4j.spi; version=1.2.15
 4 org.apache.log4j.xml; version=1.2.15

Bundle #4 is the pax-logging-api bundle (version 1.7.0).

I realize that there must be a problem - or at least somethning very
unusual - with the server I'm trying to install to since it works
everywhere else. Just can't figure out what...

/Bengt



2012/11/26 Achim Nierbeck bcanh...@googlemail.com

 Another possibility, is there another bundle installed that also exports
 log4j classes?

 you'll be able to find such bundles with a packages:exports on the shell

 regards, Achim


 2012/11/26 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 it may require some tweak on the etc/jre.properties, to prevent the JVM
 to load some classes.

 Could you share your etc/org.ops4j.pax.logging.cfg file to try to
 reproduce the issue ?

 Thanks,
 Regards
 JB


 On 11/26/2012 09:34 AM, Bengt Rodehav wrote:

 I have the strangest of problems on one of my installations. I have a
 custom server based on Karaf 2.3.0. I have installed it in several
 places with no problem. But, on one of our virtual servers running
 Windows Server 2003 SP2, I cannot start Karaf correctly due to the
 following:

 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].
 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].
 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].

 I then tried with a standard Karaf 2.3.0 and got the exact same error.

 The above errors only show up in the console since the logging system
 cannot initialize properly. It seems like a strange classloading issue
 but I fail to understand how it can happen. It looks like the JVM itself
 has loaded log4j classes making them incompatible with the real ones.

 Has anyone seen this before? Any clues?

 It's 32 bit Windows and I've tried with java 1.6.0_29 as well as
 1.6.0_37.

 /Bengt


 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer 
 Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/



Re: Strange log and startup problems with Karaf 2.3.0

2012-11-26 Thread Bengt Rodehav
Interestingly, if I use Equinox instead of Felix, Karaf starts with no
problems. I tried this since the error messages indicate that there seem to
be some problem between Felix classloading and the JVM default class loader
(I think).

/Bengt


2012/11/26 Bengt Rodehav be...@rodehav.com

 JB and Achim,

 I've tried with a standard Karaf 2.3.0 - without any customizations - and
 I still get this problem. Therefore, the etc/org.ops4j.pax.logging.cfg and
 the etc/jre.properties are the ones bundled with Karaf.

 There seems to be no other bundle exporting log4j:

 karaf@root exports | grep -i log4j
  4 org.apache.log4j; version=1.2.15
  4 org.apache.log4j.spi; version=1.2.15
  4 org.apache.log4j.xml; version=1.2.15

 Bundle #4 is the pax-logging-api bundle (version 1.7.0).

 I realize that there must be a problem - or at least somethning very
 unusual - with the server I'm trying to install to since it works
 everywhere else. Just can't figure out what...

 /Bengt



 2012/11/26 Achim Nierbeck bcanh...@googlemail.com

 Another possibility, is there another bundle installed that also exports
 log4j classes?

 you'll be able to find such bundles with a packages:exports on the shell

 regards, Achim


 2012/11/26 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 it may require some tweak on the etc/jre.properties, to prevent the JVM
 to load some classes.

 Could you share your etc/org.ops4j.pax.logging.cfg file to try to
 reproduce the issue ?

 Thanks,
 Regards
 JB


 On 11/26/2012 09:34 AM, Bengt Rodehav wrote:

 I have the strangest of problems on one of my installations. I have a
 custom server based on Karaf 2.3.0. I have installed it in several
 places with no problem. But, on one of our virtual servers running
 Windows Server 2003 SP2, I cannot start Karaf correctly due to the
 following:

 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].
 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].
 log4j:ERROR A org.apache.log4j.TTCCLayout object is not assignable to
 a org.apache.log4j.Layout variable.
 log4j:ERROR The class org.apache.log4j.Layout was loaded by
 log4j:ERROR [org.apache.felix.framework.**BundleWiringImpl@73305c]
 whereas
 object of type
 log4j:ERROR org.apache.log4j.TTCCLayout was loaded by
 [sun.misc.Launcher$**AppClassLoader@360be0].

 I then tried with a standard Karaf 2.3.0 and got the exact same error.

 The above errors only show up in the console since the logging system
 cannot initialize properly. It seems like a strange classloading issue
 but I fail to understand how it can happen. It looks like the JVM itself
 has loaded log4j classes making them incompatible with the real ones.

 Has anyone seen this before? Any clues?

 It's 32 bit Windows and I've tried with java 1.6.0_29 as well as
 1.6.0_37.

 /Bengt


 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com




 --

 Apache Karaf http://karaf.apache.org/ Committer  PMC
 OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ Committer
  Project Lead
 OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home
 Commiter  Project Lead
 blog http://notizblog.nierbeck.de/





Re: Xpath in Karaf

2012-11-15 Thread Bengt Rodehav
I have solved this problem now. I posted my findings on the Camel mailing
list.

http://osdir.com/ml/users-camel-apache/2012-11/msg00384.html


/Bengt


2012/11/13 Bengt Rodehav be...@rodehav.com

 FYI, I just posted a question on the Camel list as well although it was
 more about how to get Saxon to work with Camel.

 /Bengt


 2012/11/13 Bengt Rodehav be...@rodehav.com

 I'm trying to use xpath from Camel 2.10.2 in Karaf 2.3.0. I get the
 following exception:

 *2012-11-13 13:20:38,307 | ERROR | rfaces/fundorder |
 DefaultErrorHandler  | rg.apache.camel.util.CamelLogger  215 |
 Failed delivery for (MessageId: ID-IT-D-FQR815J-56524-1352809143728-0-1 on
 ExchangeId: ID-IT-D-FQR815J-56524-1352809143728-0-2). Exhausted after
 delivery attempt: 1 caught: org.apache.camel.RuntimeExpressionException:
 Cannot create xpath expression. Processed by failure processor:
 FatalFallbackErrorHandler[Channel[Wrap[se.digia.connect.service.fundorder.FundOrderService$NotificationProcessor@7a587427]
 -
 se.digia.connect.service.fundorder.FundOrderService$NotificationProcessor@7a587427
 ]]*
 *org.apache.camel.RuntimeExpressionException: Cannot create xpath
 expression*
 * at
 org.apache.camel.builder.xml.XPathBuilder.evaluateAs(XPathBuilder.java:689)[114:org.apache.camel.camel-core:2.10.2]
 *
 *...*
 * at java.lang.Thread.run(Thread.java:662)[:1.6.0_32]*
 *Caused by: java.lang.RuntimeException: XPathFactory#newInstance()
 failed to create an XPathFactory for the default object model:
 http://java.sun.com/jaxp/xpath/dom with the
 XPathFactoryConfigurationException:
 javax.xml.xpath.XPathFactoryConfigurationException: No XPathFctory
 implementation found for the object model:
 http://java.sun.com/jaxp/xpath/dom*
 * at javax.xml.xpath.XPathFactory.newInstance(Unknown Source)[:2.1.0]*
 * at
 org.apache.camel.builder.xml.XPathBuilder.initDefaultXPathFactory(XPathBuilder.java:1046)[114:org.apache.camel.camel-core:2.10.2]
 *
 * at
 org.apache.camel.builder.xml.XPathBuilder.getXPathFactory(XPathBuilder.java:424)[114:org.apache.camel.camel-core:2.10.2]
 *
 * at
 org.apache.camel.builder.xml.XPathBuilder.createXPathExpression(XPathBuilder.java:829)[114:org.apache.camel.camel-core:2.10.2]
 *
 * at
 org.apache.camel.builder.xml.XPathBuilder.evaluateAs(XPathBuilder.java:685)[114:org.apache.camel.camel-core:2.10.2]
 *
 * ... 44 more*

 I haven't posted this on the Camel list yet because I have a hunch that
 this is about OSGi/Karaf.

 What xpath implementation should be used under Karaf? Am I required to do
 something in a bundle that uses xpath (like importing certain packages?)

 /Bengt





Re: dev:watch problems

2012-11-10 Thread Bengt Rodehav
Good morning Andreas.

I have a Nexus repository specified in my settings.xml - could that be a
problem? However, doing an update 97 works fine showing that the bundle
location can be found. In this case the bundle resides in my local maven
repo (not in Nexus).

/Bengt
Den 10 nov 2012 07:31 skrev Andreas Pieber anpie...@gmail.com:

 Hey Bengt,

 I've just checked again, but I can confirm that dev:watch bascially does
 what it should do. How do you install your bundles? Have you configured any
 alternative maven repositories? Any other unusual settings?

 Kind regards,
 Andreas


 On Fri, Nov 9, 2012 at 4:47 PM, Bengt Rodehav be...@rodehav.com wrote:

 Thanks,

 /Bengt


 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net

 OK thanks for the update, I take a look just after your other issue ;)

 Regards
 JB


 On 11/09/2012 04:39 PM, Bengt Rodehav wrote:

 I get the exact same results using Karaf 2.2.9.

 /Bengt


 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net mailto:j...@nanthrax.net
 


 It should be do automatically.

 Could you test the same with Karaf 2.2.9 ?

 Regards
 JB


 On 11/09/2012 02:58 PM, Bengt Rodehav wrote:

 I tried dev:watch * but the bundle still doesn't get updated.

 BTW do I need to execute dev:watch --start or is it being done
 automatically after I've done dev:watch 97?

 /Bengt


 2012/11/9 Andreas Pieber anpie...@gmail.com
 mailto:anpie...@gmail.com mailto:anpie...@gmail.com

 mailto:anpie...@gmail.com


  good question. Does a dev:watch * works as expected?

  Kind regards,
  Andreas


  On Fri, Nov 9, 2012 at 9:40 AM, Bengt Rodehav
 be...@rodehav.com mailto:be...@rodehav.com
  mailto:be...@rodehav.com mailto:be...@rodehav.com
 wrote:

  It looks like this in the log:

  /2012-11-09 09:34:21,416 | DEBUG | Thread-50|

  BundleWatcher|
  af.shell.dev.watch.__**BundleWatcher   81 | Bundle

 watcher thread
  started/
  /2012-11-09 09:34:21,416 | DEBUG | Thread-50|
  configadmin  | ?
 ? | getProperties()/
  /2012-11-09 09:34:21,421 | DEBUG | lixDispatchQueue |
 framework
  | ?
?
  | FrameworkEvent PACKAGES REFRESHED -
 org.apache.felix.framework/
  /2012-11-09 09:34:22,421 | DEBUG | Thread-50|
  configadmin  | ?
 ? | getProperties()/
  /2012-11-09 09:34:22,421 | DEBUG | lixDispatchQueue |
 framework
  | ?
?
  | FrameworkEvent PACKAGES REFRESHED -
 org.apache.felix.framework/
  /2012-11-09 09:34:23,421 | DEBUG | Thread-50|
  configadmin  | ?
 ? | getProperties()/
  /2012-11-09 09:34:23,421 | DEBUG | lixDispatchQueue |
 framework
  | ?
?
  | FrameworkEvent PACKAGES REFRESHED -
 org.apache.felix.framework/


  Thus, every second the package
 org.apache.felix.framework seems
  to be refreshed. Nothing about bundle 97 though. When I
 manually
  do an update 97, the bundle is refreshed properly.

  BTW, I'm running on Windows 7.

  /Bengt



  2012/11/9 j...@nanthrax.net mailto:j...@nanthrax.net
 mailto:j...@nanthrax.net mailto:j...@nanthrax.net
  j...@nanthrax.net mailto:j...@nanthrax.net
 mailto:j...@nanthrax.net mailto:j...@nanthrax.net



  Hi,

  Do you have something in the log ?

  Regards
  JB

  --
  Jean-Baptiste Onofré
 jbono...@apache.org mailto:jbono...@apache.org
 mailto:jbono...@apache.org mailto:jbono...@apache.org


 http://blog.nanthrax.net
  Talend - http://wwx.talend.com


  - Reply message -
  From: Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com
  mailto:be...@rodehav.com mailto:
 be...@rodehav.com
  To: user@karaf.apache.org
 mailto:user@karaf.apache.org mailto:user@karaf.apache.org
 mailto:user@karaf.apache.org**__
  Subject: dev:watch problems
  Date: Fri, Nov 9, 2012 8:55 am


  I'm trying to get the dev:watch command to work
 but I
  haven't succeeded

Re: dev:watch problems

2012-11-10 Thread Bengt Rodehav
Andreas and JB,

I'll look further into this when I'm back to work (that's where I have
Nexus). I'll temporarily disable Nexus by editing my settings.xml. That
will prove whether this is a Nexus problem or not.

However, JB, a mvn install works the same way regardless if you're using
Nexus or not. The artifact is installed in the local repo. In order to
publish to Nexus you do a mvn deploy. But when retrieving an artifact and
Maven cannot find it in the local repo, it will ask Nexus. Regarding
snapshot versions (which this is), I think maven will download snapshots
from Nexus (or maven central) once every 24 h even if you have a snapshot
locally.

How does update work? Does it check if the bundle has been changed (by
looking at its size or modify time) or does it always update the bundle
regardless? I'm trying to figure out why update works but dev:watch
doesn't. It must be because the modify time is not correctly determined -
don't you agree?

Has any of you tried dev:watch on Windows 7? I know a lot of people run
Karaf on Linux and detecting a file's modify time might be different on
Windows than on Linux.

/Bengt


2012/11/10 Jean-Baptiste Onofré j...@nanthrax.net

 I think it could be related to Nexus.

 Maybe I get Bengt wrong but dev:watch only watch the local repository
 (.m2/repository) and compare the last modification date of the local
 location and the bundle location itself.

 So, if you do mvn install, it will work (as the bundle location is a MVN
 URL which can be found locally), whereas if someone else do a mvn deploy,
 and push the bundle on Nexus, Karaf won't see any change (as the local
 repository has not been updated).

 My 0.02€ (and I certainly missed what Benght means ;))

 Regards
 JB


 On 11/10/2012 10:00 AM, Andreas Pieber wrote:

 Since I can reproduce it anyhow locally it's kind of tricky... Looking
 at the code again I would say the only reason that it fails if update
 works is that you messed something up in your system (timestamps do not
 match). Would you mind attaching a remove debugger to your system and
 setting a breakpoint to org.apache.karaf.shell.dev.**watch.BundleWatcher
 line 85. The code there is really simple and you should see the problem
 within minutes then.

 Sorry for not being of any more help :-(

 Kind regards,
 Andreas


 On Sat, Nov 10, 2012 at 9:18 AM, Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com wrote:

 Good morning Andreas.

 I have a Nexus repository specified in my settings.xml - could that
 be a problem? However, doing an update 97 works fine showing that
 the bundle location can be found. In this case the bundle resides in
 my local maven repo (not in Nexus).

 /Bengt

 Den 10 nov 2012 07:31 skrev Andreas Pieber anpie...@gmail.com
 mailto:anpie...@gmail.com:


 Hey Bengt,

 I've just checked again, but I can confirm that dev:watch
 bascially does what it should do. How do you install your
 bundles? Have you configured any alternative maven repositories?
 Any other unusual settings?

 Kind regards,
 Andreas


 On Fri, Nov 9, 2012 at 4:47 PM, Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com wrote:

 Thanks,

 /Bengt


 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net
 mailto:j...@nanthrax.net


 OK thanks for the update, I take a look just after your
 other issue ;)

 Regards
 JB


 On 11/09/2012 04:39 PM, Bengt Rodehav wrote:

 I get the exact same results using Karaf 2.2.9.

 /Bengt


 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net
 mailto:j...@nanthrax.net mailto:j...@nanthrax.net

 mailto:j...@nanthrax.net


  It should be do automatically.

  Could you test the same with Karaf 2.2.9 ?

  Regards
  JB


  On 11/09/2012 02:58 PM, Bengt Rodehav wrote:

  I tried dev:watch * but the bundle still
 doesn't get updated.

  BTW do I need to execute dev:watch
 --start or is it being done
  automatically after I've done dev:watch 97?

  /Bengt


  2012/11/9 Andreas Pieber
 anpie...@gmail.com mailto:anpie...@gmail.com
  mailto:anpie...@gmail.com
 mailto:anpie...@gmail.com
 mailto:anpie...@gmail.com mailto:anpie...@gmail.com
 

  mailto:anpie...@gmail.com
 mailto:anpie...@gmail.com


   good question. Does a dev:watch

Re: Problems with ipojo in Karaf 2.3.0

2012-11-09 Thread Bengt Rodehav
OK - thanks.

Sorry for being pushy. I know you're a busy guy.

/Bengt


2012/11/9 j...@nanthrax.net j...@nanthrax.net

 Hi Bengt,

 I'm at the airport (back from ApacheCon and W-JAX). I will take a look
 beginning of this afternoon.

 Regards
 JB


 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://wwx.talend.com


 - Reply message -
 From: Bengt Rodehav be...@rodehav.com
 To: user@karaf.apache.org
 Subject: Problems with ipojo in Karaf 2.3.0
 Date: Fri, Nov 9, 2012 8:59 am


 JB,

 I will have to release our product very shortly. Do you think the
 workaround I'm currently using is safe for production or should I wait
 for your analysis? As far as I've tested it seems to work perfectly but I'm
 a bit worried about the underlying cause to this.

 /Bengt


 2012/11/8 Bengt Rodehav be...@rodehav.com

 I just tried doing the same thing using Karaf 2.2.9, that is:

- Add an ipojo feature
- Add the jpa and ipojo feature to featuresBoot

 This works without any problems. Of course, Karaf 2.2.9 uses
 org.apache.aries.util version 0.3.1 while Karaf 2.3.0 uses version 1.0.0.
 One of my main reasons for upgrading Karaf is in fact to upgrade Aries to a
 modern version.

 Don't know if this means that the problem lies within Aries or not. I
 still think that there is something fishy going on in Karaf.

 /Bengt


 2012/11/8 Bengt Rodehav be...@rodehav.com

 Hello JB,

 Just wanted to check whether you've managed to recreate this and
 possibly explain what is happening. I'm wondering if there might be a
 problem with the implementation of the feature functionality which is why I
 don't want this in production yet (but I have to upgrade our production
 servers very soon).

 My reasoning is as follows: If the org.apache.aries.util bundle is
 already installed (and possibly active - don't know what the timing looks
 like) then installing a feature containing the org.apache.aries.util bundle
 should be a noop - right? But apparently the feature functionality does
 something regarding this bundle anyway. What should it do? Why should it do
 anything?

 /Bengt


 2012/11/7 Bengt Rodehav be...@rodehav.com

 The workaround I'm currently using is to modify the
 enterprise-2.3.0-features.xml so that the *jpa* feature and the 
 *jndi*feature no longer include the org.apache.aries.util bundle. Then 
 everything
 seems to work (the org.apache.aries.util bundle is installed anyway thanks
 to startup.properties).

 However, I still don't feel comfortable putting this into production
 until I know what is happening.

 /Bengt




 2012/11/5 Bengt Rodehav be...@rodehav.com

 Thanks a lot JB,

 /Bengt


 2012/11/5 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 thanks for the detailed explanation.

 I will try to create a use case (without iPojo) to reproduce the
 issue (in combination with jpa feature).

 Regards
 JB


 On 11/05/2012 04:59 PM, Bengt Rodehav wrote:

 Some more findings...

 It seems like the Karaf Shell (org.apache.karaf.shell.**console)
 bundle
 uses packages from aries (e g org.apache.aries.blueprint) which in
 turn
 uses packages from org.apache.aries.util. Could it be that when
 the org.apache.aries.util bundle is installed as part of the jpa
 feature, it somehow causes a refresh which causes dependent bundles
 (such as the org.apache.karaf.shell.console bundle) to be rewired.
 This
 in turn would probably reinitialize the console (I'm probably using
 the
 wrong terminology here but you know what I mean...).

 If that is the case, then it seems highly undesirable to include
 the org.apache.aries.util bundle in the jpa feature.

 I don't have an explanation as to why this problem only occurs
 together
 with iPojo but I assume that it somehow triggers the refresh.

 /Bengt


 2012/11/5 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com
 


 BTW, I tried using iPojo 1.6.8 instead to see if this is a
 problem
 introduced in later iPojo versions. I do, however, get the same
 problems using iPojo 1.6.8 which implies that it's not a newly
 introduced iPojo problem.

 /Bengt


 2012/11/5 Bengt Rodehav be...@rodehav.com mailto:
 be...@rodehav.com


 I'm trying to upgrade my custom Karaf distribution to Karaf
 2.3.0 but have ran into some problems. It seems there is some
 kind of conflict between ipojo 1.8.2 and the jpa feature -
 specifically the org.apache.aries.util bundle in the jpa
 feature.

 I install ipojo as a feature (not listed in
 startup.properties).
 But when I do this I get the following exception:

 /2012-11-05 15:51:20,251 | INFO  | l Console Thread | Console

 | araf.shell.console.jline.**Console
  199
 | 14 - org.apache.karaf.shell.console - 2.3.0 | Exception
 caught
 while executing command/
 /java.lang.**UnsupportedOperationException: read() with
 timeout
 cannot be called as non-blocking operation

Re: dev:watch problems

2012-11-09 Thread Bengt Rodehav
I tried dev:watch * but the bundle still doesn't get updated.

BTW do I need to execute dev:watch --start or is it being done
automatically after I've done dev:watch 97?

/Bengt


2012/11/9 Andreas Pieber anpie...@gmail.com

 good question. Does a dev:watch * works as expected?

 Kind regards,
 Andreas


 On Fri, Nov 9, 2012 at 9:40 AM, Bengt Rodehav be...@rodehav.com wrote:

 It looks like this in the log:

 *2012-11-09 09:34:21,416 | DEBUG | Thread-50| BundleWatcher
| af.shell.dev.watch.BundleWatcher   81 | Bundle watcher
 thread started*
 *2012-11-09 09:34:21,416 | DEBUG | Thread-50| configadmin
| ?   ? | getProperties()*
 *2012-11-09 09:34:21,421 | DEBUG | lixDispatchQueue | framework
| ?   ? | FrameworkEvent
 PACKAGES REFRESHED - org.apache.felix.framework*
 *2012-11-09 09:34:22,421 | DEBUG | Thread-50| configadmin
| ?   ? | getProperties()*
 *2012-11-09 09:34:22,421 | DEBUG | lixDispatchQueue | framework
| ?   ? | FrameworkEvent
 PACKAGES REFRESHED - org.apache.felix.framework*
 *2012-11-09 09:34:23,421 | DEBUG | Thread-50| configadmin
| ?   ? | getProperties()*
 *2012-11-09 09:34:23,421 | DEBUG | lixDispatchQueue | framework
| ?   ? | FrameworkEvent
 PACKAGES REFRESHED - org.apache.felix.framework*

 Thus, every second the package org.apache.felix.framework seems to be
 refreshed. Nothing about bundle 97 though. When I manually do an update
 97, the bundle is refreshed properly.

 BTW, I'm running on Windows 7.

 /Bengt



 2012/11/9 j...@nanthrax.net j...@nanthrax.net

 Hi,

 Do you have something in the log ?

 Regards
 JB

 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://wwx.talend.com


 - Reply message -
 From: Bengt Rodehav be...@rodehav.com
 To: user@karaf.apache.org
 Subject: dev:watch problems
 Date: Fri, Nov 9, 2012 8:55 am


 I'm trying to get the dev:watch command to work but I haven't succeeded
 yet. If I want to watch the bundle with id 97, I do as follows:

 dev:watch -i 1000
 dev:watch 97
 dev:watch --start

 I could probably do all that in one go but the above is for clarity.

 If I then rebuild (using maven) the bundle with id 97 I excpect that
 bundle to be updated within approximately 1 s. However, it never happens.
 If I then do a update 97 then it works.

 The command dev:watch --list shows the following:


 *karaf@root dev:watch --list*
 *URL  ID Bundle Name*
 *
 *
 *97   97 Service-Container ::
 web-service-plugin*


 What am I doing wrong?

 I'm using Karaf 2.3.0.

 /Bengt






Re: Problems with ipojo in Karaf 2.3.0

2012-11-09 Thread Bengt Rodehav
Thanks for the info and your effort,

/Bengt


2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 just back home ;)

 We found an issue around Aries Blueprint (around graceperiod and deadlock
 especially). I gonna work on your issue this afternoon and see to fix it on
 Aries.

 Regards
 JB


 On 11/08/2012 10:41 AM, Bengt Rodehav wrote:

 I just tried doing the same thing using Karaf 2.2.9, that is:

   * Add an ipojo feature
   * Add the jpa and ipojo feature to featuresBoot


 This works without any problems. Of course, Karaf 2.2.9 uses
 org.apache.aries.util version 0.3.1 while Karaf 2.3.0 uses version
 1.0.0. One of my main reasons for upgrading Karaf is in fact to upgrade
 Aries to a modern version.

 Don't know if this means that the problem lies within Aries or not. I
 still think that there is something fishy going on in Karaf.

 /Bengt


 2012/11/8 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com


 Hello JB,

 Just wanted to check whether you've managed to recreate this and
 possibly explain what is happening. I'm wondering if there might be
 a problem with the implementation of the feature functionality which
 is why I don't want this in production yet (but I have to upgrade
 our production servers very soon).

 My reasoning is as follows: If the org.apache.aries.util bundle is
 already installed (and possibly active - don't know what the timing
 looks like) then installing a feature containing the
 org.apache.aries.util bundle should be a noop - right? But
 apparently the feature functionality does something regarding this
 bundle anyway. What should it do? Why should it do anything?

 /Bengt


 2012/11/7 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com
 


 The workaround I'm currently using is to modify the
 enterprise-2.3.0-features.xml so that the *jpa* feature and the
 *jndi* feature no longer include the org.apache.aries.util

 bundle. Then everything seems to work (the org.apache.aries.util
 bundle is installed anyway thanks to startup.properties).

 However, I still don't feel comfortable putting this into
 production until I know what is happening.

 /Bengt




 2012/11/5 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com


 Thanks a lot JB,

 /Bengt


 2012/11/5 Jean-Baptiste Onofré j...@nanthrax.net
 mailto:j...@nanthrax.net


 Hi Bengt,

 thanks for the detailed explanation.

 I will try to create a use case (without iPojo) to
 reproduce the issue (in combination with jpa feature).

 Regards
 JB


 On 11/05/2012 04:59 PM, Bengt Rodehav wrote:

 Some more findings...

 It seems like the Karaf Shell
 (org.apache.karaf.shell.__**console) bundle

 uses packages from aries (e g
 org.apache.aries.blueprint) which in turn
 uses packages from org.apache.aries.util. Could it
 be that when
 the org.apache.aries.util bundle is installed as
 part of the jpa
 feature, it somehow causes a refresh which causes
 dependent bundles
 (such as the org.apache.karaf.shell.console bundle)
 to be rewired. This
 in turn would probably reinitialize the console (I'm
 probably using the
 wrong terminology here but you know what I mean...).

 If that is the case, then it seems highly
 undesirable to include
 the org.apache.aries.util bundle in the jpa feature.

 I don't have an explanation as to why this problem
 only occurs together
 with iPojo but I assume that it somehow triggers the
 refresh.

 /Bengt


 2012/11/5 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com mailto:be...@rodehav.com

 mailto:be...@rodehav.com


  BTW, I tried using iPojo 1.6.8 instead to see
 if this is a problem
  introduced in later iPojo versions. I do,
 however, get the same
  problems using iPojo 1.6.8 which implies that
 it's not a newly
  introduced iPojo problem.

  /Bengt


  2012/11/5 Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com mailto:be...@rodehav.com

 mailto:be...@rodehav.com

Re: dev:watch problems

2012-11-09 Thread Bengt Rodehav
I get the exact same results using Karaf 2.2.9.

/Bengt


2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net

 It should be do automatically.

 Could you test the same with Karaf 2.2.9 ?

 Regards
 JB


 On 11/09/2012 02:58 PM, Bengt Rodehav wrote:

 I tried dev:watch * but the bundle still doesn't get updated.

 BTW do I need to execute dev:watch --start or is it being done
 automatically after I've done dev:watch 97?

 /Bengt


 2012/11/9 Andreas Pieber anpie...@gmail.com mailto:anpie...@gmail.com


 good question. Does a dev:watch * works as expected?

 Kind regards,
 Andreas


 On Fri, Nov 9, 2012 at 9:40 AM, Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com wrote:

 It looks like this in the log:

 /2012-11-09 09:34:21,416 | DEBUG | Thread-50|

 BundleWatcher|
 af.shell.dev.watch.**BundleWatcher   81 | Bundle watcher thread
 started/
 /2012-11-09 09:34:21,416 | DEBUG | Thread-50|
 configadmin  | ?
? | getProperties()/
 /2012-11-09 09:34:21,421 | DEBUG | lixDispatchQueue | framework
 | ?   ?
 | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/
 /2012-11-09 09:34:22,421 | DEBUG | Thread-50|
 configadmin  | ?
? | getProperties()/
 /2012-11-09 09:34:22,421 | DEBUG | lixDispatchQueue | framework
 | ?   ?
 | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/
 /2012-11-09 09:34:23,421 | DEBUG | Thread-50|
 configadmin  | ?
? | getProperties()/
 /2012-11-09 09:34:23,421 | DEBUG | lixDispatchQueue | framework
 | ?   ?
 | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/


 Thus, every second the package org.apache.felix.framework seems
 to be refreshed. Nothing about bundle 97 though. When I manually
 do an update 97, the bundle is refreshed properly.

 BTW, I'm running on Windows 7.

 /Bengt



 2012/11/9 j...@nanthrax.net mailto:j...@nanthrax.net
 j...@nanthrax.net mailto:j...@nanthrax.net


 Hi,

 Do you have something in the log ?

 Regards
 JB

 --
 Jean-Baptiste Onofré
 jbono...@apache.org mailto:jbono...@apache.org

 http://blog.nanthrax.net
 Talend - http://wwx.talend.com


 - Reply message -
 From: Bengt Rodehav be...@rodehav.com
 mailto:be...@rodehav.com
 To: user@karaf.apache.org mailto:user@karaf.apache.org**
 Subject: dev:watch problems
 Date: Fri, Nov 9, 2012 8:55 am


 I'm trying to get the dev:watch command to work but I
 haven't succeeded yet. If I want to watch the bundle with id
 97, I do as follows:

 dev:watch -i 1000
 dev:watch 97
 dev:watch --start

 I could probably do all that in one go but the above is for
 clarity.

 If I then rebuild (using maven) the bundle with id 97 I
 excpect that bundle to be updated within approximately 1 s.
 However, it never happens. If I then do a update 97 then
 it works.

 The command dev:watch --list shows the following:


 /karaf@root dev:watch --list/
 /URL  ID Bundle Name/
 /
 /
 /97   97
 Service-Container :: web-service-plugin/



 What am I doing wrong?

 I'm using Karaf 2.3.0.

 /Bengt





 --
 Jean-Baptiste Onofré
 jbono...@apache.org
 http://blog.nanthrax.net
 Talend - http://www.talend.com



Re: Problems with ipojo in Karaf 2.3.0

2012-11-08 Thread Bengt Rodehav
Hello JB,

Just wanted to check whether you've managed to recreate this and possibly
explain what is happening. I'm wondering if there might be a problem with
the implementation of the feature functionality which is why I don't want
this in production yet (but I have to upgrade our production servers very
soon).

My reasoning is as follows: If the org.apache.aries.util bundle is already
installed (and possibly active - don't know what the timing looks like)
then installing a feature containing the org.apache.aries.util bundle
should be a noop - right? But apparently the feature functionality does
something regarding this bundle anyway. What should it do? Why should it do
anything?

/Bengt


2012/11/7 Bengt Rodehav be...@rodehav.com

 The workaround I'm currently using is to modify the
 enterprise-2.3.0-features.xml so that the *jpa* feature and the *jndi*feature 
 no longer include the org.apache.aries.util bundle. Then everything
 seems to work (the org.apache.aries.util bundle is installed anyway thanks
 to startup.properties).

 However, I still don't feel comfortable putting this into production until
 I know what is happening.

 /Bengt




 2012/11/5 Bengt Rodehav be...@rodehav.com

 Thanks a lot JB,

 /Bengt


 2012/11/5 Jean-Baptiste Onofré j...@nanthrax.net

 Hi Bengt,

 thanks for the detailed explanation.

 I will try to create a use case (without iPojo) to reproduce the issue
 (in combination with jpa feature).

 Regards
 JB


 On 11/05/2012 04:59 PM, Bengt Rodehav wrote:

 Some more findings...

 It seems like the Karaf Shell (org.apache.karaf.shell.**console) bundle
 uses packages from aries (e g org.apache.aries.blueprint) which in turn
 uses packages from org.apache.aries.util. Could it be that when
 the org.apache.aries.util bundle is installed as part of the jpa
 feature, it somehow causes a refresh which causes dependent bundles
 (such as the org.apache.karaf.shell.console bundle) to be rewired. This
 in turn would probably reinitialize the console (I'm probably using the
 wrong terminology here but you know what I mean...).

 If that is the case, then it seems highly undesirable to include
 the org.apache.aries.util bundle in the jpa feature.

 I don't have an explanation as to why this problem only occurs together
 with iPojo but I assume that it somehow triggers the refresh.

 /Bengt


 2012/11/5 Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com


 BTW, I tried using iPojo 1.6.8 instead to see if this is a problem
 introduced in later iPojo versions. I do, however, get the same
 problems using iPojo 1.6.8 which implies that it's not a newly
 introduced iPojo problem.

 /Bengt


 2012/11/5 Bengt Rodehav be...@rodehav.com mailto:
 be...@rodehav.com


 I'm trying to upgrade my custom Karaf distribution to Karaf
 2.3.0 but have ran into some problems. It seems there is some
 kind of conflict between ipojo 1.8.2 and the jpa feature -
 specifically the org.apache.aries.util bundle in the jpa
 feature.

 I install ipojo as a feature (not listed in startup.properties).
 But when I do this I get the following exception:

 /2012-11-05 15:51:20,251 | INFO  | l Console Thread | Console

 | araf.shell.console.jline.**Console
  199
 | 14 - org.apache.karaf.shell.console - 2.3.0 | Exception caught
 while executing command/
 /java.lang.**UnsupportedOperationException: read() with timeout
 cannot be called as non-blocking operation is disabled/
 /at
 jline.internal.**NonBlockingInputStream.read(**
 NonBlockingInputStream.java:**134)[14:org.apache.karaf.**
 shell.console:2.3.0]/
 /at
 jline.internal.**NonBlockingInputStream.read(**
 NonBlockingInputStream.java:**246)[14:org.apache.karaf.**
 shell.console:2.3.0]/
 /at
 jline.internal.**InputStreamReader.read(**
 InputStreamReader.java:259)[**14:org.apache.karaf.shell.**
 console:2.3.0]/
 /at
 jline.internal.**InputStreamReader.read(**
 InputStreamReader.java:196)[**14:org.apache.karaf.shell.**
 console:2.3.0]/
 /at
 jline.console.ConsoleReader.**readCharacter(ConsoleReader.**
 java:1974)[14:org.apache.**karaf.shell.console:2.3.0]/
 /at
 jline.console.ConsoleReader.**readLine(ConsoleReader.java:**
 2174)[14:org.apache.karaf.**shell.console:2.3.0]/
 /at
 jline.console.ConsoleReader.**readLine(ConsoleReader.java:**
 2098)[14:org.apache.karaf.**shell.console:2.3.0]/
 /at
 org.apache.karaf.shell.**console.jline.Console.**
 readAndParseCommand(Console.**java:235)[14:org.apache.karaf.**
 shell.console:2.3.0]/
 /at
 org.apache.karaf.shell.**console.jline.Console.run(**
 Console.java:171)[14:org.**apache.karaf.shell.console:2.**3.0]/
 /at java.lang.Thread.run(Thread.**java:662)[:1.6.0_32]/


 Then it seems like Karaf (or Felix) restarts somehow since I get

  1   2   >