Re: How about using JDK 21 in the official docker image?

2023-10-26 Thread Houston Putman
I've changed the nightly docker jobs to also create "-java21" images for
main, branch_9x and branch_9_4, so we can start testing this out before
making it official.

The new images should become available in the next 24 hours.

e.g. apache/solr-nightly:9.4.1-SNAPSHOT-java21

- Houston

On Thu, Oct 26, 2023 at 12:19 PM Houston Putman  wrote:

> We could also use Java 21 for the 9x and main nightly images! Easy to
> change it in the Jenkins jobs
>
> - Houston
>
> On Wed, Oct 25, 2023 at 6:22 PM Jan Høydahl  wrote:
>
>> I agree on being conservative here. But if it turns out to work well, we
>> could consider publishing an additional solr:9.4.0-jre21 tag. That way
>> early adopters have a choice. If I remember correctly, Java 21 has some
>> improvements that can benefit some vector workloads or something, so I see
>> a benefit in getting it out there. We could alternatively opt to push
>> temporary images like this to our own apache/solr docker namespace for
>> folks to try out.
>>
>> Jan
>>
>> > 24. okt. 2023 kl. 18:17 skrev Shawn Heisey > >:
>> >
>> > On 10/18/2023 10:11 AM, Tomasz Elendt wrote:
>> >> I noticed that JDK 21 LTS was released some time ago. Is there any
>> reason why official docker images still use JDK 17?
>> >> I'm asking because I know there are some preview JDK features that
>> Lucene utilizes and Solr enables them when it detects a newer version (e.g.
>> SOLR-16500).
>> >> Does it make sense to switch now that there is a new LTS version?
>> >
>> > I have no desire to stand in the way of progress, but Java 21 has only
>> been out for a month.  I don't think it's a good idea to rely on a new
>> major version of *anything* that soon after its release.  Test with it, but
>> don't switch to it.
>> >
>> > I do not think we should be planning on such a major upgrade to the
>> docker image until Java 21 has been out for a while.  I was going to
>> upgrade my Solr server to Java 21 to try it out since it's not a mission
>> critical install, but Ubuntu doesn't yet have OpenJDK packages for it. The
>> eclipse-temurin:21-jre-jammy docker image was pushed 11 days ago.
>> >
>> > My thought on it is to wait until at least the release of Java 22,
>> which will happen six months after Java 21 was released.
>> >
>> > Thanks,
>> > Shawn
>> >
>> > -
>> > To unsubscribe, e-mail: dev-unsubscr...@solr.apache.org
>> > For additional commands, e-mail: dev-h...@solr.apache.org
>> >
>>
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@solr.apache.org
>> For additional commands, e-mail: dev-h...@solr.apache.org
>>
>>


Re: Jenkins Build Errors

2023-10-26 Thread Houston Putman
Ok, I think I fixed the docker tests. The other issues all still apply
though.

- Houston

On Thu, Oct 26, 2023 at 12:16 PM Houston Putman  wrote:

> The Jenkins builds aren't in a great state right now.
>
> Currently the Solr-Check-main
>  build is
> failing consistently because of random Solr processes being found on the
> box (when the integration tests expect nothing else to be running). Now
> that we have port randomization for the integration tests, its a very good
> sign that the found Solr processes all use port 8983, meaning that we
> aren't leaking Solrs in the integration tests.
>
> Because of this, the culprit seems to be that the smoke tests (which still
> start a Solr on port 8983) are leaking processes, and looking at the logs,
> that seems to be the case (Solr-Smoketest-9.4
> ,
> Solr-Smoketest-9.x
> ). So
> fixing the Smoketests leaking Solr processes will in turn fix both the
> smoke test builds and the main check.
>
> As for the Solr-Check-9.x
>  build, it is
> running on Crave, so it doesn't have the same issue with leaked Solr
> processes. However on crave, there seems to be an issue with the mTLS
> tests. (Solr-Check-main also has this issue, but only on the lucene-solr-1
> machine, not lucene-solr-2 strangely). We need to investigate why the TLS
> tests pass locally for everyone (and on 1/2 of the Jenkins boxes), but not
> on crave.
>
> Lastly, the Docker tests are broken in a very strange way. A while ago, I
> added tests to make sure that the prometheus exporter can communicate
> correctly in docker. This test seems to fail on both
> Solr-Docker-Nightly-main
>  and
> Solr-Docker-Nightly-9.x
> . At
> first I thought the issue was that the Jenkins servers had different Docker
> networking that didn't support these tests, and I let it be for a bit. Now
> we are running Solr-Docker-Nightly-9.4
> ,
> which has the same tests included and it passes. So it does seem like the
> Jenkins servers allow us to use Docker networking in the ways we want, but
> for some reason 9.x and 9.4 (which should be relatively identical) don't
> behave the same way. Looking at the err logs, the problem is
>
>> /opt/solr/docker/scripts/docker-entrypoint.sh: line 48: exec:
>> solr-exporter: not found
>>
> On the top of my head I think this might be using the slim docker image?
> Because otherwise there's no reason why the solr exporter wouldn't be
> there... (Also no idea why it wouldn't work the same on the 9.4 build...)
>
> Anyways, this is just a list of what's going on. I'll try to fix the
> docker stuff, but would love help with the other builds!
>
> - Houston
>


Re: How about using JDK 21 in the official docker image?

2023-10-26 Thread Houston Putman
We could also use Java 21 for the 9x and main nightly images! Easy to
change it in the Jenkins jobs

- Houston

On Wed, Oct 25, 2023 at 6:22 PM Jan Høydahl  wrote:

> I agree on being conservative here. But if it turns out to work well, we
> could consider publishing an additional solr:9.4.0-jre21 tag. That way
> early adopters have a choice. If I remember correctly, Java 21 has some
> improvements that can benefit some vector workloads or something, so I see
> a benefit in getting it out there. We could alternatively opt to push
> temporary images like this to our own apache/solr docker namespace for
> folks to try out.
>
> Jan
>
> > 24. okt. 2023 kl. 18:17 skrev Shawn Heisey  >:
> >
> > On 10/18/2023 10:11 AM, Tomasz Elendt wrote:
> >> I noticed that JDK 21 LTS was released some time ago. Is there any
> reason why official docker images still use JDK 17?
> >> I'm asking because I know there are some preview JDK features that
> Lucene utilizes and Solr enables them when it detects a newer version (e.g.
> SOLR-16500).
> >> Does it make sense to switch now that there is a new LTS version?
> >
> > I have no desire to stand in the way of progress, but Java 21 has only
> been out for a month.  I don't think it's a good idea to rely on a new
> major version of *anything* that soon after its release.  Test with it, but
> don't switch to it.
> >
> > I do not think we should be planning on such a major upgrade to the
> docker image until Java 21 has been out for a while.  I was going to
> upgrade my Solr server to Java 21 to try it out since it's not a mission
> critical install, but Ubuntu doesn't yet have OpenJDK packages for it. The
> eclipse-temurin:21-jre-jammy docker image was pushed 11 days ago.
> >
> > My thought on it is to wait until at least the release of Java 22, which
> will happen six months after Java 21 was released.
> >
> > Thanks,
> > Shawn
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@solr.apache.org
> > For additional commands, e-mail: dev-h...@solr.apache.org
> >
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@solr.apache.org
> For additional commands, e-mail: dev-h...@solr.apache.org
>
>


Jenkins Build Errors

2023-10-26 Thread Houston Putman
The Jenkins builds aren't in a great state right now.

Currently the Solr-Check-main
 build is
failing consistently because of random Solr processes being found on the
box (when the integration tests expect nothing else to be running). Now
that we have port randomization for the integration tests, its a very good
sign that the found Solr processes all use port 8983, meaning that we
aren't leaking Solrs in the integration tests.

Because of this, the culprit seems to be that the smoke tests (which still
start a Solr on port 8983) are leaking processes, and looking at the logs,
that seems to be the case (Solr-Smoketest-9.4
,
Solr-Smoketest-9.x
). So fixing
the Smoketests leaking Solr processes will in turn fix both the smoke test
builds and the main check.

As for the Solr-Check-9.x
 build, it is
running on Crave, so it doesn't have the same issue with leaked Solr
processes. However on crave, there seems to be an issue with the mTLS
tests. (Solr-Check-main also has this issue, but only on the lucene-solr-1
machine, not lucene-solr-2 strangely). We need to investigate why the TLS
tests pass locally for everyone (and on 1/2 of the Jenkins boxes), but not
on crave.

Lastly, the Docker tests are broken in a very strange way. A while ago, I
added tests to make sure that the prometheus exporter can communicate
correctly in docker. This test seems to fail on both
Solr-Docker-Nightly-main
 and
Solr-Docker-Nightly-9.x
. At
first I thought the issue was that the Jenkins servers had different Docker
networking that didn't support these tests, and I let it be for a bit. Now
we are running Solr-Docker-Nightly-9.4
, which
has the same tests included and it passes. So it does seem like the Jenkins
servers allow us to use Docker networking in the ways we want, but for some
reason 9.x and 9.4 (which should be relatively identical) don't behave the
same way. Looking at the err logs, the problem is

> /opt/solr/docker/scripts/docker-entrypoint.sh: line 48: exec:
> solr-exporter: not found
>
On the top of my head I think this might be using the slim docker image?
Because otherwise there's no reason why the solr exporter wouldn't be
there... (Also no idea why it wouldn't work the same on the 9.4 build...)

Anyways, this is just a list of what's going on. I'll try to fix the docker
stuff, but would love help with the other builds!

- Houston


Re: Issue with marking replicas down at startup

2023-10-26 Thread Vincent Primault
Hello, I created a JIRA to track this:
https://issues.apache.org/jira/browse/SOLR-17049

On Thu, Oct 26, 2023 at 3:30 PM rajani m  wrote:

> Is this an issue in that case? If so, should we create a jira to address
> it?
>
> On Sat, Oct 7, 2023 at 8:32 PM Mark Miller  wrote:
>
> > Yeah, it’s not going to fix that updates can come in too early if you
> just
> > delay when the replica publishes active. It’s still going to show up
> active
> > when it’s not. That gets rectified if you end up replicating the index,
> > it’s when you peer sync that it can be a persistent problem. And in both
> > cases, you can end up with a window of incorrect queries.
> >
> > The most straightforward way to handle it is to use the cluster state
> > rather than the cores from the core container when publishing down (where
> > it doesn’t currently use the downnode command) and when waiting to see
> the
> > state.
> >
>


Re: Issue with marking replicas down at startup

2023-10-26 Thread rajani m
Is this an issue in that case? If so, should we create a jira to address it?

On Sat, Oct 7, 2023 at 8:32 PM Mark Miller  wrote:

> Yeah, it’s not going to fix that updates can come in too early if you just
> delay when the replica publishes active. It’s still going to show up active
> when it’s not. That gets rectified if you end up replicating the index,
> it’s when you peer sync that it can be a persistent problem. And in both
> cases, you can end up with a window of incorrect queries.
>
> The most straightforward way to handle it is to use the cluster state
> rather than the cores from the core container when publishing down (where
> it doesn’t currently use the downnode command) and when waiting to see the
> state.
>


Help Solr Newsletter October 2023 with links, blogs, articles

2023-10-26 Thread Arrieta, Alejandro
Hello Solr Community,

We are working on the October Newsletter, and Solr/Lucene/search-related
content is welcome.

Please check the draft here:
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=272927879

If the content you want to suggest is not there, please add it to the wiki
or reply to this email thread with the following:

title
Two lines description
link

What we are looking for is:

-meetups or conferences that will take place in the following months,
starting in November 2023.
-blogs, photos, videos, and recordings of the Search/Solr/Lucene-related
meetups conference in October 2023 or before. If the talk is good, could
you send the link?
-any article, blog, post, or link you think is exciting and relevant to
Solr/Search/Lucene/community, even if it is older.

We plan to finish the Solr Newsletter October 2023 edition on the first
days of November. You still have five days.

Kind Regards,
Alejandro Arrieta