Oh no, completely forgot about that one -.- 
Sorry about that!

I don't think that it really matters whether it's changed after tagging 3.4.0 
or not as it only affects the tags in hub.docker.com. It would only be good if 
we could change that before 3.4.0 gets deployed. Should I make a CTR commit to 
master or would you prefer another approach since we're currently in the 
release process?

Alternatively, you could also add the tag manually this one time and then we'll 
resolve the ticket later for the next release. These commands should do the 
trick (assuming that you built the 3.4.0 images already, e.g., with 'mvn clean 
install -pl gremlin-server,gremlin-console -DdockerImages'):
docker tag tinkerpop/gremlin-console:3.4.0 tinkerpop/gremlin-console:latest
docker push tinkerpop/gremlin-console:latest
docker tag tinkerpop/gremlin-server:3.4.0 tinkerpop/gremlin-server:latest
docker push tinkerpop/gremlin-server:latest

This works as long as 3.3.5 is already deployed as that would otherwise use the 
latest tag again and therefore overwrite the tagging of latest for the 3.4.0 
images

-----Ursprüngliche Nachricht-----
Von: Stephen Mallette <spmalle...@gmail.com> 
Gesendet: Montag, 7. Januar 2019 13:07
An: dev@tinkerpop.apache.org
Betreff: Re: [DISCUSS] code freeze 3.2.11/3.3.5/3.4.0

Florian - i suppose this one needs to be done?

https://issues.apache.org/jira/browse/TINKERPOP-1980

should it have been done before we tagged 3.4.0 and built all the artifacts? 
does it matter?

On Fri, Jan 4, 2019 at 4:52 PM Stephen Mallette <spmalle...@gmail.com>
wrote:

> well - that was absolute insanity right down the 
> valid-distiribution.sh on 3.4.0. Please make sure you pull master 
> before testing 3.4.0 as kuppitz had to drop in a patch..........
>
> On Wed, Jan 2, 2019 at 4:50 PM Stephen Mallette <spmalle...@gmail.com>
> wrote:
>
>> so, 3.2.11 and 3.3.5 releases are prepared - vote threads were just sent.
>> i would have had 3.4.0 done too but i flubbed the 2018/2019 date on 
>> 3.3.5 and had to redo the whole thing basically - so that was rotten. 
>> I'll have
>> 3.4.0 out tomorrow.
>>
>> On Wed, Jan 2, 2019 at 7:34 AM Stephen Mallette 
>> <spmalle...@gmail.com>
>> wrote:
>>
>>> yes. i can also use a specific commit, but it invariably leaves out 
>>> commits for a release. i guess those commits left out will just be 
>>> release manager commits typically, but i dunno, it just bugged me 
>>> that it wasn't the actual 3.2.10..3.2.11
>>>
>>> On Wed, Jan 2, 2019 at 7:15 AM Robert Dale <robd...@gmail.com> wrote:
>>>
>>>> Can't you just use HEAD?
>>>>
>>>> git shortlog -sn 3.2.10..HEAD
>>>>
>>>> Robert Dale
>>>>
>>>>
>>>> On Wed, Jan 2, 2019 at 7:11 AM Stephen Mallette 
>>>> <spmalle...@gmail.com>
>>>> wrote:
>>>>
>>>> > All PRs appear merged....I've starting the release process 
>>>> > beginning
>>>> with
>>>> > 3.2.11. Should see VOTE threads coming soon.
>>>> >
>>>> > Separately, I already found a problem with using git shortlog to
>>>> generate
>>>> > the "contributor list" as there is no 3.2.11 tag to use in the
>>>> arguments
>>>> > given to it. It's a bit of a cart before the horse problem. I 
>>>> > will
>>>> come up
>>>> > with something else to deal with that. Probably related to the 
>>>> > website which can be updated out of band with the release. Just 
>>>> > wanted to
>>>> point out
>>>> > the problem so that no one would be expecting to see that content 
>>>> > in
>>>> the
>>>> > release notes when the VOTE threads go out.
>>>> >
>>>> > On Tue, Jan 1, 2019 at 8:31 AM Stephen Mallette 
>>>> > <spmalle...@gmail.com
>>>> >
>>>> > wrote:
>>>> >
>>>> > > I don't know what is going on but yesterday i spent a few 
>>>> > > minutes
>>>> messing
>>>> > > with the failing python build with radish and after a few 
>>>> > > failures
>>>> it
>>>> > just
>>>> > > started working again locally. Then I re-ran failed travis jobs 
>>>> > > and
>>>> they
>>>> > > started passing. I guess the problem was fixed externally - weird.
>>>> I
>>>> > > expect to start the release process soon - there are just a 
>>>> > > couple
>>>> more
>>>> > > longstanding PRs to merge that delayed a little during the 
>>>> > > holiday
>>>> > period.
>>>> > >
>>>> > >
>>>> > >
>>>> > > On Fri, Dec 28, 2018 at 6:10 PM Daniel Kuppitz 
>>>> > > <m...@gremlin.guru>
>>>> wrote:
>>>> > >
>>>> > >> A quick & dirty fix for the broken radish libs to at least get 
>>>> > >> a
>>>> clean
>>>> > >> build in docker:
>>>> > >>
>>>> > >> diff --git a/docker/scripts/build.sh b/docker/scripts/build.sh 
>>>> > >> index e172ddaaa9..8b56cf53c3 100755
>>>> > >> --- a/docker/scripts/build.sh
>>>> > >> +++ b/docker/scripts/build.sh
>>>> > >> *@@ -78,7 +78,10 @@* if [ -r "settings.xml" ]; then
>>>> > >>    cp settings.xml ~/.m2/
>>>> > >>  fi
>>>> > >>
>>>> > >> *-mvn clean install process-resources 
>>>> > >> ${TINKERPOP_BUILD_OPTIONS}
>>>> || exit
>>>> > >> 1*
>>>> > >> *+mvn clean install -DskipTests* *+sed -i 
>>>> > >> 's/background=background,/background=background/g'
>>>> > >>
>>>> > >>
>>>> >
>>>> gremlin-python/target/python2/env/local/lib/python2.7/site-packages
>>>> /radish/parser.py*
>>>> > >> *+*
>>>> > >> +mvn install process-resources ${TINKERPOP_BUILD_OPTIONS} || 
>>>> > >> +exit 1
>>>> > >>  [ -z "${BUILD_JAVA_DOCS}" ] || mvn process-resources 
>>>> > >> -Djavadoc ||
>>>> exit
>>>> > 1
>>>> > >>
>>>> > >>  if [ ! -z "${BUILD_USER_DOCS}" ]; then
>>>> > >>
>>>> > >>
>>>> > >>
>>>> > >> Cheers,
>>>> > >> Daniel
>>>> > >>
>>>> > >>
>>>> > >> On Thu, Dec 27, 2018 at 6:18 AM Stephen Mallette <
>>>> spmalle...@gmail.com>
>>>> > >> wrote:
>>>> > >>
>>>> > >> > Hi all, just checking in during the holiday period. My 
>>>> > >> > laptop
>>>> will
>>>> > >> return
>>>> > >> > to the off position shortly, but I wanted to point out that 
>>>> > >> > we
>>>> seem to
>>>> > >> have
>>>> > >> > a problem with Python:
>>>> > >> >
>>>> > >> > https://travis-ci.org/apache/tinkerpop/jobs/472444735
>>>> > >> >
>>>> > >> > every current PR seems to be in fail mode right now. We will
>>>> want to
>>>> > >> sort
>>>> > >> > that out before release.Looks like something in radish. If
>>>> anyone can
>>>> > >> have
>>>> > >> > a look this week that would be helpful as I won't have time 
>>>> > >> > to
>>>> dig on
>>>> > it
>>>> > >> > too deeply until next week (which is when we are supposed to 
>>>> > >> > be
>>>> > >> releasing).
>>>> > >> >
>>>> > >> > On Mon, Dec 17, 2018 at 7:09 AM Stephen Mallette <
>>>> > spmalle...@gmail.com>
>>>> > >> > wrote:
>>>> > >> >
>>>> > >> > > While I think we could go on for every adding things to 
>>>> > >> > > 3.4.0 I
>>>> > think
>>>> > >> > it's
>>>> > >> > > time to cut it off and release. There's too many good 
>>>> > >> > > things in
>>>> > there
>>>> > >> to
>>>> > >> > > hold for any longer. We are looking at releasing 3.2.11, 
>>>> > >> > > 3.3.5
>>>> and
>>>> > >> 3.4.0.
>>>> > >> > >
>>>> > >> > > I'd propose we polish up remaining items this week, set 
>>>> > >> > > for
>>>> code
>>>> > >> freeze
>>>> > >> > > 12/22 and then build the release for VOTE the week of the
>>>> 31st. I
>>>> > >> assume
>>>> > >> > > that there are enough PMC members around the holiday 
>>>> > >> > > period to
>>>> VOTE
>>>> > on
>>>> > >> > the
>>>> > >> > > release artifacts. If the VOTE has to stay open a bit 
>>>> > >> > > longer
>>>> than is
>>>> > >> > > typical then that's ok. I'm happy to just do all three 
>>>> > >> > > releases
>>>> > myself
>>>> > >> > this
>>>> > >> > > time as it might be hard to coordinate with others during 
>>>> > >> > > the
>>>> > holiday
>>>> > >> > > period.
>>>> > >> > >
>>>> > >> > > We still have a number of important things to finish -
>>>> specifically:
>>>> > >> > >
>>>> > >> > > 1. code reviews on open PRs with ids > 1000 2. finish up 
>>>> > >> > > the GraphBinary - jorge is adding two more
>>>> serializers
>>>> > >> and
>>>> > >> > > then i think we can call this a day and put it up for review.
>>>> > >> > > 3. documentation review
>>>> > >> > > 4. anything else?
>>>> > >> > >
>>>> > >> > > As usual, let's continue to use this thread for release
>>>> coordination
>>>> > >> > > heading into code freeze.
>>>> > >> > >
>>>> > >> > >
>>>> > >> > >
>>>> > >> >
>>>> > >>
>>>> > >
>>>> >
>>>>
>>>

Reply via email to