Hi Roman,

Commenting on the hotspot changes ...

On 8/01/2015 9:01 PM, Roman Kennke wrote:
Hi Erik,

I'm CC-ing hotspot-dev for review of Hotspot code related changes.

Yes, some additional changes to Hotspot are required. This is the full
set of changes needed to build and run Shark:

http://cr.openjdk.java.net/~rkennke/shark-build-hotspot/webrev.01/

In detail:

- In the Makefile fix a typo to be able to build unzipped debuginfo.

Ok.

- In ciTypeFlow.cpp only include some files and code only when building
C2. I don't think that code makes sense outside of C2. (That's the issue
that you've seen).

Looks okay but someone from compiler team needs to comment. There may be other code that need adjusting.

- In systemDictionary.cpp, exclude some code for Shark that creates and
checks native wrappers for method handle intrinsics. Invokedynamic stuff
is not yet implemented in Shark, so we can't do this.

Ok.

- In allocation.hpp, exclude the operator overrides for new etc, LLVM
does allocations itself, and this would blow up.

I'm intrigued as the allocation strategy is not tied to the compiler but pervasive to the whole VM runtime.

- In handles.hpp and handles.inline.hpp, I added an argument
check_in_stack so that I can disable the in-stack check for the
SharkNativeWrapper. The SharkNativeWrapper is allocated on-heap and the
methodHandle inside the SharkNativeWrapper. I could have excluded that
check altogther using an #ifndef SHARK, but the way I implemented it
seemed more finegrained.

I'd prefer an ifndef SHARK rather than change the API.

Thanks,
David

- In SharkCompiler, I pulled out the code to initialize LLVM into its
own method, and the call to set_state(initialized) out of the
execution-engine-lock. This would otherwise complain about wrong
lock-ordering.
- In SharkRuntime, I changed the cast from intptr_t to oop to work with
the new picky compilers ;-)

Please review.

Thanks and best regards,
Roman

Am Donnerstag, den 08.01.2015 um 11:03 +0100 schrieb Erik Joelsson:
Hello Roman,

This addition looks good to me.

Thinking about what the others said, it might be inconvenient to have
all this be pushed to different forests. I tried applying all the
changes to jdk9/hs-rt, but I can't seem to build zeroshark.. Did you
have more hotspot changes to be able to finish the build?

My failure is:

   ciTypeFlow.o
/localhome/hg/jdk9-hs-rt/hotspot/src/share/vm/ci/ciTypeFlow.cpp
In file included from
/localhome/hg/jdk9-hs-rt/hotspot/src/share/vm/opto/regmask.hpp:29:0,
                   from
/localhome/hg/jdk9-hs-rt/hotspot/src/share/vm/opto/compile.hpp:40,
                   from
/localhome/hg/jdk9-hs-rt/hotspot/src/share/vm/ci/ciTypeFlow.cpp:38:
/localhome/hg/jdk9-hs-rt/hotspot/src/share/vm/opto/optoreg.hpp:40:39:
fatal error: adfiles/adGlobals_zero.hpp: No such file or directory

  From what I can see, adfiles are not generated for zero or zeroshark
builds, so the include should probably be removed.

Would you still like me to push what you currently have to hs-rt?

/Erik

On 2015-01-07 21:21, Roman Kennke wrote:
Hi Erik,

When I built Zero and Shark on my Raspberry Pi, I noticed another
problem when copying jvm.cfg into the right places. I fixed it in a
similar way as I did for the SA stuff:

http://cr.openjdk.java.net/~rkennke/shark-build-jdk/webrev.02/

I think that should be all for now.

Please push that into JDK9 if you think that's fine.

Best regards,
Roman

Am Mittwoch, den 07.01.2015 um 17:49 +0100 schrieb Erik Joelsson:
On 2015-01-07 17:29, Roman Kennke wrote:
Am Mittwoch, den 07.01.2015 um 17:16 +0100 schrieb Erik Joelsson:
On 2015-01-07 17:11, Roman Kennke wrote:
Hi Erik,

Do you have a bug for this?
No.

I haven't pushed any changes to JDK in a while. Is it possible in the
meantime for me to create my own bugs? Otherwise, please file one for
me :-)
You should be able to log in to https://bugs.openjdk.java.net and create
bugs since you have an OpenJDK identity.
Done:

https://bugs.openjdk.java.net/browse/JDK-8068598

While I'm at it, is it possible for me to push my own changes (except
hotspot of course)? If yes, what needs to be done for regenerating the
configure files? Simply run autogen.sh in common/autoconf with whatever
version of autotools I have? Or doesn't it make sense at all b/c you
need to regenerate your closed scripts?
It requires you to run common/autogen.sh yes, and that will require you
to have autoconf 2.69 installed. But since we also need to regenerate
the closed version, I can take care of the push for you. Will do it
tomorrow if that's ok?

/Erik
Roman





Reply via email to