Re: Proposal: jtreg tests with native components

2014-04-30 Thread David Holmes

On 30/04/2014 9:39 PM, Staffan Larsen wrote:


On 30 apr 2014, at 11:39, David Holmes  wrote:


Hi Staffan,

On 25/04/2014 10:02 PM, Staffan Larsen wrote:

There are a couple of jtreg tests today that depend on native components 
(either JNI libraries or executables). These are handled in one of two ways:

1) The binaries are pre-compiled and checked into the repository (often inside 
jar files).
2) The test will try to invoke a compiler (gcc, cl, …) when the test is being 
run.

Neither of these are very good solutions. #1 makes it hard to run the setup the 
test for all platforms and requires binaries in the source control system. #2 
is hit-and-miss: the correct compiler may or may not be installed on the test 
machine, and the approach requires platform specific logic to be maintained.


#2 is far from perfect but ...


I would like to propose that these native components are instead compiled when 
the product is built by the same makefile logic as the product. At product 
build time we know we have access to the (correct) compilers and we have 
excellent support in the makefiles for building on all platforms.

If we build the native test components together with the product, we also have 
to take care of distributing the result together with the product when we do 
testing across a larger number of machines. We will also need a way to tell the 
jtreg tests where these pre-built binaries are located.


don't under estimate the complexity involved in building then "distributing" 
the test binaries.


I don’t. It will be complicated, but I’m sure we can do it.


The question is whether it is worth it relative to the size of the problem.



You will still need to maintain platform specific logic as you won't 
necessarily be able to use the CFLAGS etc that the main build process uses.


Can you explain more? Why can’t I use CFLAGS as it is?


You _may_ be able to, you may not. I know we already had issues where 
the CFLAGS as being used for the JDK sources also got applied to 
building the code-generator utility programs and that didn't work 
correctly. Here's sample CFLAGS from a JDK build


CFLAGS_JDKLIB:=  -W -Wall -Wno-unused -Wno-parentheses   -pipe 
 -D_GNU_SOURCE -D_REENTRANT -D_LARGEFILE64_SOURCE 
-fno-omit-frame-pointer  -D_LITTLE_ENDIAN -DLINUX -DNDEBUG 
-DARCH='"i586"' -Di586 -DRELEASE='"$(RELEASE)"' 
-I/export/users/dh198349/ejdk8u-dev/build/b13/linux-i586-ea/jdk/include 

-I/export/users/dh198349/ejdk8u-dev/build/b13/linux-i586-ea/jdk/include/linux 
  -I/export/users/dh198349/jdk8u-dev/jdk/src/share/javavm/export 
-I/export/users/dh198349/jdk8u-dev/jdk/src/solaris/javavm/export 
-I/export/users/dh198349/jdk8u-dev/jdk/src/share/native/common 
  -I/export/users/dh198349/jdk8u-dev/jdk/src/solaris/native/common 
-m32  -fno-strict-aliasing -fPIC


Does that make sense for compiling a test? Does it depend on whether we 
are building a native library or a native executable?




Also talk to SQE as I'm pretty sure there is an existing project to look at how 
to better handle this, at least for the internal test suites.


I have talked to SQE. I don’t know of any other projects to handle this.


:) It wasn't SQE, it was your project as referenced in a few bug reports 
last August/September.


David



/Staffan




David
-


I suggest that at the end of a distributed build run, the pre-built test 
binaries are packaged in a zip or tar file (just like the product bits) and 
stored next to the product bundles. When we run distributed tests, we need to 
pick up the product bundle and the test bundle before the testing is started.

To tell the tests where the native code is, I would like to add a flag to jtreg 
to point out the path to the binaries. This should cause jtreg to set 
java.library.path before invoking a test and also set a test.* property which 
can be used by test to find it’s native components.

This kind of setup would make it easier to add and maintain tests that have a 
native component. I think this will be especially important as more tests are 
written using jtreg in the hotspot repository.

Thoughts on this? Is the general approach ok? There are lots of details to be 
figured out, but at this stage I would like to hear feedback on the idea as 
such.

Thanks,
/Staffan





Re: RFR: 8034094: SA agent can't compile when jni_x86.h is used

2014-04-30 Thread Dmitry Samersoff
Erik,

Sorry, missed the thread.

Changes (option 2) looks good for me.

-Dmitry

On 2014-02-10 19:21, Erik Helin wrote:
> Sigh, I forgot the subject...
> 
> "RFR: 8034094: SA agent can't compile when jni_x86.h is used"
> 
> Thanks,
> Erik
> 
> On 2014-02-10 16:08, Erik Helin wrote:
>> Hi all,
>>
>> this patch fixes an issue with HotSpot's makefiles, IMPORT_JDK and
>> jni_md.h.
>>
>> The bug manifests itself when using an IMPORT_JDK which
>> include/linux/jni_md.h has a timestamp that is older than
>> hotspot/src/cpu/x86/jni_x86.h. When this happens, the Makefiles will
>> copy hotspot/src/cpu/x86/jni_x86.h to
>> hotspot/build/jdk-linux-amd64/fastdebug/include/linux/jni_md.h.
>>
>> The issue is that hotspot/src/cpu/x86/jni_x86.h differs slightly from
>> jdk/include/jni.h, since it is used for all operating systems:
>>
>> #if defined(SOLARIS) || defined(LINUX) || defined(_ALLBSD_SOURCE)
>> ... // common stuff
>> #else
>> ... // windows stuff
>> #endif
>>
>> We compile the SA agent, see make/linux/makefiles/saproc.make, without
>> defining LINUX (LINUX is hotspot's own define, gcc uses __linux__).
>>
>> In my opinion, there are two ways to solve this:
>> 1. Add -DLINUX to make/linux/makefiles/saproc.make (and corresponding
>> defines for Solaris and BSD).
>> 2. Rewrite the #if check in jni_x86.h to use platform specific "native"
>> defines.
>>
>> I've created a patch for each alternative:
>> 1: http://cr.openjdk.java.net/~ehelin/8034094/webrev.1/
>> 2: http://cr.openjdk.java.net/~ehelin/8034094/webrev.2/
>>
>> For the second patch, note that I've inverted the #if check so that it
>> checks for _WIN32 is defined and treat all others operating systems as
>> "#else".
>>
>> Bug:
>> https://bugs.openjdk.java.net/browse/JDK-8034094
>>
>> Testing:
>> - Compiled both version locally and made sure it worked
>> - JPRT
>>
>> Thanks,
>> Erik


-- 
Dmitry Samersoff
Oracle Java development team, Saint Petersburg, Russia
* I would love to change the world, but they won't give me the sources.


Re: Cross-building Windows binaries using the mingw toolchain

2014-04-30 Thread Volker Simonis
On Wed, Apr 30, 2014 at 6:31 PM, Florian Weimer  wrote:
> On 04/30/2014 06:16 PM, Volker Simonis wrote:
>
>> The first one is to make the OpenJDK compile on Windows with the MinGW
>> toolchain (instead of Cygwin). This currently doesn't work out of the
>> box but is relatively easy to achieve (see for example "8022177:
>> Windows/MSYS builds broken"
>> https://bugs.openjdk.java.net/browse/JDK-8022177). Magnus and I are
>> working on this (and actually I have an internal build which works
>> with MinGW). Hopefully we can fix this in the OpenJDK soon.
>
>
> Thanks for your input.  If you say "MingW toolchain", you mean the scripting
> environment, but not the compiler and linker, right?
>
>
>> The second one is to cross compile the whole OpenJDK on Linux using
>> gcc and MingGW. If I understood you right that's what you actually
>> wanted.
>
>
> Yes, that's what I'm interested in.
>
>
>> I personally think that would be nice to have but at the same
>>
>> time I also think it would be quite hard to get there and probably not
>> worth doing it because even if you'd succeed nobody will probably
>> maintain it and it would break quite soon (see for example the
>> GCC/Solaris build or the Clang/Linux build).
>
>
> It's clear to me that this is worthwhile only if I set up a builder which
> detects bit rot quickly.
>
>
>> If you want to try it nevertheless, some of the problems you will face
>> will be at least the following ones:
>> - convert the HotSpot nmake makefiles to GNU syntax (notice that this
>> project is currently underway under the umbrella of the new build
>> system anyway, so you'd probably want to wait to avoid doing double
>> work)
>
>
> Ah, interesting.
>
>
>> - convert Visual Studio intrinsics, inline assembler and compiler
>> idiosyncrasies to GCC syntax
>
>
> Ahh, I wonder how much I will encounter there.  That would be prerequisite
> for a pure-mingw build on Windows as well, right?
>
>
>> - you'll probably also need to cross compile dependencies like
>> libfreetype with GCC/MinGW
>
>
> Fedora already covers those, although the paths are somewhat unexpected.
>
>
>> - I'm actually not an expert, but the OpenJDK is linked against some
>> native Window libraries like DirectX and runtime libraries from the
>> Microsoft SDKs. I'm not an expert here and I don't know how that would
>> work for a cross-compile.
>
>
> We supposedly have the headers and import libraries in Fedora.
>
>
>> I personally think we should rather focus on further improving the
>> current Windows build. It's already a huge improvement compared to the
>> old JDK7 Windows build. From what I see, the main remaining problems
>> are to somehow make it possible to get a stable, defined and free
>> version of the Microsoft development tools which are "known to work".
>
>
> Yes, I tried to set up a Windows development environment, but quickly got
> confused.
>
> My background here is that I want to contribute some new features and I
> expect that feature parity for Windows will increase the likelihood of
> acceptance.
>

But why can't you install Cygwin and the free Microsoft Express/SDK
compilers and do a native build. From my experience that's a matter of
some hours (and you can find quite some documentation/tutorials/help
on the web and on the various mailing lists). Doing that you could be
sure that you really test what others (i.e. especially Oracle) will
get. Cross-compiling your new feature with a MinGW toolchain doesn't
mean that others will be able to compile and run that code with a
native Windows build tool chain (it would be actually quite easy to
introduce changes which work for the MinGW cross-build but break for
the native build) So I don't see how that would increase the
confidence in your change.

>From my experience, native OpenJDK changes (and often even trivial
ones) should be build and tested at least on Linux and Windows (this
already exercises your changes on two different OSs with two different
compilers). Bigger shared changes should also be tested on MacOS X
(which is quite "cheap" and gives you a third OS and compiler) and
Solaris (which is hard nowadays).

> I need to think about it, but for my purposes, a pure mingw environment
> running on Windows would work as well.  It would be less specialized, so
> perhaps there is more interest in that.
>
>
> --
> Florian Weimer / Red Hat Product Security Team


Re: Cross-building Windows binaries using the mingw toolchain

2014-04-30 Thread Florian Weimer

On 04/30/2014 06:16 PM, Volker Simonis wrote:


The first one is to make the OpenJDK compile on Windows with the MinGW
toolchain (instead of Cygwin). This currently doesn't work out of the
box but is relatively easy to achieve (see for example "8022177:
Windows/MSYS builds broken"
https://bugs.openjdk.java.net/browse/JDK-8022177). Magnus and I are
working on this (and actually I have an internal build which works
with MinGW). Hopefully we can fix this in the OpenJDK soon.


Thanks for your input.  If you say "MingW toolchain", you mean the 
scripting environment, but not the compiler and linker, right?



The second one is to cross compile the whole OpenJDK on Linux using
gcc and MingGW. If I understood you right that's what you actually
wanted.


Yes, that's what I'm interested in.

> I personally think that would be nice to have but at the same

time I also think it would be quite hard to get there and probably not
worth doing it because even if you'd succeed nobody will probably
maintain it and it would break quite soon (see for example the
GCC/Solaris build or the Clang/Linux build).


It's clear to me that this is worthwhile only if I set up a builder 
which detects bit rot quickly.



If you want to try it nevertheless, some of the problems you will face
will be at least the following ones:
- convert the HotSpot nmake makefiles to GNU syntax (notice that this
project is currently underway under the umbrella of the new build
system anyway, so you'd probably want to wait to avoid doing double
work)


Ah, interesting.


- convert Visual Studio intrinsics, inline assembler and compiler
idiosyncrasies to GCC syntax


Ahh, I wonder how much I will encounter there.  That would be 
prerequisite for a pure-mingw build on Windows as well, right?



- you'll probably also need to cross compile dependencies like
libfreetype with GCC/MinGW


Fedora already covers those, although the paths are somewhat unexpected.


- I'm actually not an expert, but the OpenJDK is linked against some
native Window libraries like DirectX and runtime libraries from the
Microsoft SDKs. I'm not an expert here and I don't know how that would
work for a cross-compile.


We supposedly have the headers and import libraries in Fedora.


I personally think we should rather focus on further improving the
current Windows build. It's already a huge improvement compared to the
old JDK7 Windows build. From what I see, the main remaining problems
are to somehow make it possible to get a stable, defined and free
version of the Microsoft development tools which are "known to work".


Yes, I tried to set up a Windows development environment, but quickly 
got confused.


My background here is that I want to contribute some new features and I 
expect that feature parity for Windows will increase the likelihood of 
acceptance.


I need to think about it, but for my purposes, a pure mingw environment 
running on Windows would work as well.  It would be less specialized, so 
perhaps there is more interest in that.


--
Florian Weimer / Red Hat Product Security Team


Re: Cross-building Windows binaries using the mingw toolchain

2014-04-30 Thread Volker Simonis
Hi Florian,

there are two different points to consider here.

The first one is to make the OpenJDK compile on Windows with the MinGW
toolchain (instead of Cygwin). This currently doesn't work out of the
box but is relatively easy to achieve (see for example "8022177:
Windows/MSYS builds broken"
https://bugs.openjdk.java.net/browse/JDK-8022177). Magnus and I are
working on this (and actually I have an internal build which works
with MinGW). Hopefully we can fix this in the OpenJDK soon.

The second one is to cross compile the whole OpenJDK on Linux using
gcc and MingGW. If I understood you right that's what you actually
wanted. I personally think that would be nice to have but at the same
time I also think it would be quite hard to get there and probably not
worth doing it because even if you'd succeed nobody will probably
maintain it and it would break quite soon (see for example the
GCC/Solaris build or the Clang/Linux build).

If you want to try it nevertheless, some of the problems you will face
will be at least the following ones:
- convert the HotSpot nmake makefiles to GNU syntax (notice that this
project is currently underway under the umbrella of the new build
system anyway, so you'd probably want to wait to avoid doing double
work)
- convert Visual Studio intrinsics, inline assembler and compiler
idiosyncrasies to GCC syntax
- you'll probably also need to cross compile dependencies like
libfreetype with GCC/MinGW
- I'm actually not an expert, but the OpenJDK is linked against some
native Window libraries like DirectX and runtime libraries from the
Microsoft SDKs. I'm not an expert here and I don't know how that would
work for a cross-compile.

I personally think we should rather focus on further improving the
current Windows build. It's already a huge improvement compared to the
old JDK7 Windows build. From what I see, the main remaining problems
are to somehow make it possible to get a stable, defined and free
version of the Microsoft development tools which are "known to work".
But of course this is a problem which may need cooperation from
Microsoft. Another improvement would be to make the build more
agnostic against various Cygwin/MingGW version and to improve the
build speed. Again, this issue partially depends on improving
Cygwin/MingGW themselves (which would be at least theoretically
possible).

Regards,
Volker


On Wed, Apr 30, 2014 at 4:36 PM, Florian Weimer  wrote:
> I noticed that cross-building Windows binaries is currently not supported.
> It seems that Hotspot in particular assumes that the host and target
> operating systems are the same (for examples, Linux-to-Linux cross builds
> are support).  Assuming I can get it to work within the current build
> system, would you be interested in integrating patches and carry them
> forward?
>
> Cross-build support would make it easier for GNU/Linux developers to work on
> features that should have parity on Windows, but lack shared native sources
> (e.g. networking-related features).
>
> --
> Florian Weimer / Red Hat Product Security Team


Re: OS X configure ignores --with-tools-dir

2014-04-30 Thread Henry Jen

On 04/30/2014 12:42 AM, Erik Joelsson wrote:


On 2014-04-30 00:51, Dan Smith wrote:

Thanks Henry, that will force it to choose my referenced compiler.

Still not clear whether this is intended behavior or not: is the
default toolchain-type (clang, apparently) supposed to trump an
explicit tools-dir?  I.e., is this a bug, or just surprising but
intentional?

I think this is intentional, but it could certainly still be discussed.
I'm surprised clang is already picked as default however. Perhaps there
is something else that's not working as intended causing this.


We use 'xcodebuild -version' to determine xcode version, and choose 
clang as default after 5.0.


http://hg.openjdk.java.net/jdk9/jdk9/rev/77c150b417d8

--with-tools-dir specify where to find the toolchain, in this case, we 
would hope it can correctly identify it's xcode 4, but it's not.


Cheers,
Henry



/Erik

—Dan

On Apr 25, 2014, at 1:43 PM, Henry Jen  wrote:


For JDK9, try to specify toolchain using --with-toolchain-type=gcc

Cheers,
Henry

On 04/25/2014 10:41 AM, Dan Smith wrote:

I'm using --with-tools-dir on OS X Mavericks to point to an old copy
of Xcode 4.  I configure jdk9 as follows:


make dist-clean
hg update -d "<2014-03-17"
sh configure --with-boot-jdk=$JAVA8_HOME
--with-tools-dir=/Applications/Xcode4.app/Contents/Developer/usr/bin

Running generated-configure.sh
...
Tools summary:
* Boot JDK:   java version "1.8.0" Java(TM) SE Runtime
Environment (build 1.8.0-b132) Java HotSpot(TM) 64-Bit Server VM
(build 25.0-b70, mixed mode)  (at
/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home)
* Toolchain:  gcc (GNU Compiler Collection)
* C Compiler: Version 4.2.1 (at
/Applications/Xcode4.app/Contents/Developer/usr/bin/gcc)
* C++ Compiler:   Version 4.2.1 (at
/Applications/Xcode4.app/Contents/Developer/usr/bin/g++)
...

As of March 18, this no longer works.


make dist-clean
hg update -d "<2014-03-18"
sh configure --with-boot-jdk=$JAVA8_HOME
--with-tools-dir=/Applications/Xcode4.app/Contents/Developer/usr/bin

Running generated-configure.sh
...
Tools summary:
* Boot JDK:   java version "1.8.0" Java(TM) SE Runtime
Environment (build 1.8.0-b132) Java HotSpot(TM) 64-Bit Server VM
(build 25.0-b70, mixed mode)  (at
/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home)
* Toolchain:  clang (clang/LLVM)
* C Compiler: Version Apple LLVM version 5.1 (clang-503.0.40)
(based on LLVM 3.4svn) Target: x86_64-apple-darwin13.1.0 Thread
model: posix (at /usr/bin/clang)
* C++ Compiler:   Version Apple LLVM version 5.1 (clang-503.0.40)
(based on LLVM 3.4svn) Target: x86_64-apple-darwin13.1.0 Thread
model: posix (at /usr/bin/clang++)
...

I appreciate the effort to get clang to work, but I should still be
able to pick my compiler using --with-tools-dir.

Should I report a bug?

(Note on my motivation: I'm getting build errors due to
-Wformat-nonliteral.  I've heard this is a known issue, but I'd like
to be able to work around it in the mean time.)

—Dan





Cross-building Windows binaries using the mingw toolchain

2014-04-30 Thread Florian Weimer
I noticed that cross-building Windows binaries is currently not 
supported.  It seems that Hotspot in particular assumes that the host 
and target operating systems are the same (for examples, Linux-to-Linux 
cross builds are support).  Assuming I can get it to work within the 
current build system, would you be interested in integrating patches and 
carry them forward?


Cross-build support would make it easier for GNU/Linux developers to 
work on features that should have parity on Windows, but lack shared 
native sources (e.g. networking-related features).


--
Florian Weimer / Red Hat Product Security Team


Re: RFR: JDK-8042213: Freetype detection fails on Solaris sparcv9 when using devkit

2014-04-30 Thread Tim Bell

Hello Erik:

Please review this small fix to freetype detection in configure. On 
Solaris, libs are typically found in the "isadir" for 64bit builds. 
This was only handled for amd64 for freetype. This patch makes it more 
universal.


Bug: https://bugs.openjdk.java.net/browse/JDK-8042213
Patch inline:


Looks good to me.

Tim


diff -r de3a6b2a6904 common/autoconf/libraries.m4
--- a/common/autoconf/libraries.m4
+++ b/common/autoconf/libraries.m4
@@ -286,9 +286,10 @@
   AC_MSG_NOTICE([Could not find 
$POTENTIAL_FREETYPE_LIB_PATH/freetype.lib. Ignoring location.])

   FOUND_FREETYPE=no
 fi
-  elif test "x$OPENJDK_TARGET_OS" = xsolaris && test 
"x$OPENJDK_TARGET_CPU" = xx86_64 && test -s 
"$POTENTIAL_FREETYPE_LIB_PATH/amd64/$FREETYPE_LIB_NAME"; then
-# On solaris-x86_86, default is (normally) PATH/lib/amd64. 
Update our guess!

- POTENTIAL_FREETYPE_LIB_PATH="$POTENTIAL_FREETYPE_LIB_PATH/amd64"
+  elif test "x$OPENJDK_TARGET_OS" = xsolaris \
+  && test -s 
"$POTENTIAL_FREETYPE_LIB_PATH$OPENJDK_TARGET_CPU_ISADIR/$FREETYPE_LIB_NAME"; 
then

+# Found lib in isa dir, use that instead.
+ 
POTENTIAL_FREETYPE_LIB_PATH="$POTENTIAL_FREETYPE_LIB_PATH$OPENJDK_TARGET_CPU_ISADIR"

   fi
 fi
   fi

/Erik




Re: RFR: JDK-8042208: Build fails on Solaris using devkit when X isn't installed

2014-04-30 Thread Tim Bell

Hi Erik:

Please review this small fix to the build when linking 
libfontmanager.so on Solaris. Further explanation in the bug.


Bug:  https://bugs.openjdk.java.net/browse/JDK-8042208
Patch inline:


Looks good to me.

Tim


diff -r 830cc367f41b make/lib/Awt2dLibraries.gmk
--- a/make/lib/Awt2dLibraries.gmk
+++ b/make/lib/Awt2dLibraries.gmk
@@ -798,6 +798,10 @@
   BUILD_LIBFONTMANAGER_ExtensionSubtables.cpp_CXXFLAGS := 
-fno-strict-aliasing

 endif

+# Libfontmanager doesn't actually need X_LIBS to link, but if building
+# on a Solaris machine without X installed, using a devkit, linking
+# to libawt_xawt will fail without the -L parameters from X_LIBS. Filter
+# out the -R parameters since they aren't needed.
 $(eval $(call SetupNativeCompilation,BUILD_LIBFONTMANAGER, \
 LIBRARY := fontmanager, \
 OUTPUT_DIR := $(INSTALL_LIBRARIES_HERE), \
@@ -816,7 +820,8 @@
 $(call SET_SHARED_LIBRARY_ORIGIN), \
 LDFLAGS_SUFFIX := $(BUILD_LIBFONTMANAGER_FONTLIB), \
 LDFLAGS_SUFFIX_linux := -lawt $(LIBM) $(LIBCXX) -ljava -ljvm -lc, \
-LDFLAGS_SUFFIX_solaris := -lawt -lawt_xawt -lc $(LIBM) $(LIBCXX) 
-ljava -ljvm, \

+LDFLAGS_SUFFIX_solaris := $(filter-out -R%, $(X_LIBS)) \
+-lawt -lawt_xawt -lc $(LIBM) $(LIBCXX) -ljava -ljvm, \
 LDFLAGS_SUFFIX_aix := -lawt -lawt_xawt $(LIBM) $(LIBCXX) -ljava 
-ljvm,\
 LDFLAGS_SUFFIX_macosx := -lawt $(LIBM) $(LIBCXX) -undefined 
dynamic_lookup \

 -ljava -ljvm, \

/Erik




RFR: JDK-8042213: Freetype detection fails on Solaris sparcv9 when using devkit

2014-04-30 Thread Erik Joelsson
Please review this small fix to freetype detection in configure. On 
Solaris, libs are typically found in the "isadir" for 64bit builds. This 
was only handled for amd64 for freetype. This patch makes it more universal.


Bug: https://bugs.openjdk.java.net/browse/JDK-8042213
Patch inline:
diff -r de3a6b2a6904 common/autoconf/libraries.m4
--- a/common/autoconf/libraries.m4
+++ b/common/autoconf/libraries.m4
@@ -286,9 +286,10 @@
   AC_MSG_NOTICE([Could not find 
$POTENTIAL_FREETYPE_LIB_PATH/freetype.lib. Ignoring location.])

   FOUND_FREETYPE=no
 fi
-  elif test "x$OPENJDK_TARGET_OS" = xsolaris && test 
"x$OPENJDK_TARGET_CPU" = xx86_64 && test -s 
"$POTENTIAL_FREETYPE_LIB_PATH/amd64/$FREETYPE_LIB_NAME"; then
-# On solaris-x86_86, default is (normally) PATH/lib/amd64. 
Update our guess!

- POTENTIAL_FREETYPE_LIB_PATH="$POTENTIAL_FREETYPE_LIB_PATH/amd64"
+  elif test "x$OPENJDK_TARGET_OS" = xsolaris \
+  && test -s 
"$POTENTIAL_FREETYPE_LIB_PATH$OPENJDK_TARGET_CPU_ISADIR/$FREETYPE_LIB_NAME"; 
then

+# Found lib in isa dir, use that instead.
+ 
POTENTIAL_FREETYPE_LIB_PATH="$POTENTIAL_FREETYPE_LIB_PATH$OPENJDK_TARGET_CPU_ISADIR"

   fi
 fi
   fi

/Erik


Re: Proposal: jtreg tests with native components

2014-04-30 Thread Staffan Larsen

On 30 apr 2014, at 11:39, David Holmes  wrote:

> Hi Staffan,
> 
> On 25/04/2014 10:02 PM, Staffan Larsen wrote:
>> There are a couple of jtreg tests today that depend on native components 
>> (either JNI libraries or executables). These are handled in one of two ways:
>> 
>> 1) The binaries are pre-compiled and checked into the repository (often 
>> inside jar files).
>> 2) The test will try to invoke a compiler (gcc, cl, …) when the test is 
>> being run.
>> 
>> Neither of these are very good solutions. #1 makes it hard to run the setup 
>> the test for all platforms and requires binaries in the source control 
>> system. #2 is hit-and-miss: the correct compiler may or may not be installed 
>> on the test machine, and the approach requires platform specific logic to be 
>> maintained.
> 
> #2 is far from perfect but ...
> 
>> I would like to propose that these native components are instead compiled 
>> when the product is built by the same makefile logic as the product. At 
>> product build time we know we have access to the (correct) compilers and we 
>> have excellent support in the makefiles for building on all platforms.
>> 
>> If we build the native test components together with the product, we also 
>> have to take care of distributing the result together with the product when 
>> we do testing across a larger number of machines. We will also need a way to 
>> tell the jtreg tests where these pre-built binaries are located.
> 
> don't under estimate the complexity involved in building then "distributing" 
> the test binaries.

I don’t. It will be complicated, but I’m sure we can do it.

> 
> You will still need to maintain platform specific logic as you won't 
> necessarily be able to use the CFLAGS etc that the main build process uses.

Can you explain more? Why can’t I use CFLAGS as it is?

> 
> Also talk to SQE as I'm pretty sure there is an existing project to look at 
> how to better handle this, at least for the internal test suites.

I have talked to SQE. I don’t know of any other projects to handle this.

/Staffan


> 
> David
> -
> 
>> I suggest that at the end of a distributed build run, the pre-built test 
>> binaries are packaged in a zip or tar file (just like the product bits) and 
>> stored next to the product bundles. When we run distributed tests, we need 
>> to pick up the product bundle and the test bundle before the testing is 
>> started.
>> 
>> To tell the tests where the native code is, I would like to add a flag to 
>> jtreg to point out the path to the binaries. This should cause jtreg to set 
>> java.library.path before invoking a test and also set a test.* property 
>> which can be used by test to find it’s native components.
>> 
>> This kind of setup would make it easier to add and maintain tests that have 
>> a native component. I think this will be especially important as more tests 
>> are written using jtreg in the hotspot repository.
>> 
>> Thoughts on this? Is the general approach ok? There are lots of details to 
>> be figured out, but at this stage I would like to hear feedback on the idea 
>> as such.
>> 
>> Thanks,
>> /Staffan
>> 



Re: RFR: 8034094: SA agent can't compile when jni_x86.h is used

2014-04-30 Thread Erik Joelsson

I think option 2 seems best.

/Erik

On 2014-04-30 13:26, Erik Helin wrote:

Anyone interested in this patch? I ran into this issue again yesterday...

Thanks,
Erik

On 2014-02-10 16:21, Erik Helin wrote:

Sigh, I forgot the subject...

"RFR: 8034094: SA agent can't compile when jni_x86.h is used"

Thanks,
Erik

On 2014-02-10 16:08, Erik Helin wrote:

Hi all,

this patch fixes an issue with HotSpot's makefiles, IMPORT_JDK and
jni_md.h.

The bug manifests itself when using an IMPORT_JDK which
include/linux/jni_md.h has a timestamp that is older than
hotspot/src/cpu/x86/jni_x86.h. When this happens, the Makefiles will
copy hotspot/src/cpu/x86/jni_x86.h to
hotspot/build/jdk-linux-amd64/fastdebug/include/linux/jni_md.h.

The issue is that hotspot/src/cpu/x86/jni_x86.h differs slightly from
jdk/include/jni.h, since it is used for all operating systems:

#if defined(SOLARIS) || defined(LINUX) || defined(_ALLBSD_SOURCE)
... // common stuff
#else
... // windows stuff
#endif

We compile the SA agent, see make/linux/makefiles/saproc.make, without
defining LINUX (LINUX is hotspot's own define, gcc uses __linux__).

In my opinion, there are two ways to solve this:
1. Add -DLINUX to make/linux/makefiles/saproc.make (and corresponding
defines for Solaris and BSD).
2. Rewrite the #if check in jni_x86.h to use platform specific "native"
defines.

I've created a patch for each alternative:
1: http://cr.openjdk.java.net/~ehelin/8034094/webrev.1/
2: http://cr.openjdk.java.net/~ehelin/8034094/webrev.2/

For the second patch, note that I've inverted the #if check so that it
checks for _WIN32 is defined and treat all others operating systems as
"#else".

Bug:
https://bugs.openjdk.java.net/browse/JDK-8034094

Testing:
- Compiled both version locally and made sure it worked
- JPRT

Thanks,
Erik




RFR: JDK-8042208: Build fails on Solaris using devkit when X isn't installed

2014-04-30 Thread Erik Joelsson

Hello,

Please review this small fix to the build when linking libfontmanager.so 
on Solaris. Further explanation in the bug.


Bug:  https://bugs.openjdk.java.net/browse/JDK-8042208
Patch inline:

diff -r 830cc367f41b make/lib/Awt2dLibraries.gmk
--- a/make/lib/Awt2dLibraries.gmk
+++ b/make/lib/Awt2dLibraries.gmk
@@ -798,6 +798,10 @@
   BUILD_LIBFONTMANAGER_ExtensionSubtables.cpp_CXXFLAGS := 
-fno-strict-aliasing

 endif

+# Libfontmanager doesn't actually need X_LIBS to link, but if building
+# on a Solaris machine without X installed, using a devkit, linking
+# to libawt_xawt will fail without the -L parameters from X_LIBS. Filter
+# out the -R parameters since they aren't needed.
 $(eval $(call SetupNativeCompilation,BUILD_LIBFONTMANAGER, \
 LIBRARY := fontmanager, \
 OUTPUT_DIR := $(INSTALL_LIBRARIES_HERE), \
@@ -816,7 +820,8 @@
 $(call SET_SHARED_LIBRARY_ORIGIN), \
 LDFLAGS_SUFFIX := $(BUILD_LIBFONTMANAGER_FONTLIB), \
 LDFLAGS_SUFFIX_linux := -lawt $(LIBM) $(LIBCXX) -ljava -ljvm -lc, \
-LDFLAGS_SUFFIX_solaris := -lawt -lawt_xawt -lc $(LIBM) $(LIBCXX) 
-ljava -ljvm, \

+LDFLAGS_SUFFIX_solaris := $(filter-out -R%, $(X_LIBS)) \
+-lawt -lawt_xawt -lc $(LIBM) $(LIBCXX) -ljava -ljvm, \
 LDFLAGS_SUFFIX_aix := -lawt -lawt_xawt $(LIBM) $(LIBCXX) -ljava 
-ljvm,\
 LDFLAGS_SUFFIX_macosx := -lawt $(LIBM) $(LIBCXX) -undefined 
dynamic_lookup \

 -ljava -ljvm, \

/Erik


Re: RFR: 8034094: SA agent can't compile when jni_x86.h is used

2014-04-30 Thread Erik Helin

Anyone interested in this patch? I ran into this issue again yesterday...

Thanks,
Erik

On 2014-02-10 16:21, Erik Helin wrote:

Sigh, I forgot the subject...

"RFR: 8034094: SA agent can't compile when jni_x86.h is used"

Thanks,
Erik

On 2014-02-10 16:08, Erik Helin wrote:

Hi all,

this patch fixes an issue with HotSpot's makefiles, IMPORT_JDK and
jni_md.h.

The bug manifests itself when using an IMPORT_JDK which
include/linux/jni_md.h has a timestamp that is older than
hotspot/src/cpu/x86/jni_x86.h. When this happens, the Makefiles will
copy hotspot/src/cpu/x86/jni_x86.h to
hotspot/build/jdk-linux-amd64/fastdebug/include/linux/jni_md.h.

The issue is that hotspot/src/cpu/x86/jni_x86.h differs slightly from
jdk/include/jni.h, since it is used for all operating systems:

#if defined(SOLARIS) || defined(LINUX) || defined(_ALLBSD_SOURCE)
... // common stuff
#else
... // windows stuff
#endif

We compile the SA agent, see make/linux/makefiles/saproc.make, without
defining LINUX (LINUX is hotspot's own define, gcc uses __linux__).

In my opinion, there are two ways to solve this:
1. Add -DLINUX to make/linux/makefiles/saproc.make (and corresponding
defines for Solaris and BSD).
2. Rewrite the #if check in jni_x86.h to use platform specific "native"
defines.

I've created a patch for each alternative:
1: http://cr.openjdk.java.net/~ehelin/8034094/webrev.1/
2: http://cr.openjdk.java.net/~ehelin/8034094/webrev.2/

For the second patch, note that I've inverted the #if check so that it
checks for _WIN32 is defined and treat all others operating systems as
"#else".

Bug:
https://bugs.openjdk.java.net/browse/JDK-8034094

Testing:
- Compiled both version locally and made sure it worked
- JPRT

Thanks,
Erik


Re: Proposal: jtreg tests with native components

2014-04-30 Thread David Holmes

Hi Staffan,

On 25/04/2014 10:02 PM, Staffan Larsen wrote:

There are a couple of jtreg tests today that depend on native components 
(either JNI libraries or executables). These are handled in one of two ways:

1) The binaries are pre-compiled and checked into the repository (often inside 
jar files).
2) The test will try to invoke a compiler (gcc, cl, …) when the test is being 
run.

Neither of these are very good solutions. #1 makes it hard to run the setup the 
test for all platforms and requires binaries in the source control system. #2 
is hit-and-miss: the correct compiler may or may not be installed on the test 
machine, and the approach requires platform specific logic to be maintained.


#2 is far from perfect but ...


I would like to propose that these native components are instead compiled when 
the product is built by the same makefile logic as the product. At product 
build time we know we have access to the (correct) compilers and we have 
excellent support in the makefiles for building on all platforms.

If we build the native test components together with the product, we also have 
to take care of distributing the result together with the product when we do 
testing across a larger number of machines. We will also need a way to tell the 
jtreg tests where these pre-built binaries are located.


don't under estimate the complexity involved in building then 
"distributing" the test binaries.


You will still need to maintain platform specific logic as you won't 
necessarily be able to use the CFLAGS etc that the main build process uses.


Also talk to SQE as I'm pretty sure there is an existing project to look 
at how to better handle this, at least for the internal test suites.


David
-


I suggest that at the end of a distributed build run, the pre-built test 
binaries are packaged in a zip or tar file (just like the product bits) and 
stored next to the product bundles. When we run distributed tests, we need to 
pick up the product bundle and the test bundle before the testing is started.

To tell the tests where the native code is, I would like to add a flag to jtreg 
to point out the path to the binaries. This should cause jtreg to set 
java.library.path before invoking a test and also set a test.* property which 
can be used by test to find it’s native components.

This kind of setup would make it easier to add and maintain tests that have a 
native component. I think this will be especially important as more tests are 
written using jtreg in the hotspot repository.

Thoughts on this? Is the general approach ok? There are lots of details to be 
figured out, but at this stage I would like to hear feedback on the idea as 
such.

Thanks,
/Staffan



Re: Build OpenJDK 8 for armhf (Raspberry Pi)?

2014-04-30 Thread David Holmes

Jim,

I don't think zero builds are generally addressed on this list but ...

On 30/04/2014 2:34 AM, Jim Connors wrote:

Hello,

Trying to build OpenJDK 8 for armhf, ultimately to be hosted on a
Raspberry Pi.  I'm cross compiling from a Ubuntu 12.04 x86 Virtualbox
image and am using gcc-4.7-linaro-rpi-gnueabihf for a toolchain.

Configuration invocation looks as follows:

$ bash configure
--with-sys-root=/home/jimc/gcc-4.7-linaro-rpi-gnueabihf/
--target=arm-unknown-linux-gnueabihf --with-jvm-variants=zero
--with-num-cores=2

The make fails like this:

Compiling /home/jimc/openjdk8/hotspot/src/os/posix/vm/os_posix.cpp
/home/jimc/openjdk8/hotspot/src/os/linux/vm/os_linux.cpp: In static
member function 'static jint os::init_2()':
/home/jimc/openjdk8/hotspot/src/os/linux/vm/os_linux.cpp:4853:42: error:
'workaround_expand_exec_shield_cs_limit' was not declared in this scope


You need to setup configure to do cross-compilation. I'm assuming you 
are building on an x86 box because that function is only defined on x86


#if defined(IA32)
  workaround_expand_exec_shield_cs_limit();
#endif

I think

--target=arm-unknown-linux-gnueabihf

should be

--openjdk-target=arm-unknown-linux-gnueabihf

David


Might there a set of patches that are required to get this going
further?  Anything else I'm missing?

Any pointer greatly appreciated,
-- Jim C



Re: OS X configure ignores --with-tools-dir

2014-04-30 Thread Erik Joelsson


On 2014-04-30 00:51, Dan Smith wrote:

Thanks Henry, that will force it to choose my referenced compiler.

Still not clear whether this is intended behavior or not: is the default 
toolchain-type (clang, apparently) supposed to trump an explicit tools-dir?  
I.e., is this a bug, or just surprising but intentional?
I think this is intentional, but it could certainly still be discussed. 
I'm surprised clang is already picked as default however. Perhaps there 
is something else that's not working as intended causing this.


/Erik

—Dan

On Apr 25, 2014, at 1:43 PM, Henry Jen  wrote:


For JDK9, try to specify toolchain using --with-toolchain-type=gcc

Cheers,
Henry

On 04/25/2014 10:41 AM, Dan Smith wrote:

I'm using --with-tools-dir on OS X Mavericks to point to an old copy of Xcode 
4.  I configure jdk9 as follows:


make dist-clean
hg update -d "<2014-03-17"
sh configure --with-boot-jdk=$JAVA8_HOME 
--with-tools-dir=/Applications/Xcode4.app/Contents/Developer/usr/bin

Running generated-configure.sh
...
Tools summary:
* Boot JDK:   java version "1.8.0" Java(TM) SE Runtime Environment (build 
1.8.0-b132) Java HotSpot(TM) 64-Bit Server VM (build 25.0-b70, mixed mode)  (at 
/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home)
* Toolchain:  gcc (GNU Compiler Collection)
* C Compiler: Version 4.2.1 (at 
/Applications/Xcode4.app/Contents/Developer/usr/bin/gcc)
* C++ Compiler:   Version 4.2.1 (at 
/Applications/Xcode4.app/Contents/Developer/usr/bin/g++)
...

As of March 18, this no longer works.


make dist-clean
hg update -d "<2014-03-18"
sh configure --with-boot-jdk=$JAVA8_HOME 
--with-tools-dir=/Applications/Xcode4.app/Contents/Developer/usr/bin

Running generated-configure.sh
...
Tools summary:
* Boot JDK:   java version "1.8.0" Java(TM) SE Runtime Environment (build 
1.8.0-b132) Java HotSpot(TM) 64-Bit Server VM (build 25.0-b70, mixed mode)  (at 
/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home)
* Toolchain:  clang (clang/LLVM)
* C Compiler: Version Apple LLVM version 5.1 (clang-503.0.40) (based on 
LLVM 3.4svn) Target: x86_64-apple-darwin13.1.0 Thread model: posix (at 
/usr/bin/clang)
* C++ Compiler:   Version Apple LLVM version 5.1 (clang-503.0.40) (based on 
LLVM 3.4svn) Target: x86_64-apple-darwin13.1.0 Thread model: posix (at 
/usr/bin/clang++)
...

I appreciate the effort to get clang to work, but I should still be able to 
pick my compiler using --with-tools-dir.

Should I report a bug?

(Note on my motivation: I'm getting build errors due to -Wformat-nonliteral.  
I've heard this is a known issue, but I'd like to be able to work around it in 
the mean time.)

—Dan