sounds good to me!
On 20/12/2007, at 9:48 AM, Shane Isbell wrote:
I've been crawling around the toolchain implementation and have a
rough idea
of how this is going to work.
First, we have a dead simple rule that chooses microsoft for windows
and
mono for all other platforms. A developer can override this within the
pom.xml under the compiler plugin configuration in case they need to
compile
with mono on windows. This will require that the user adds the
appropriate
path to the classpath. If they don't, funny behavior could occur. For
example, on windows, if a developer places mono on the classpath but
specifies microsoft as the vendor, then it will compiler under mono
1.1.
Initially, we will only support .NET 2.0 (unless they hit the
situation
outlined above). We can release NMaven against the 2.0.x branch.
Second, when we are ready, we use the toolchain under shared. This
requires implementing some classes for .NET support, this is a minor
amount
of effort. It may, however, require some changes to the toolchain
API. When
the toolchain releases under maven 2.0.9+, we will require that
version for
NMaven. This switch means that the developer no longer specifies the
vendor
(and framework versions) under the compiler plugin config, but
rather adds
the toolchain plugin to the pom.xml and uses the toolchains plugin
configuration to specify the vendor information. This will allow us to
configure the exact path of the executables. At that point, we can
expand
out support for multiple .NET versions, vendor versions and so on.
In many ways, the toolchain is very similar to the nmaven-settings
in 0.14.
The toolchains, however, will be standardized across Maven, so there
will
not be a divergence; toolchains supports the notion of version
ranges; it
only needs to be initialized once, resulting in a performance
improvement;
and finally, we won't be specifying a state machine to infer the
best match
based on the platform, this will all be in the hands of the developer.
Since the toolchain can handle passing in framework locations,
developers
won't be modifying their paths. For IT, we would likely specify
different
toolchains.xml files to use for testing under multiple platforms.
One other feature that I am interested in is a .NET app that reads the
window's registry and displays toolchain options to the user. In this
way, we can have a GUI based application for discovering platform
capabilities and for configuring the toolchain.
Regards,
Shane
On Dec 17, 2007 1:19 PM, Shane Isbell <[EMAIL PROTECTED]> wrote:
Hi Brett,
I'm going to start looking into the current toolchain work and see
what we
can leverage. If the CompilerContext implementation can access the
tool
chain, the context will be able to pass the path location of various
compilers off to the ClassCompiler implementation instances during
construction.
Regards,
Shane
On Dec 16, 2007 10:59 PM, Shane Isbell <[EMAIL PROTECTED]>
wrote:
On Dec 16, 2007 9:38 PM, Brett Porter <[EMAIL PROTECTED]> wrote:
Thanks for the thorough explanation.
Pardon my density, but I'm missing something fundamental - how will
NMaven work in any of the cases below at runtime, without a
settings
file and capability matching, given that you are saying the IT test
might need some special handling to specify the implementation?
I understand where your confusion comes from. If it works under x,
y, z
configuration for builds, it should just require configuring x, y,
z for
testing. I'm not saying that this can't be done by creating
scripts that
modify systems path and environmental variables, what I am saying
that there
may be an easier way that requires configuring components that we
can plug
in to the framework. Both of these are just concepts with no
grounding because right now it doesn't work for the runtime,
except in the
simplest of cases: Microsoft/Mono .NET 2.0. It's just tough to
say at
this point how we want to handle IT tests under multiple platforms
until we
get the first cut at the implementation.
Shane
- Brett
On 17/12/2007, at 9:58 AM, Shane Isbell wrote:
Hi Brett,
The trunk integration tests are set up the same way as Maven and
my
intention for the first release was just to test out the latest
version of
Mono using .NET 2.0, as well as Microsoft 2.0. This would involve
just
changing the environment on a limited number of configurations.
Just
to
note, a simple changing out of the path will not completely work
on
all
configurations, as Mono contains csc.exe (for .NET 1.1) and
gmcs.exe
(for
.NET 2.0). Microsoft, on the other hand, has the versions in
different
directories, which makes swapping out the paths easier. But you
have
to keep
in mind that for Microsoft .NET 3.0, the framework uses the same
compiler as
.NET 2.0, but a different one for 1.1 and 3.5. Configurations can
get a
little funky.
If we were only dealing with Mono or only Microsoft, I would be
much
more
confident that would could pull off doing the IT setup exactly the
same as
Maven for all the needed configurations. I'm hoping that this is
the
case.
However, it's a little premature for me to have a good idea of how
the
integration tests will work under multiple platforms now that we
don't have
capability matching. Right now we have options of creating
scripts
setting
up the paths, with modifications for each permutation of
installations.
Another approach is something like the nmaven-settings file, which
contains
the parameters and lets some component handle the paths. One thing
that I do
like about a settings approach is that, on Windows, we can
autogenerate the
settings file by inspecting the registry, meaning we only test
what
the
platform is capable of testing. We'll need to open up that to
design
and
comments when we get to that point.
If anyone on the list wants to take up on how IT testing should be
done, go
for it. The issue: http://jira.codehaus.org/browse/NMAVEN-14 has
been out
there a long time.
As part of the IT testing, we will need to create a .NET assembly
(or find a
way to do it through Java) that handles the inspection of meta-
data
within
the project assembly. This is the only way to verify things like
resource
generation, signing of assemblies, proper dependencies/references,
and so
on. We may even be able to write this under the verifier.
We haven't yet addressed within the compiler interface
implementation
the specifying of framework version for Mono, so I have even
less of
an idea
of exactly how it will work with the IT tests.
Shane
On Dec 16, 2007 1:32 PM, Brett Porter <[EMAIL PROTECTED]> wrote:
On 17/12/2007, at 8:19 AM, Shane Isbell wrote:
I agree on all the points. Can you post a bug about what is
breaking?
Will do.
This was the original motivation for the nmaven-settings file.
It
allowed
changing the platform configuration to replace vendors, vendor
versions and
framework versions. I think that the general nmaven-settings
file
concept is
the right approach for integration testing, it should just be
used
for
integration tests and should be non-obstrusive. This will likely
require
adding some component extensions that will allow modifying of
the
working
directory of executables. This approach would avoid having to
bring
in all
the capability matching components to support it.
At this stage I would be happy with the integration tests just
running
under whatever the current execution environment is, like the
normal
NMaven execution would do. This is basically what the Maven
ones do
for now, based on the version of Maven in the path. You then
switched
your execution environment and re-run the test suite. Some tests
are
excluded on environments they are not suitable for. The
integration
test tools that are used for Maven should be able to be re-used
in
NMaven.
Beyond that, I would just use whatever the toolchain capabilities
are
in Maven at the time to go towards the next step rather than
adding
anything specific for it in either NMaven or the integration
tests.
Is that in line with what you were thinking?
- Brett