Re: Changing the Proton build system to accommodate jni bindings
What are people's views on the relative priority of these requirements? Are there any I've missed? I think answering these questions is a prerequisite for agreeing the technical solution. With the aim of stimulating discussion regarding our requirements and to reach a consensus, I've classified each of the proposed requirements into whether I believe each is essential, neutral or detrimental to the smooth development of Proton. (proposed requirement numbers from https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements ) Essential 3. To change proton-api, all that is required is to edit a Java file. - Developer productivity 4. To switch to a particular SVN revision, simple SVN commands are run (e.g. svn switch or svn update) - Developer productivity 5. proton-c can be built, excluding its JNI binding, without requiring non-standard tools* 6. proton-c can be built, excluding its JNI binding, from a standalone checkout of the proton-c directory - Developer productivity / tool familiarity Neutral 1. A tarball source release of proton-c can be built by a user without an external dependency on any other part of proton, e.g. proton-api. 2. The aforementioned proton-c tarball release can be produced by performing a simple svn export of proton-c. - If I were building proton-c for my platform for tarball, I would also want to run the tests to be sure proton-c functions correctly. For this reason I question the usefulness of a proton-c tarball. I would want a tarball that included the whole tree including the tests. 7. Proton-c can be built without requiring non-standard tools* 9. Proton-c can be tested without requiring non-standard tools* - If we can achieve this without introducing too much complexity, reinventing too many wheels and the result is portable across all target platforms. Detrimental 8. proton-c can be built from a standalone checkout of the proton-c directory - I think that all proton developers who are changing either the C or Java implementations should be running the system tests before each commit. If they are changing system tests then they need to run against both implementations before each commit. On 22 January 2013 17:09, Rafael Schloming r...@alum.mit.edu wrote: Thanks for posting this, I think it's a very useful step. I'd suggest adding another Stakeholder -- someone testing a release artifact. Rob makes a good point that the release manager is a distinct view, but I think the desire to minimize deltas between the svn tree and the release artifacts is most directly motivated by my experience *testing* release artifacts. I remember going through qpid releases in the old days and having the very unpleasant experience of trying to remember from 8 or 10 months ago how exactly stuff worked in the release artifact as compared to the build tree. I very much like the fact that with a simple export I can be highly confident that my experience of stuff working in my checkout translates well to the release artifacts and testing them is a very familiar, quick, and easy process. Strictly speaking I think the requirement from a release management perspective is purely that we can produce releases at the rate we need, so it has to be quick and easy and robust to different environments, but I wouldn't say the export thing is a requirement of the release manager per/se. As many have pointed out we already use a script for this and it can remap things quite easily. I have more thoughts on the release process, especially as it is somewhat expanded now to produce java binaries and will need to expand more to include windows stuff, however I need to run an errand at the moment. I'll post and/or comment on the page later though. --Rafael I very much like the fact that our current release artifacts are trivial On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey p...@philharveyonline.comwrote: It sounds like we're still a little way away from reaching a consensus. As a step towards this, I would like to clarify the relative priority of the various requirements that have come up. I've therefore created a page on the wiki that lists them, with a child page briefly describing the various proposals. https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements What are people's views on the relative priority of these requirements? Are there any I've missed? I think answering these questions is a prerequisite for agreeing the technical solution. Phil
Re: Is Proton a lightweight alternative to Qpid?
A couple of things... I believe cmake places an 'rpath' on the libraries when they are built. This is removed when you run 'make install' and the libraries are copied to their installed location. The .so and .so.1 files are soft links to the .so.1.0.0 file. Depending on how you copy them, they may act in unexpected ways. You may need to set the environment variable LD_LIBRARY_PATH to your local directory so the loader can find the library file if it's not in a standard install location. I hope this helps, -Ted On 01/22/2013 06:02 PM, Eagy, Taylor wrote: Ted, This is a follow up of our earlier discussion. After building Proton using cmake as described, I copied and placed those four files into another directory: ~/myproject/proton/proton.py ~/myproject/proton/cproton.py ~/myproject/proton/_cproton.so ~/myproject/proton/libqpid-proton.so.1.0.0 ~/myproject/proton/libqpid-proton.so.1 I found that I also needed a link libqpid-proton.so.1 to point to libqpid-proton.so.1.0.0. However, after I removed the qpid-proton-c-0.2/build directory, when I try to import proton.py, it spits out: import proton Traceback (most recent call last): File stdin, line 1, in module File proton.py, line 33, in module from cproton import * File cproton.py, line 26, in module _cproton = swig_import_helper() File cproton.py, line 22, in swig_import_helper _mod = imp.load_module('_cproton', fp, pathname, description) ImportError: libqpid-proton.so.1: cannot open shared object file: No such file or directory Is there something in cmake that is adding absolute paths to the library files? When I built again, importing proton worked fine. Is there a cmake variable that I need to turn off to remove any references to the build directory? The reason I need to do this is because when I copy it over to a clean install of the same Linux version, it spits out that import error. Thanks, Taylor From: Eagy, Taylor [te...@blackbirdtech.com] Sent: Wednesday, January 16, 2013 12:13 PM To: proton@qpid.apache.org Subject: RE: Is Proton a lightweight alternative to Qpid? Rafael and Ted, Thanks for your help on this. I'm excited to see that proton is getting a Windows port since I wasn't able to build it in VS2012 successfully. Thanks, Taylor From: Rafael Schloming [r...@alum.mit.edu] Sent: Tuesday, January 15, 2013 8:13 PM To: proton@qpid.apache.org Subject: Re: Is Proton a lightweight alternative to Qpid? If you run cmake this way you can build the minimal code needed for just the proton library and its python bindings: cmake -DCMAKE_INSTALL_PREFIX=/usr -DBUILD_PYTHON=ON -DBUILD_PHP=OFF -DBUILD_PERL=OFF -DBUILD_RUBY=OFF path_to_src_tree A quick test on my system shows that a make install based on the above build works out to about 1.4MB. Stripping out header files and some package config stuff would get you down to about 1.2MB if you want to go super barebones. --Rafael On Tue, Jan 15, 2013 at 5:38 PM, Ted Ross tr...@redhat.com wrote: Taylor, You need the following files: proton.py (from proton-c/bindings/python) cproton.py (from $BUILD/bindings/python) _cproton.so(from $BUILD/bindings/python) libqpid-proton.so (from $BUILD) -Ted On 01/15/2013 03:35 PM, Eagy, Taylor wrote: Ted, Proton is more lightweight and the systems that it runs on won't have Java installed. While I would prefer a more Pythonic portable solution, as long as Proton-c builds within 5MB, then it should work. However, I'm getting a bunch of undefined reference messages from pythonPYTHON_wrap.c when trying to make install it. So if I just want to use the p2p messaging between Python processes, what are the minimum amount of files that I need to create a Python queue server to handle the queues between processes? (i.e. proton.py, cproton.py, etc) Thanks, Taylor
RE: Is Proton a lightweight alternative to Qpid?
Thanks a lot Ted. It's working now. I appreciate all your help. Thanks, Taylor From: Ted Ross [tr...@redhat.com] Sent: Wednesday, January 23, 2013 8:26 AM To: proton@qpid.apache.org Subject: Re: Is Proton a lightweight alternative to Qpid? A couple of things... I believe cmake places an 'rpath' on the libraries when they are built. This is removed when you run 'make install' and the libraries are copied to their installed location. The .so and .so.1 files are soft links to the .so.1.0.0 file. Depending on how you copy them, they may act in unexpected ways. You may need to set the environment variable LD_LIBRARY_PATH to your local directory so the loader can find the library file if it's not in a standard install location. I hope this helps, -Ted On 01/22/2013 06:02 PM, Eagy, Taylor wrote: Ted, This is a follow up of our earlier discussion. After building Proton using cmake as described, I copied and placed those four files into another directory: ~/myproject/proton/proton.py ~/myproject/proton/cproton.py ~/myproject/proton/_cproton.so ~/myproject/proton/libqpid-proton.so.1.0.0 ~/myproject/proton/libqpid-proton.so.1 I found that I also needed a link libqpid-proton.so.1 to point to libqpid-proton.so.1.0.0. However, after I removed the qpid-proton-c-0.2/build directory, when I try to import proton.py, it spits out: import proton Traceback (most recent call last): File stdin, line 1, in module File proton.py, line 33, in module from cproton import * File cproton.py, line 26, in module _cproton = swig_import_helper() File cproton.py, line 22, in swig_import_helper _mod = imp.load_module('_cproton', fp, pathname, description) ImportError: libqpid-proton.so.1: cannot open shared object file: No such file or directory Is there something in cmake that is adding absolute paths to the library files? When I built again, importing proton worked fine. Is there a cmake variable that I need to turn off to remove any references to the build directory? The reason I need to do this is because when I copy it over to a clean install of the same Linux version, it spits out that import error. Thanks, Taylor From: Eagy, Taylor [te...@blackbirdtech.com] Sent: Wednesday, January 16, 2013 12:13 PM To: proton@qpid.apache.org Subject: RE: Is Proton a lightweight alternative to Qpid? Rafael and Ted, Thanks for your help on this. I'm excited to see that proton is getting a Windows port since I wasn't able to build it in VS2012 successfully. Thanks, Taylor From: Rafael Schloming [r...@alum.mit.edu] Sent: Tuesday, January 15, 2013 8:13 PM To: proton@qpid.apache.org Subject: Re: Is Proton a lightweight alternative to Qpid? If you run cmake this way you can build the minimal code needed for just the proton library and its python bindings: cmake -DCMAKE_INSTALL_PREFIX=/usr -DBUILD_PYTHON=ON -DBUILD_PHP=OFF -DBUILD_PERL=OFF -DBUILD_RUBY=OFF path_to_src_tree A quick test on my system shows that a make install based on the above build works out to about 1.4MB. Stripping out header files and some package config stuff would get you down to about 1.2MB if you want to go super barebones. --Rafael On Tue, Jan 15, 2013 at 5:38 PM, Ted Ross tr...@redhat.com wrote: Taylor, You need the following files: proton.py (from proton-c/bindings/python) cproton.py (from $BUILD/bindings/python) _cproton.so(from $BUILD/bindings/python) libqpid-proton.so (from $BUILD) -Ted On 01/15/2013 03:35 PM, Eagy, Taylor wrote: Ted, Proton is more lightweight and the systems that it runs on won't have Java installed. While I would prefer a more Pythonic portable solution, as long as Proton-c builds within 5MB, then it should work. However, I'm getting a bunch of undefined reference messages from pythonPYTHON_wrap.c when trying to make install it. So if I just want to use the p2p messaging between Python processes, what are the minimum amount of files that I need to create a Python queue server to handle the queues between processes? (i.e. proton.py, cproton.py, etc) Thanks, Taylor
Re: Question about proton-c message id API
Yeah, it appears to be a bug. I checked in a potential fix on trunk. Give it a shot and see if it's still an issue. --Rafael On Wed, Jan 23, 2013 at 9:19 AM, Phil Harvey p...@philharveyonline.comwrote: I'm working on the proton-c JNI binding and am having trouble with the Message.getMessageId() JNI implementation. I am currently calling pn_message_get_id but this is causing a segfault (details below). I suspect that I might be using the proton-c message API incorrectly and would appreciate advice from someone more familiar with the proton-c implementation. Our call stack is: AccessorsTest.testId ... Message._get_id JNIMessage.getMessageId() ... pn_message_get_id pn_data_get_atom segfaults because node is null This suggests that our problem is that we haven't previously called pn_message_set_id. message.h contains the following: pn_data_t *pn_message_id(pn_message_t *msg); pn_atom_t pn_message_get_id(pn_message_t *msg); intpn_message_set_id(pn_message_t *msg, pn_atom_t id); The JNI implementation of Message.getMessageId() calls pn_message_get_id. However, in revision 1400309 (committed by Rafi) the proton-c Python binding was modified to implement this using the pn_data returned from pn_message_id instead. Are the pn_message_get_id and pn_message_set_id functions deprecated? If so, the Java Message interface should possibly have its getMessageId and setMessageId methods removed, and replaced with calls to a new interface (called MessageId?). If not, can anyone explain the intended interplay between the three message id functions? Finally, it seems wrong to me that pn_message_get_id should ever cause a seg fault. Is this a bug? Phil
Reducing the visibility of proton-j constructors
As part of the Proton JNI work, I would like to remove all calls to proton-j implementation constructors from client code. I intend that factories will be used instead [1], thereby abstracting away whether the implementation is pure Java or proton-c-via-JNI. I'd like to check that folks are happy with me making this change, and to mention a couple of problems I've had. In this context, client code is anything outside the current sub-component, where our sub-components are Engine, Codec, Driver, Message and Messenger, plus each of the contrib modules, and of course third party code. To enforce this abstraction, I am planning to make the constructors of the affected classes package-private where possible. I believe that, although third party projects might already be calling these constructors, it is acceptable for us to change its public API in this manner while Proton is such a young project. Please shout if you disagree with any of the above. Now, onto my problem. I started off with the org.apache.qpid.proton.engine. impl package, and found that o.a.q.p.hawtdispatch.impl.AmqpTransport calls various methods on ConnectionImpl and TransportImpl, so simply using a Connection and Transport will not work. I don't know what to do about this, and would welcome people's opinions. Thanks Phil [1] for example, these work-in-progress classes: https://svn.apache.org/repos/asf/qpid/proton/branches/jni-binding/proton-j/proton-api/src/main/java/org/apache/qpid/proton/ProtonFactoryLoader.java https://svn.apache.org/repos/asf/qpid/proton/branches/jni-binding/proton-j/proton-api/src/main/java/org/apache/qpid/proton/engine/EngineFactory.java https://svn.apache.org/repos/asf/qpid/proton/branches/jni-binding/proton-j/proton/src/main/java/org/apache/qpid/proton/engine/impl/EngineFactoryImpl.java https://svn.apache.org/repos/asf/qpid/proton/branches/jni-binding/proton-c/bindings/java/jni/src/main/java/org/apache/qpid/proton/engine/jni/JNIEngineFactory.java
Re: Changing the Proton build system to accommodate jni bindings
I've added another wiki page that documents the proton release steps as best I can remember. I'll updated it more during the 0.4 release: https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps I think it's important to understand the overall release and testing process as it is a significant and perhaps underrepresented factor against which to measure any proposals. I believe the build system requirements documented below are inherently incomplete as they don't recognize the fact that the C build system is not just a developer productivity tool, it is also the installer for our end users. And before anyone says our end users will just use yum or equivalents, all those packaging tools *also* depend on our build system both directly, and because we can't even supply a release for packagers to consume without a reasonable amount of direct install testing. To a good extent a standard looking C source tarball is pretty much the equivalent of a jar or jar + pom file in the Java world, it's really the only platform independent means of distribution we have. It's also probably worth noting that perhaps the biggest issue with system tests in Java is not so much imposing maven on proton-c developers, but the fact that Java may not be available on all the platforms that proton-c needs to be tested on. My primary concern here would be iOS. I'm not an expert, but my brief googling seems to suggest there would be significant issues. --Rafael On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey p...@philharveyonline.comwrote: In case anyone has missed it, note that Gordon has added some relevant comments directly on the wiki pages: https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals Phil On 23 January 2013 13:01, Keith W keith.w...@gmail.com wrote: What are people's views on the relative priority of these requirements? Are there any I've missed? I think answering these questions is a prerequisite for agreeing the technical solution. With the aim of stimulating discussion regarding our requirements and to reach a consensus, I've classified each of the proposed requirements into whether I believe each is essential, neutral or detrimental to the smooth development of Proton. (proposed requirement numbers from https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements ) Essential 3. To change proton-api, all that is required is to edit a Java file. - Developer productivity 4. To switch to a particular SVN revision, simple SVN commands are run (e.g. svn switch or svn update) - Developer productivity 5. proton-c can be built, excluding its JNI binding, without requiring non-standard tools* 6. proton-c can be built, excluding its JNI binding, from a standalone checkout of the proton-c directory - Developer productivity / tool familiarity Neutral 1. A tarball source release of proton-c can be built by a user without an external dependency on any other part of proton, e.g. proton-api. 2. The aforementioned proton-c tarball release can be produced by performing a simple svn export of proton-c. - If I were building proton-c for my platform for tarball, I would also want to run the tests to be sure proton-c functions correctly. For this reason I question the usefulness of a proton-c tarball. I would want a tarball that included the whole tree including the tests. 7. Proton-c can be built without requiring non-standard tools* 9. Proton-c can be tested without requiring non-standard tools* - If we can achieve this without introducing too much complexity, reinventing too many wheels and the result is portable across all target platforms. Detrimental 8. proton-c can be built from a standalone checkout of the proton-c directory - I think that all proton developers who are changing either the C or Java implementations should be running the system tests before each commit. If they are changing system tests then they need to run against both implementations before each commit. On 22 January 2013 17:09, Rafael Schloming r...@alum.mit.edu wrote: Thanks for posting this, I think it's a very useful step. I'd suggest adding another Stakeholder -- someone testing a release artifact. Rob makes a good point that the release manager is a distinct view, but I think the desire to minimize deltas between the svn tree and the release artifacts is most directly motivated by my experience *testing* release artifacts. I remember going through qpid releases in the old days and having the very unpleasant experience of trying to remember from 8 or 10 months ago how exactly stuff worked in the release artifact as compared to the build tree. I very much like the fact that with a simple export I can be highly confident that my experience of stuff
Re: Changing the Proton build system to accommodate jni bindings
On Wed, Jan 23, 2013 at 8:01 AM, Keith W keith.w...@gmail.com wrote: Essential 3. To change proton-api, all that is required is to edit a Java file. - Developer productivity This seems to be kind of a leading requirement so to speak, or at least it's phrased a little bit oddly. That said I would never argue with it for most of the Java files, however in the case of the API files I don't see how you're ever going to be able to stop after just editing the API. Because we have two implementations, we're fundamentally stuck with manually syncing the implementations themselves whenever a change to the interface occurs. By comparison the highly automatable task of syncing the API files themselves seems quite small. I'm imagining most changes would go something like this, say we want to add a getter to the Message interface, we would need to: 1. edit the Message interface 2. write and/or possibly modify a test 3. edit the java Message implementation 4. run the tests against java, if they don't pass go to step 2 5. now that the java impl passes the tests, run the tests against the C impl 6. if the sync check fails on the C build, run the sync script 7. edit the message.h file 8. edit the message.c implementation 9. edit the adapter layer between the C API and the Java interfaces 10. run the tests against the C, if they don't pass go to step 8 11. run the tests against both, just to be sure 12. check in Given the above workflow, it seems like even with a relatively small change like adding a getter, the scripted portion of the syncing effort is going to be vanishingly small compared to the manual process of syncing the implementations. Perhaps I'm just envisioning a different workflow than you, or maybe I'm missing some important scenarios. Could you describe what workflow(s) you envision and how the sync process would impacting your productivity? 4. To switch to a particular SVN revision, simple SVN commands are run (e.g. svn switch or svn update) - Developer productivity 5. proton-c can be built, excluding its JNI binding, without requiring non-standard tools* 6. proton-c can be built, excluding its JNI binding, from a standalone checkout of the proton-c directory - Developer productivity / tool familiarity Neutral 1. A tarball source release of proton-c can be built by a user without an external dependency on any other part of proton, e.g. proton-api. 2. The aforementioned proton-c tarball release can be produced by performing a simple svn export of proton-c. - If I were building proton-c for my platform for tarball, I would also want to run the tests to be sure proton-c functions correctly. For this reason I question the usefulness of a proton-c tarball. I would want a tarball that included the whole tree including the tests. The proton-c tarball does include the tests directory. The tests directory is just pure python code, so once you've installed proton-c onto your system, you can run any of the proton tests just like you would run any normal python script. As I mentioned in another post, the inclusion of tests under both proton-c and proton-j is the one deviation in directory structure from a pure svn export, and even this much is kindof a pain as there is no way for the README to actually describe things properly without being broken in either the svn tree or in the release artifact. 7. Proton-c can be built without requiring non-standard tools* 9. Proton-c can be tested without requiring non-standard tools* - If we can achieve this without introducing too much complexity, reinventing too many wheels and the result is portable across all target platforms. Detrimental 8. proton-c can be built from a standalone checkout of the proton-c directory - I think that all proton developers who are changing either the C or Java implementations should be running the system tests before each commit. If they are changing system tests then they need to run against both implementations before each commit. Doesn't this conflict pretty directly with 6?
Re: Changing the Proton build system to accommodate jni bindings
On 23 January 2013 19:09, Rafael Schloming r...@alum.mit.edu wrote: I've added another wiki page that documents the proton release steps as best I can remember. I'll updated it more during the 0.4 release: https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps I think it's important to understand the overall release and testing process as it is a significant and perhaps underrepresented factor against which to measure any proposals. I believe the build system requirements documented below are inherently incomplete as they don't recognize the fact that the C build system is not just a developer productivity tool, it is also the installer for our end users. And before anyone says our end users will just use yum or equivalents, all those packaging tools *also* depend on our build system both directly, and because we can't even supply a release for packagers to consume without a reasonable amount of direct install testing. To a good extent a standard looking C source tarball is pretty much the equivalent of a jar or jar + pom file in the Java world, it's really the only platform independent means of distribution we have. It would be helpful if you could enumerate requirements which you believe to be missing and add the to the existing wiki page. I don't think anyone is suggesting that the make install step should be broken in the source tarball, so it's a little unclear to me the problem you are trying to highlight above. It's also probably worth noting that perhaps the biggest issue with system tests in Java is not so much imposing maven on proton-c developers, but the fact that Java may not be available on all the platforms that proton-c needs to be tested on. My primary concern here would be iOS. I'm not an expert, but my brief googling seems to suggest there would be significant issues. So, I think we probably need to consider what sort of tests are required, and which languages it is appropriate to write any particular type of test in. For me tests in Java have some advantages over Python tests. Firstly they allow interop tests between the two implementations within the same process and secondly they will also be able to be used against any future pure JavaScript Proton implementation (something we have planned to do but not yet embarked upon). A third issue for me is that when we start to attempt more granular testing of things such as error handling, I will want to ensure that the user experience is identical between the pure Java and JNI binding implementations of the Java Proton API... if the tests are being run through a second translation into the Python API then this is not easily verifiable. As a final aside, on the standard development environment many of us have to work with, the installed version on Python is too old to support the current Python client (lack of UUID, etc). Personally I think the more tests we have the better, and it's more important to encourage people to write tests than to force the use of a particular language to write them in. I'd also suggest that we should be writing at least some tests for each of the idiomatic bindings. -- Rob --Rafael On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey p...@philharveyonline.com wrote: In case anyone has missed it, note that Gordon has added some relevant comments directly on the wiki pages: https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals Phil On 23 January 2013 13:01, Keith W keith.w...@gmail.com wrote: What are people's views on the relative priority of these requirements? Are there any I've missed? I think answering these questions is a prerequisite for agreeing the technical solution. With the aim of stimulating discussion regarding our requirements and to reach a consensus, I've classified each of the proposed requirements into whether I believe each is essential, neutral or detrimental to the smooth development of Proton. (proposed requirement numbers from https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements ) Essential 3. To change proton-api, all that is required is to edit a Java file. - Developer productivity 4. To switch to a particular SVN revision, simple SVN commands are run (e.g. svn switch or svn update) - Developer productivity 5. proton-c can be built, excluding its JNI binding, without requiring non-standard tools* 6. proton-c can be built, excluding its JNI binding, from a standalone checkout of the proton-c directory - Developer productivity / tool familiarity Neutral 1. A tarball source release of proton-c can be built by a user without an external dependency on any other part of proton, e.g. proton-api. 2. The aforementioned proton-c tarball release can be produced by performing a
Re: Changing the Proton build system to accommodate jni bindings
Firstly I think it would be helpful if you made clear the requirements you consider to be essential, nice to have, unimportant and/or detrimental. On 23 January 2013 20:17, Rafael Schloming r...@alum.mit.edu wrote: On Wed, Jan 23, 2013 at 8:01 AM, Keith W keith.w...@gmail.com wrote: Essential 3. To change proton-api, all that is required is to edit a Java file. - Developer productivity This seems to be kind of a leading requirement so to speak, or at least it's phrased a little bit oddly. That said I would never argue with it for most of the Java files, however in the case of the API files I don't see how you're ever going to be able to stop after just editing the API. Because we have two implementations, we're fundamentally stuck with manually syncing the implementations themselves whenever a change to the interface occurs. By comparison the highly automatable task of syncing the API files themselves seems quite small. I'm imagining most changes would go something like this, say we want to add a getter to the Message interface, we would need to: I think it's worth considering two different cases 1) The API change is purely on the Java side... there is no corresponding change to the C API. This may be to add some sort of convenience method, or simply a refactoring. In this case the developer making the change needs only to work in Java, there will be two implementations of the interface to change (in two different source locations) but is all rather trivial. 2) The API change affects both C and Java. In this case either a single developer has to commit to making the change in both the C and the Java, or the API change has to have been discussed before work commences and Java and C developers will need to work together. If there is a single developer or developers working very closely together then I would suggest that the steps would in fact be: 1. edit the Message interface / edit the message.h file 2. write and/or modify a test (and Python binding if necessary) 3. edit the JNI binding to use the SWIG generated API 4. edit the C / Pure Java 5. run the tests against the C / Java (6. modify other bindings if necessary) repeat steps 4 and 5 until they pass. In the case where the C and Java developers are separated by time/distance then the build / tests on one side will be broken until the implementation catches up. For the sake of politeness it is probably better to ensure that at all points the checked in code compiles even if the tests do not pass. For cases where the changes to the API are additions then it should be relatively easy to make the changes in such a way as to simply have any tests relating to the new API be skipped. For cases where the C leads the Java, the Java implementation can simply throw UnsupportedOperationException or some such. Where the Java leads the C we can throw said exception from the JNI binding code and leave the .h file unchanged until the C developer is ready to do the work. Only for cases where there is modification to existing APIs does it seem that there may be occaisins where we could not have a consistent build across components, and I would strongly recommend that any change where the Java and C are being worked on in such a fashion should take place on a branch, with a merge to trunk only occurring when all tests are passing against all implementations. 1. edit the Message interface 2. write and/or possibly modify a test 3. edit the java Message implementation 4. run the tests against java, if they don't pass go to step 2 5. now that the java impl passes the tests, run the tests against the C impl 6. if the sync check fails on the C build, run the sync script 7. edit the message.h file 8. edit the message.c implementation 9. edit the adapter layer between the C API and the Java interfaces 10. run the tests against the C, if they don't pass go to step 8 11. run the tests against both, just to be sure 12. check in Given the above workflow, it seems like even with a relatively small change like adding a getter, the scripted portion of the syncing effort is going to be vanishingly small compared to the manual process of syncing the implementations. Perhaps I'm just envisioning a different workflow than you, or maybe I'm missing some important scenarios. Could you describe what workflow(s) you envision and how the sync process would impacting your productivity? I differ strongly in my opinion here. Every time I need to drop out of my development environment to run some ad-hoc script then there is overhead... Moreover if we are using svn to do this I presume we would be having to check in any change before the sync could be made. This means that every edit to a file now has to be followed by commit and sync (which would obviously be an insane process). Those of us behind corporate firewalls and proxies experience very degraded response times when updating from the