Christian,

Thanks for the informational writeup and sorry to get back to you so 
late. As Paul had mentioned I am the project lead for Rifidi an open 
source RFID IDE (http://www.rifidi.org). We have been working on the 
Java LLRP Bindings and are also using the library to develop a Virtual 
LLRP Reader. We just released a beta version which we will be expanding 
in the next couple of weeks.

We are currently using the Java Code donated by Joe Hoag from University 
of Arkansas. After following the discussions here and discussing with 
Joe we agree that it would be beneficial to have the Java library based 
off the same set of XSD's. Just the fact that the other projects are 
using it means there are more people looking at it and also it would 
make John Hogerhuis very very happy :) (sorry john couldn't resist).

I think that given the completeness of the LLRP specification and the 
fact that there really aren't too many different ways of doing things it 
would be a great idea to work together and release a Java toolkit that 
we can both agree on. I looked at your list and I generally agree with 
your approach. There are some implementation specifics that i have some 
different opinions on and they are noted below.

-----------------------------

CF - * it is not using LLRPdef.xml
PG - Already agree with this and will be the major focus of the updates.

CF - * the code generator is passive - it requires manual tweaks. IMHO this is a
big issue from a maintenance perspective - especially when the generated
PG - Agree with this completely.  I would like to take an XSLT approach to 
tweaks and byte encoding where the required logic is embedded in XSL and all 
the Java does is utilize this.  Any changes should happen to the stylesheets 
and we go more towards a document oriented approach to the library.


CF - * the code generator mixes parsing with code generation
PG - I don't see this as critical but more of a nice to have.  I ideally don't 
want to write our own code generator so this may be moot point.  However what 
is important is that the generation of a LLRP-Toolkit API (jar file) is done 
entirely through an ant (build) process and the source is strictly for changes 
to the generation (if needed).  Also as mentioned should not have generated 
code in the source tree.

CF - * not sure whether all of LLRP is actually supported
PG - not sure on this as well but we think that it is pretty good right now 
from what we have looked at the spec.  Kyle is our resident LLRP spec expert 
and he can comment on this further

Now on the actual technology side:

CF - Generate the LLRP Java Library using Apache Velocity (if this does not
work, we'll build a java code gen)
PG - We were actually thinking of using XML Beans to generate our structures 
and use a custom LLRP BaseXML object that can essentially look up the 
stylesheet for doing the byte transformations in a generic way.  This would 
mean that it would be pretty modular and extensible and potentially only 
require one class that the XML Beans generator would use as the base class for 
all generated objects.  However, I just started looking into Velocity and it 
does look promising but I do not have enough information or the knowledge to 
make an informed decision at this point.  This would be something to discuss.

CF - Use LLRPdef.xml
PG - Agreed

CF - Output a set of classes with public methods which will be identical/very
similar to the U of A code. I don't think the LLRP spec leaves a lot of room
there.
PG - Also agree and this would be great for our existing LLRP virtual reader.

-----------------------------

And one other item that I consider to have less of an impact but is 
important and is more of a question to the overall larger group.

- Package names of code should be packaged as org.llrp etc. It is not 
good to see many different company and organization names in the code. 
It seems that so far most of the toolkits have been single company 
focused efforts so this is not an issue but if things go well we will be 
the first cross organization effort so it would be good to resolve this 
problem now. This is also something that we will have to discuss with 
Joe Hoag from U of Ark as I would still like to leverage as much of his 
work and knowledge as possible.

So it seems that we are overall in agreement in terms of design, 
approach and philosophy. Perhaps we can look at each other's design 
recommendations and come to a common ground on the technologies and 
implementation.

As I mentioned before and others have alluded to we are currently trying 
to finish out the initial build of a Virtual LLRP reader for Rifidi and 
for the LLRP community. Hopefully this will get tested by the community 
soon and we won't have too many major changes. We should have a fully 
realized release with the kinks worked out in a week or two. Once this 
piece is accomplished we have two excellent resources (Kyle Neumeir and 
Matt Dean) who are already familiar with LLRP and built the virtual 
reader available to lend support in bringing the Java Toolkit up to spec.

So it would be great if you have any suggestions and let's try to think 
of next steps. We also have conference calls bi-weekly to discuss items 
which Joe also attends and we can also do working sessions as well over 
Skype, phone etc. What time zone are you in BTW? I have been thinking of 
setting up an IRC channel for LLRP so if you have anyone interested in 
that I will try to get that setup as all of our developers use IRC 
regularly.

I hope that this is a good start and that we can reach some consensus. 
As mentioned above it would be great to put something together 
collaboratively and it appears that we have some very good resources to 
do so.

Thanks,

Prasith




Christian Floerkemeier wrote:
> Paul, thanks for the introduction. 
>
> Prasith, we started work on a Java LLRP library implementation a couple of
> weeks ago with Basil Gasser, a student at the Swiss Auto-ID Lab, doing most
> of the coding. Previously, we had already implemented EPCIS, TDT, RP, ALE,
> and RM and made them available as open-source at www.accada.org . 
>
> When the initial JAVA code from U of A got committed we noticed a couple of
> things that we considered not ideal:
> * it is not using LLRPdef.xml
> * the code generator is passive - it requires manual tweaks. IMHO this is a
> big issue from a maintenance perspective - especially when the generated
> code gets checked into cvs as it is currently.
> * the code generator mixes parsing with code generation
> * not sure whether all of LLRP is actually supported
>
> I tried to raise some of the points on the mailing list earlier, but except
> for John and Gordon (perl and c++), there was no reply from the Java
> developer(s). Just wondering whether you agree with my comments above?
>
> Here is our current thinking:
>
> - Generate the LLRP Java Library using Apache Velocity (if this does not
> work, we'll build a java code gen)
> - Use LLRPdef.xml
> - Output a set of classes with public methods which will be identical/very
> similar to the U of A code. I don't think the LLRP spec leaves a lot of room
> there.
>
> Does this make sense?
>
> - Christian
>
> --
>
> Christian Floerkemeier PhD
> [EMAIL PROTECTED] 
> Auto-ID Lab, Massachusetts Institute of Technology
> phone: +1-617-324-1984
>
>
> ________________________________
>
>       From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Paul
> Dietrich
>       Sent: Donnerstag, 26. Juli 2007 12:02
>       To: [email protected]
>       Subject: [ltk-d] Java library (ies)
>       
>       
>
>       We’ve seen interest from multiple parties for Java library
> development. I want to ensure that our toolkit Java resources are being use
> to produce the best library we can.  
>
>        
>
>       I think it makes sense for Christian, Prasith, Kyle, and any other
> that are interested in *developing* and *supporting* Java to engage in
> discussion here on how to move forward with the Java toolkit(s). 
>
>        
>
>       I have seen discussion on the merits of code generation, but I don’t
> think there is any disagreement that if we are using code generation, we
> should use the Common machine descriptions.  Is there any other technical
> justification to have two separate java toolkits? 
>
>        
>
>       If there is not sufficient technical justification, what is involved
> in merging this new effort with the existing one?  
>
>        
>
>       Regards,
>
>        
>
>       Paul
>
>        
>
>        
>
>        
>
>
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems?  Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >>  http://get.splunk.com/
> _______________________________________________
> llrp-toolkit-devel mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/llrp-toolkit-devel
>
>   


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >>  http://get.splunk.com/
_______________________________________________
llrp-toolkit-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/llrp-toolkit-devel

Reply via email to