Hi Musachy,

I've been using struts 2 with the REST plugin in several projects
and it's been great. It's really nice (thanks for the job, Don!).
I'd like to make the REST plugin more JSR-311 like, too. I've
been studying the struts2 source code for some days, focusing
on the REST and the Codebehind plugin. I've created a new plugin
based on the REST plugin source code + the ClasspathPackageProvider
class from the codebehind plugin. I was thinking in the following
implementation path for initial implementation of a subset of the spec:

- Process jsr-311 @Path annotation with ClasspathPackageProvider to configure
namespaces, packages, etc. to match requests to resource methods
- Create jsr-311 interceptors to process @MatrixParam, @QueryParam, @PathParam,
etc, annotations.

Am I on the right path? Any advices from the most experienced?

Thanks in advance.

Musachy Barroso wrote:
What are the plans for the future of the REST plugin? Has anybody
tried to make it more JSR-311 like? I am not a REST user myself, but I
am kind of bored and could help if there was a clear understanding of
what needs to be done.

musachy

On Mon, Aug 25, 2008 at 12:30 AM, Jeromy Evans
<[EMAIL PROTECTED]> wrote:
Don Brown wrote:
On Mon, Aug 25, 2008 at 12:54 PM, Martin Cooper <[EMAIL PROTECTED]>
wrote:

Another option is a client-side component-based framework like Ext or
Flex
running directly against web services, RESTful or otherwise. No
server-side
web framework required. Of course, you could use something server-side
like
DWR to facilitate working with web services, or Jersey for RESTful
services,
but that would be a choice rather than a requirement.

This is a nice design, when you can do it. GWT is also a good way to
build these types of apps.  Unfortunately, they can easily break much
of what makes the web what it is - the back button, unique,
addressable URI's, accessibility, search engine crawling, etc.
Therefore, I think some sort of server-side web framework will usually
be necessary, however, I don't think it has to go so far as JSF, where
they try to push all the state to the server.  I was talking with a
guy here at work who is looking to start using GWT more about how and
where a plain HTML view of the application fits.  He wants to do very
dynamic, client-side heavy views, but still needs to support search
engines and REST clients.  What if you use Jersey for your REST API,
GWT or straight JQuery for your client-side UI, then have Jersey +
something generate HTML views of your REST API, which you could use
for search engines and developers wanting to browse and interact with
your application.  If you can have the HTML representation of your
REST API auto-generated, you wouldn't have to maintain two different
interfaces, and you could go fully nuts with your client-side heavy
app without having to worry about accessibility or search engine
issues.

Don



[rant] Personally I think search engines need to solve this problem.  The
era of crawling sites needs to close.  As a publisher of content I should be
able to connect to a Google API and publish my content and URIs to them in a
standard machine-friendly format ready for indexing.  Alternatively, I could
implement a dedicated-service for them to consume instead of emulating pages
of content in a non-page-oriented application. Then my application then can
be what it needs to be in any form suitable for my users instead of
perpetuating the artificial SEO-optimzation industry. [/rant]

Anyway, despite that, I took this approach recently with a client-heavy
(single page) application myself, with the exception of autogeneration of
the HTML.  Basically:
- mandated that the client include a custom header (X-RequestedBy) and
signature in the request
- if headers present, the S2 rest plugin handled the request and returned
the resource in the requested content type. I just had to build the view
myself for html.
- if the header's not present and it was a GET, the REST plugin returned the
HTML view and sitemesh decorated it as a full HTML PAGE.
- if a resource was requested directly and the user had javascript, they
were redirected to the rich client with the best-guess initial state based
on the URI
- all flow control is managed on the client.

That meant that one action could service requests for the resource for rich
clients and support search engines requests for the same content.
Search engines could browse the site through the same content spread over
many little well-formed pages.
Users accessing the site via the search engine's sub-URI would see the rich
client with appropriate initial state derived from the URI
On the client-side sensible URIs could still be used in links and listeners
adjusted the content type when appropriate.

Users without JS could get by but were a low priority.  Users with screen
readers are still a challenge but not due to struts.

This approach wasn't as simple as it should be though but confirms that
Don's idea is feasible.  The biggest problem was in fact with IE6 memory
leaks and the poor performance of javascript in most browsers.  A flex
client could have used the same services without a problem.  If automation
of a bland html view with a sitemap were provided for users without
javascript/flash you'd eliminate the double-up on the views for search
engines.

I definitely like the direction these discussions are going.




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]






---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to