Re: inputText and readonly
swk schrieb: hi all, Hoping that someone can help me with the following problem that i have: I'm using the latest myfaces 1.1.6, tomahawk 1.1.6, tomahawk sandbox 1.1.7 Basically i have... s:form t:panelTabbedPane ... t:panelTab h:panelGrid ... multiple h:inputText to display data from backing beans. some with attribute readonly=true and others without and a h:commandButton to update data h:panelGrid /t:panelTab multiple t:panelTabs like the one above /t:panelTabbedPane /s:form My problem is that the h:inputText that have the attribute readonly=true specified shows the data. It's rendered fine with data from the beans. The h:inputText that don't have readonly defined show's nothing. But if i put readonly=true on these, then it works fine. Can anyone advise what i am doing wrong? Another point is that if i change this to have a form in each panelTab (as opposed to 1 form for all tabs) it also displays all the data, readonly and non readonly. I expect that you're getting a validation failure, but that you do not have an h:messages tag in your page, so do not see it. Try adding an h:messages tag to your page. And in future, please send these sorts of questions to the user list, not the dev list. Regards, Simon
Re: Recreating Tomahawk 1.1.6
Jan Nielsen schrieb: To get a bug fix into 1.1.6, I would like to recreate the Tomahawk 1.1.6 build. I grabbed the code at: http://svn.apache.org/repos/asf/myfaces/tomahawk/tags/1_1_6 and tried to build that with mvn -Djsf=12 -Dtomahawk=12 clean package but it fails to resolve the dependency on myfaces-shared-tomahawk 2.0.6: org.apache.myfaces.shared:myfaces-shared-tomahawk:jar:2.0.6 Reviewing a couple of repositories, it looks like 2.0.x releases are there, /except/ for 2.0.6. I'm fine with building it myself but it also looks like the 2.0.6 is missing from subversion: http://svn.apache.org/repos/asf/myfaces/shared/tags So, can the Tomahawk 1.1.6 build be reproduced? Is there a recipe, a wiki, a script, or a spot in subversion from which the Tomahawk 1.1.6 release can be built? Hmm..that is weird indeed. I'll check the email archives. Tomahawk 1.1.6 does indeed depend on shared-2.0.6, and as you say there appears to be neither a tag nor a released jar for shared-2.0.6. The jsf1.2 flavour of tomahawk has only been added *after* the 1.1.6 tomahawk release. So the -Djsf=12 -Dtomahawk=12 options are not valid when building the 1.1.6 release. Note that normal tomahawk 1.1.x runs fine on both JSF1.1 and JSF1.2. However there are some optimisations that can be added by building against JSF1.2, hence the new flavour. But that doesn't change the fact that shared-2.0.6 is missing. Regards, Simon
Re: Recreating Tomahawk 1.1.6
[EMAIL PROTECTED] schrieb: Jan Nielsen schrieb: To get a bug fix into 1.1.6, I would like to recreate the Tomahawk 1.1.6 build. I grabbed the code at: http://svn.apache.org/repos/asf/myfaces/tomahawk/tags/1_1_6 and tried to build that with mvn -Djsf=12 -Dtomahawk=12 clean package but it fails to resolve the dependency on myfaces-shared-tomahawk 2.0.6: org.apache.myfaces.shared:myfaces-shared-tomahawk:jar:2.0.6 Reviewing a couple of repositories, it looks like 2.0.x releases are there, /except/ for 2.0.6. I'm fine with building it myself but it also looks like the 2.0.6 is missing from subversion: http://svn.apache.org/repos/asf/myfaces/shared/tags So, can the Tomahawk 1.1.6 build be reproduced? Is there a recipe, a wiki, a script, or a spot in subversion from which the Tomahawk 1.1.6 release can be built? Hmm..that is weird indeed. I'll check the email archives. Tomahawk 1.1.6 does indeed depend on shared-2.0.6, and as you say there appears to be neither a tag nor a released jar for shared-2.0.6. The jsf1.2 flavour of tomahawk has only been added *after* the 1.1.6 tomahawk release. So the -Djsf=12 -Dtomahawk=12 options are not valid when building the 1.1.6 release. Note that normal tomahawk 1.1.x runs fine on both JSF1.1 and JSF1.2. However there are some optimisations that can be added by building against JSF1.2, hence the new flavour. But that doesn't change the fact that shared-2.0.6 is missing. I see there is a 2.0.6 dir in http://svn.apache.org/repos/asf/myfaces/shared/branches And the pom there has the released version number. So I would guess that whoever did the release did a mvn install from that branch dir, then build the final release of tomahawk but forgot to: * move the branch to the tags dir * deploy the shared jar to the release repo So to rebuild tomahawk 1.1.6, you should be able to check out that branch dir, do mvn install locally then build tomahawk 1.1.6. In some ways it *is* a little odd to deploy the shared jar to the release repo, as nobody will actually ever use it - except people like yourself who want to rebuild the official release. But for tidiness it probably *should* be done. And certainly an svn copy should have existed in the tag dir, not just branches (although there is technically no difference when using svn). Regards, Simon
Re: [myfaces core] Is there any task left for release myfaces core 1.2.4?
Leonardo Uribe schrieb: Hi As planned to release tomahawk, It could be good (optional) to release myfaces core 1.2.4 Is there any task left to do this? If no I'll start the release procedure. Sounds good to me. There is always more to do, but I don't know of anything critical. So +1
Re: [tomahawk] extensionsfilter refactoring
Matthias Wessendorf schrieb: Simon, On Wed, Aug 13, 2008 at 10:53 AM, Simon Kitching [EMAIL PROTECTED] wrote: Hi, Just a reminder of the email I posted a few weeks ago: I'm still very concerned about the recent ExtensionsFilter refactoring. The new code now parses the web.xml to extract some configuration information. This just feels wrong to me. The new code also is based around a custom FacesContext wrapper. I'm worried about possible interactions between this and other libraries that also wrap FacesContext. And I'm worried about the portability of this code to the upcoming JSF2.0. I know this stuff is needed to get tomahawk useable for portlets. But it's very important not to break existing users. What I would like to see is: * separate out the resource-serving stuff into a separate utilities module, that can be called from either a filter or a FacesContext. And call the common code from both ExtensionsFilter and TomahawkFacesContextWrapper. * have the TomahawkFacesContextFactory create a FacesContext wrapper only in the case where the filter is not configured, ie look in the request-scope for a flag placed there by the filter. That means there is no chance of breaking existing setups where the filter is defined explicitly. * ideally, also have some way of turning off the faces-context wrapping even when the filter is not present. * don't parse the web.xml at all. I know you said earlier that it is needed, but I just don't see why. In any case, I think we *must* find some other solution; the current approach is really hard to maintain. Until this is fixed, I would definitely vote -1 on any tomahawk release. It's a major issue. was there a jira issue for it ? or can you point me to the svn rev? The best thing to do is to look at these classes: org.apache.myfaces.webapp.filter.ExtensionsFilter org.apache.myfaces.webapp.filter.TomahawkFacesContextFactory org.apache.myfaces.webapp.filter.TomahawkFacesContextWrapper org.apache.myfaces.webapp.filter.ExtensionsFilterConfig [1] The svn history of these files references MYFACES-434. There is no specific jira issue about my concerns; it is really an architecture/design disagreement rather than a specific bug. [1] ExtensionsFilterConfig.getExtensionsFilterConfig is invoked from TomahawkFacesContextWrapper. Regards, Simon
Re: [myfaces core] Is there any task left for release myfaces core 1.2.4?
[EMAIL PROTECTED] schrieb: Leonardo Uribe schrieb: Hi As planned to release tomahawk, It could be good (optional) to release myfaces core 1.2.4 Is there any task left to do this? If no I'll start the release procedure. Sounds good to me. There is always more to do, but I don't know of anything critical. So +1 Oh yeah, there is one thing. I suggested updating the org.tomcat:catalina:6.0.10 dependency to org.tomcat:catalina:6.0.13, so we don't depend on the special maven2 repo at tomcat.apache.org (6.0.13 is in the normal maven repos). If there are no objections, I'll do this in a few hours.
Re: Problem rendering selectOneMenu
Antoni Reus schrieb: ||Hi, Recently I came around a problem testing my app with geronimo 2.1.2 (myfaces 1.2.3). The app is working well in jboss 4.2 (JSF RI 1.2_04), and I'm trying it with geronimo 2.1.2 (myfaces 1.2.3). I have a managed bean called treeManager, with a selectedNode property that is null the first time. I have three input components: 2 inputText, and a selectOneMenu, the JSP code is this: ... h:outputLabel for=nodeName value=Nom/ h:inputText id=nodeName value=#{treeManager.selectedNode.name} / h:outputLabel for=nodeDescription value=Descripció/ h:inputText id=nodeDescription value=#{treeManager.selectedNode.description} / h:outputLabel for=ambitType value=Àmbit/ h:selectOneMenu id=ambitType value=#{treeManager.selectedNode.ambit} f:selectItem itemLabel=Global itemValue=global/ f:selectItem itemLabel=Organisme itemValue=organisme/ f:selectItem itemLabel=Procediment itemValue=procediment/ /h:selectOneMenu When I try the JSF I get this error. |javax.faces.FacesException: Exception while calling encodeEnd on component : {Component-Path : [Class: org.ajax4jsf.component.AjaxViewRoot,ViewId: /dissenyador/estructures.jsp][Class: javax.faces.component.html.HtmlPanelGrid,Id: j_id_jsp_305935947_1]}| | at javax.faces.component.UIComponentBase.encodeEnd(UIComponentBase.java:610)| Caused by: ... |Caused by: org.apache.jasper.el.JspPropertyNotFoundException: /dissenyador/estructures.jsp(60,8) '#{treeManager.selectedNode.ambit}' Target Unreachable, 'selectedNode' returned null| | at org.apache.jasper.el.JspValueExpression.getType(JspValueExpression.java:61)| | at org.apache.myfaces.shared_impl.renderkit._SharedRendererUtils.findUIOutputConverter(_SharedRendererUtils.java:58)| | at org.apache.myfaces.shared_impl.renderkit.RendererUtils.findUIOutputConverter(RendererUtils.java:391)| | at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.findUIOutputConverterFailSafe(HtmlRendererUtils.java:393)| | at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.internalRenderSelect(HtmlRendererUtils.java:316)| | at org.apache.myfaces.shared_impl.renderkit.html.HtmlRendererUtils.renderMenu(HtmlRendererUtils.java:288)| | at org.apache.myfaces.shared_impl.renderkit.html.HtmlMenuRendererBase.encodeEnd(HtmlMenuRendererBase.java:57)| | at javax.faces.component.UIComponentBase.encodeEnd(UIComponentBase.java:607)| Don't know if this is correct, because its true that selectedNode evaluates to null, but no exception is thrown when rendering the two previus inputText that also reference selectedNode. Is this correct? Should I file a bug report? Interesting. From the stacktrace it look like the problem is when trying to determine the *type* that this expression returns. When actually asking for the value, null is simply returned if the intermediate object is not there. But when asking what static type of object would be returned from the bound property, of course there is a real problem if the intermediate node is not there. Here's the code from SharedRendererUtils.findUIOutputConverter; the getType call is the problem: Class valueType = vb.getType(facesContext); // boom when an intermediate node in the EL is null if (valueType == null) return null; if (String.class.equals(valueType)) return null;//No converter needed for String type if (Object.class.equals(valueType)) return null;//There is no converter for Object class The HtmlRendererUtils.internalRenderSelect uses converter = findUIOutputConverterFailSafe -- which obviously is not quite so fail safe :-) I'm not quite sure what the converter is being used for during rendering of the select component, but I do know that the rules about converters and select-components are quite complex. The HTML select component must always render strings for its options, but JSF requires typed objects to be passed between the select *component* and the backing beans. So conversions are required to be invoked at various times. I think a JIRA issues should definitely be filed for this. If a converter is optional here, then the code should catch this exception and not use a converter. Even if a converter is mandatory (ie an error should be reported if the value-type cannot be determined) then at least the error reporting needs to be improved. And it is definitely a myfaces issue, not a Geronimo one. Just as a side-note, please report issues like this to the user list in future. The myfaces developers are also subscribed to that... Regards, Simon
tomahawk core 1.2.x build problem
Hi All, I just tried to build core 1.2.x trunk, and got a failure due to a missing dependency: org.apache.tomcat:catalina:6.0.10 which is in the impl/pom.xml. The error is right: the available catalina versions in the main repos available start from 6.0.13. But I have been building this code regularly and not had this problem. Does anyone else have this? For now, I've modified the pom locally to point to 6.0.18 and it builds again. But I'm reluctant to commit this when I don't know what's happened...very puzzling. Cheers, Simon
disappearing apache snapshots (was Re: [myfaces site] skin)
Grant Smith schrieb: I was idling in #asfinfra on freenode yesterday and looking at the logs now I see they did clean out that repository... Just to let people know, there is currently some discussion going on at the infrastructure list about this. It was indeed the infra team who deleted all files older than 30 days, as people.apache.org was running short of disk space. The current opinion of the infra admins seems to be that they reserve the right to delete stuff from the snapshot repo at any time, and that any reliance on files staying in the snapshot repo for more than a couple of days is wrong. In particular, some people are concerned that allowing non-apache people access to the snapshot repo for any purpose other than testing of artifacts is equivalent to bypassing the release process, ie any use of this repo except for internal apache development purposes is wrong (my phrasing). A couple of contrary opinions have been expressed; I'll let you know what the final conclusion is. Regards, Simon
Re: [myfaces site] skin
Matthias Wessendorf schrieb: The myfaces-site-skin seems to have disappeared. I believe it used to be in http://people.apache.org/repo/m2-snapshot-repository/org/apache/myfaces/maven/myfaces-site-skin/1-SNAPSHOT But that's now empty. That directory was updated this morning. Does one know more ? Interesting. All the snapshots under org.apache.myfaces.maven have disappeared, and all dirs have a last-modified of 2008-08-04 17:28. For example: http://people.apache.org/repo/m2-snapshot-repository/org/apache/myfaces/maven/build-tools/ Looks to me like someone ran find . -type f -rm {} \; or similar. Or maybe find . -older -mm-dd -rm ... to remove files not modified since a specific date? Some other dirs are also empty, eg http://people.apache.org/repo/m2-snapshot-repository/org/apache/myfaces/myfaces/5-SNAPSHOT/ but not http://people.apache.org/repo/m2-snapshot-repository/org/apache/myfaces/myfaces/6-SNAPSHOT/ So maybe one of the apache sysadmins decided to free up disk space by deleting old snapshots, and was a little too enthusiastic and didn't realise that we can have snapshots that are 3 months old, but still used? There isn't any maven repository manager installed for the apache snapshots repo is there? I've just installed Nexus at work, and there is a feature there to clean up old snapshots automatically. But I don't think we have a repo manager installed.. Regards, Simon
Re: [myfaces site] skin
Matthias Wessendorf schrieb: Nexus? I am not familiar with that. Lemme do a google search for that. interesting tool with the wrong license ;-) Since it is GPL I think that's why we don't have it here installed on ASF servers. I don't think there is any rule against installing GPL tools for our use. For example, gcc is GPL. But there are half-a-dozen maven repository manager tools around. Archiva (http://archiva.apache.org/) and Artifactory (jfrog.org) are Apache licensed and also look good; it was simply a random choice which one I installed here. My original point was just that *if* we have one of these installed, and it has the same snapshot cleanup feature that maven has, then that might be the cause of the disappearing plugins. I'll ask on the Archiva list whether anyone has installed it for the apache snapshots repo Cheers, Simon
Re: [myfaces site] skin
Thanks Grant. I've sent an email to infrastructure to ask about possible solutions. Cheers, Simon Grant Smith schrieb: I was idling in #asfinfra on freenode yesterday and looking at the logs now I see they did clean out that repository... On Tue, Aug 5, 2008 at 2:21 AM, Matthias Wessendorf [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: On Tue, Aug 5, 2008 at 11:15 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Matthias Wessendorf schrieb: Nexus? I am not familiar with that. Lemme do a google search for that. interesting tool with the wrong license ;-) Since it is GPL I think that's why we don't have it here installed on ASF servers. I don't think there is any rule against installing GPL tools for our use. For example, gcc is GPL. But there are half-a-dozen maven repository manager tools around. Archiva (http://archiva.apache.org/) and Artifactory (jfrog.org http://jfrog.org) are Apache licensed and also look good; it was simply a random choice which one I installed here. k, I was never checking those repo management systems. Archiva, I know by name... thanks! My original point was just that *if* we have one of these installed, and it has the same snapshot cleanup feature that maven has, then that might be the cause of the disappearing plugins. I'll ask on the Archiva list whether anyone has installed it for the apache snapshots repo cool, thanks. BTW. yes someone did a cleanup on file that are older than 30 days. -m Cheers, Simon -- Matthias Wessendorf further stuff: blog: http://matthiaswessendorf.wordpress.com/ sessions: http://www.slideshare.net/mwessendorf mail: matzew-at-apache-dot-org -- Grant Smith
Re: [Tomahawk] Tomahawk 1.2.x
Matthias Wessendorf schrieb: On Fri, Aug 1, 2008 at 9:47 AM, Gertjan van Oosten [EMAIL PROTECTED] wrote: Hi Matthias, As quoted from Matthias Wessendorf [EMAIL PROTECTED]: Is there a chance that you can run the build on your machine ? If I run 'mvn clean package' from the svn checkout of http://svn.apache.org/repos/asf/myfaces/tomahawk/trunk I get build errors in core12 (see attached). That *may* account for the snapshots not getting updated on 'people'. there seems to be something wrong... :) I am not that familiar with Tomahawk12 + its build-process. Simon and Leo are the men for that ;) I've just rebuilt tomahawk, and did not have any build failure. I have updated the poms to use the newly-released myfaces-builder-plugin version, but don't think this would have affected the build. I have got this error message before (duplicate class) and never did figure out why. But it went away by itself. Some weird maven issue I guess.. Gertjan, perhaps you could do svn update and try again? Regards, Simon
Re: [Tomahawk] Tomahawk 1.2.x
Leonardo Uribe schrieb: On Thu, Jul 31, 2008 at 9:32 AM, Gertjan van Oosten [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: As quoted from [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]: There used to be just one Tomahawk version, but Leonardo has now created a separate tomahawk 1.2.x project that has increased JSF1.2 compatibility. What's the current status of this? If I replace the tomahawk-1.1.7-SNAPSHOT.jar and tomahawk-sandbox-1.1.7-SNAPSHOT.jar dependencies with tomahawk12-1.1.7-SNAPSHOT.jar and tomahawk-sandbox12-1.1.7-SNAPSHOT.jar my application no longer works. E.g. t:stylesheet and t:dataTable are not rendered but appear just like that (the JSF tags that is) in the HTML output... I have tested the latest code and everything works fine (using the sample app but with tomahawk12). I've also done some tests and the latest code (built from svn) works for me. In tomahawk/examples/simple, I ran: mvn -Djsf=12 -Dtomahawk=12 clean jetty:run The log output included: 2008-08-01 13:14:04,597 [main] INFO org.apache.myfaces.config.FacesConfigurator - Starting up MyFaces-package : myfaces-api in version : 1.2.4-SNAPSHOT from path : file:/home/sk/.m2/repository/org/apache/myfaces/core/myfaces-api/1.2.4-SNAPSHOT/myfaces-api-1.2.4-SNAPSHOT.jar 2008-08-01 13:14:04,598 [main] INFO org.apache.myfaces.config.FacesConfigurator - Starting up MyFaces-package : myfaces-impl in version : 1.2.4-SNAPSHOT from path : file:/home/sk/.m2/repository/org/apache/myfaces/core/myfaces-impl/1.2.4-SNAPSHOT/myfaces-impl-1.2.4-SNAPSHOT.jar 2008-08-01 13:14:04,598 [main] INFO org.apache.myfaces.config.FacesConfigurator - Starting up MyFaces-package : tomahawk in version : 2-1.1.7-SNAPSHOT from path : file:/home/sk/.m2/repository/org/apache/myfaces/tomahawk/tomahawk12/1.1.7-SNAPSHOT/tomahawk12-1.1.7-SNAPSHOT.jar This shows that things are selected as expected: myfaces 1.2.4-SNAPSHOT and tomahawk1.2-1.1.7. There is a slight error in that message: tomahawk version is displayed as 2-1.1.7, but that's just a quirk in the diagnostics output code. As can be seen from the path, it is the tomahawk12 jarfile that is being used. And the examples seemed to work fine to me. Regards, Simon
Re: [myfaces core] Is there any task left for release myfaces core 1.1.6?
Matthias Wessendorf schrieb: On Fri, Aug 1, 2008 at 2:20 PM, Leonardo Uribe [EMAIL PROTECTED] wrote: Hi As planned to release tomahawk, It could be good (optional) to release myfaces core 1.1.6. Is there any task left to do this? If no I'll start the release procedure. sounds good to me Sounds good to me too. One thing I have meant to do is to check that all the info from the (long obsolete) .xml files that sit next to the component .java files has been migrated to an appropriate place, then remove them. But that's not critical for a release. I would suggest giving the release candidate for this a reasonable time (1 week) for the user community to test. There have been some radical changes since the last release, and our unit tests are not great so getting real-world testing for this would be very useful. But I would also suggest stating in the RC announcement that only *regressions* from the 1.1.5 release will be looked at during the rc cycle. We need to get the release cycles going again, even with known issues - as long as they are not regressions. Cheers, Simon
Re: [myfaces core] Is there any task left for release myfaces core 1.1.6?
[EMAIL PROTECTED] schrieb: Leonardo Uribe schrieb: On Fri, Aug 1, 2008 at 7:33 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Matthias Wessendorf schrieb: On Fri, Aug 1, 2008 at 2:20 PM, Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi As planned to release tomahawk, It could be good (optional) to release myfaces core 1.1.6. http://1.1.6. Is there any task left to do this? If no I'll start the release procedure. sounds good to me Sounds good to me too. One thing I have meant to do is to check that all the info from the (long obsolete) .xml files that sit next to the component .java files has been migrated to an appropriate place, then remove them. But that's not critical for a release. I would suggest giving the release candidate for this a reasonable time (1 week) for the user community to test. There have been some radical changes since the last release, and our unit tests are not great so getting real-world testing for this would be very useful. But I would also suggest stating in the RC announcement that only *regressions* from the 1.1.5 release will be looked at during the rc cycle. We need to get the release cycles going again, even with known issues - as long as they are not regressions. Ok, sounds good taking into account the latest changes, but I have never seen how a release candidate procedure works. I suppose it is the same as normal release but there is no vote, just an announcement about it and where to find the artifacts, and those artifacts are not published on main maven repo, right? I think that passing around something that has the final version number in it is too dangerous. So instead how about creating a tag dir, and updating the version within that dir to 1.1.7-rc1, then just doing a build and putting the artifacts up on people.apache.org? Then if testing goes ok, we can either generate the final release from the rc tag dir, or just do a normal release again from trunk (presuming not too much has changed since the rc was tagged). Ideally we would also push the 1.1.7-rc1 artifacts to the apache snapshot repo, so that maven users can test this rc really easily. I'm not sure how to do that, but it shouldn't be difficult; we can get the manually downloadable artifacts there first, and figure out how to push to the snapshot repo later... (sorry, please read 1.1.6 instead of 1.1.7 above; got confused between core and tomahawk versions :-) Hmm..actually, what if its version is named 1.1.6-rc1-SNAPSHOT? That's more appropriate for pushing to the snapshot repo (and mvn deploy will do so automatically). Does that version# come before or after 1.1.6-SNAPSHOT? Probably doesn't matter, as people will be pointing directly at one or the other. We could presumably do a tomahawk release candidate in the same way, and send it out for testing at the same time (ie tomahawk-1.1.7-rc1 can be sent out after core-1.1.6-rc1 but before core-1.1.6 has been released). Cheers, Simon
Re: [Tomahawk] Any pending release tasks
Hazem Saleh schrieb: Hi Team, I would like to know whether there are pending tasks on the next Tomahawk release. So that I can help. I think that one thing that needs doing is restructuring the Tomahawk website. There used to be just one Tomahawk version, but Leonardo has now created a separate tomahawk 1.2.x project that has increased JSF1.2 compatibility. I split the myfaces core website into separate JSF1.1 and JSF1.2 parts a while ago; maybe the same needs to be done for tomahawk, so that people can find the right mvn reports, javadoc, etc for the different releases? At the minimum, the existing website would need to be updated to mention the existence of the two separate lines... Warning: fighting maven's site building features can really eat a lot of time! Regards, Simon
Re: [Tomahawk] Tomahawk 1.2.x
Leonardo Uribe schrieb: On Thu, Jul 31, 2008 at 9:32 AM, Gertjan van Oosten [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: As quoted from [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]: There used to be just one Tomahawk version, but Leonardo has now created a separate tomahawk 1.2.x project that has increased JSF1.2 compatibility. What's the current status of this? If I replace the tomahawk-1.1.7-SNAPSHOT.jar and tomahawk-sandbox-1.1.7-SNAPSHOT.jar dependencies with tomahawk12-1.1.7-SNAPSHOT.jar and tomahawk-sandbox12-1.1.7-SNAPSHOT.jar my application no longer works. E.g. t:stylesheet and t:dataTable are not rendered but appear just like that (the JSF tags that is) in the HTML output... I have tested the latest code and everything works fine (using the sample app but with tomahawk12). Gertjan, are you using jsp? If so, have you deleted your tomcat work directory? Changing taglib versions without recompiling jsps can cause problems...
Re: MYFACES-1900: any news?
Would we be happy with a patch to just revert to the previous version? What was changed, and what new behaviour or other fixes would reverting this script remove? Matthias Wessendorf schrieb: any chance to provide a patch ? On Mon, Jul 28, 2008 at 3:58 PM, Gertjan van Oosten [EMAIL PROTECTED] wrote: Hi MyFaces devs, Is there any news about MYFACES-1900? The error prevents me from deploying myfaces-1.2.3, so I'm stuck with 1.2.2. The fix can't be that hard, just roll back the change in the oamSubmitForm() JavaScript from 1.2.2 to 1.2.3, right? Kind regards, -- -- Gertjan van Oosten, [EMAIL PROTECTED], West Consulting B.V., +31 15 2191 600
Re: Odd generated component classes in the JSF1.2 trunk
Leonardo Uribe schrieb: On Tue, Jul 15, 2008 at 1:55 PM, simon [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: On Mon, 2008-07-14 at 16:36 +0200, Leonardo Uribe wrote: On Sun, Jul 13, 2008 at 12:48 PM, simon [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi, I've just noticed that quite a few component classes are checked in to the JSF 1.2 trunk, but carry big generated code, do not modify warnings. And they look completely different from the versions in the JSF1.1 branch. Examples: UICommand.java UIData.java UIGraphic.java UIInput UINamingContainer I'm a little confused here. Can someone (Leonardo?) explain what is happening here? This comments should be removed, since this classes are not generated. But the weird thing is that they *look* generated. The code is just all over the place, variables declared in the middle of the file, odd line wrapping. And all the documentation that was on the UIData class is completely gone. I don't know what is going on here, but it feels wrong. When myfaces core 1.2 started, all classes were generated using myfaces-faces-plugin template approach, so the old code was just replaces. That is the reason because myfaces 1.1 component classes looks different from its 1.2 counterparts. Then, myfaces-builder-plugin was applied, so some generated classes using myfaces-faces-plugin were just moved to src/main/java. I have removed the wrong warnings, since this core classes should not be generated (makes easier to users to check if something is wrong). Ok, thanks for explaining that. I'm not really satisfied with just removing the generated code comments though. This (formerly generated) code is really ugly. And it lacks all the comments of the versions from the 1.1 branch. So I think the right thing to do is to replace these files with the versions from the 1.1 branch... I'll try to find some time to look into this, but have guests visiting at the moment so it won't be in the next few days. BTW, I know you're planning to get some releases out soon (after the maven-builder-plugin release). I would like to see myfaces-core-1.1.x released then tomahawk. Is that what you are planning? Regards, Simon
Re: Building Tomahawk
Gertjan van Oosten schrieb: Hi Simon, As quoted from [EMAIL PROTECTED] [EMAIL PROTECTED]: But at the moment, as your error message shows, there is a snapshot-level *plugin* required to build tomahawk. And the root apache pom does not enable the snapshot repo for plugins, just for dependencies. So it is still necessary to add the apache snapshot repo to your ~/.m2/settings.xml, at least in this case. Done that, but now it gets stuck on this: % mvn clean [INFO] Scanning for projects... [INFO] snapshot org.apache.myfaces.tomahawk:tomahawk-project:1.1.7-SNAPSHOT: checking for updates from apache.org Downloading: http://people.apache.org/repo/m2-snapshot-repository/org/apache/myfaces/tomahawk/tomahawk-project/1.1.7-SNAPSHOT/tomahawk-project-1.1.7-SNAPSHOT.pom 5K downloaded Downloading: http://maven2.mirrors.skynet.be/pub/maven2/org/apache/myfaces/myfaces/6/myfaces-6.pom [INFO] [ERROR] FATAL ERROR [INFO] [INFO] Failed to resolve artifact. GroupId: org.apache.myfaces ArtifactId: myfaces Version: 6 Reason: Unable to download the artifact from any repository org.apache.myfaces:myfaces:pom:6 from the specified remote repositories: central (http://repo1.maven.org/maven2), myfaces-staging (http://people.apache.org/builds/myfaces/m2-staging-repository), apache.org (http://people.apache.org/repo/m2-snapshot-repository) Sure enough, if I check the downloaded tomahawk-project-1.1.7-SNAPSHOT.pom in my M2 repo it has: parent groupIdorg.apache.myfaces/groupId artifactIdmyfaces/artifactId version6/version /parent which seems to be wrong... Why do you think that parent is wrong? It look ok to me. The referenced parent pom is in the standard repository: http://repo1.maven.org/maven2/org/apache/myfaces/myfaces/6/ And you clearly have this repo enabled: from the specified remote repositories: central (http://repo1.maven.org/maven2), myfaces-staging (http://people.apache.org/builds/myfaces/m2-staging-repository), apache.org (http://people.apache.org/repo/m2-snapshot-repository) But there is also this message: Downloading: http://maven2.mirrors.skynet.be/pub/maven2/org/apache/myfaces/myfaces/6/myfaces-6.pom So what I think is happening is that you have (explicitly or implicitly) got maven2.mirrors.skynet.be configured as a mirror of the main repo1.maven.org repository. But that mirror hasn't got the latest release yet. The parent pom version 6 was released on the 11th of july (see timestamps on the files in the repo1.maven.org directory), so I'm surprised but that does appear to be the case. Regards, Simon
Re: [myfaces-builder-plugin] can we do a release of it?
[EMAIL PROTECTED] schrieb: An alternative is to split things up so that the parent pom (the pom that other modules inherit from) is not the module pom (the one used to drive bulk builds of nested directories). A subdirectory can be created, the existing pom moved down into it, and a new trivial pom created in the base directory that just has module tags in it. I don't see any need to change the artifactId for the parent pom, so AFAIK simply moving it within svn should not break anything. For example, the main myfaces pom is in its own subdir, and has no module tags. The Orchestra code is also set up like this, with the parent pom in the myfaces/orchestra/maven dir, and the myfaces/orchestra/pom.xml file (which is never released) just contains module tags. Setting things up like this makes sense when the various subdirs have different release cycles, as is the case for Myfaces in general (core, core12, tomahawk, trinidad, etc have different release cycles) and Orchestra (core, core15, flow have different release cycles). Of course thinks like (core/api, core/impl) do not have independent release cycles, so having the module pom also be the parent pom makes sense there. I think that the build-tools modules do (or should) have independent release cycles, so pushing the parent pom into a subdir makes sense there. And if it is pushed down into its own dir, then you could also use the maven-release-plugin to release it :-) Comments, anyone? Are there any objections if I move the parent pom for the maven2-plugins into its own subdir? If not, I'll try to find time to do that tonight...
Re: Building Tomahawk
Not true, Matthias. One way to build tomahawk is to check out the root of the tomahawk project, and build it all. But even that will probably not work because tomahawk trunk can depend on myfaces trunk etc. The right solution is to add the myfaces snapshot repository to your ~/.m2/settings.xml file. This is described on the wiki somewhere. Then you can build just one module without problems. The snapshot repo lives at: http://people.apache.org/repo/m2-snapshot-repository This snapshot repo cannot be defined in the pom.xml (well, at least it must never be defined in any actual release). I guess we could add it as a profile, though, and have that profile not enabled by default. Probably not a lot easier though... Regards, Simon Matthias Wessendorf schrieb: That was already discussed. You need to build everything... in order to build something from (tomahawk) trunk -M On Tue, Jul 22, 2008 at 5:29 PM, Gertjan van Oosten [EMAIL PROTECTED] wrote: Hi devs! I'm having some trouble getting Tomahawk to build (see also TOMAHAWK-1304). Since I was unable to find any documentation about how to set up a local Tomahawk development environment, I'm trying my luck... I've checked out myfaces/tomahawk/trunk/assembly now, and I am able to install that into my local repository using: mvn -N -Dmyfaces-shared.version=2.0.7 install Is that the correct command to use? If not, what is it (and is there some documentation somewhere where I can read about this and more)? Then, after checking out myfaces/tomahawk/trunk/core, I am unable to build that because of the following error: [INFO] A required plugin was not found: Plugin could not be found - check that the goal name is correct: Unable to download the artifact from any repository Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.apache.myfaces.buildtools -DartifactId=myfaces-builder-plugin \ -Dversion=1.0.1-SNAPSHOT -Dpackaging=maven-plugin -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.apache.myfaces.buildtools -DartifactId=myfaces-builder-plugin \ -Dversion=1.0.1-SNAPSHOT -Dpackaging=maven-plugin -Dfile=/path/to/file \ -Durl=[url] -DrepositoryId=[id] org.apache.myfaces.buildtools:myfaces-builder-plugin:maven-plugin:1.0.1-SNAPSHOT from the specified remote repositories: central (http://repo1.maven.org/maven2) What now? Kind regards, -- -- Gertjan van Oosten, [EMAIL PROTECTED], West Consulting B.V., +31 15 2191 600
Re: Building Tomahawk
Hmm..actually, in most cases it's even easier than that. I hadn't noticed, but the root apache parent pom now defines and enables the snapshot repository by default *for dependencies*. So normally there is nothing to do (and I've tested this). But at the moment, as your error message shows, there is a snapshot-level *plugin* required to build tomahawk. And the root apache pom does not enable the snapshot repo for plugins, just for dependencies. So it is still necessary to add the apache snapshot repo to your ~/.m2/settings.xml, at least in this case. Stick the following in your ~/.m2/settings.xml file: profiles profile idapache.snapshots.profile/id activation activeByDefaulttrue/activeByDefault /activation repositories repository idapache.org/id nameMaven Snapshots/name urlhttp://people.apache.org/repo/m2-snapshot-repository/url releases enabledfalse/enabled /releases snapshots enabledtrue/enabled /snapshots /repository /repositories pluginRepositories pluginRepository idapache.org/id nameMaven Plugin Snapshots/name urlhttp://people.apache.org/repo/m2-snapshot-repository/url releases enabledfalse/enabled /releases snapshots enabledtrue/enabled /snapshots /pluginRepository /pluginRepositories /profile /profiles Regards, Simon [EMAIL PROTECTED] schrieb: Not true, Matthias. One way to build tomahawk is to check out the root of the tomahawk project, and build it all. But even that will probably not work because tomahawk trunk can depend on myfaces trunk etc. The right solution is to add the myfaces snapshot repository to your ~/.m2/settings.xml file. This is described on the wiki somewhere. Then you can build just one module without problems. The snapshot repo lives at: http://people.apache.org/repo/m2-snapshot-repository This snapshot repo cannot be defined in the pom.xml (well, at least it must never be defined in any actual release). I guess we could add it as a profile, though, and have that profile not enabled by default. Probably not a lot easier though... Regards, Simon Matthias Wessendorf schrieb: That was already discussed. You need to build everything... in order to build something from (tomahawk) trunk -M On Tue, Jul 22, 2008 at 5:29 PM, Gertjan van Oosten [EMAIL PROTECTED] wrote: Hi devs! I'm having some trouble getting Tomahawk to build (see also TOMAHAWK-1304). Since I was unable to find any documentation about how to set up a local Tomahawk development environment, I'm trying my luck... I've checked out myfaces/tomahawk/trunk/assembly now, and I am able to install that into my local repository using: mvn -N -Dmyfaces-shared.version=2.0.7 install Is that the correct command to use? If not, what is it (and is there some documentation somewhere where I can read about this and more)? Then, after checking out myfaces/tomahawk/trunk/core, I am unable to build that because of the following error: [INFO] A required plugin was not found: Plugin could not be found - check that the goal name is correct: Unable to download the artifact from any repository Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.apache.myfaces.buildtools -DartifactId=myfaces-builder-plugin \ -Dversion=1.0.1-SNAPSHOT -Dpackaging=maven-plugin -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.apache.myfaces.buildtools -DartifactId=myfaces-builder-plugin \ -Dversion=1.0.1-SNAPSHOT -Dpackaging=maven-plugin -Dfile=/path/to/file \ -Durl=[url] -DrepositoryId=[id] org.apache.myfaces.buildtools:myfaces-builder-plugin:maven-plugin:1.0.1-SNAPSHOT from the specified remote repositories: central (http://repo1.maven.org/maven2) What now? Kind regards, -- -- Gertjan van Oosten, [EMAIL PROTECTED], West Consulting B.V., +31 15 2191 600
Re: java.lang.NoClassDefFoundError: javax/servlet/jsp/jstl/core/Config
Stefan Neudorfer schrieb: Hi All, I try to use tomcat 6.0, myfaces core 1.2.3 and tomahawk 1.1.6 and i could not found a way to get this (an maybee other) classes. The jstl-1.2.0.jar is not to in the install-path from maven and there is no way to found it at sun. I found this classes in javaee.jar from the glassfish-project, but i get the same error when i use it. The dependency on jstl is defined in the poms for myfaces core. By looking in the pom files, it can be seen that: Core 1.1.x depends on jstl 1.1.2 which can be found here: http://repo1.maven.org/maven2/jstl/jstl/1.1.2/ Core 1.2.x depends on jstl 1.2 which can be found here: http://download.java.net/maven/1/jstl/jars/ Regards, Simon
Re: commons logging again
Werner Punz schrieb: Hello everyone, we have been using commons-logging the past years. I am not sure if it is a good idea, first of all java has a decent logging api, which would allow us to eliminate the logging dependency. Using a logging API built into the JDK does feel tempting. However before doing this, have a look at the number of alternative implementations of the API. * The one built into the JDK sucks. badly. * There is one that is built into Tomcat (JULI) but that is not available as a standalone lib. * The SLF4J project provides jcl-over-slf4j. I haven't used this myself, but presume that it needs to be in the system classpath to work (one of the major issues with the java.util.logging design). So I'm not sure how that would interact with Tomcat's JULI logging. See: http://slf4j.org/api/org/slf4j/bridge/SLF4JBridgeHandler.html I'm not aware of any other implementations. And I'm not at all sure whether it is possible for different webapps running in the same container to have different logging configuration. It would be ugly to force a solution on users where all logging from all their webapps ends up mixed together in the same output file. So I would suggest analysing things carefully before moving to java.util.logging. I have the feeling that java.util.logging was designed by Sun JDK implementers to help debugging JDK (java.*) code. For that purpose it works fine. Using it from application code seems far less useful.. Secondly,I have not looked into the code yet, but there are a load of references that commons logging has problems due to messing around with the classloader. That's an ancient issue. There have been zero issues related to this reported since the 1.1 commons release. Projects like Tapestry already have moved away towards SLF4J which apparently is better. apparently doesn't sound like a terribly convincing reason to me to move from one log wrapper to another. Whats your opinion should we keep the commons logging references? Despite being a commons-logging developer, I don't care what implementation is used. But AFAIK we do have a working solution now; I'd like to be sure that whatever we move towards (a) actually does work (java.util.logging), and (b) brings benefits that are worth the effort of converting over (slf4j) Regards, Simon
Re: [myfaces-builder-plugin] can we do a release of it?
Leonardo Uribe schrieb: Hi I think that myfaces-builder-plugin has been very well tested so we can release a first version of this tool. Is there any task left? I think there is still a fair bit of work that could be done on the plugin. However I don't see any reason why we can't make a release now. Hmm..by the way, I just noticed that the pom does not specify a version tag. So it is inheriting that from its parent, and therefore existing snapshots are all 1.0.1-SNAPSHOT which is rather odd for a project that has never been released. What version number will be used for the first release? The parent pom does need to be released first, and that currently depends on the new myfaces-master-pom which depends on the new checkstyle module. The release process for the master pom and checkstyle is currently in progress; there are now enough +1s so I will finish the release of these tonight. That's a little short of the recommended 72 hours since the RC announcement (only 48 hours), but these modules are really only internal so I hope no-one will be annoyed by that. Regards, Simon
Recent tomahawk ExtensionsFilter change breaks orchestra
The following change has recently been made to TomahawkFacesContextWrapper: Old code: if (addResource.requiresBuffer()) { extensionsResponseWrapper = new ExtensionsResponseWrapper(httpResponse); extendedResponse = extensionsResponseWrapper; } New Code: if (addResource.requiresBuffer()) { //If the request requires buffer, this was already //wrapped (on TomahawkFacesContextFactory.getFacesContext(...) ), //but we need to save the wrapped response value //on a local variable to then reference it on release() //method and parse the old response. extensionsResponseWrapper = (ExtensionsResponseWrapper) extendedResponse; } Casting the extendedResponse object to a tomahawk-specific type doesn't work if something else has wrapped the ResponseWrapper too. And orchestra does exactly that. So some other solution will be needed here. PS: extendedResponse is no longer an appropriate name for this variable. It was true in the filter approach, where the filter knows exactly what the object is because it just created it. But in the FacesContext approach, it can be any object that implements ServletResponse. Regards, Simon
Re: Recent tomahawk ExtensionsFilter change breaks orchestra
Leonardo Uribe schrieb: Solved. Just pass the wrapped response on TomahawkFacesContextFactory as parameter. Fixed at revision 675608. Yep, that fixed it. Thanks.
Re: [jira] Updated: (TOMAHAWK-768) t:datascroller and sort headers don't work with ajax4jsf
Hi Hazem, AFAIK we don't usually bother to resolve issues as later. That's just a nuisance because then they need to be reopened after the release. Regards, Simon Hazem Saleh (JIRA) schrieb: [ https://issues.apache.org/jira/browse/TOMAHAWK-768?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hazem Saleh updated TOMAHAWK-768: - Resolution: Later Status: Resolved (was: Patch Available) t:datascroller and sort headers don't work with ajax4jsf Key: TOMAHAWK-768 URL: https://issues.apache.org/jira/browse/TOMAHAWK-768 Project: MyFaces Tomahawk Issue Type: Bug Components: Data Scroller, Sort Header Affects Versions: 1.1.3 Reporter: Popcorn Assignee: Hazem Saleh Attachments: HtmlDataScrollerRendererFixedForAjax.java The datascroller and sort headers stop working inside an ajax-rendered area with Ajax4jsf. The components need to be patched. For details, see this link: http://marc.theaimsgroup.com/?l=myfaces-userm=116237031123948w=2
Re: Clarifications requested regarding the usage of annotations within the myfaces-builder-plugin
Leonardo Uribe schrieb: On Mon, Jul 7, 2008 at 11:43 PM, Jihoon Kim [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi, I had some questions while refactoring the code using the annotations of myfaces-builder-plugin and was wondering if I was simply using it incorrectly or if it is yet to be implemented [since I am currently pointing to the snapshot repository of the plugin]. (1) JSFProperty annotation seems to work intuitively as suggested in creating the get + set methods of the properties, but I was wondering about the properties for JSFJspProperties + JSFJspProperty. Since I wanted the components that didn't utilize the fields frequently to be pushed to the Map [so to make the code cleaner], I was hoping to use these annotations for the fields, but I apparently do not see them being generated within the Tag of the components and the tld as I thought they would be generated at. The following annotation is within a blank interface of an abstract class to be created as concrete with JSFComponent annotation : @JSFJspProperty name = allowMultipleSelection returnType = java.lang.String longDesc= A flag that indicates whether you can allow more than one item to be selected at the same time. @JSFJspProperties and @JSFJspProperty is used only as a last resource, when you want to define a property used on the tld but not defined on the component (like binding property) or a parent component property should not be on the tld of the child component, and you cannot change the api of the component like in myfaces core (you cannot add public methods to components). If you are working with new components use @JSFProperty instead and avoid any usage at possible of @JSFJspProperty. But JSFProperty does assume that there is a real field on the component, which is not what Jihoon Kim wants. I was looking at this a couple of weeks ago, and from memory one of the JSFJsp* annotations really does work fine when you want real properties declared in the tld (and present as methods on the tag class) but just want the data pushed into the component's attributes map rather than stored as a property on the component. So I was going to propose renaming that annotation to JSFAttribute. Unfortunately I cannot remember which annotation it was. I'll have a look on my home pc after work today. I guess the alternative is to put a flag on the @JSFProperty annotation to indicate that it is an attributes map only property. But that does require a real method on the component to attach the annotation to, which is not right for attribute-map-based properties. Maybe we could allow an @JSFProperty annotation to be attached to a public static final string field on the component itself; each component that supports an attribute really should have a constant defined which other classes can then use to fetch that attribute from the attributes map. When the plugin finds a JSPProperty annotation attached to a constant, then it could set the internal isAttribute flag on the corresponding PropertyMeta object so that we know to skip that PropertyMeta when generating the saveState/restoreState methods. Thoughts? Regards, Simon
Re: Dojo discussion - opensourcing the jsf dojo components project
It's great that people are thinking carefully about the right way to handle this new code. But after some pondering, I'm happy for it to go directly into a sandbox here and not through the incubator. My reasons are: Incubation is necessary when a brand-new project is created, in order to be sure that a new non-apache development group learn to use apache-style collaboration. But that's not relevant in this case; Werner is familiar with all this and I'l confident he will make sure everything happens in the open. Incubation is also necessary when the code is for an existing project but that existing project doesn't have committers that will review/commit patches for the new code and doesn't want to grant new unknown people commit rights immediately. But again that's not relevant here; Werner will presumably be acting as reviewer for patches. So all we need to be concerned about here is that the code is legally unencumbered (a grant should do that), and that there is enough of a community to maintain it long term (which some time in the sandbox can test). And of course that we're all happy with the architecture etc. But for that we need to see the code :-) I can't see any other reasons for requiring incubation... Definitely worth asking the incubator group their opinion too, but hopefully they just push it back to us.. Regards, Simon Martin Marinschek schrieb: Yes, definitely incubator should be kept in the loop. But I feel a Grant should be enough, if it is part of the sandbox. regards, Martin On 7/7/08, Matthias Wessendorf [EMAIL PROTECTED] wrote: Well best probably is to ask there, but I dont think there should be too much of a problem of getting it in directly without having to go through the incubator, due to the nature of the code being developed 100% by me. I am fine with that. But I just want to make sure everything is fine and correct with the Apache guidelines. Since the scope of the contribution is a (to my understanding) separate project. Perhaps a software grant is pretty fine. Perhaps even that is not needed. Don't get me wrong. I am not against this (I was pinged offline already asking why). So, again I am not against it. I just want to make sure we follow the right way. -M So, IMO the best is to give a heads-up on the [EMAIL PROTECTED] list. So see, what their feeling is about this. They deal with these type of things more frequently than everybody of us. Generally, I think it is a good project.
Re: cleaning up whitespace in source files
Ok, rather than running detab.sh before svn update, I suggest this instead. svn -q status | cut -c 8- | xargs -n 1 sed -i -e 's/\t//g' It replaces tabs *only* in local files that you already have modified versions of. The svn update therefore works normally on other files (no conflicts). Regards, Simon Andrew Robinson schrieb: SVN merge takes -x -w arguments to ignore whitespace. I am not sure about updating. -Andrew On Thu, Jul 3, 2008 at 4:39 PM, simon [EMAIL PROTECTED] wrote: Sorry, but I think conflicts are now being reported when updating a checkout dir for files where *all* of these were true: * contains tabs * did not have eol-style set to native * was not first checked in from your native platform. I'll try to think of a nice way to automatically clean up those conflicts.. Regards, Simon On Fri, 2008-07-04 at 00:04 +0200, simon wrote: By the way: * the detab.sh script is here: http://svn.apache.org/repos/asf/myfaces/myfaces-build-tools/trunk/other/scripts/detab.sh * I haven't touched tobago, trinidad or portlet-bridge. It's up to the developers of those projects to choose when/if they want to do this. I also fixed quite a few .java files that did not have eol-style set to native. People, could you please check that you have your ~/.subversion/config file set up correctly? Regards, Simon On Thu, 2008-07-03 at 23:11 +0200, simon wrote: Ok, as people seem happy to see tabs cleaned up done I'm doing it now. But I'm leaving trailing whitespace alone for now; there is less benefit and it does touch a whole lot of files. To anyone who currently has checked-out directories with uncommitted changes in them, I recommend running detab.sh *before* running svn update. This will avoid having conflict markers inserted into all your locally modified files. If you forget, do svn update, and end up with lots of conflicts then I recommend: * install svn 1.5.0 (if you don't have it already), then * svn resolve --recursive --accept mine-full . then * run detab.sh Regards, Simon On Wed, 2008-07-02 at 22:14 +0200, simon wrote: Interesting question, Manfred. Here are the answers: Count of java files is done via: find . -name .svn -prune -o -name target -prune \ -o -name *.java -print | wc -l Count of java files with tabs is done by running detab1.sh (which just fixes tabs) then: svn status | grep ^M | wc -l Count of java files with tabs or trailing whitespace is done by running detab.sh then svn status as above. shared/trunk: # of java files: 396 # of files with tabs: 25 # of files with tabs/trailing spaces: 51 shared/trunk12: # of java files: 390 # of files with tabs: 31 # of files with tabs/trailing spaces: 133 core/trunk: # of java files: 351 # of files with tabs: 78 # of files with tabs/trailing spaces: 216 core/trunk12: # of java files: 503 # of files with tabs: 120 # of files with tabs/trailing spaces: 385 It's interesting how many more classes there are in jsf1.2 than in jsf1.1. Some of this is due to more unit tests, but much appears to be real new classes needed to implement the extended spec. On Wed, 2008-07-02 at 20:12 +0200, Manfred Geiler wrote: Simon, Do you have a number? How many files do have tab characters? I think (b - fix them) would be the better solution. But only if that does not change every second file. --Manfred On Wed, Jul 2, 2008 at 7:28 PM, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: Hi All, In the new checkstyle rules file I enabled checks for tab characters, as the myfaces convention is (AFAIK) to use 4 spaces, not tabs. However the checkstyle report points out a lot of files containing tabs. It's no big deal, but do we want to: (a) disable the checkstyle rule and ignore tabs or (b) fix them? Tabs are a minor nuisance when viewing the source as some tools render 4 spaces, some 8. I've written a simple shellscript that can clean this up very easily, and am happy to do so. The script also removes trailing whitespace from lines, of which we also appear to have quite a lot. But doing this will create some large commit messages and make comparing files with past versions noisier. It can also cause svn conflicts if people have modified files they have not yet committed, unless they run the cleanup script against their own working dir before doing svn update. So, option (a) or (b)? Regards, Simon
Re: EnumConverter in commons 1.1 branch
Volker Weber schrieb: Hi, Leonardo has just deleted the EnumConverter from the jsf1.1 branch of commons. This converter was the reason for me to use a snapshot version in our production application. Is it really necessary to have the commons jsf1.1 branch java 1.4 compatible? I know jsf1.1 is, but commons is an extension, so why should we restrict commons? We may provide a retroweaved (if this is possible with this Converter) release for 1.4 users, as we do for tobago. I think we really *should* have commons-1.1 compatible with java1.4. Setting up retroweaver, etc. is a pain in the butt. So if *you* (Volker) can provide a clean and simple patch to get this working, fine. But otherwise I'm happy with removing EnumConverter from commons1.1. You can always build the EnumConverter yourself, however you wish. Regards, Simon
Re: EnumConverter in commons 1.1 branch
Matthias Wessendorf schrieb: On Wed, Jul 2, 2008 at 10:19 AM, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: Volker Weber schrieb: Hi, Leonardo has just deleted the EnumConverter from the jsf1.1 branch of commons. This converter was the reason for me to use a snapshot version in our production application. Is it really necessary to have the commons jsf1.1 branch java 1.4 compatible? I know jsf1.1 is, but commons is an extension, so why should we restrict commons? We may provide a retroweaved (if this is possible with this Converter) release for 1.4 users, as we do for tobago. I think we really *should* have commons-1.1 compatible with java1.4. I think that this is not necessary. Trinidad 1.0.x (which is the JSF 1.1 version) supports only Java5. Setting up retroweaver, etc. is a pain in the butt. So if *you* (Volker) can provide a clean and simple patch to get this working, fine. But otherwise I'm happy with removing EnumConverter from commons1.1. just because of JSF 1.1 uses outdated Java version? I may have spoken too soon. For myfaces core 1.1.x I think we should definitely stay with -source 1.4 -target 1.4 options. There won't be a whole lot of people running it on java1.4, but we currently support it so should stay with it. I guess that commons-1.1.x *could* be run by different rules. It is new, so we won't break any existing users if java15 is set as the minimum. Pros for java15 as minimum in commons-1.1: * can have EnumConverter * internal code can be cleaner * ??? Cons: * some users stuck on JSF1.1 + java14 might not be able to use the new lib. * ??? Anyone else got pros/cons? I can't think of anything particularly convincing either way.. Regards, Simon
Re: [VOTE] Upgrade s:limitRendered to tomahawk
Andrew Robinson schrieb: Andrew, can you give an example of when it could be used? Check the documentation and the demo http://example.irian.at/example-sandbox-20080702/limitRendered.jsf http://myfaces.apache.org/sandbox/tlddoc/s/limitRendered.html It states its purpose as a JSF substitute for c:choose/c:when/c:otherwise to avoid JSP tag logic in JSF pages. I did read the javadoc that says this is a substitute for c:choose. But that wasn't helpful to me. In what kind of real-world page would I want to choose to render the first 3 children? Or child 4 and 7? The example page doesn't help much. It is demonstrating the functionality, not showing a real use case. I can possibly see wanting to render one specific child from a set. But even then, would I really want to specify which child via an *integer offset*? That seems to very tightly couple the page layout to the backing bean. Adding a new child component or rearranging them shouldn't affect *which ones* are rendered, but that's what an index-based approach to child selection will do. The traditional approach of a rendered attribute on each child that queries whether *it* needs to be rendered might be a little more verbose, but seems to provide a much more stable division between page and backing bean than having a backing bean method that returns a list of indexes into the child list. You must have had a real use case that pushed you to write this component. Can you please describe it? Regards, Simon
Re: [VOTE] Upgrade s:limitRendered to tomahawk
Andrew Robinson schrieb: You must have had a real use case that pushed you to write this component. Can you please describe it? The same as all usages of c:choose /. The index based or more than one are just added benefits I threw in. I can provide examples, but I shouldn't have to. I certainly think all new components should have to provide proper use-cases. Having very rarely used components in Tomahawk: * makes it hard for users to find what they want (steeper learning curve) * increases the maintenance burden * increases the jarfile size So components should only go in if they are useful to a reasonable number of people. Just because someone can't think of when it would be needed, doesn't mean it never would be useful, but I can appease you curiosity. It's not curiosity. There is a vast amount of crap in Tomahawk right now, to the point where Tomahawk is close to dying. There hasn't been a release for a year. The number of open bugs is vast. So everyone *should* be watching carefully to see that we don't increase the problems. That doesn't mean that good components cannot be added. But new stuff does need to be evaluated for usefulness. And the author of a component is often too close to the code to see whether it can be improved (that applies equally to me). Having other people look critically at the purpose and API is very useful. You are free to point out any issues with components I write (eg Orchestra stuff). I created the component so that people would stop using c:choose and c:if in JSF pages and complain that they don't work in tables and such. 1) default c:choose functionality (render the first match): s:limitRendered h:outputText value=#{person.first} #{person.last} rendered=#{prefs.firstThenLast} / h:outputText value=#{person.last}, #{person.first} rendered=#{prefs.firstThenLast} / /s:limitRendered Yep, this is a good use case. As you demonstrate later in your email, writing mutually-exclusive rendered expressions for a set of components can get nasty. I'm not a JSTL user, so your reference to c:choose wasn't perhaps as clear to me as it might be to others. I think this way: if (cond1) render component 1 else if (cond2) render component 2 else if (cond3) render component 3 else render component 4 Expanding the javadoc for the component a bit would be good, mentioning that it simplifies rendered expressions for mutually-exclusive components. The current docs don't mention that the implicit condition associated with the choose clauses is the rendered expression; it makes sense once I know what the component is doing but wasn't obvious at first. 2) render index based. This behaves much like the tr:switcher component. But instead of using facets and facet names, it uses s:limitRendered value=#{wizard.currentStep} type=index h:panelGroup h:outputText value=This is wizard step 1 / /h:panelGroup h:panelGroup h:outputText value=This is wizard step 2 / /h:panelGroup h:panelGroup h:outputText value=This is wizard step 3 / /h:panelGroup /s:limitRendered I'm not so sure about this. The tr:switcher makes sense to me; it chooses a component to render by name which will not be easily broken by page changes, and where the link between what the backing bean EL expression returns and what facet is selected is clear (the name matches). Selecting by index has a far smaller set of use-cases I think. And it can promote code fragility; coupling an index returned by the backing bean with an array defined in the page has potential for trouble. But the wizard use-case is an example of a valid use of this functionality. 3) render multiple children. Can be used much like -v vs - can be used for command line verbosity s:limitRendered value=#{verbosity} h:outputText value=#{title} / h:outputText value=#{usage} / h:outputText value=#{description} / /s:limitRendered Equivalent to this: h:outputText value=#{title} rendered=#{verbosity =1}/ h:outputText value=#{usage} rendered=#{verbosity =2}/ h:outputText value=#{description} rendered=#{verbosity =3}/ Yes, the limitRendered approach is a little more efficient; only one EL expression evaluated rather than 3. But any JSF developer understands the latter, while no-one can understand the limitRendered approach without looking up the docs first. And a pretty rare use case I would think. Worth including perhaps if it didn't have any other negatives, but I think it does: it forces the name of the component to be generic and the documentation to be complex. Now I cannot tell you all the reasons they may be useful, but I can say that many time in Trinidad authors chose to only provide functionality that they themselves could think of, making the component useless for every use case they could not think of. Perhaps I cannot think of great reasons to render more than one child at the moment, but who is to say no one will ever want that? Making
cleaning up whitespace in source files
Hi All, In the new checkstyle rules file I enabled checks for tab characters, as the myfaces convention is (AFAIK) to use 4 spaces, not tabs. However the checkstyle report points out a lot of files containing tabs. It's no big deal, but do we want to: (a) disable the checkstyle rule and ignore tabs or (b) fix them? Tabs are a minor nuisance when viewing the source as some tools render 4 spaces, some 8. I've written a simple shellscript that can clean this up very easily, and am happy to do so. The script also removes trailing whitespace from lines, of which we also appear to have quite a lot. But doing this will create some large commit messages and make comparing files with past versions noisier. It can also cause svn conflicts if people have modified files they have not yet committed, unless they run the cleanup script against their own working dir before doing svn update. So, option (a) or (b)? Regards, Simon
Re: [myfaces-builder-plugin] generate site files describing component, like maven-tagdoc-plugin
Leonardo Uribe schrieb: I'm not sure that (1) is possible. The existing extended doc pages contain screenshots, html tables, etc. that just cannot be represented as javadoc AFAIK. So there would be no way to enhance the javadoc on components in a way that would generate anything like the existing extended doc or the trinidad report. I presume the trinidad report merges in hand-written html that can contain stuff like images and stuff?, I have not seen this in deep (there are not screenshots on the doc). The plugin does not suggest any merge. By hand-written and merge I meant that there needed to be something other than the javadocs. And there is: these -base.xml files get merged with the model data, and they contain screenshot tags etc which provide info that just cannot be embedded in the javadoc. I was expecting real html in the templates rather than a custom format, but the current template format is fine. BTW, how similar is this to the way trinidad generates its docs? Identical, or somewhat modified? Is the template file format the same? (just curious...) I have committed myfaces-builder-plugin:tagdoc-index and tagdoc-content goals and apply it to tomahawk core. Now the objective is apply it to sandbox. By that reason, the files related to extended docs about components will be deleted, because this plugin generate a more complete info. +1. This all looks great. Just two minor comments: /** + * Triggers a standard dojo baseScriptUri as defined by the + * a href=http://dojotoolkit.org/;Dojo Toolkit/a + * br / + * br / + * Allows the alteration of the dojo loading root path + * used by require. + * I don't much like br/ in html at all, and certainly not two of them together. It doesn't make any semantic sense, and creates really ugly output. IMO, a paragraph tag is the right thing to use rather than linebreak. I would suggest *not* wrapping the first sentence in a paragraph tag; it isn't needed and looks ugly, but this works: /** * Triggers a standard dojo baseScriptUri etc etc. * p * Allows the alteration of the dojo ... * /p And the first sentence of any javadoc block should be a stand-alone summary. The first sentence of this doesn't make sense as a summary: /** + * The MyFacesDataTable extends the standard JSF DataTable by two + * important features: + * br/ Not worth fixing at the moment, but maybe worth keeping in mind when making future changes.. Regards, Simon
Re: An error during build
Hi Hazem, I've just rebuilt tomahawk and had no problem. Well, after I edited the pom.xml and sandbox/pom.xml to comment out the core12 stuff which doesn't compile. Note that maven does not support multiple concurrent instances accessing the same local maven repository. So you cannot run mvn in two windows at the same time. If you were doing this when the error message was shown, try again with just one instance running. And maven does occasionally get a corrupted local repository (possibly due to accidentally running multiple concurrent instances, or using ctrl-c at the wrong moment). Try renaming ~/.m2/repository so it gets a fresh copy. If that doesn't work, then I'm out of ideas. As I said, it works for me (mvn 2.0.9 on linux). Cheers, Simon Matthias Wessendorf schrieb: I noticed this error a while ago, when building Trinidad. This is one of the typical random build issues, which you can't foresee when using maven ,-) I guess you run with a very recent maven, right? I think using 2.0.4 this will not show up, but this switch causes other issues, since some plugins require 2.0.6 or greater Another option is to patch the FileUtils in question on your computer. Find the one that doesn't have this method, and replace the JAR by one that actually has the right signature. Yes... this is a very ugly workaround... -Matthias On Tue, Jul 1, 2008 at 9:16 AM, Hazem Saleh [EMAIL PROTECTED] wrote: Hi Team, Is there anyone that faces this issue during building Tomahawk ? java.lang.NoSuchMethodError: org.codehaus.plexus.util.FileUtils.getDefaultExclud es()[Ljava/lang/String; at org.codehaus.plexus.components.io.fileselectors.IncludeExcludeFileSel ector.setExcludes(IncludeExcludeFileSelector.java:131) at org.apache.myfaces.buildtools.maven2.plugin.builder.unpack.AbstractDe pendencyMojo.unpack(AbstractDependencyMojo.java:251) at org.apache.myfaces.buildtools.maven2.plugin.builder.unpack.UnpackMojo .unpackArtifact(UnpackMojo.java:207) at org.apache.myfaces.buildtools.maven2.plugin.builder.unpack.UnpackMojo .execute(UnpackMojo.java:180) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi nManager.java:443) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa ultLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLi fecycle(DefaultLifecycleExecutor.java:480) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau ltLifecycleExecutor.java:459) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan dleFailures(DefaultLifecycleExecutor.java:311) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen ts(DefaultLifecycleExecutor.java:278) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi fecycleExecutor.java:143) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:334) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:125) at org.apache.maven.cli.MavenCli.main(MavenCli.java:272) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Thanks! -- Hazem Ahmed Saleh Ahmed http://www.jroller.com/page/HazemBlog
Re: t:graphicImage doesnot generate XHTML complaint code
On Mon, Jun 30, 2008 at 12:15 AM, Hazem Saleh [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi Team, Simon and me made a discussion about making the (t:graphicImage) component XHTML complaint. Here is the thread discussion : https://issues.apache.org/jira/browse/TOMAHAWK-1143 We need to take your opinion about that, Have we have to make the components XHTML complaint or leave this to the user's usage with warnings ? Hazem Saleh schrieb: Sorry the thread discussion is here : https://issues.apache.org/jira/browse/TOMAHAWK-1291 Manfred commented on the jira issue: [I moved this to the email thread, so we don't have half the discussion here and half on the issue] +1 for a strict (but sweet-tempered) behaviour that means: - log a nag warning - render a non-empty alt attribute with a meaningful default text if the developer omits the attribute (or provides an empty one) The thing is that for h:graphicImage and t:graphicImage we have **no idea** what a meaningful text would be. This is some arbitrary image that the user has chosen. For what purpose? We don't know - unless we embed AI software and do image recognition on the referenced file. So for h:graphicImage and t:graphicImage we have **only** these choices: (a) don't output ALT. This screws all blind users, but in an obvious way so that QA departments can easily detect it and tell their developers to add the needed alt attributes. And it is not our code that is at fault. (b) output empty ALT. This screws all blind users, but it cannot be detected by validation. And it is our code that is at fault as well as the user code. (c) output ALT with ha ha no description. See (b). For cases where myfaces components are generating the image references for their own purposes, they *know* what that purpose is. Always. So they are always capable of attaching a valid ALT description. Regards, Simon
Re: t:graphicImage doesnot generate XHTML complaint code
[EMAIL PROTECTED] schrieb: On Mon, Jun 30, 2008 at 12:15 AM, Hazem Saleh [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi Team, Simon and me made a discussion about making the (t:graphicImage) component XHTML complaint. Here is the thread discussion : https://issues.apache.org/jira/browse/TOMAHAWK-1143 We need to take your opinion about that, Have we have to make the components XHTML complaint or leave this to the user's usage with warnings ? Hazem Saleh schrieb: Sorry the thread discussion is here : https://issues.apache.org/jira/browse/TOMAHAWK-1291 Manfred commented on the jira issue: [I moved this to the email thread, so we don't have half the discussion here and half on the issue] +1 for a strict (but sweet-tempered) behaviour. that means: - log a nag warning - render a non-empty alt attribute with a meaningful default text if the developer omits the attribute (or provides an empty one) The thing is that for h:graphicImage and t:graphicImage we have **no idea** what a meaningful text would be. This is some arbitrary image that the user has chosen. For what purpose? We don't know - unless we embed AI software and do image recognition on the referenced file. So for h:graphicImage and t:graphicImage we have **only** these choices: (a) don't output ALT. This screws all blind users, but in an obvious way so that QA departments can easily detect it and tell their developers to add the needed alt attributes. And it is not our code that is at fault. (b) output empty ALT. This screws all blind users, but it cannot be detected by validation. And it is our code that is at fault as well as the user code. (c) output ALT with ha ha no description. See (b). For cases where myfaces components are generating the image references for their own purposes, they *know* what that purpose is. Always. So they are always capable of attaching a valid ALT description. Mario has suggested to me that if there is a title attribute on the component, then that could be used as the alt text. That seems reasonable; an image title should be meaningful. But that doesn't solve the problem. Put the filename of the referenced image in the alt? It might sometimes work - but equally might not. And it isn't translated into the user language. So no, this isn't really a good solution. Regards, Simon
Re: CAPTCHA tag in Tomahawk
Michael Angelo schrieb: I can't figure out the exact scenario, but when I embed the t:captcha tag within a table cell the resulting HTML is injected in the middle of other HTML tags therefore altering the view of the page. When I move the captcha tag out into it's own table outside of the main table it generates correctly. Are you using jsf1.1 with JSP, and not wrapping your html in f:verbatim? Regards, Simon
Re: [myfaces-builder-plugin] generate site files describing component, like maven-tagdoc-plugin
Do you mean pages like this? http://myfaces.apache.org/tomahawk/popup.html I'm not sure that the syntax section here is particularly useful. That information is available in the taglib docs; no point in repeating it. And the API section isn't terribly useful; maybe for people who want to customise a component it makes it easier to find the real component class, but that isn't often. The description field could be copied I suppose; that would be mildly useful. But the rest is all hand-written. Or did you mean something else? Regards, Simon Leonardo Uribe schrieb: The idea could be use velocity templates like everything. On Thu, Jun 26, 2008 at 10:38 PM, Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi It could be good if we could generate the site files describing each component using myfaces-builder-plugin (generate from myfaces-metadata.xml instead from faces-config.xml). It's a waste of effort have to maintain this files. If we do this, we make easier move components from sandbox to tomahawk. I'll try it and see what happens. Suggestions are welcome. regards Leonardo Uribe
Re: [myfaces-builder-plugin] generate site files describing component, like maven-tagdoc-plugin
Man, I really hate top-posting. My first reply was in top-posting style because the first reply to this thread was top-posted. But now the styles are mixed, and this thread is rather painful to follow. My reply is somewhere in the middle here I really do think that consistently posting *under* the text that is being replied to leads to threads that are *much* easier to read. Leonardo Uribe schrieb: On Fri, Jun 27, 2008 at 1:53 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Do you mean pages like this? http://myfaces.apache.org/tomahawk/popup.html Yes. Each time we want to upgrade some component, we have to write this part manually. I'm not sure that the syntax section here is particularly useful. That information is available in the taglib docs; no point in repeating it. And the API section isn't terribly useful; maybe for people who want to customise a component it makes it easier to find the real component class, but that isn't often. The description field could be copied I suppose; that would be mildly useful. But the rest is all hand-written. Or did you mean something else? Trinidad has an generated Tag library information (using maven-tagdoc-plugin) here: http://myfaces.apache.org/trinidad/trinidad-api/tagdoc.html When you navigate through this, you can find a lot more info than on its java taglibdoc. One question is why have a extended doc on tomahawk (see http://myfaces.apache.org/tomahawk/extendedDocs.html) link, if all necessary info is on its tld (or could be added here). We have 3 alternatives: 1. Remove extended doc pages and do and merge the info on the description field of each component (so this is available on its taglibdoc). 2. Remove extended doc pages and apply Trinidad solution: a custom report goal that generate all the info. 3. Do nothing and live with the fact of maintain 70 or more pages that say a little bit more than on the taglibdoc. Suggestions? It looks to me like the trinidad report is a complete replacement for the standard taglib report, containing all the info of the existing taglib reports plus more. And they look nicer too. So I'd be happy to see us use the same maven report plugin that trinidad does, instead of the current taglib plugin. And ditch the extended doc pages. I'm not sure that (1) is possible. The existing extended doc pages contain screenshots, html tables, etc. that just cannot be represented as javadoc AFAIK. So there would be no way to enhance the javadoc on components in a way that would generate anything like the existing extended doc or the trinidad report. I presume the trinidad report merges in hand-written html that can contain stuff like images and stuff?, Regards, Simon Regards, Simon Leonardo Uribe schrieb: The idea could be use velocity templates like everything. On Thu, Jun 26, 2008 at 10:38 PM, Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi It could be good if we could generate the site files describing each component using myfaces-builder-plugin (generate from myfaces-metadata.xml instead from faces-config.xml). It's a waste of effort have to maintain this files. If we do this, we make easier move components from sandbox to tomahawk. I'll try it and see what happens. Suggestions are welcome. regards Leonardo Uribe
Re: Time for an Orchestra 1.2 release?
Kito D. Mann schrieb: Simon, How much work is required to support portlets in Orchestra? I don't think it's too much. Probably 8-12 hours of work (wild guess). Regards, Simon
Re: [VOTE] Upgrade sandbox converters and validators to tomahawk
Leonardo Uribe schrieb: Hi The following converters and validators are proposed to be moven from sandbox to tomahawk core: s:convertBoolean +1 s:convertDateTime Only if the comment on this class is updated. It should read something like: Interprets dates as being in the local timezone of the server. This converter is therefore useful only for dinky little websites where every single user is in the same timezone. For real websites, use the standard h:convertDateTime. s:convertNumber We've had problems with the convertNumber converter here recently. I think its TLD API needs careful review before it is promoted. s:validateCompareTo The javadoc needs to be cleaned up at least. There is plenty of good info there, but it is not in valid javadoc format. And all that code about independently converting a component's submitted value makes me a little nervous. It looks ok, but has anyone properly reviewed it? s:validateCSV s:validateISBN s:validateUrl +1 In tomahawk core, the related files should be moved from sandbox/core to core. In tomahawk core12, a new dependency to myfaces-commons-converters 1.2.x and myfaces-commons-validators 1.2.x should be added, so the tomahawk core12 tld reference validators and converters from these projects. This introduce a change because the validatorId and converterId for this components changes (because this converters are defined on myfaces-commons) only on core12. If you want to see this change on tomahawk please vote. This was discussed positively before, but the times change and better to know what people think about it. I don't like the idea of tomahawk1.x having these components internally, while tomahawk2.x uses the version from commons. It's ugly and confusing. While code duplication is never nice, I think it would be better for tomahawk 1.1.x and 1.2.x to continue to have these components internally, and for commons to have a separate version. It also means that commons can clean up the API without breaking tomahawk users. Yes, it does mean having to apply fixes in two places (tomahawk, commons) but so does the alternative (tomahawk 1.x, commons). Regards, Simon
Time for an Orchestra 1.2 release?
Hi All, I'm about to start working on a new Orchestra feature (basic dialog support). It therefore seems a good idea to get an Orchestra release out before I start messing with the trunk. In particular, there are two bugs which are fixed in trunk and would be good to have in an official release: * ORCHESTRA-21 Issue with weird urls (esp. ones created by Trinidad) * ORCHESTRA-23 (locking bug, triggered by ajax or double-click) There are also a couple of minor enhancements, but not much has really happened since the 1.1 release. As always, there is more we *could* do; there are a couple of open bug issues. But none of them are simple to tackle, so I'd rather get 1.2 out now and tackle the other issues (esp. portlets) at some later time. I've created a release branch in svn, and taken out a few things that are not completely baked in trunk. The resulting code is binary compatible with 1.1 except in one minor case that I don't think we need to worry about. But see the RELEASE-NOTES.txt file for details. Unless there are any objections, I'll create an RC in the next few days. Cheers, Simon
Re: [myfaces commons] discussion about reorganization of this project is required!
Volker Weber schrieb: Hi, i think we should not use the version to distinguish between the 1.2 and 1.1 branch of commons (in the release artifacts name), because the 1.2 (trunk) is not a improved 1.1 version. afaik there could be circumstances where maven prefers the higher version even in a jsf 1.1 application. True. That's always bothered me about the current core/shared/tomahawk versioning system. Shared 1.1.x is not just a different version from Shared 1.2.x. It's really a different artifact. This could be argued for core and tomahawk too. Regards, Simon
Re: [Build] philosophy behind the myfaces build ?
Matthias Wessendorf schrieb: Hi, when doing a checkout of myfaces, pretty much everything is build. Fine. Except Trinidad and Tobago. No problem with that. But, when just updating a single svn-folder, like tomahawk, there is a very high chance that the build pretty much fails. Why? because it depends on snapshots that are build via the master myfaces build. In this case I am refering to the myfaces-builder plugin. Isn't is kinda annoying that you always have to build all? Just b/c of a snapshot dependency? At least to me. Why not testing the builder-snapshot in a branch (like tomahawk-move-to-builder-branch). Do a builder release, once stable. And update trunk. (I am only using builder-plug as an example). That's what we do for Trinidad. It doesn't depend on a snapshot plugin, so it is easy (and straightforward) to build it. Not sure why there is this, build the world first philosophy :-) What do you think ? Add the following to ~/.m2/settings.xml. Then add -Papachesnap when building a project. This allows maven to download stuff published to the snapshot repository. Which is kind of useful when building snapshot projects :-) settings profiles profile idapachesnap/id repositories repository idapache.org/id nameMaven Snapshots/name urlhttp://people.apache.org/repo/m2-snapshot-repository/url releases enabledfalse/enabled /releases snapshots enabledtrue/enabled /snapshots /repository /repositories pluginRepositories pluginRepository idapache.org/id nameMaven Plugin Snapshots/name urlhttp://people.apache.org/repo/m2-snapshot-repository/url releases enabledfalse/enabled /releases snapshots enabledtrue/enabled /snapshots /pluginRepository /pluginRepositories /profile /profiles /settings Regards, Simon
Re: [m2 plugins] Maven2 Builder Annotations
Matthias Wessendorf schrieb: hi, is there a special reason why the mentioned plugin is requiring maven 2.0.6 ? I think it is b/c of a weird maven dependency. The kinda usual dance: maven-plugin-a - maven-plugin-b - maven-plugin-c - maven-plugin-d ... Just curious if there is a real reason :-) Sorry, don't understand what you mean. I build the maven2-plugins directory with maven 2.0.9 and it works fine. And I cannot see anything requiring maven 2.0.6. Regards, Simon
Re: [VOTE] promoting the selectOneRow component to Tomahawk
I think it would be useful to have an answer to my question first. *Is* it possible to do this? h:dataTable ... s:selectOneRow/ /h:dataTable If so, then moving from the current approach to using an attribute will remove existing functionality. Ok, with a sandbox component that is allowed but the above looks useful. Unfortunately I'm really busy on other things at the moment, and don't have time to look myself. Regards, Simon Hazem Saleh schrieb: Any new votes ? On Sun, Jun 15, 2008 at 4:13 PM, simon [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: I only suggested leaving it as it is *if* it is possible to use this component with tables other than t:dataTable. On Sun, 2008-06-15 at 15:57 +0300, Hazem Saleh wrote: OK Let's make this as a vote : we now have : - 2 votes for supporting the feature inside the t:dataTable. - 1 vote for leaving the current component as is. On Sun, Jun 15, 2008 at 3:13 PM, simon [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: A separate component would be nice if it could be applied to any table, eg the h:dataTable. On Sun, 2008-06-15 at 13:40 +0300, Hazem Saleh wrote: +1 for Matzew idea. I think that the old component syntax was suitable for the component when it was in the sandbox phase. I will implement this idea, it is good. Thanks Matzew! On Sun, Jun 15, 2008 at 11:16 AM, Matthias Wessendorf [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: On Sat, Jun 14, 2008 at 6:16 PM, Hazem Saleh [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi Team, After updating the selectOneRow component's documentation. just curious, what is this component about? Using it like this ? t:table . s:selectOneRow / /t:able If so, how to declare multiple row selection ? Trinidad's table doesn't treat that as a special case. It just has an attribute for that, where you specify via an enum the selection type (single:mulit) Greetings, Matthias I wish to promote it to the next Tomahawk release. [+1] for agreeing with promoting the component to the next Tomahawk release. [-1] for disagreeing with promoting the component to the next Tomahawk release.
Re: [trinidad] Why input type=hidden name=javax.faces.ViewState ... does not render its id?
Leonardo Uribe schrieb: On Wed, Jun 11, 2008 at 11:24 PM, Matthias Wessendorf [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: I want to know if any developer has a reason to avoid render this id, and if no objections I'll commit this change in 72 hours (doing this s:inputSuggestAjax and s:tableSuggestAjax could work with trinidad and/or facelets and solve TOMAHAWK-1157). so, I guess I am not 100% getting your mail. why is the id needed ? For detect when a request is a postback using the procedure on the documentation. If is a postback the state is restored an the full lifecycle occur. If not a new view is created. Who detects postback in this way? the jsf implementation used! I think Matthias meant that the html name attribute is all that is significant for html. That is what gets used in creating the http post data. The id attribute is only relevant for javascript and css. And css certainly shouldn't be accessing this field. Do you want the id attribute set so that javascript associated with AJAX-type controls can find this viewstate field? Regards, Simon
Re: [trinidad] Why input type=hidden name=javax.faces.ViewState ... does not render its id?
Matthias Wessendorf schrieb: On Thu, Jun 12, 2008 at 7:02 AM, Leonardo Uribe [EMAIL PROTECTED] wrote: On Thu, Jun 12, 2008 at 8:51 AM, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: Leonardo Uribe schrieb: On Wed, Jun 11, 2008 at 11:24 PM, Matthias Wessendorf [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: I want to know if any developer has a reason to avoid render this id, and if no objections I'll commit this change in 72 hours (doing this s:inputSuggestAjax and s:tableSuggestAjax could work with trinidad and/or facelets and solve TOMAHAWK-1157). so, I guess I am not 100% getting your mail. why is the id needed ? For detect when a request is a postback using the procedure on the documentation. If is a postback the state is restored an the full lifecycle occur. If not a new view is created. Who detects postback in this way? the jsf implementation used! I think Matthias meant that the html name attribute is all that is significant for html. That is what gets used in creating the http post data. The id attribute is only relevant for javascript and css. And css certainly shouldn't be accessing this field. yup Do you want the id attribute set so that javascript associated with AJAX-type controls can find this viewstate field? why not getElementsByName[0] ? Well, in general document.getElementById is both faster and more elegant. So in this case, rendering this component with id=clientId seems reasonable. Hmm..but it is possible that there are multiple view state fields in the page, one per form. So perhaps using getElementsByName is a good idea in this specific case. Question: in systems like portals, how does replacing the viewstate work? Presumably we would only want to replace the viewstate for those hidden inputs that really came from the same jsf view? Regards, Simon
Re: [trinidad] Why input type=hidden name=javax.faces.ViewState ... does not render its id?
Mario Ivankovits schrieb: Hi! why not getElementsByName[0] ? Well, in general document.getElementById is both faster and more elegant. So in this case, rendering this component with id=clientId seems reasonable. Hmm..but it is possible that there are multiple view state fields in the page, one per form. So perhaps using getElementsByName is a good idea in this specific case. And also ensure that every ViewState will be replaced and not only the first [0] one. When you do have multiple JSF forms (which is valid) each ViewState needs to be replaced. Yes, but only when they came from the same view! I don't know much about JSF portal bridge stuff. Can this result in a page contains a form that was rendered via JSF from host A, and a different form that was rendered via JSF from host B? If so, then a PPR response from host A should not update both forms.. Regards, Simon
Re: [VOTE RESULT] myfaces-extensions as a new myfaces sub-project
Hi Gerhard, Gerhard Petracek schrieb: this vote passed with 100 % +1 The rule for any vote is that it requires at least 3 +1s from PMC members in order to pass. Note that votes from committers and users are an important *influence*. But in the end it is the PMC which has been appointed by the ASF board as responsible for overseeing the project [1] Of course not every issue is an official vote; discussions about architecture, bugfixes, etc are an issue of consensus and not a vote, even though people express themselves as +1/-1. But I think creating a new sub-project is definitely an official VOTE issue. So could you please list the actual votes, and whether the voter is a PMC member or not? [1] On a side note, if anyone knows a committer who has been involved on a regular basis for a year, and is not on the PMC then I suggest pointing this out in an email to [EMAIL PROTECTED] Thanks, Simon
Re: [VOTE] Apply myfaces builder plugin for myfaces core 1.2 and tomahawk 1.2
This isn't a vote to do a release, or start a new project, so I think simple consensus is enough, ie that this vote passes. Does the comment below not apply fully mean that annotations will be added to the code, and the make-config task will be run to generate the myfaces-metadata.xml file, but that otherwise everything still uses myfaces-faces-plugin to build as before? If so, +1 from me too. Although I obviously look forward to having myfaces-builder-plugin used for core12, it would be better to prove it with core11 and tomahawk releases first. But adding annotations does not harm the current build process. Regards, Simon Leonardo Uribe schrieb: In my opinion yes, so if no objections I'll commit the proposed code, but on the point: 1. Apply myfaces-builder-plugin on myfaces core 1.2 (necessary for build correct myfaces-metadata.xml for tomahawk 1.2) I'll not apply fully the plugin, just for component class generation and build metadata, as suggested. regards Leonardo Uribe On Fri, Jun 6, 2008 at 7:19 AM, Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: One question: we have here some positive votes. Is this votes enough? regards Leonardo Uribe On Wed, Jun 4, 2008 at 4:55 AM, Hazem Saleh [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: +1 On Wed, Jun 4, 2008 at 12:01 PM, Bruno Aranda [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: +1 (both points) 2008/6/4 Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]: +1 On Tue, Jun 3, 2008 at 10:55 PM, Leonardo Uribe [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi The code necessary to tomahawk 1.2 is ready for commit. For do this, it is necessary to apply myfaces builder plugin on myfaces core 1.2. The work on myfaces core 1.2 can be seen here: http://svn.apache.org/repos/asf/myfaces/myfaces-build-tools/branches/builder_plugin/bigtest/core_trunk_1.2.x/ There is still some details that I'm solving (I want to remove myfaces-faces-plugin, and generate everything with myfaces-builder-plugin), but the code is ready for a vote. It uses source annotations (instead doclets on 1.1, see myfaces-builder-annotations submodule). For do this first it was modified temporally myfaces-faces-plugin for generate myfaces-builder-annotations automatically and then do the upgrade (and add additional missing info). Clirr report does not show any problems. The code for tomahawk 1.2 can be seen here http://svn.apache.org/repos/asf/myfaces/myfaces-build-tools/branches/builder_plugin/bigtest/tomahawk12_trunk/ It uses the new unpack mojo that makes very easy the maintain of this project. The idea of unpack goal is extract tomahawk 1.1 parts of code and resources automatically, making this project relatively small. In conclusion, the vote is for this parts: 1. Apply myfaces-builder-plugin on myfaces core 1.2 (necessary for build correct myfaces-metadata.xml for tomahawk 1.2) 2. Add two modules called core12 and sandbox/core12 on tomahawk, that contains 1.2 specific code for tomahawk. Suggestions are welcome. regards Leonardo Uribe -- Hazem Ahmed Saleh Ahmed http://www.jroller.com/page/HazemBlog
Re: [VOTE] To have MyFaces Alchemy as a subproject
Jihoon Kim schrieb: I was hoping to start a vote of creation of MyFaces Alchemy as a subproject to hold bridging of various Web 2.0 and new technology to JSF. Even if you have voted in a previous post, please vote here as well to get an accurate count and an accurate viewpoint. The initial batch of code that integrates Adobe Flex [open sourced through MPL which is approved within Apache] with JSF has been written and can be viewed within the following JIRA link [recently made some improvements within the code] = https://issues.apache.org/jira/browse/TOMAHAWK-1250 If the vote is affirmative then I hope that MyFaces will propose to the Incubator PMC to accept MyFaces Alchemy as a Podling within the Incubator. For ref, the proposal has been written up within the following Incubator wiki page = http://wiki.apache.org/incubator/MyFacesAlchemyProposal Thanks!!! I'm still voting -1 on having a flash-based project within MyFaces unless someone can show that this code works with a non-proprietory Flash player. I'd be happy to see this project succeed, but not here at Apache. This foundation exists to promote open source software, not provide wrappers for proprietory tools. Regards, Simon
Re: Adobe Flex components as MyFaces JSF components
I'd vote -1 on accepting this project into myfaces. I don't think the idea of a bridge between jsf and flex is a bad one, but I don't believe that a wrapper for a proprietory tool belongs here at Apache. I would suggest sourceforge instead. Myfaces members who are interested in flex (eg Hazem, Grant) can of course equally well work with a sourceforge project. We could certainly add a link from our pages pointing at related projects, such as this one, either on the wiki or on the main site itself. And I think it would be quite reasonable for release announcements to be posted on the myfaces user list; people subscribed to the myfaces list might be interested. BTW, until javafx is propertly open-source, I would probably also vote -1 to a wrapper around that being hosted here. Regards, Simon Hazem Saleh schrieb: If the idea is accepted, I can give a part of my time to implement the standard converters, validators and other stuff. IMO, alchemy is a very nice name ( I really like :) ). +1 for Gerhard idea. On Wed, May 14, 2008 at 1:20 AM, Jihoon Kim [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi thanks for the feedback! Yes, I was intending to support the other jsf artifacts such as converters and validators in the future. In the contribution, I did create the components for Flex's validators and converters. But since they belong in a swf file, they technically are actual jsf components [in order to keep to the design]. Yet since all components that take input and updates the model extend from UIInput, I do not think there will be too many issues in supporting the regular jsf converters and validators. Of course when the contribution does get accepted, I do plan on investing my free time in creating support for standard converters and validators as well as other areas that I wish to improve upon. I think that's a good idea, since in the future I was hoping to check out javafx and other technology as well. Thanks!!! On Tue, May 13, 2008 at 2:27 PM, Gerhard Petracek [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: hello, i had just a very quick look at it. do you plan to support existing jsf artifacts (e.g. converters, validators,...)? @subproject and name: for each topic you have to start at least a vote. what's about the following idea: let's start a subproject for component libs like alchemy. in the future it might be nice to have also a component lib for e.g. javafx,... [new subproject] |_ [alchemy] |_[...] I would suggest that the best place would be as a new subproject, like orchestra or portlet bridge are. It would definitely belong in the MyFaces family (rather than anywhere else in Apache) but I don't think it fits as part of Tomahawk. As I noted earlier, people who use tomahawk won't always want Flex, and people who want Flex won't always want the other tomahawk components. It could also be a new myfaces commons module, but we haven't really figured out how to structure those anyway. And it isn't really a common module, but a component library just like tomahawk is. So is there any documentation in regards to how a code becomes a subproject of MyFaces? I would like to possibly look into that area, since if that is the path that might be taken; I would like to know of the process ahead of time. I did previously read through the process within the incubator, but wasn't sure if that was solely for a standalone project or applies to subproject as well. In the case that the contribution does get accepted and does become a subproject, I even have a name that I would like to propose, which is alchemy. Thanks!!! http://www.jroller.com/page/HazemBlog
Re: [Orchestra] Disabling Persistence Conversation Feature
Mario Ivankovits schrieb: Hi! I tried deleting the part regardless to persistence from spring configuration (application-context.xml) but no success. I encountered exceptions thrown by the Orchestra Conversation Interceptor. Not configuring the persistence related advice (e.g the persistentContextConversationInterceptor) with the scope should be sufficient. section. What exceptions do you encounter? For conversations *with* persistence support, you'll have something like this in the spring config file: entry key=conversation.manual bean class=org.apache.myfaces.orchestra.conversation.spring.SpringConversationScope property name=timeout value=30 / property name=advices list ref bean=persistentContextConversationInterceptor/ /list /property /bean /entry If you don't need persistence, just leave out the advices property. It is the persistenceContextConversationInterceptor that does all the work of binding a PersistenceContext to a conversation instance. And in future, please ask questions on the user list. It's better for *you* because there are lots of very competent people who subscribe to the users list but not the dev list. And all developers subscribe to both. Regards, Simon
Re: svn commit: r653506 - in /myfaces/tobago/trunk: core/src/main/java/org/apache/myfaces/tobago/ajax/api/ core/src/main/java/org/apache/myfaces/tobago/application/ core/src/main/java/org/apache/myfac
[EMAIL PROTECTED] schrieb: Author: bommel Date: Mon May 5 08:40:29 2008 New Revision: 653506 URL: http://svn.apache.org/viewvc?rev=653506view=rev Log: (TOBAGO-662) Use LOG.isInfoEnabled() for LOG.info() Hi Bernd, Just wondered if you know about just4log.sourceforge.net. That's an alternative to manually wrapping all the logging statements... Regards, Simon
please disable continuum
Hi, Could someone please disable continuum for a day or two? The svn server is obviously a bit unstable at the moment, and the enormous streams of error messages that continuum is generating as a result are rather annoying. Let's give things a while to stabilise... Regards, Simon
Re: Problem with tomahawk sample executing from websphere Need Help
Nutulapati, Krishna schrieb: Hello All, . I executed one of the tomahawk sample from tomcat which is working fine, but getting following error in the console.with websphere,and seeing just a blank screen from browser. 4/24/08 17:35:59:135 CDT] 001e MyfacesConfig E org.apache.commons.logging.impl.Jdk14Logger error Both MyFaces and the RI are on your classpath. Please make sure to use only one of the two JSF-implementations. The following wing are exact jar files I'm using. I did n't find where I'm using 2 implementations of JSF. Can I have any suggestions? I indeed appreciate your early response. Does websphere not support tomahawk? I'm using jdk 5.0,websphere 6.1 . I did n't find any java compilation errors in RAD though. Thanks Websphere (and other containers) have a directory of common jars that are visible to all webapps/ j2ee apps. I don't know exactly what the directory is called for websphere, but in a plain Tomcat container the jars are in $base/lib You will need to remove the jsf-api.jar and jsf-impl.jar files from that directory. Of course that will affect any other jsf applications that run in the same websphere installation. Ideally, containers would have a config option that could be used to select which of the common jars should be visible to apps running in the container, but AFAIK no container provides that option. Just selecting child-first classloading order is not sufficient unfortunately. Regards, Simon
Re: Problem with tomahawk sample executing from websphere Need Help
Gerald Müllan schrieb: Hi, see also: http://wiki.apache.org/myfaces/Websphere_Installation From my experience it is not that easy to remove the RI, but wish you good luck. Apart from this, please ask such kind of questions on the user list, this list is for developer discussions only. Hmm..I'm quite surprised that the approach documented for Websphere6.1 works (just using parent last order). JSF looks for a faces-config.xml file in every jar in the classpath, and runs them all. So with this setup, it will run the one from the default JSF impl (Mojarra) *as well* as the one from MyFaces. That doesn't sound healthy to me. But the page says it works, so ...
Re: [VOTE] Add MyfacesBuilderPlugin to buildtools branch and use it on myfaces 1.1
Leonardo Uribe schrieb: MyfacesBuilderPlugin is now available for use on myfaces 1.1, so we can officially vote about what plugin use on myfaces 1.1, 1.2 and tomahawk for code generation +1 for myfaces-core 1.1.x and tomahawk now! (+1 for myfaces-core 1.2.x eventually, but there's no hurry..)
Re: [VOTE] Add MyfacesBuilderPlugin to buildtools branch and use it on myfaces 1.1
Werner Punz schrieb: +1 definitely for jsf 1.1 we need something working and well documented. Just to be clear: the options for the next core-1.1and tomahawk releases are: (a) the new myfaces-builder-plugin (b) the myfaces-faces-plugin (formerly called trinidad-faces-plugin) as currently used in trinidad/core-1.2.x (c) stay with the current solution. Info about approach (a) can be found here: http://wiki.apache.org/myfaces/MyfacesBuilderPlugin Background on the whole issue (including a description of the current solution) can be found here: http://wiki.apache.org/myfaces/Code_Generation This vote is specifically about whether to choose (a) or not. Choosing (a) means updating source in core-1.1 and tomahawk-1.1 trunk to add javadoc annotations to the source code. Hence the vote, to make sure everyone is happy for that to be done. Werner, did you mean +1 for (a) or something else? Regards, Simon
Re: possible replacement for our kupu based html editor
Werner Punz schrieb: Mario Ivankovits schrieb: No way - http://www.cdolivet.net/editarea/editarea/docs/license.html - its LGPL Probably contacting the author and asking him to change the license or dual license it might be worth a try. Urgs sorry about that I thought the website stated lgpl and apache license I have to recheck that. The sourceforge website does state that it is dual-licensed, ASL and LGPL. I'm not sure this is appropriate, though. He is very clear in the description that this is a *code* editor, and not a general-purpose text editor. So for example, the syntax highlighting is good, but there are no bold/italic/etc controls that I can see. Which tomahawk component are you referring to as our kupu based html editor ? I cannot see any t:htmlEditor or s:htmlEditor component.. Regards, Simon
Re: svn commit: r649387
Hi Lenonardo, [EMAIL PROTECTED] schrieb: Author: lu4242 Date: Fri Apr 18 00:00:16 2008 New Revision: 649387 URL: http://svn.apache.org/viewvc?rev=649387view=rev Log: Generation of .tld I just wanted to say I really appreciate all the work you're putting into this. I'm back from holiday but still don't have much spare time to hack on this until after next weekend. However I have been trying to keep up with all the commits. It's looking really good - and pretty close to done! I particularly like the velocity scripts; understanding and changing the generated output (if needed) should be quite easy. Regards, Simon
Re: MyfacesBuilderPlugin
Leonardo Uribe schrieb: On Wed, Apr 16, 2008 at 12:33 PM, Mike Kienenberger [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Another alternative to option 1 is to #parse($fileName) or #include($fileName). You can specify filename externally. This is probably the best solution so long as the contents of the file included can be included as-is. The only problem with this solution is that you cannot do something like this: faces-config-base.xml faces-config application !-- custom code -- /application /faces-config and include only the application part (what I want because is more clear for the developer). I have implemented option 1, anyway, using #parse or #include is also available. Yes, the original plugin could take a full config file, and extract the children of the document root for merging into the final result. That was a little cleaner. I don't know enough about velocity to know if there is a way to achieve that. I guess a builder-plugin utility class could be written to take a filename as input, and return the content of the root element as a string. Then that could be called from the velocity template. Right now generation of faces-config, .tld (or any config file you want), component classes and tag classes works. Now I'm doing some big tests (use it in myfaces 1.1 and generate faces-config, .tld, and tag classes, then probe it on tomahawk generating all the stuff). The only part that I want to enhance of this plugin is that is very common that component generation can be done fully (an example is javax.faces.component.html classes minus HtmlDataTable). The question is how to add the definition of this component without creating a .java file and annotations or doclets to do it (maybe do a part that use groovy .). My first idea is create a package scope .java class like _HtmlCommandButton and add the needed stuff, but this file is included by the compiler to the jar (does not a big problem, since has not side effects). So I'm looking a cleaner way to do this. suggestions are welcome I'm ok with package-scoped classes for this case. I think it is clear to developers what is going on, which is more important than saving a few kb in the resulting jar. But of course the maven-jar-plugin could be configured to exclude these empty classes when building the jarfile if this is important. In addition, if javadoc is generated with package-scope enabled, then we get some helpful documentation about what is going on. For java1.5, it is possible to write a package.java file. This is intended to replace package.html; it allows package docs to be written as javadoc, rather than an html fragment. It also allows package-scope annotations to be added, which can be introspected at runtime. This would be a possible place to put annotations that don't have a real class to be attached to. Of course this does result in another .class file in the jar, but that didn't seem to bother Java's architects. Does the meta-data for these auto-generated components really have *no* properties that can be derived from introspection of a java class? One reason for the annotation-based approach is that properties can be determined by analysing code rather than simply accepting string data entered by the user. That allows at least some of the config to be checked by the compiler, supported during refactoring, etc. If a project uses the real annotations (rather than javadoc annotations) then it becomes even more useful to have this info expressed via .java files (rather than a .xml config file or similar). Regards, Simon
Re: MyfacesBuilderPlugin
Leonardo Uribe schrieb: Hi I have a design question. I'm working on generate component tag classes using velocity. In this part it is common to found some situations when you need utility methods. There are several methods to do this: 1) Implementing this methods on a java class, and use the following code using a macro file on inside the template: ## [ Setting Utility Classes to use ] ## #set($utils = $classes.forName(org.apache.commons.lang.StringUtils).newInstance()) In this case, we can copy org.apache.myfaces.buildtools.maven2.plugin.faces.util.Util form myfaces-faces-plugin and use it inside the templates. like this $utils.lowerCase($field.getAttributeValue(name)) 2) Use a file to create velocity macros and implement this here. Inside we need to use StringUtils like in (1), but from the point of view of the template designer, He/she doesn't see this. 3) Create methods on each Model and XXXMeta. Sometimes this is unavoidable (like getting the properties from a component) and is more clean. For example: package ${component.tagPackage}; public class ${component.tagName}{ #foreach( $property in ${component.propertyList} ) //getter and setter methods #end } This two methods (getTagPackage and getTagName and derived properties of tagClass). What option could be better? If no suggestions, I will go for option 1 and 3. +1
Re: nasty problem in server side state saving
Sochor Zdeněk schrieb: Hi, Mario Ivankovits napsal(a): ... with serialize in state disabled. I've create a small test case which shows that the attributes map is just copied over into the state. Which means that each and every Component shares exactly the same map. Any change to this map will be reflected in ALL saved states. It's because the wrong constructor in api's _ComponentAttributesMap class, it's assigning the map directly: 1.1 trunk: _ComponentAttributesMap(UIComponent component, Map attributes) { _component = component; _attributes = attributes; } should be _ComponentAttributesMap(UIComponent component, Map attributes) { _component = component; _attributes = new HashMap(); _attributes.putAll(attributes); } the same in 1.2 trunk: _ComponentAttributesMap(UIComponent component, MapObject, Object attributes) { _component = component; _attributes = attributes; } should be: _ComponentAttributesMap(UIComponent component, MapObject, Object attributes) { _component = component; _attributes = new HashMapObject, Object(); _attributes.putAll(attributes); } But as Mario says.. Correct would be to clone the map, but this must not work, at least a shallow copy of the map should be saved (when serialize in state=false) to avoid this problem. Your solution fixes one more level, but not every possibility. It's certainly better, but not complete. If the attributes map has a mutable object in it (eg a StringBuffer, or something more complex) then the problem remains; a change to the object state via one UIViewRoot will cause it to change in the other. Using java.io.serialize stuff forces a real deep clone which solves this issue. But it does then require that every object put into a component's attributes is serializable. Regards, Simon
Re: MyfacesBuilderPlugin
Leonardo Uribe schrieb: Hi The problem with xml files to make faces-plugin test work is now fixed. Great. I have a question about how myfaces-metadata works. The idea is have one file per jar, and when the model is readed, it should scan every artifact and merge it with the generated myfaces-metadata.xml. I'm right or wrong (This is what I'm doing right now, and in my opinion is the preferred way)? I'm not quite clear what your description above means. I think we are talking about the same thing, but just to be clear this is how I would see it working: == for goal build-metadata: start with an empty model for each jarfile containing a META-INF/myfaces-metadata.xml file read that myfaces-metadata.xml file add the resulting objects into the model[1] run the ModelBuilder for the current project, which adds more objects to the model save the model into META-INF/myfaces-metadata.xml in the current project An alternative would be to do the merging just at the xml level, then build a model from the resulting merged xml file. That also seems reasonable. == for other goals (eg generate faces.xml, generate tag classes): start with an empty model read META-INF/myfaces-metadata.xml for the current project only add the resulting objects to the model pass the model object to the appropriate generator class[3] [1] Hmm..might need to somehow detect and handle duplicate data. In particular, tomahawk will depend on both myfaces-api and myfaces-impl. But the META-INF/myfaces-metadata.xml file will have a copy of all the data from the myfaces-metadata.xml contained in myfaces-api jarfile. So if *all* jars in the classpath are processed, the data from myfaces-api.jar will be processed twice. Options I see are (a) don't worry; the data will just be identical (b) check that if a model object is being overwritten, the new data is identical (c) have the plugin configured with an explicit list of jars to process metadata from. Then in the pom it must be configured so that myfaces-impl is processed and myfaces-api is ignored. Then make it an error for the same model object to be defined twice. (d) have a myfaces-metadata.xml file *not* include data inherited from parent projects. That's cleaner in a way, but means that when processing other goals we cannot just load the metadata file from the local project but need to merge in all the ancestor projects too. Ecch. (e) in the myfaces-metadata.xml, somehow mark entries with the jarfile they came from. Option (c) is probably the safest..and not too complicated. [2] eg something that executes a velocity template against the model Regards, Simon
Re: JspStateManagerImpl into shared?
Mario Ivankovits schrieb: Ping! It seems that Orchestra has to implement a StateManager which holds the view state in the conversationContext instead of the session. At the moment it seems that large portions of JspStateManagerImpl can be reused, but requires to copy it over into Orchestra. With slight refactoring of JspStateManagerImpl it should be possible to just replace the actual storing/loading stuff. Does this qualify to move JspStateManagerImpl into shared? Or should I better copy the source over? There are some jira issues and email threads that might be relevant: http://issues.apache.org/jira/browse/MYFACES-1791 http://issues.apache.org/jira/browse/TRINIDAD-816 http://www.nabble.com/state-saving-%28StateManager%29-and-multiple-frames-%28HACKATHON-points-4-and-6%29-td13913845.html http://www.nabble.com/RE%3A-Proposal-to-Externalize-the-ViewCache.-td13516998.html Regards, Simon
Shared module (was Re: JspStateManagerImpl into shared?)
Manfred Geiler schrieb: Mario, you are not alone in hating the shared concept. ;-) This is exactly where the myfaces-commons-xxx library comes into play, I often mentioned before. What we need is a module, that 1) has a super stable API 2) is used (ie. shared) by the myfaces core(!) as well as other myfaces projects Personally, I think the shared concept works fine. One problem with a commons jar used by core is that Sun Mojarra will need two jars (jsf-api, jsf-impl) and myfaces will need three (jsf-api, jsf-impl, commons-whatever.jar). This is rather ugly. I'm also sceptical that it will be possible to create a super stable API. Just look at what is in shared at the moment! And in addition to the API, the *implementation* will also need to be super-stable. For example, if a Tomahawk issue (whether bug or new feature) needs changes in the commons module to resolve, then deploying that commons change will immediately impact myfaces-core too. Suddenly people will be running a version of myfaces-core together with a commons version that it was never tested against. I don't like that idea at all. It was precisely this issue that the shared repackaging idea was invented for. We can improve the build process by having a maven plugin that imports-then-renames, rather than messing with the shared project to generate the per-user variant. That's really quite simple, and could be a nice general-purpose plugin. The maven-shade-plugin does something similar, but at the binary level. Earlier releases of myfaces did not include the shared source in the -sources jar which was a major nuisance. But that is now fixed. So for most users, there is no need to ever deal with the original shared project. Regards, Simon
Re: JspStateManagerImpl into shared?
Mario Ivankovits schrieb: Hi! Does this qualify to move JspStateManagerImpl into shared? Or should I better copy the source over? There are some jira issues and email threads that might be relevant: Thanks for the links, for me it seems there is need for this for other experiments too. So, I'd move this class to shared and refactor it a little bit so that it can be easier subclassed and allow only the bits changed one is interested in. However, I don't want to go to solve the window-problem for MyFaces at all, but just for Orchestra which seems to be easy due to the already existent conversationContext parameter. Ah. The point you were making is that things other than myfaces-core might want a basic JspStateManagerImpl implementation for them to subclass. And that is what shared is for. Whether JspStateManagerImpl could be refactored to support alternative strategies is a different issue. Sorry about the red herring. Orchestra having its own JspStateManagerImpl sounds interesting though. Enabling this on Sun Mojarra for example will quite radically change the way that a JSF app on Mojarra performs. That's not really Orchestra's role. What is *really* needed is for the StateManager spec to have some mechanism to externalise the state, then have Orchestra override just that. Is it not possible to apply the decorator pattern to whatever StateManager implementation the current JSF implementation provides? Regards, Simon
Re: Shared module (was Re: JspStateManagerImpl into shared?)
Matthias Wessendorf schrieb: I'm also sceptical that it will be possible to create a super stable API. Just look at what is in shared at the moment! And in addition to the API, the *implementation* will also need to be super-stable. For example, if a Tomahawk issue (whether bug or new feature) needs changes in the commons module to resolve, then deploying that commons change will immediately impact myfaces-core too. Suddenly people will be +1 that's a side effect running a version of myfaces-core together with a commons version that it was never tested against. I don't like that idea at all. It was precisely this issue that the shared repackaging idea was invented for. yup, I guess I now move to your direction, or... base-tomahawk :-) but... that doesn't fly at all. I guess what we really want is for servlet engines to explicitly permit OSGi use. Then myfaces-core can depend on version X of the common lib, while tomahawk can run in the same webapp using version Y. That would solve my major objection. Package-renaming is just a hack to get a similar effect without having to mess with classloaders. But with this setup, we would still have people complaining that they do not know where to put their breakpoints. In fact, it's even worse. At least with the current stuff, you can clearly see o.a.m.shared_core.Foo, rather than dealing with two different concurrently-loaded versions of the o.a.m.Foo class. How do people interactively debug OSGi apps, I wonder.. Regards, Simon
Re: redeploy myfaces website
I've noticed some other inconsistent stuff that needs to be fixed. I think the most important is the issue I raised before: what should the top menu bar look like for sub-projects. If you look at the core modules for the moment, those links are all specific to myfaces as a whole, not to the subproject currently being viewed. I would expect overview to take me to the overview for the project I am currently viewing, not to the myfaces overview. Also, while we do *currently* have a shared downloads page for all projects, and a common set of mailing lists for all projects I don't think this will always be the case. So I don't think the download or mailing list links are appropriate here. What I've done for tomahawk for example is just to have Apache and MyFaces links, and nothing else. I think this is nicer. Obviously, it would be even nicer to be consistent across all projects. core11/core12: * icons need updating Trinidad: * has the faces icon on the right, not on the left. * has no title in the first block of the left-hand pane. * has no list-of-myfaces-projects in the left-hand pane. This might actually be nicer than replicating the list-of-projects on each page (people can always go up to the main myfaces site) but we should be consistent. * has no link to apache in the top menu bar, just MyFaces * has the foundation section at the top, not the bottom * date-published format is -mm-dd, not dd MMM Regards, Simon
Re: svn commit: r640864 - in /myfaces/tomahawk/trunk/sandbox: core/src/main/java/org/apache/myfaces/custom/captcha/ core/src/main/java/org/apache/myfaces/custom/captcha/util/ core/src/main/java/org/ap
Hi Grant, Could you please avoid combining code changes and reformatting in the same patch? It is just impossible from this patch to determine what *real* changes you have made to ComponentUtils. And I believe the coding convention here is to use spaces, *not* tabs, but you have replaced all spaces *with* tabs in ComponentUtils. What were the actual ComponentUtils changes? Thanks, Simon [EMAIL PROTECTED] schrieb: Author: grantsmith Date: Tue Mar 25 08:40:50 2008 New Revision: 640864 URL: http://svn.apache.org/viewvc?rev=640864view=rev Log: https://issues.apache.org/jira/browse/TOMAHAWK-1216 patch applied Added: myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/util/CAPTCHAConstants.java myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/util/CAPTCHAResponseStream.java Modified: myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/CAPTCHARenderer.java myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/util/CAPTCHAImageGenerator.java myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/util/ComponentUtils.java myfaces/tomahawk/trunk/sandbox/examples/src/main/webapp/WEB-INF/web.xml Modified: myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/CAPTCHARenderer.java URL: http://svn.apache.org/viewvc/myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/CAPTCHARenderer.java?rev=640864r1=640863r2=640864view=diff == --- myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/CAPTCHARenderer.java (original) +++ myfaces/tomahawk/trunk/sandbox/core/src/main/java/org/apache/myfaces/custom/captcha/CAPTCHARenderer.java Tue Mar 25 08:40:50 2008 @@ -19,22 +19,40 @@ package org.apache.myfaces.custom.captcha; import java.io.IOException; +import java.util.Map; +import javax.faces.FacesException; +import javax.faces.FactoryFinder; import javax.faces.component.UIComponent; import javax.faces.context.FacesContext; +import javax.faces.context.FacesContextFactory; +import javax.faces.context.ResponseStream; import javax.faces.context.ResponseWriter; +import javax.faces.lifecycle.Lifecycle; +import javax.faces.lifecycle.LifecycleFactory; import javax.faces.render.Renderer; +import javax.servlet.ServletContext; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; + +import org.apache.myfaces.component.html.util.ParameterResourceHandler; +import org.apache.myfaces.custom.captcha.util.CAPTCHAImageGenerator; +import org.apache.myfaces.custom.captcha.util.CAPTCHAResponseStream; +import org.apache.myfaces.custom.captcha.util.CAPTCHATextGenerator; +import org.apache.myfaces.custom.util.ComponentUtils; +import org.apache.myfaces.renderkit.html.util.AddResource; +import org.apache.myfaces.renderkit.html.util.AddResourceFactory; +import org.apache.myfaces.renderkit.html.util.ResourceLoader; +import org.apache.myfaces.shared_tomahawk.renderkit.html.HTML; -public class CAPTCHARenderer extends Renderer { - - private static final String CAPTCHA_SERVLET_NAME = apache_captcha_servlet_url; +public class CAPTCHARenderer extends Renderer implements ResourceLoader { public void encodeBegin(FacesContext context, UIComponent component) throws IOException { CAPTCHAComponent captchaComponent = (CAPTCHAComponent) component; - renderCAPTCHA(context, captchaComponent); + generateImageTag(context, captchaComponent); } public void encodeEnd(FacesContext context, UIComponent component) @@ -43,25 +61,104 @@ } /* - * This helper method renders the img tag that will - * call the CAPTCHAServlet to render the CAPTCHA image. + * This helper method is used for generating the img tag that will + * use the (AddResource) to generate the url of the generated image. */ - private void renderCAPTCHA(FacesContext context, CAPTCHAComponent component) + private void generateImageTag(FacesContext context, CAPTCHAComponent component) throws IOException { + +AddResource addResource = null; +String url = null; + CAPTCHAComponent captchaComponent = (CAPTCHAComponent) component; ResponseWriter writer = context.getResponseWriter(); +Map params = ComponentUtils.getParameterMap(component); +String captchaSessionKeyName = captchaComponent.getCaptchaSessionKeyName(); + +writer.startElement(HTML.IMG_ELEM, captchaComponent); + +if (captchaSessionKeyName != null
Re: [Tomahawk] Commit of component generation and 1.2 modules to trunk
Leonardo Uribe schrieb: On Tue, Mar 18, 2008 at 10:54 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: [1] I'm not sure what the later property examples on that page are meant to be; as Leonardo has written them they are attached to no function which is not what I had in mind... Thanks Simon for make more clear this on the wiki page. I have also added comments about how should work isSetFieldMethod and isGetLocalMethod attributes for @mfp.property. And thanks for your additions. Is the setLocalMethod stuff specifically something that trinidad needs? I have not seen this anywhere in core or tomahawk that I remember. If so, then perhaps the wiki could say (trinidad only) or similar next to those props. On Tue, Mar 18, 2008 at 12:44 PM, Andrew Robinson [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: On Tue, Mar 18, 2008 at 9:54 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi Andrew, Andrew Robinson schrieb: One major drawback to the javadoc annotation approach has been left out and that is attribute reuse. The maven-faces-plugin allows you to import XML fragments using XPath. So in Trinidad, onclick, onmouseover, onmouseout, etc. you can import one XML file and not have to re-define all these. But with the javadoc approach, you have to either one, try to hack the code to extend other classes, two have some weird interface usage to import these. Either way, the object hierarchy has to be hacked to get it to work. Hmm..interesting. So trinidad has cases where a class X is not related to A by inheritance, but does want to provide the same properties as A? Java currently defines implements and extends; sounds like Trinidad has invented a new OO concept, emulates :-). No, it only imports certain attributes, not all of them. Take some time to look at the trinidad-build project and how it works. It is better to see than explain. In tomahawk, there are interfaces like org.apache.myfaces.component.UserRoleAware that define getter and setter methods for a particular group of related properties. Maybe we can do something like this: /** * @mfp.interface //or propertiesinterface or setofproperties or anything related **/ public interface UserRoleAware { static final String ENABLED_ON_USER_ROLE_ATTR = enabledOnUserRole; static final String VISIBLE_ON_USER_ROLE_ATTR = visibleOnUserRole; /** * @mfp.property **/ String getEnabledOnUserRole(); void setEnabledOnUserRole(String userRole); /** * @mfp.property **/ String getVisibleOnUserRole(); void setVisibleOnUserRole(String userRole); } Then the abstract component class can implements this interface and finally the generated class must implement the methods. In this way we make clearer the API, and eliminate in a clean way the advantage of using xml files. I see. Yes, where is a group of properties to be shared between two classes, the OO way would be to declare a common interface. I guess another example is the set of html passthrough attributes that is attached to many components. The myfaces-builder-plugin code already walks interfaces looking for property definitions. Currently the normal @mfp.component annotation is looked for even on interfaces (the plugin already knows that this is an interface) but a different annotation could also be introduced. An alternative might be to have @mfp.property group=userRole and @mfp.component usePropertyGroups=userRole, htmlAttributes but I prefer the interface approach. Does trinidad pull subsets of properties from the myfaces-api classes? If so, then the usePropertyGroups would be necessary as we cannot factor out interfaces for the javax.faces classes. The point you made about overriding documentation appears to be the ugliest part of the doc-annotation based approach. In the wiki page I have a delegating method just to override the comment, which is really not nice. Any suggestions for a better answer to this would be welcome... Regards, Simon
Re: [Tomahawk] Commit of component generation and 1.2 modules to trunk
Hi Bruno, The plugin goals for config-file and code generation requires a myfaces-metadata.xml file as input which contains all the info about component classes, properties, etc. For the case where tomahawk extends myfaces-api and myfaces-impl, that file is just available directly from the jar's META-INF dir. This is the step that would have to change. I suppose the plugin could try to reconstruct the necessary metadata just from the info available by analysing the .tld and faces-config.xml files for the third-party lib, and possibly using introspection on the classes. I'm not sure if all the critical info would be recoverable [1]. But if not, then the plugin could certainly get *most* of the data, and the missing bits could then be added by hand. It doesn't sound *too* difficult to code...and once the metadata has been recovered, the other goals will run fine. [1] There is a JCP in progress at the moment regarding metadata for JSF components, because IDEs are having problems implementing some gui-design features. That seems to me to imply that there is some important data that cannot be deduced from just the .tld, faces-config and classfiles. But I haven't read the JCP docs yet. Regards, Simon Bruno Aranda schrieb: Hi, is this possible with the myfaces-builder-plugin to extend components that are part of another jar library (e.g. extending a tomahawk component from a third-party lib)?. Cheers! Bruno On 19/03/2008, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]* [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Leonardo Uribe schrieb: On Tue, Mar 18, 2008 at 10:54 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: [1] I'm not sure what the later property examples on that page are meant to be; as Leonardo has written them they are attached to no function which is not what I had in mind... Thanks Simon for make more clear this on the wiki page. I have also added comments about how should work isSetFieldMethod and isGetLocalMethod attributes for @mfp.property. And thanks for your additions. Is the setLocalMethod stuff specifically something that trinidad needs? I have not seen this anywhere in core or tomahawk that I remember. If so, then perhaps the wiki could say (trinidad only) or similar next to those props. On Tue, Mar 18, 2008 at 12:44 PM, Andrew Robinson [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: On Tue, Mar 18, 2008 at 9:54 AM, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi Andrew, Andrew Robinson schrieb: One major drawback to the javadoc annotation approach has been left out and that is attribute reuse. The maven-faces-plugin allows you to import XML fragments using XPath. So in Trinidad, onclick, onmouseover, onmouseout, etc. you can import one XML file and not have to re-define all these. But with the javadoc approach, you have to either one, try to hack the code to extend other classes, two have some weird interface usage to import these. Either way, the object hierarchy has to be hacked to get it to work. Hmm..interesting. So trinidad has cases where a class X is not related to A by inheritance, but does want to provide the same properties as A? Java currently defines implements and extends; sounds like Trinidad has invented a new OO concept, emulates :-). No, it only imports certain attributes, not all of them. Take some time to look at the trinidad-build project and how it works. It is better to see than explain. In tomahawk, there are interfaces like org.apache.myfaces.component.UserRoleAware that define getter and setter methods for a particular group of related properties. Maybe we can do something like this: /** * @mfp.interface //or propertiesinterface or setofproperties or anything related **/ public interface UserRoleAware { static final String ENABLED_ON_USER_ROLE_ATTR = enabledOnUserRole; static final String VISIBLE_ON_USER_ROLE_ATTR = visibleOnUserRole; /** * @mfp.property
Re: [Tomahawk] Commit of component generation and 1.2 modules to trunk
Leonardo Uribe schrieb: On Mon, Mar 17, 2008 at 6:05 PM, Andrew Robinson [EMAIL PROTECTED] wrote: http://wiki.apache.org/myfaces/MyfacesBuilderPlugin Is this supposed to be a solution to the code replacement? I don't recall any vote to achieve a resolution on how to go about the code generation. From what this looks like, it looks worse than the current trinidad plugin. Embedding data in javadoc is really awful IMO. Or am I mistaken and this is just the tomahawk code builder? This wiki is the idea that Simon proposed for code generation in a more concrete presentation based on the comments on the mailing list and the starting code proposed on. https://svn.apache.org/repos/asf/myfaces/myfaces-build-tools/branches/skitching Actually, I have worked a lot with myfaces-faces-plugin (older trinidad build), I have full component generation for tomahawk 1.1 and 1.2 ready for commit, and really I think that myfaces-faces-plugin is a great tool (clear wiki, easy to understand and make it work, but sometimes a little bit complex doing uncommon tasks). If it was my decision, I will continue using myfaces-faces-plugin, commit tomahawk, upgrade components from sandbox and release. It's clear that all solutions proposed for this issue (Code Generation) sucks more or less. You're right. There is not a vote about what should be the direction to take yet, but it's unclear how to manage this (at least I don't know what to do in this case when some developers thinks very different). suggestions about what to do are most welcome The myfaces-builder-plugin stuff is an experiment/proof-of-concept at the current time, That's why the code is in a branches directory. No vote has been taken on using this approach; it was necessary to show that this would work before calling a vote to make a choice between this and the myfaces-faces-plugin (formerly trinidad-faces-plugin) approach. But there is enough concrete info there now (particularly thanks to Leonardo's new wiki pages) for us to have a discussion. It's clear that the approach will work; now the community needs to say which approach they think is the most productive for the future. I would personally like to see this used for all projects: core1.1, core1.2, tomahawk, commons and eventually trinidad too. I certainly see no reason why the myfaces-builder-plugin approach cannot be used for trinidad. It's just a matter of extending the model classes that hold the metadata. And I see no theoretical reason why the existing .xml files in the build project could not even be used as input to the myfaces-builder-plugin (ie as an alternative to doc comments). Note that the reverse is not true: there is no easy way to use doc comments to feed metadata into the myfaces-faces-plugin, because that is really seriously hard-wired to assume that xml files are its input. But the most important thing to me *is* getting away from the build project with lots of xml files approach. == * Use case 1: I want to extend the t:inputDate component to add a property to a component, and alter the renderer to use that property. With the xml-files approach (myfaces-faces-plugin), I open the build project and modify an xml file to add the info using the special xml tags. I then run the plugin. I then refresh my eclipse project. I then alter the renderer to access the new property on the generated component class. Note that just looking at the base component class gives *no* info about what properties that component has. With the doc-annotation approach (myfaces-builder-plugin), I open the normal tomahawk project, and add an abstract getter method (perfectly normal java coding). I then add the appropriate annotation as a doc-annotation. I then modify my renderer (no need to run the code generator; I can write calls to the abstract method fine). And with the doc-annotation approach, looking at the component class shows clearly that it has that property. In addition: (a) the amount of info I need in the annotation is much less than I needed to add to the xml file because the plugin can use introspection on the abstract method declaration (b) javadoc comments attached to the method are used automatically in the .tld etc. * Use case 2: I create a new tomahawk converter class. With the xml-files approach, I open the build project and create some new xml files. Then I write the converter class in the main code project. With the myfaces-builder-plugin approach, I open the main code project, add the class and add a couple of doc-annotations. * Use case 3: I create a new project (eg commons widgets) With the xml-files approach, I need to create two maven modules, one for the code and one for the builder stuff. Somehow I need to get access to all the metadata defined by the core build project (??). I then need to create metadata files for my project that (somehow) add info about my new components. With the doc-annotation approach, I just create a project
Re: [Tomahawk] Commit of component generation and 1.2 modules to trunk
Sochor Zdeněk schrieb: Generally extending components by adding whole bucket of local property/getter/setter seems not right for me because it's meant to provide just another feature not used by many people. Wouldn't it be more cleaner to just use that property as attribute (in JSF api's speach): storing/using it from getAttributes() map? This way it would just be needed to generate TLD/Tag entries, no customization of component's code needed. Also renderer has full access to attributes map by default ;) Nice side-effect to this approach is eventual speeding-up processing by adding cache to getAttributes() MyFaces map: storing already resolved values in transient map to avoid multiple resolving during lifecycle. The disadvantage of using attributes are: * all typesafety is lost. * the attributes are not self-documenting, like explicit setter/getter methods are. For code, you can see that a property exists on a class, but cannot see what attributes might affect its behaviour. And for tags, the property in the tld tells people that the option exists; hiding it in an attribute makes it harder to find. So in general I would currently be in favour of properties not attributes. There's no reason why the setter/getter cannot store data into the attributes map behind the scenes if that turns out to be a useful optimisation. By having a setter method, the actual way the data is stored is hidden; we can then optimise in whatever way works best. * Use case 3: I create a new project (eg commons widgets) With the xml-files approach, I need to create two maven modules, one for the code and one for the builder stuff. Somehow I need to get access to all the metadata defined by the core build project (??). I then need to create metadata files for my project that (somehow) add info about my new components. With the doc-annotation approach, I just create a project for my code, add the builder plugin to my pom, and add annotations to any of my converters/validators/components. Done. A question about properties inheritance accross components arise: i really think that having the same propetry handled by multiple components in hiearchy is bad (quite common in Tomahawk). Also the Tomahawk code should use consistent way in components and tags inheritances. There should be a way to filter already defined properties in either approach. Sorry, I don't understand this comment at all. With what I am proposing, an ancestor class will define a property (eg getId/setId) and an annotation will be added on that method definition. Then the derived class does nothing; the plugin walks the ancestry tree and figures out that there is an inherited property of that name. * Use case 4: A JSF newbie browses the myfaces-impl code. With the xml-files approach, he needs to also browse the xml files in the build project or alternatively browse the generated code to get any idea of how things work. With the doc-annotation approach, the annotations are there for him to see on the checked-in classes. This use-case is not clear - it's quite different to look at code to see how components work and to look at renderers' stuff. IMO more people are interested in renderers, so both approaches are almost the same for them. Agreed, generated components that subclass a package-private base (core-api) is always going to be confusing. In either case, the renderer is casting to a type that does not exist until the plugin has been run. However in all other projects (core-impl, tomahawk, etc) we can have the component base class public. Then with the doc-annotation approach, the renderer can cast to that base (non-generated) type, and invoke the abstract methods. The result is code that will compile correctly after a direct checkout, even without the plugin having been run. That seems nice. The same *is* possible with the xml-files approach, but is duplicated work: the class and its properties have to be defined in the xml files too. And if the code and xml are not kept correctly in sync, then problems will occur. Regards, Simon
Re: [Tomahawk] Commit of component generation and 1.2 modules to trunk
Hi Andrew, Andrew Robinson schrieb: One major drawback to the javadoc annotation approach has been left out and that is attribute reuse. The maven-faces-plugin allows you to import XML fragments using XPath. So in Trinidad, onclick, onmouseover, onmouseout, etc. you can import one XML file and not have to re-define all these. But with the javadoc approach, you have to either one, try to hack the code to extend other classes, two have some weird interface usage to import these. Either way, the object hierarchy has to be hacked to get it to work. Hmm..interesting. So trinidad has cases where a class X is not related to A by inheritance, but does want to provide the same properties as A? Java currently defines implements and extends; sounds like Trinidad has invented a new OO concept, emulates :-). While I haven't analysed this carefully, I see no reason why there couldn't be an extra annotation attribute: /** * @mfp.component emulates=javax.faces.UIData ... */ class TrinidadData .. {...} which would cause the plugin to look up the metadata for the javax.faces.UIData component then simply copy all the property definitions for that class into the metadata for the TrinidadData class. Would that satisfy the trinidad requirement? Also, the java-annotation-javadoc approach is odd that code must be filled in. The Java files would have some sort of attribute definitions, but no body (I don't want to have to waste time writing setters getters when I should only have to write a definition for an attribute). Yes, there does need to be something to attach property annotations to. I would suggest an abstract getter method: /** * An identifier for this component which is unique within the enclosing NamingContainer. * * @mfp.property literalOnly=true */ public String getId(); rather than annotating a private class member. After all, there might not *be* a real class member; the data might be stored in the attributes map for example. This is shown in the first property example here [1]: http://wiki.apache.org/myfaces/MyfacesBuilderPlugin By defining the getter method as above, the plugin can inspect the return type to determine the property type automatically (so the annotation doesn't have to specify it). I guess it is possible to do away with the method completely, and just have a free-standing annotation, but (a) that look weird, and (b) the annotation has to be more complex because we have to explicitly provide property name and type rather than looking at the method definition. Note that because the method is defined as a normal java method, code in a renderer (for example) can be written to call that method and will still compile even before the code-generator has been run. Not critical, but nice. I don't see any need to define the corresponding setter method. Just in case it isn't clear, the javadoc above can be used as the property description field for the tld (first sentence - short description, and the remainder for long description). Is this more work than defining the equivalent xml? [1] I'm not sure what the later property examples on that page are meant to be; as Leonardo has written them they are attached to no function which is not what I had in mind... Personally, I much prefer the Trinidad approach, and I have found it easy to use, except for the *Template files (there has to be a better way than that as they are not IDE friendly). Regards, Simon
Re: build plugin - an alternative annotation approach
Yes, long descriptions are supported; see the example output I showed in the original email. And I see no reason why every option currently offered in xml cannot be supported via either an annotation or even better automatically by introspecting the classes. I did want to build the annotation-handling functionality into the existing plugin so that the two sources of info (xml config files and doc-annotations) could be merged together. However the existing myfaces-faces-plugin code is not well structured for that purpose; it makes a lot of very xml-specific assumptions rather than having a neutral metadata representation. I see no reason why the two couldn't co-exist eventually, but intend to build a separate plugin first. There is no reason why both plugins cannot be run separately, ie the pom.xml be configured to run both. But that does lead to a lot of inconsistency and redundancy; for example, one component will use the annotations on a base class to determine settings for inherited properties while another will ignore this annotation info completely and get its data from an xml config file that contains (hopefully) the same data but in another format. Ecch. I would therefore suggest not having some components configured via xml and others via annotations in the same project. What I would like to see is core-1.1 and tomahawk use the approach I'm suggesting here. All that is needed is to add annotations to the existing code and move the comments that are currently in the tld-files into javadoc comments on the appropriate fields. Optionally the property methods can be reduced to abstract methods and the save/restore methods removed if the generate-component-class approach is wanted. With this approach it is possible to generate a concrete subclass with property implementations and saveState/restoreState methods as has been discussed earlier. I'm still not convinced about this, but can live with it. Using annotations instead of lots of xml config files is a different issue. Regards, Simon Martin Marinschek schrieb: Sounds interesting. Will you support everything the XML-syntax allows to supply now? E.g, long descriptions? will it basically be an option which component-set wants to use which frontend? Then slowly every component set could decide if it wants to move over... How about restoreState/saveState? the getter? regards, Martin On Sun, Mar 9, 2008 at 10:20 PM, simon [EMAIL PROTECTED] wrote: Hi All, Currently, trinidad and core-1.2 use the myfaces-faces-plugin to generate tag classes, config files and component classes. There is some work going on to use this for core-1.1 and tomahawk too. As I mentioned earlier, I don't like the approach used by the current myaces-faces-plugin; I think the large number of xml config files that are needed is not elegant or user-friendly. I proposed using some kind of annotation in the source to drive this process instead. I have been working on this recently, and have got the first steps running. It's still very rough so I won't post the code, but wanted to let you know what I'm working on. All feedback is welcome. Here's an instrumented UIData class from core-1.1: /** * Represents a component which has multiple rows of data. * p * The children of this component are expected to be ... * * @mfp.component * type = javax.faces.Data * family = javax.faces.Data * defaultRendererType = javax.faces.Table * desc=tabular data */ public class UIData extends UIComponentBase implements NamingContainer { ... /** * Set the name of the temporary variable that will be exposed to * child components of the table to tell them what the rowData * object for the current row is. This value must be a literal * string (EL expression not permitted). * * @mfp.property literalOnly=true */ public void setVar(String var) { _var = var; } } The code I've got so far just scans the source code to build up a meta-data structure holding the relevant data. This would then be used to drive the file-generation step, hopefully reusing the existing code from the myfaces-faces-plugin. In other words, I'm proposing replacing the front end of the plugin but not the back-end. Dumping the parsed data gives: --dumping artifacts-- == Component class:UIData type:javax.faces.Data prop:setVar class:java.lang.String isLiteral:true desc:Set the name of the temporary variable that will be exposed to child components of the table to tell them what the rowData object for the current row is. This value must be a literal string (EL expression not permitted). --dumped artifacts-- Note that the javadoc comments from the class are taken into the metadata. So comments in the component, the generated tag-class and the taglib file are automatically identical and are maintained in the *normal* manner
core1.2: handling of component ids in included files
Hi All (and particularly Bruno): In core 1.2, class UIComponentClassicTagBase messes with the id field of components if they are in an iterator, adding a (clumsy) unique suffix string. And for some reason it defines in an interator as any kind of include or forward. So doing something like this: == mainpage.jsp f:view jsp:include page=subPage.jsp / /f:view == subPage.jsp h:form id=foo/ causes the form component to have an id of fooj_id_2 or similar. This behaviour just looks weird to me. Why mess with ids just because the file being processed is included or forwarded-to? Does anyone have any idea what the purpose of this is? In particular, this breaks Tomahawk forceId on myfaces1.2 when the component with forceId is in an included file. But it looks really ugly in normal cases too. I understand that ids need to be messed with in order to support components being in an interator: c:forEach .. h:inputText id=name/ /c:forEach But why check for includedOrForwarded too? Is this include-handling perhaps to allow jsp:include page=subPage.jsp / jsp:include page=subPage.jsp / to work correctly? If so, then currently users are paying a very high price (unstable ids when refactoring into included files) for a feature that very few will need (multiple includes of the same file). And Sun Mojarra does *not* mess with the ids like this...testing shows that the ids of components are the same regardless of whether they are inline or in an included file. This code was committed by baranda in commit 462531, ie this class has behaved like this from very early times. Bruno, what was the reason you added this? Regards, Simon
Re: core1.2: handling of component ids in included files
Ok, thanks Bruno. I'll create a jira issue for this, then try to find some time soon to look into the related specs. At the least we need to document this, but hopefully can tidy up the id suffixes too. It's a little embarrasing that tomahawk forceId works on Sun but not on myfaces :-) See you wednesday.. Cheers, Simon Bruno Aranda schrieb: Honestly... I don't remember, but I think I got this idea from somewhere else (otherwise I would remember). Writing the UIComponentClassicTagBase class was the most complicate part for me and one of the first things I did for the 1.2 implementation. Probably the generate IDs ensure that the ids do not clash, but I don't remember what does the spec says (if anything) about it. However, if mojarra does not use such system we should probably be compatible with them in that aspect. See you this week! ;) Bruno On 10/03/2008, [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]* [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: Hi All (and particularly Bruno): In core 1.2, class UIComponentClassicTagBase messes with the id field of components if they are in an iterator, adding a (clumsy) unique suffix string. And for some reason it defines in an interator as any kind of include or forward. So doing something like this: == mainpage.jsp f:view jsp:include page=subPage.jsp / /f:view == subPage.jsp h:form id=foo/ causes the form component to have an id of fooj_id_2 or similar. This behaviour just looks weird to me. Why mess with ids just because the file being processed is included or forwarded-to? Does anyone have any idea what the purpose of this is? In particular, this breaks Tomahawk forceId on myfaces1.2 when the component with forceId is in an included file. But it looks really ugly in normal cases too. I understand that ids need to be messed with in order to support components being in an interator: c:forEach .. h:inputText id=name/ /c:forEach But why check for includedOrForwarded too? Is this include-handling perhaps to allow jsp:include page=subPage.jsp / jsp:include page=subPage.jsp / to work correctly? If so, then currently users are paying a very high price (unstable ids when refactoring into included files) for a feature that very few will need (multiple includes of the same file). And Sun Mojarra does *not* mess with the ids like this...testing shows that the ids of components are the same regardless of whether they are inline or in an included file. This code was committed by baranda in commit 462531, ie this class has behaved like this from very early times. Bruno, what was the reason you added this? Regards, Simon
Re: [Tomahawk] List of components to be upgraded from sandbox to tomahawk 1.2
Leonardo Uribe wrote: On Mon, Mar 3, 2008 at 4:58 PM, simon [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote: The list of components is fine. And I very much appreciate all your work on this and the tomahawk bugs you've been fixing recently. However at the risk of sounding like a broken record, I would like to point out that AFAIK there has still been no vote on whether to have a tomahawk 2.0 at all. And I would vote -1 on such a thing; abandoning tomahawk users on JSF1.1 would be bad, and the community is just not big enough to support two parallel tomahawk branches. The fact that you are about the only person to commit to tomahawk in the last month should make that obvious. The minor benefits of a JSF1.2-specific tomahawk branch are far outweighed by the pain. I have a different opinion about this. Sooner or later we have to upgrade this lib (I think better sooner than later). The component generator does the big part of the update (write component and tag classes), only minor changes on the renderers was done to make all examples work (the big part of the work is on build module). Move changes from one branch to another should be an easy task. Passing tomahawk to 1.2 let us see more bug on myfaces 1.2, and upgrade tomahawk 1.1 apps should be easy. JSF 1.2 has more than one year and JSF 2.0 is coming. The only thing to be taken into account is continue doing releases on 1.1 and that's all. It's a matter of subjective opinions (all valid of course). But in my humble opinion, better move forward than stay quiet. Not one step back, nor to catch momentum. regards Leonardo Uribe Scott O'Bryan schrieb: I agree with Leonardo totally. Just because you have a 2.0 branch does not mean that you drop support for 1.1. It simply means that things which cannot be made 1.1 compatible continue to migrate and that the stuff which is already in place, embraces any emerging standards. Furthermore, it gives Tomahawk users a much clearer upgrade path into the new technologies. Trinidad, for instance, has had a 1.2 branch for a few months and we are totally seeing enhancements going into both branches. Things go into 1.2 as the exception, not the rule. I fully support making sure that 1.1 continues to move ahead because 1.1 is MyFace's largest community of users. But PREVENTING projects from moving to the new technology only hurts those renderkits and helps no-one One of the key issues with open source is that developers work on what they want/need to work on. If people in the community continue to restrict developers from supporting the new standards for a renderkit, the renderkit will loose developers and support. Instead of having active development in a project with a little more emphasis on the later standards, you'll end up with no development at all. Argh..top-posting in reply to a thread that already has bottom-posting established as a convention is REALLY ANNOYING. I've therefore moved the reply text to a sane position. The thing I am really concerned about here is ensuring that there are enough tomahawk developers to actually keep the project alive. Therefore in this case, I think that only the opinions of those who are actually active developers count. Scott, it's all very well you saying that both 1.1 and 1.2 should be supported, but someone has to actually do that, and I don't see your name in the commit list... Here's some actual stats on commits since feb 2007 (ie for the last 12 months) baranda: 3 bommel: 1 (website only) cagatay: 14 dennisbyrne: 2 gmuellan: 6 grantsmith: 13 imario: 10 jlust: 1 manolito: 10 (mostly buildsystem fixes) matzew: 4 mkienenb: 14 mmarinschek: 8 pmahoney: 24 skitching: 42 tomsp: 2 werpu: 4 total patches: 158 in 12 months -- 13 per month - 3 per week Hmm..interestingly, lu4242 (Leonardo) does not appear on this list, except for one of matzew's patches that credits leonardo. All Leonardo's patches must have been to sandbox or the tomahawk 2.0 branch. I think we can count Leonardo among the active developer pool anyway. By the way, a lot of the above commits are checking in patches provided by other people; sorry I can't properly credit them here. The numbers for myself (skitching) are somewhat misleading; a big chunk of those are just on one component, the t:calendar. And likewise for pmahoney; most of the commits are just for the schedule component. This really does not look like a lot of people or traffic. Now let's look at JIRA: http://issues.apache.org/jira/browse/TOMAHAWK There are 390 open issues of severity Major or above. So at the current commit rate, that will take 2.5 years to clear them, assuming one patch fixes one issue. In this situation, we really do NOT need to increase the amount of work it takes to maintain Tomahawk. Leonardo, are you saying that moving to a code-generation approach will allow us to maintain a single
Re: [Tomahawk] List of components to be upgraded from sandbox to tomahawk 1.2
Martin Marinschek schrieb: Hi *, if Leonardo does as discussed, we can have both the 1.1 version and 1.2 from the same branch. (I don't see why this shouldn't be possible). Sorry, I must have missed this as discussed discussion. Do you mean that there will be: * no use of new for-loops or generics in any code? * no use of ELContext except in a couple of generic templates that are separated out by jsf version? If the code and templates are not going to have any Java1.5 or JSF1.2-specific code in them, then what exactly will be the difference between the tomahawk-for-jsf1.1 and tomahawk-for-jsf1.2 releases? If they do have java1.5 or jsf1.2-specific code in them, then we have the same problem: two code lines to maintain, two files to patch etc. ok, a couple of template files with different content are ok, but more than that is going to be a problem. And again, what is the benefit of a jsf1.2-specific tomahawk? Can you show any Tomahawk code that will run faster or work better because we use JSF1.2 features? Regards, Simon
Re: [Tomahawk] List of components to be upgraded from sandbox to tomahawk 1.2
Martin Marinschek schrieb: Hi Simon, the three of us (Leonardo, you, me) discussed this in our component-generation discussion. @use of 1.2 constructs: yes, you are right, it should not use any 1.2 constructs (at a maximum - with reflection, so that we stay independent). Facelets does something similar. We need a 1.2 version however for the tags - they are just too different. But thankfully, those will be generated. There is one thing which I want to have: invokeOnComponent can be called, and it should be called for the AJAX-callback. @use of the 1.2 version: wouldn't you want to indicate to the community that this component library is now 1.2 compliant? For JSP 2.1 containers, you will indeed need the new tag-files, if you are not using Facelets (AFAIK)! So all code except tag classes will be JSF1.1-compliant? And then we generate different tag classes and tld files for two different flavours of Tomahawk (jsf11, jsf12) ? I'm not quite sure why you say that new tag-classes and tld-files are needed; I have run tomahawk on MyFaces 1.2 without difficulty, using JSP pages. Just see the Orchestra examples. However I'm also quite happy with what you describe above. As long as there remains one trunk for all the components that is fine. Having one template for jsf11 tag classes and a different one for jsf12 tag classes is no big deal to manage. Just to clarify: you intend to use the SAME template for component/renderer generation for both JSF11 and JSF12? And use the same hand-written component parent classes when generating JSF11 and JSF12? The invokeOnComponent thing sounds ok, as long as it can be reasonably isolated from the majority of the code; as you say, that really is useful to have. Cheers, Simon
[orchestra] new property for FrameworkAdapter interface
Hi, Currently the SpringViewControllerScope class has jsf-specific code in it to retrieve the current viewId, so that it can then look up the view controller bean for the current view. This is not desirable as this means that we have: (a) a reference from the spring-specific packages to the jsf-specific packages, and (b) the view-controller-scope is not usable in any environment other than jsf. It seems to me that the cleanest solution is to add a method to the FrameworkAdapter astract class: String getCurrentViewId(); We could provide a default implementation that returns the constant string defaultView or similar. This avoids breaking existing FrameworkAdapter subclasses, and theoretically allows users of non-jsf frameworks to have a global view-controller instance by just adding a spring bean with that name. The JsfFrameworkAdapter would of course return FacesContext.getCurrentInstance().getViewRoot().getViewId() and the SpringViewControllerScope would then just call FrameworkAdapter.getInstance().getCurrentViewId(). Are there any objections to this? Regards, Simon
Re: MyFaces API and IMPL docs 404.
Hi Manfred, No, that page http://myfaces.apache.org/javadoc.html is just a leftover. It should not exist, and can be deleted. The maven website deploy stuff doesn't delete old obsolete stuff when a new site is deployed. I guess google indexed it when it was once valid, and now people are finding it :-( The thread that Matthias refers to is about something a bit different. There should be no javadoc.html page at the top level of the site, as this just does not make sense with the large number of myfaces projects that now exist. But there should be (and are) javadoc index pages under core11 and core12 directories - and possibly tomahawk directory (TODO). Regards, Simon Matthias Wessendorf schrieb: Hi On Feb 20, 2008 10:39 AM, Manfred Geiler [EMAIL PROTECTED] wrote: Simon, is http://myfaces.apache.org/javadoc.html an official page? Google search says that there is no external link to that page. it was... and we talked about that already in this thread ([1]). This page was also linked to tagdocs. -Matthias [1] http://www.mail-archive.com/dev@myfaces.apache.org/msg29331.html Should we get rid of it or is there a quick way to fix this? Thanks, Manfred -- Forwarded message -- From: [EMAIL PROTECTED] Date: Feb 20, 2008 7:25 AM Subject: MyFaces API and IMPL docs 404. To: [EMAIL PROTECTED] Hi, Wasn't sure whom to inform. http://myfaces.apache.org/javadoc.html getting 404s for the API and IMPL docs. Thanks Rohit.
Re: MyFaces API and IMPL docs 404.
Perhaps there is a process running on the myfaces.zones.apache.org machine that is syncing from the continuum server output dir to people.apache.org. Note that the site is *not* being automatically built by continuum, but maybe the sync is still running anyway. I had problems for a while with the new site I deployed getting stomped on. I rummaged around in continuum and then it seemed to stop happening, but I was never quite sure what was going on. Regards, Simon Manfred Geiler schrieb: Deleting it seems not so easy. There is still some update cron job running with user schof. Right? This are the properties of the file in /www/myfaces.apache.org 85847169 8 -rw-rw-r-- 1 schof myfaces 7397 Feb 17 21:34 javadoc.html So it seems that it gets updated/copied on a regular basis. Where is this script and from where does it update/copy the files? Thanks, Manfred On Feb 20, 2008 12:12 PM, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: Hi Manfred, No, that page http://myfaces.apache.org/javadoc.html is just a leftover. It should not exist, and can be deleted. The maven website deploy stuff doesn't delete old obsolete stuff when a new site is deployed. I guess google indexed it when it was once valid, and now people are finding it :-( The thread that Matthias refers to is about something a bit different. There should be no javadoc.html page at the top level of the site, as this just does not make sense with the large number of myfaces projects that now exist. But there should be (and are) javadoc index pages under core11 and core12 directories - and possibly tomahawk directory (TODO). Regards, Simon Matthias Wessendorf schrieb: Hi On Feb 20, 2008 10:39 AM, Manfred Geiler [EMAIL PROTECTED] wrote: Simon, is http://myfaces.apache.org/javadoc.html an official page? Google search says that there is no external link to that page. it was... and we talked about that already in this thread ([1]). This page was also linked to tagdocs. -Matthias [1] http://www.mail-archive.com/dev@myfaces.apache.org/msg29331.html Should we get rid of it or is there a quick way to fix this? Thanks, Manfred -- Forwarded message -- From: [EMAIL PROTECTED] Date: Feb 20, 2008 7:25 AM Subject: MyFaces API and IMPL docs 404. To: [EMAIL PROTECTED] Hi, Wasn't sure whom to inform. http://myfaces.apache.org/javadoc.html getting 404s for the API and IMPL docs. Thanks Rohit.
Re: svn commit: r627917 - /myfaces/core/trunk_1.2.x/api/src/main/java/javax/faces/component/_SharedRendererUtils.java
[EMAIL PROTECTED] schrieb: -valueType = expression.getType(facesContext.getELContext()); +//By some strange reason vb.getType(facesContext.getELContext()); +//does not return the same as vb.getValue(facesContext.getELContext()).getClass(), +//so we need to use this instead. +Object value = expression.getValue(facesContext.getELContext()); +valueType = (value != null) ? value.getClass() : +expression.getType(facesContext.getELContext()) ; + if (valueType != null valueType.isArray()) { arrayComponentType = valueType.getComponentType(); The behaviour of vb.getType doesn't seem so strange. The ValueExpression.getType method returns the *declared* type of a property. Specifically, the javadocs say that it is the type you would pass to the property setter. If a method is public Foo getFoo(), then getType is Foo. This result is always the same over a run of the application, so can be cached. Checking the concrete type of the object returned from the getter is quite different; it may be null, or a subtype of Foo. And for code that doesn't use generics, foo['key'] only statically implies a type of Object, which is the problem here, yes? Presumably declaring the map on the backing bean as MapString, ListBoolean would also resolve the issue, as ValueExpression.getType would then return ListBoolean as the type, rather than Object. I agree that in this case we want to check the runtime type rather than the static type. Maybe we should check the code for other cases where ValueExpression.getType is being called... Regards, Simon
Re: Good News:Trinidad Tomahawk
Hi , are there any prebuild libraries somewhere I can download? (I looked at the incubator site but couldnt find any) thanks /anders Thomas Spiegl wrote: Good news: MyFaces Trinidad and Tomahawk components can now be used together. Major issues regarding tomahawk links have now been fixed. You wil be able to mix Trinidad and Tomahawk components in the same page, putting them into an ADF form. Version note: Get the current MyFaces version from SVN, or use a nightly build = 8/26/2006 http://people.apache.org/builds/myfaces/nightly/ Thomas
Re: preserveDataModel
well its a rather ajax specific problem... the real problem was that we skipped the render response phase, which calls the encode begin method (which we skip, since we only render the content that is we only call the encodeInnerHTML method). this was a problem, because the method nullifies the preservedDataModel... we solved it by nullifying it ourselves... im not yet sure whether it is allowed at all (especially since for now we had to add some setter and getter methods in the htmlDataTable class. ;) but so far there is no other way for us... thanx a lot again greetings jörg Martin Marinschek wrote: Great! You want to share with us how? regards, Martin On 8/18/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: never mind, we fixed it ;) greetings jörg. Martin Marinschek wrote: No, not as far as I know. You can easily try out the differences between preserveDataModel true and false by having a request scoped bean on the backend which looses its backing model upon reload and by showing links in the datatable - without preserveDataModel = true the links will not be executed! regards, Martin On 8/18/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: hi everybody! i'm having troubles understanding the function of the dataTable, when preserveDatamodel = true. Is there an example designed only for testing this feature, someting which clearly shows the difference between the two possibilities?? happy for any suggestions... greetings jörg
preserveDataModel
hi everybody! i'm having troubles understanding the function of the dataTable, when preserveDatamodel = true. Is there an example designed only for testing this feature, someting which clearly shows the difference between the two possibilities?? happy for any suggestions... greetings jörg
Re: preserveDataModel
hi Martin. what do you meanby looses its backing model ?? greetings jörg Martin Marinschek wrote: No, not as far as I know. You can easily try out the differences between preserveDataModel true and false by having a request scoped bean on the backend which looses its backing model upon reload and by showing links in the datatable - without preserveDataModel = true the links will not be executed! regards, Martin On 8/18/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: hi everybody! i'm having troubles understanding the function of the dataTable, when preserveDatamodel = true. Is there an example designed only for testing this feature, someting which clearly shows the difference between the two possibilities?? happy for any suggestions... greetings jörg
Re: preserveDataModel
never mind, we fixed it ;) greetings jörg. Martin Marinschek wrote: No, not as far as I know. You can easily try out the differences between preserveDataModel true and false by having a request scoped bean on the backend which looses its backing model upon reload and by showing links in the datatable - without preserveDataModel = true the links will not be executed! regards, Martin On 8/18/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: hi everybody! i'm having troubles understanding the function of the dataTable, when preserveDatamodel = true. Is there an example designed only for testing this feature, someting which clearly shows the difference between the two possibilities?? happy for any suggestions... greetings jörg
Re: Can I contribute to project ?
this all sounds very interesing, iwe're too working on an ajaxcomponent right now (datatable) and we were also thinking in making this a global attribute for all components. maybe have a look at the custom examples in the sandbox folder. we used a special javascript library (i think from ruby on rails) called prototype.js. and it is really very neat, it does a lot of the work for you and seems to have no compatibility problems. also we already defind an ajaxPhaseListener, which looks if an ajaxcomponent is to be rendered and then jumps out of the life-cycle. have a look, im sure it is of interest to you best wishes joerg I have worked ( as alpha-alpha-alpha .. ) set of custom components for create AJAX-like user interfaces for Java Server Faces Applications. Main idea - in faces lifecicle, ve have view tree on server side and DOM tree in client browser ( at present, saving tree on client side don't used ). For ajax request, client send all user input of current form ( as normal submit request ). Request phases work as usual, but on render responce phase server send ONLY CHANGED parts of view ( i don't create full compare, but send only fo pre-defined parts, or submitted form ). For this I created special AjaxContainer component, wich render it's children ( or part of children ) components. List of clientId's for rendered components included in response for client-side part of framework. For other cases ( for example, such will work in-page Jabber client ) component can render updated xml in custom Listener, and set responseComplete to FacesContext. On client side, JavaScript get list of rendered parts and update page DOM Tree. In result, we have two-way communications between client browser and JSF view, and have sync tree's on client and server side. It's work like common desctop applications. Such arch don't need create special RenderKit or change most of components - for example, at present, I have only custom tag's-renderers for UICommand ( link Button ) components, and have worked all standart and most of custom ( tree2 and any other Myfaces components ) as AJAX ... On other hand, custom renderers based on usual Html renderers, and work also in non-JavaScript environment as simple html... For client side script use XMLHttpRequest object, and, if it don't exist, special JavaScriptHttpRequest. It's worked same as XMLHttpRequest ( designed with same properies and methods ), but use other idea. For perform request, It appended to page script tag. Browser load script ( url maked from form action url with query string as normal GET request ), but script produced by special Filter on server from Html code of JSF Response - as pseudo-Dom tree of objects. Loaded script call handler function for update page - same as XMLHttpRequest. Not need any iframe or other incompatible technologies for non XMLHttpRequest browsers. It can work in wide range of browsers, even in 4+ IE Netscape ??? ( in theory. At present tested with Mozilla Firefox and IE 5.5+ browsers. I need help for make JavaScript compatible with others. ) At present, XMLHttpRequest work only in Mozilla. Code for MS ActiveX was disabled since I don't can get any functionality of XML part for microsoft object ! Xml parsed withowt errors. ( For xml request, i use in servlet Filter nekkohtml parser, and make valid xml, with all declarations and mime-type headers ). It make responseXML Tree, but any methods for navigate/manipulate don't worked ! Also, elements from responseXML can't be inserted to page, don't produce event handlers etc. Current view of code repository - http://svn.demi.spb.ru/repository/myfaces-ajax/ Project created in Eclipse. Sorry for possible mistakes - my native language is Russian. Alexander J. Smirnov
dummyForm
hi everyone. i would like to render the dummyForm right at the beginning of an html page. the reason is the following. i have a javscript function, which needs the current state as a parameter. the problem however is, that it needs this parameter right at the beginning, even before the hole dom tree is set up (this functin is not from me so i can't change it). so far it only works, because i render a seperate form myself, where i include the state. but this is obviously not a good solution, since it gets a different state then all the rest... any ideas? is it at all possible to render the dummyform at the beginning? best wishes joerg