RE: Hel in writing my own transformer.
You mis one argument in new AttributesImpl(); it should be new AttributesImpl(attr), see comments in text AS Hi all. I need to implement a my own transform that does a link rewriter. the scope is to change the href value of a tag in a html file (a well formed html file checked by jtidy as you can check in the file jtidy.xml). I have realize for now the code: package it.transformers; import org.apache.cocoon.transformation.AbstractSAXTransformer; import org.xml.sax.Attributes; import java.io.IOException; import java.util.Map; import org.apache.cocoon.ProcessingException; import org.apache.cocoon.portal.transformation.CopletTransformer; import org.apache.avalon.framework.parameters.Parameters; import org.xml.sax.SAXException; import org.apache.cocoon.environment.SourceResolver; import org.apache.cocoon.xml.AttributesImpl; import java.util.Stack; public class HTMLLinkTransformer extends AbstractSAXTransformer { protected Stack elementStack = new Stack(); public void recycle() { super.recycle(); this.elementStack.clear(); } public void setup(SourceResolver resolver, Map objectModel, String src, Parameters par) throws ProcessingException, SAXException, IOException { super.setup(resolver, objectModel, src, par); } public void startElement(String uri, String name, String raw, Attributes attr) throws SAXException { if (a.equalsIgnoreCase(name) || xhtml:a.equalsIgnoreCase( name ) ) { AttributesImpl newAttr = new AttributesImpl(); newAttr.setValue(attr.getIndex(href),ciao); Here is the mistake. newAttr is empty! So, attr.getIndex(href) returns an index that newAttr does not have. AttributesImpl newAttr = new AttributesImpl(); - AttributesImpl newAttr = new AttributesImpl(attr); does the job attr = newAttr; } super.startElement( uri, name, raw,attr ); } public void endElement(String uri, String name, String raw) throws SAXException { super.endElement(uri, name, raw); } } By using it i have this error: ERROR (2005-10-26) 23:35.13:739 [sitemap.handled-errors] (/pmm/portal/coplets/S062/execute.service) http-8080-Processor8/ErrorHandlerHelper: Error executing pipeline. org.apache.cocoon.ProcessingException: Error executing pipeline.: org.w3c.dom.DOMException: HIERARCHY_REQUEST_ERR: An attempt was made to insert a node where it is not permitted. at org.apache.cocoon.components.pipeline.AbstractProcessingPipeli ne.handleException(AbstractProcessingPipeline.java:940) at org.apache.cocoon.components.pipeline.impl.AbstractCachingProc essingPipeline.processXMLPipeline(AbstractCachingProcessingPip eline.java:281) at org.apache.cocoon.components.pipeline.AbstractProcessingPipeli ne.process(AbstractProcessingPipeline.java:783) at org.apache.cocoon.components.source.impl.SitemapSource.toSAX(S itemapSource.java:413) at org.apache.cocoon.components.source.SourceUtil.toSAX(SourceUti l.java:142) at org.apache.cocoon.components.source.SourceUtil.toSAX(SourceUti l.java:100) at org.apache.cocoon.components.source.SourceUtil.toDOM(SourceUti l.java:332) at org.apache.cocoon.components.flow.util.PipelineUtil.processToD OM(PipelineUtil.java:173) Can anybody help me? Thanks to all. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [CForms/flow/ajax] Weird problem with ajax=true
Thorsten Scherler wrote: El vie, 14-10-2005 a las 16:25 +0200, Felix Röthenbacher escribió: Hi Thorsten Thorsten Scherler wrote: Hey all, lenya just added the dynamic repeater example to the default pub. The problem with it is, that if we use ajax=true and submit the form we lose the focus on the flow script. It shows the form (form.show()) and then after submitting the form it does not return to the flow script (usecase.js). The browserupdater.js used with Ajax checks for the header X-Cocoon-Ajax = continue I changed this recently, as working with headers is not reliable. What triggers the full page reload is now the bu:continue/ instruction. If true, a request is sent back to the server so it exits from Form.showForm(). Unfortunately, the request is not compatible with Lenya's usecase management (needs a parameter lenya.usecase to know which usecase it is in). Additionally, Ajax expects the continuation id in continuation-id whereas Lenya uses lenya.continuation for that. Maybe browserupdater.js can be made more generic so that it also fits for Lenya? Actually Felix is right, Sylvain, do you have an idea how we can make the browserupdater.js more generic to allow application specific trigger? We need a way to instruct the js about the name of the hidden input that holds the continuation id. Actually, I've been thinking for some time about removing the need to write ft:continuation-id and have it produced directly by ft:form-template. The parameter name could then either be specified in the flowscript or in the form definition. Now the question is: why don't you use continuation-id ?? Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [CForms/flow/ajax] Weird problem with ajax=true
Thorsten Scherler wrote: El jue, 27-10-2005 a las 12:58 +0200, Thorsten Scherler escribió: Thx it seems recent commits have solved this issue. I was wrong, only the error is gone but still it is not working. :( Er... I don't follow you here: what is not working? Do you still have the IllegalStateException? If yes, please post the stacktrace. Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: sitemap.xmap does not reload on slide repository
This is quite complex what you want. When stored on filesystem and using something in a cached pipeline, a change of the file will invalidate the cached pipeline (do not know if this happens instantly or on requesting the pipeline again, I suppose the last). But when you are using some protocol, like slide: or an http: request, this is not possible. Think checking over http or slide: for just the cache-key is taking to long or is not possible. I am not certain about this. Anyway, the way we solve this is smart ecaching, only invalidating cache keys which should be invalidated. This happens through JMS messages. How it is used is for example as follows: We have in slide 10.000 news articles under nl/abroad/news On the website/intranet, there is a pipeline that calls for example nl/abroad/news, which fetches with a dasl (being translated into lucene search) the latest 10 articles. In the cash key, the initial part nl/abroad/news is taken into account. Now, when an article is added to nl/abroad/news/2005/12, slide sents a JMS message to the right cocoon instance. The cocoon instance invalidates all the cache keys depending on nl/abroad/news/2005/12, nl/abroad/news/2005, nl/abroad/news, nl/abroad,nl. But for example the news archive of en/abroad/news/, or nl/abroad/agenda will not be invalidated. Well, bottom line is that this is ofcourse quite a complex infrastructure, and probably a little bit to much for just a subsitemap. Anyway, this is how we do it AS Hi, I'm trying to use slide repository to store a cocoon subdirectory. In the root sitemap.xmap, I've added these lines: map:match pattern=web/** map:mount check-reload=yes src=slide://cocoon/site/sitemap.xmap uri-prefix=web/ /map:match Unfortunately, when I change the slide://cocoon/site/sitemap.xmap, cocoon does not take the changes into account, despite the check-reload option. I need to modify the root sitemap.xmap to take the changes into account. It is a problem in cocoon caching system or in the slide API? Thanks in advance, Benoit. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Betr.: Parameter passing and the authentication framework
I think this should be map:generate src=cocoon:/get-event/{../id} / Doh. Yep; the works a charm. Thanks for that :). Gary * The information contained in this message may be confidential or legally privileged and is intended for the addressee only, If you have received this message in error or there are any problems, please notify the originator immediately. The unauthorised use, disclosure, copying or alteration of this message is strictly forbidden. * - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: simple directory listing
Something like map:match pattern=** map:select type=resource-exists map:when test={1}/sitemap.xmap map:mount src={1} uri-prefix=whatever/ /map:when map:otherwise map:generate type=directory src={1}/ map:transform src=directory2html.xslt/ map:serialize type=xhtml/ /map:otherwise /map:select /map:match should work, AS what's the right way to implement the following: if the request is a directory if the directory has no sitemap create a simple directory listing like so: map:generate type=directory src={theDirectory}/ map:transform src=directory2html.xslt/ map:serialize type=xhtml/ otherwise call the requests sitemap.xmap end end as you can see, i need help with three things - defining the source, defining the match, and putting the match in the right place relative to all the other matches. any solutions|examples for this kind of thing? -- _jason - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: sitemap.xmap does not reload on slide repository
I understand that protocol like http: or webdav: cannot check cache-key validity over the network. But my understanding is slide: protocol is just a direct access to the Slide API, that in my case access the file locally on the server. So I guess the overhead is small compare to a direct file access, but maybe I'm wrong. Is there a way to force cocoon to check the cache-key validity for slide: protocol like it does for local file? Thanks and Regards, Benoit. -Original Message- From: Ard Schrijvers [mailto:[EMAIL PROTECTED] Sent: Friday, October 28, 2005 10:33 AM To: users@cocoon.apache.org Subject: RE: sitemap.xmap does not reload on slide repository This is quite complex what you want. When stored on filesystem and using something in a cached pipeline, a change of the file will invalidate the cached pipeline (do not know if this happens instantly or on requesting the pipeline again, I suppose the last). But when you are using some protocol, like slide: or an http: request, this is not possible. Think checking over http or slide: for just the cache-key is taking to long or is not possible. I am not certain about this. Anyway, the way we solve this is smart ecaching, only invalidating cache keys which should be invalidated. This happens through JMS messages. How it is used is for example as follows: We have in slide 10.000 news articles under nl/abroad/news On the website/intranet, there is a pipeline that calls for example nl/abroad/news, which fetches with a dasl (being translated into lucene search) the latest 10 articles. In the cash key, the initial part nl/abroad/news is taken into account. Now, when an article is added to nl/abroad/news/2005/12, slide sents a JMS message to the right cocoon instance. The cocoon instance invalidates all the cache keys depending on nl/abroad/news/2005/12, nl/abroad/news/2005, nl/abroad/news, nl/abroad,nl. But for example the news archive of en/abroad/news/, or nl/abroad/agenda will not be invalidated. Well, bottom line is that this is ofcourse quite a complex infrastructure, and probably a little bit to much for just a subsitemap. Anyway, this is how we do it AS Hi, I'm trying to use slide repository to store a cocoon subdirectory. In the root sitemap.xmap, I've added these lines: map:match pattern=web/** map:mount check-reload=yes src=slide://cocoon/site/sitemap.xmap uri-prefix=web/ /map:match Unfortunately, when I change the slide://cocoon/site/sitemap.xmap, cocoon does not take the changes into account, despite the check-reload option. I need to modify the root sitemap.xmap to take the changes into account. It is a problem in cocoon caching system or in the slide API? Thanks in advance, Benoit. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: sitemap.xmap does not reload on slide repository
Cousson, Benoit wrote: I understand that protocol like http: or webdav: cannot check cache-key validity over the network. But my understanding is slide: protocol is just a direct access to the Slide API, that in my case access the file locally on the server. So I guess the overhead is small compare to a direct file access, but maybe I'm wrong. Is there a way to force cocoon to check the cache-key validity for slide: protocol like it does for local file? Looking at the implementation of the slide: protocol, it does provide validity information _if_ the underlying repository provides them. You should investigate in this direction. Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: simple directory listing
Ard Schrijvers wrote: Something like map:match pattern=** map:select type=resource-exists map:when test={1}/sitemap.xmap map:mount src={1} uri-prefix=whatever/ /map:when map:otherwise map:generate type=directory src={1}/ map:transform src=directory2html.xslt/ map:serialize type=xhtml/ /map:otherwise /map:select /map:match Note that map:match pattern=** is totally useless, as it matches everything! A sitemap does not need to have matchers as top-level instructions. Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: simple directory listing
Note that map:match pattern=** is totally useless, as it matches everything! Ard probably meant map:match pattern=*/** .. :-) G. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: simple directory listing
Geert Josten wrote: Note that map:match pattern=** is totally useless, as it matches everything! Ard probably meant map:match pattern=*/** .. :-) Doh, you're right! I have a special brain matcher for match pattern='**' that automatically triggers this answer without reading much further ;-) Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: simple directory listing
It was not meant as a top-level matcher of course. I use this kind of matcher often at the end of of sitemap to match everything that was not matched before, which seems pretty normal to me. Ofcourse, what Geert says, if you have directories, you would use */**, unless dirs are 2 levels deep etc etc Anyway, did not pay that much attention to the matcher at all, only pointing out the map:select possibility was my intention AS Ard Schrijvers wrote: Something like map:match pattern=** map:select type=resource-exists map:when test={1}/sitemap.xmap map:mount src={1} uri-prefix=whatever/ /map:when map:otherwise map:generate type=directory src={1}/ map:transform src=directory2html.xslt/ map:serialize type=xhtml/ /map:otherwise /map:select /map:match Note that map:match pattern=** is totally useless, as it matches everything! A sitemap does not need to have matchers as top-level instructions. Sylvain -- Sylvain WallezAnyware Technologies http://people.apache.org/~sylvain http://www.anyware-tech.com Apache Software Foundation Member Research Technology Director - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [Maven-plugins-user] Maven WebSphere 5.0/5.1/6? Plugin
Hello Dion, thanks for the info. I really would like to test the new version, but - to be honest - I've still some serious problems getting the 1.2 running, using the ejbDeploy goal: [wasEjbDeploy] Failure invoking BootLoader.startup method [wasEjbDeploy] java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: Fatal Error: Unable to locate matching org.eclipse.core.runtime plug-in. [wasEjbDeploy] at org.eclipse.core.internal.boot.PlatformConfiguration.locateDefaultPlugins(Pl atformConfiguration.java:2264) [wasEjbDeploy] at org.eclipse.core.internal.boot.PlatformConfiguration.init(PlatformConfigur ation.java:903) [wasEjbDeploy] at org.eclipse.core.internal.boot.PlatformConfiguration.startup(PlatformConfigu ration.java:1368) [wasEjbDeploy] at org.eclipse.core.internal.boot.InternalBootLoader.initialize(InternalBootLoa der.java:582) [wasEjbDeploy] at org.eclipse.core.internal.boot.InternalBootLoader.startup(InternalBootLoader .java:1035) [wasEjbDeploy] at org.eclipse.core.boot.BootLoader.startup(BootLoader.java:516) [wasEjbDeploy] at java.lang.reflect.Method.invoke(Native Method) [wasEjbDeploy] at com.ibm.etools.ejbdeploy.batch.impl.BootLoaderLoader.startup(BootLoaderLoade r.java:315) [wasEjbDeploy] at com.ibm.etools.ejbdeploy.batch.impl.BatchDeploy.startup(BatchDeploy.java:207 ) [wasEjbDeploy] at com.ibm.etools.ejbdeploy.EJBDeploy.startup(EJBDeploy.java:384) [wasEjbDeploy] at com.ibm.etools.ejbdeploy.EJBDeploy.execute(EJBDeploy.java:77) [wasEjbDeploy] at com.ibm.etools.ejbdeploy.EJBDeploy.main(EJBDeploy.java:309) [wasEjbDeploy] EJBDeploy level: 20040425_1935-WB213-AD-V512D-W5 [wasEjbDeploy] [ERROR] Java Result: 1 Besides - is ther any plugin version for maven2? If there is a newer version than 1.2, where can I download it? Thanks. Ole. Dion Gillard [EMAIL PROTECTED] schrieb im Newsbeitrag news:[EMAIL PROTECTED] If you're using the current WebSphere plugin, or are interested in testing the new version, please contact me. The new version has been rewritten so that it is using the ant tasks in an IBM supported way. -- http://www.multitask.com.au/people/dion/ You are going to let the fear of poverty govern your life and your reward will be that you will eat, but you will not live. - George Bernard Shaw --- SF.Net email is sponsored by: Discover Easy Linux Migration Strategies from IBM. Find simple to follow Roadmaps, straightforward articles, informative Webcasts and more! Get everything you need to get up to speed, fast. http://ads.osdn.com/?ad_idt77alloc_id492op=ick - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
processing large files
Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks Eric Boisvert Spécialiste TI-GI / IT-IM specialist [EMAIL PROTECTED], 418-654-3705, facsimile/télécopieur 418-654-2615 490, rue de la Couronne, Québec (Québec), G1K 9A9 490, rue de la Couronne, Quebec, Quebec, G1K 9A9 Laboratoire de cartographie numérique et de photogrammétrie (LCNP) Digital Cartography and Photogrammetry Laboratory (DCPL) Commission géologique du Canada (Québec) / Geological Survey of Canada (Quebec) Ressources naturelles Canada / Natural Resources Canada Gouvernement du Canada / Government of Canada http://www.cgcq.rncan.gc.ca/lcnp http://www.nrcan.gc.ca/gsc - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pdf Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks You will probably also want to listen to it. http://cocoongt.hippo12.castaserver.com/cocoongt/audio/gt-11-nico.mp3 Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: fd:validationfd:javascript
finally got around to installing it - looks good!On 10/27/05, Sylvain Wallez [EMAIL PROTECTED] wrote: Bruno Dumon wrote: In the _javascript_ validation (or in any validation for that matter), if you return false, you should also set a validation error on the widget to which the validator belongs, or a child/descendant widget of that widget (e.g. in case the validator belongs to a form or repeater). Have a look at the source of the samples for how to do this.Yep.Note that in 2.1.8 (real soon now!), the _javascript_ validator has beenextended so that it can now, along with booleans, return a String, ValidationError or I18Message. In that case, the error is set on thecurrent widget and the validation fails.Sylvain--Sylvain WallezAnyware Technologieshttp://people.apache.org/~sylvain http://www.anyware-tech.comApache Software Foundation Member Research Technology Director -To unsubscribe, e-mail: [EMAIL PROTECTED]For additional commands, e-mail: [EMAIL PROTECTED]
RE: processing large files
thanks, I saw that. I wondered if there was some obvious thing I could check before starting rewriting the xslt (I know, I'm lazy) Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 11:51 À : users@cocoon.apache.org Objet : Re: processing large files Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pdf Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
You may not have to rewrite your xslt. Part of the idea is to reduce the size of the document by eliminating unnecessary stuff in a transformer before your XSLT is invoked. Ralph Boisvert, Éric wrote: thanks, I saw that. I wondered if there was some obvious thing I could check before starting rewriting the xslt (I know, I'm lazy) Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 11:51 À : users@cocoon.apache.org Objet : Re: processing large files Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pdf Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: processing large files
not much to remove, I'm afraid.. The xslt in question mostly reformat a document into another format, but keeps all the same information. I'm investigating Vermer's options right now.. keeps suggestions coming.. they are appreciated. Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 13:55 À : users@cocoon.apache.org Objet : Re: processing large files You may not have to rewrite your xslt. Part of the idea is to reduce the size of the document by eliminating unnecessary stuff in a transformer before your XSLT is invoked. Ralph Boisvert, Éric wrote: thanks, I saw that. I wondered if there was some obvious thing I could check before starting rewriting the xslt (I know, I'm lazy) Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 11:51 À : users@cocoon.apache.org Objet : Re: processing large files Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pd f Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
hi, have you seen Nico Verwer's presentation Performance on big documents. You can download it as mp3/pdf on http://www.cocoongt.org/Slides-and-recordings.html They used a nice two-step appoach for transforming bid documents. perhaps it helps, Christoph Boisvert, Éric wrote: not much to remove, I'm afraid.. The xslt in question mostly reformat a document into another format, but keeps all the same information. I'm investigating Vermer's options right now.. keeps suggestions coming.. they are appreciated. Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 13:55 À : users@cocoon.apache.org Objet : Re: processing large files You may not have to rewrite your xslt. Part of the idea is to reduce the size of the document by eliminating unnecessary stuff in a transformer before your XSLT is invoked. Ralph Boisvert, Éric wrote: thanks, I saw that. I wondered if there was some obvious thing I could check before starting rewriting the xslt (I know, I'm lazy) Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 11:51 À : users@cocoon.apache.org Objet : Re: processing large files Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pd f Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: processing large files
yeah. Verwer (I mispelled his name - my finger took control over my brain again), this is what I'm reading. Now I cannot make xsltc work (returns an empty document).. anything I should know ? Cheers Eric -Message d'origine- De : Christoph Gaffga (triplemind.com) [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 14:43 À : users@cocoon.apache.org Objet : Re: processing large files hi, have you seen Nico Verwer's presentation Performance on big documents. You can download it as mp3/pdf on http://www.cocoongt.org/Slides-and-recordings.html They used a nice two-step appoach for transforming bid documents. perhaps it helps, Christoph Boisvert, Éric wrote: not much to remove, I'm afraid.. The xslt in question mostly reformat a document into another format, but keeps all the same information. I'm investigating Vermer's options right now.. keeps suggestions coming.. they are appreciated. Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 13:55 À : users@cocoon.apache.org Objet : Re: processing large files You may not have to rewrite your xslt. Part of the idea is to reduce the size of the document by eliminating unnecessary stuff in a transformer before your XSLT is invoked. Ralph Boisvert, Éric wrote: thanks, I saw that. I wondered if there was some obvious thing I could check before starting rewriting the xslt (I know, I'm lazy) Eric -Message d'origine- De : Ralph Goers [mailto:[EMAIL PROTECTED] Envoyé : 28 octobre, 2005 11:51 À : users@cocoon.apache.org Objet : Re: processing large files Boisvert, Éric wrote: Hi all I need to process large xml file and as I tested with increasingly larger file, the time to process suddently increased a lot. For instance, 200 K files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it jumps to 30 seconds (nearly 10 times, for twice the size).. I played with the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to 512 Megs, nothing helped. I guess this is related to the fact that at some point the incoming document cannot be loaded entirely in memory. Anyone has an idea to fix this ? Cheers and thanks This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.p d f Ralph - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
keeps suggestions coming.. they are appreciated. XSLT transforms are the most obvious reason for the memory usage and speed behaviour you observe. It is really a matter of limiting the memory usage, that will influence the performance the most. Is it really necessary to use XSLT? Could STX be used in your case? Do you really need access to the full document? Or are there parts of the document that are merely copied? And perhaps your XSLT contains inefficient algorithms; have you done optimisation on it? Kind regards, Geert - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: processing large files
Hello Éric, If this file is not very complex and You don't need XPath to be happy then You should to consider writing custom Transformer or Generator. SAXParser interface is easy to use. Friday, October 28, 2005, 5:30:50 PM, you wrote: BÉ Hi all BÉ I need to process large xml file and as I tested with increasingly larger BÉ file, the time to process suddently increased a lot. For instance, 200 K BÉ files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it BÉ jumps to 30 seconds (nearly 10 times, for twice the size).. I played with BÉ the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to BÉ 512 Megs, nothing helped. I guess this is related to the fact that at some BÉ point the incoming document cannot be loaded entirely in memory. BÉ Anyone has an idea to fix this ? BÉ Cheers and thanks BÉ BÉ Eric Boisvert BÉ Spécialiste TI-GI / IT-IM specialist BÉ [EMAIL PROTECTED], 418-654-3705, facsimile/télécopieur BÉ 418-654-2615 BÉ 490, rue de la Couronne, Québec (Québec), G1K 9A9 BÉ 490, rue de la Couronne, Quebec, Quebec, G1K 9A9 BÉ Laboratoire de cartographie numérique et de photogrammétrie (LCNP) BÉ Digital Cartography and Photogrammetry Laboratory (DCPL) BÉ Commission géologique du Canada (Québec) / Geological Survey of Canada BÉ (Quebec) BÉ Ressources naturelles Canada / Natural Resources Canada BÉ Gouvernement du Canada / Government of Canada BÉ http://www.cgcq.rncan.gc.ca/lcnp BÉ http://www.nrcan.gc.ca/gsc BÉ - BÉ To unsubscribe, e-mail: [EMAIL PROTECTED] BÉ For additional commands, e-mail: [EMAIL PROTECTED] -- Best regards, Grzegorzmailto:[EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Authentication redirect-to uri
Merico Raffaele wrote: Dear community I hope somebody can explain me if the following situation is a feature or bug. The URL http://.../appl/auth/Index/TODO/ve2000 is auth-protected and matches the pattern=auth** in the appl/sitemap.xmap. The redirect-to uri=cocoon:/showLoginForm.cflow/ is than translated to appl/auth/Index/TODO/showLoginForm.cflow (auth/Index/TODO is added!). I aspected a redirect to appl/showLoginForm.cflow. Is this obvious to you? Try: redirect-to uri=cocoon://showLoginForm.cflow/ or redirect-to uri=cocoon:///.../appl/showLoginForm.cflow/ Best Regards, Antonio Gallardo. Thx in advance ... Raffaele - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]