Hi,
I have seen the following statement in "
http://www.marklogic.com/resources/component-content-reuse-with-microsoft-office/resource_download/solution-sheets/
"
- "Content tracking, custom tagging, and automatic updates"
-"Document consistency by identifying out-of-date components, and
optional
Unfortunately, the only output I get from xdmp:document-filter() when I
pass in my DLL is this :
http://www.w3.org/1999/xhtml";>
No version information, unlike the one returned by my VBScript.
On Wed, Sep 2, 2015 at 4:46 PM, Danny Sinang wrote:
> Ah, document-get() ...
Ah, document-get() ... perfect ... thanks !
- Danny
On Wed, Sep 2, 2015 at 4:44 PM, Joe Bryan wrote:
> Hi Danny,
>
> You don't need to have the DLL stored in the database in order to use
> xdmp:document-filter(). You merely need to have it's contents as a node()
> (specifically, a document-node
Hi Danny,
You don't need to have the DLL stored in the database in order to use
xdmp:document-filter(). You merely need to have it's contents as a node()
(specifically, a document-node() or binary() node). That node could come in as
the response to a network request (to another system), it coul
Hi David,
Thank you very much for pointing out that article.
xdmp:document-filter() accepts a node parameter. Since the DLL is on the
filesystem, does this mean I need to ingest the DLL into ML first before I
can pass it to xdmp:document-filter() ?
Regards,
Danny
On Wed, Sep 2, 2015 at 3:29 PM,
HI.
[other than some hooks for custom search extensions], MarkLogic does not
have a way to directly run code external to itself. You always need to
bridge to this other code as you have seen in the MLJAM example.
However.. Looking at your use case, consider the fact that MarkLogic can
extract int
Has anyone here tried running Windows executable files from within XQuery
scripts ?
I've got a VBScript that can read the version info of any given DLL, and
I'd like to invoke it from an ML scheduled task and store the version info
inside MarkLogic.
The list of DLLs to read is stored in an XML fi
Hi Sumathi
I guess you are trying to use a MAP and read it via a global map variable
between modules, where you can pass the map by reference. The Variable and
its Value can be passed as key-value pair across the modules.
Please see the link below
https://developer.marklogic.com/blog/
Hi Sumathi
I guess you are trying to use a MAP and read it via a global map variable
between modules, where you can pass the map by reference. The Variable and
its Value can be passed as key-value pair across the modules.
Please see the link below
https://developer.marklogic.com/blog/
>> Just wanted to point out that you should not have the start up overhead if
>> you wrap FOP in a servlet
Yes that's exactly what I was recommending (and did). Using a servlet or some
equivalent technology that is running in a 'always on' JVM or at least one that
hangs around for a while (
Thank you for quick response,
I started to think that this is part of the problem because custom forests that
were created on the old server have additional "-1" after import and there are
no forests with such id that is in error.
If this is not the best way to backup/restore databases (schemas,
MLJAM shows PDF generation via FOP as one of the included sample use cases.
https://developer.marklogic.com/code/mljam
It does all the serializing for you.
-jh-
On Sep 2, 2015, at 8:51 PM, Florent Georges wrote:
> Hi David,
>
> Just wanted to point out that you should not have the start
Hi David,
Just wanted to point out that you should not have the start up overhead
if you wrap FOP in a servlet, as the JVM is not loading the classes over
and over again, it's done only once.
From experience, the critical part is to connect the various components
properly (that is, the serv
I've done the same thing for PDFS. For a previous $mployeer I used a locally
running embedded tomcat server with a simple servlet that ran a simple xmlsh
script (could be any language) that ran the FOP processing and then returned
the PDF as a binary.
If you need high throughput, its critical
14 matches
Mail list logo