; >
> > > The database has more than 120 entries. Daily around 20 plus
> new
> > > records are added and around the same number deleted.
> > >
> > > posting_id is a unique ID for every record.
> > >
> > > Please can some help write
> > The database has more than 120 entries. Daily around 20 plus new
> > records are added and around the same number deleted.
> >
> > posting_id is a unique ID for every record.
> >
> > Please can some help write a delta-import script so that the index file
&
than 120 entries. Daily around 20 plus new
> records are added and around the same number deleted.
>
> posting_id is a unique ID for every record.
>
> Please can some help write a delta-import script so that the index file is
> update as per the records in the MySql da
.
Please can some help write a delta-import script so that the index file is
update as per the records in the MySql database (news) everyday. If the
posting_id is not found in the database (news) then the same is deleted
from the solr indexed file and the records with new posting_id are indexed.
Time for import - 5-6 minutes
Warmup time - 40seconds
autoCommit and autoSoftCommit setting both disabled and We fire commit only
after the import is completed.
I have some more doubt
1. In case of master-slave, is auto warm strategy is available for slaves
2. Also should I also have a limit
On 11/27/2018 10:32 AM, ~$alpha` wrote:
<http://lucene.472066.n3.nabble.com/file/t482390/1.png>
SOLR VERSION 6.0.0
As seen in the image, There is a spike which can be observed at every 2
hours.
i.e whenever delta import runs
1. Response time doubles
2. CPU load average doubles and
<http://lucene.472066.n3.nabble.com/file/t482390/1.png>
SOLR VERSION 6.0.0
As seen in the image, There is a spike which can be observed at every 2
hours.
i.e whenever delta import runs
1. Response time doubles
2. CPU load average doubles and if it runs to near to peal hour than it
goe
This is the response I am getting. But
>> every
>> time I run delta import , it fetches same number of records but didn't
>> commit them.
>>
>>
>> 0:0:42.255
>> 2
>> 10208
>> 0
>> 0
>> 2018-07-08 15:37:31
>> 2018-07-08 15:3
On 7/8/2018 9:44 AM, shruti suri wrote:
I am using solr-6.1.0 version. This is the response I am getting. But every
time I run delta import , it fetches same number of records but didn't
commit them.
0:0:42.255
2
10208
0
0
2018-07-08 15:37:31
2018-07-08 15:37:31
2018-07-08 15:38:13
2018-
Agreed. DIH is not an industrial grade ETL tool.. may want to consider other
options. May want to look into Kafka Connect as an alternative. It has
connectors for JDBC into Kafka, and from Kafka into Solr.
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Jul 9, 2018, 6:14 AM -0500, Alex
I think you are moving so fast it is hard to understand where you need help.
Can you setup one clean smallest issue (maybe as test) and try our original
suggestions.
Otherwise, nobody has enough attention energy to figure out what is
happening.
And even then, this list is voluntary help, we are
Still not working, same issue documents are not getting pushed to index.
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
I have as well faced the problem when we have composite primary key in the
table, so below is how have went with workaround.
deltaQuery retrieve concat value with time criteria (that should retrieves
only modified rows) and use it in deltaImportQuery with where clause.
On Sun, Jul 8, 2
Dataconfig I am using now
*managed-schema*
data_id
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
My oracle table doesn't have any primary key and delta import requires a
primary key, that's why I am creating it by concatenating 2 columns. For now
just for testing I am using only one column.
I am using solr-6.1.0 version. This is the response I am getting. But every
time I run de
(col1 as varchar(10)) , name) = '${
dih.delta.ID}'"
pk="ID"
query="select CONCAT(cast(col1 as varchar(10)) , name) as ID, * from MyTable
On Sat, Jul 7, 2018 at 1:41 PM, shruti suri wrote:
> HI,
>
> Please help me with delta import form one oracle table int
HI,
Please help me with delta import form one oracle table into solr. I don't
have any primary key in the table. We need to use composite key using
(LOCAL_MASTER_ID,LOCAL_ID).
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
On 7/6/2018 5:53 AM, shruti suri wrote:
> Please help me with delta import form one oracle table into solr. I don't
> have any primary key in the table. We need to use composite key using
> (LOCAL_MASTER_ID,LOCAL_ID).
>
>
>
Which version of Solr is it and in which way is it not working.
And should not the deltaQuery and deltaImportQuery both have "as ID" part?
Regards,
Alex.
On 6 July 2018 at 07:53, shruti suri wrote:
> HI,
>
> Please help me with delta import form one oracle table into s
HI,
Please help me with delta import form one oracle table into solr. I don't
have any primary key in the table. We need to use composite key using
(LOCAL_MASTER_ID,LOCAL_ID).
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
info about Solr's
mailing lists:
http://lucene.apache.org/solr/community.html#mailing-lists-irc
A delta import in Solr's dataimport handler can use anything you give
it. The only information that Solr actually records from a previous
import is the time of the import, so if you haven&
Hi all,
Is the deltaimport should use the timestamp in sql table?
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
Hi,
A couple of weeks ago, I ran into an unusual problem with Solr on which I could
find previous discussion.
I have a 4 node Solr cluster with 2 collections, ‘A’ and ‘B’. Each of the
collections has 1 shard and 3 replicas. Both collections are updated with a
delta-import that pulls from a
On 12/8/2017 2:40 AM, Sabeer Hussain wrote:
> I am using Solr 7.1 version and deployed it in standalone mode. I have
> created a scheduler in my application itself to perform delta-import
> operation based on a pre-configured frequency. I have used the following
> lines of code (in jav
Solr version :: 6.6.1
I am using the solr to index the PDF files and it is working fine as
expected. Now i have a requirement to perform the option of delta-import on
the PDF file.
I am not able to locate the example of implementing the delta-import with
FileListEntityProcessor.
Please suggest
I am using Solr 7.1 version and deployed it in standalone mode. I have
created a scheduler in my application itself to perform delta-import
operation based on a pre-configured frequency. I have used the following
lines of code (in java) to invoke delta-import operation
URL url
Yes it does work in that case. And one more thing is, I'm use my database as
cassandra not sql. Probably, that must be the problem.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp4347680p4350901.html
Sent from the Solr - User mailing list ar
It says here
https://cwiki.apache.org/confluence/display/solr/Parallel+SQL+Interface#ParallelSQLInterface-SolrSQLSyntax
that you need to escape the FROM.
Did you try it without the
'false' != 'false' OR
On 2017-08-16 04:07 AM, bhargava ravali koganti wrote:
I'm getting error like this
Exce
adEventExecutor$4.run(SingleThreadEventExecutor.java:703)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
at java.lang.Thread.run(Thread.java:748)
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp43
Does this way even handle the deletes?
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp4347680p4350726.html
Sent from the Solr - User mailing list archive at Nabble.com.
yes.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp4347680p4350734.html
Sent from the Solr - User mailing list archive at Nabble.com.
refer this :
http://lucene.472066.n3.nabble.com/Number-of-requests-spike-up-when-i-do-the-delta-Import-td4338162.html#a4339168
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp4347680p4350157.html
Sent from the Solr - User mailing list archive at
Tried it had no impact.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Issue-with-delta-import-tp4347680p4349577.html
Sent from the Solr - User mailing list archive at Nabble.com.
can you please try ${dih.last_index_time} instead of
${dataimporter.last_index_time}.
On Wed, Jul 26, 2017 at 2:33 PM, bhargava ravali koganti <
ravali@gmail.com> wrote:
> Hi,
>
> I'm trying to integrate Solr and Cassandra. I"m facing problem with delta
> impo
Hi,
I'm trying to integrate Solr and Cassandra. I"m facing problem with delta
import. For every 10 minutes I'm running deltaquery using cron job. If any
changes in the data based on last index time, it has to fetch the data(as
far as my knowledge), however, it keeps fetching
Hi,
Did not encounter this issue with solr 6.x. But delta import with cache
executes nested query for every element encountered in parent query. Since
this select does not have where clause because we are using cache, it takes
long time. So delta import witch cache is very slow. My observation is
Hello,
I am running *Solr 3.5* and using Data Import Handler. I am using the
following query -
Although the FULL Import is running fine but the delta import is having
trouble. Here is what I am experiencing -
1. Delta Imports are working in cumulative fashion - any increment
(delta) is
I found this article helpful.
https://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport
--
View this message in context:
http://lucene.472066.n3.nabble.com/Number-of-requests-spike-up-when-i-do-the-delta-Import-tp4338162p4339168.html
Sent from the Solr - User mailing list archive
; On June 2, 2017 4:49:20 AM EDT, vrindavda wrote:
>>Thanks Erick ,
>>
>>Could you please suggest some alternative to go with SolrNET.
>>
>>@jlman, I tried your way, that do reduces the number of request, but
>>delta-import still take longer than full-import. T
;
>@jlman, I tried your way, that do reduces the number of request, but
>delta-import still take longer than full-import. There is no
>improvement in
>performance.
>
>
>
>--
>View this message in context:
>http://lucene.472066.n3.nabble.com/Number-of-requests-spike-up-wh
Thanks Erick ,
Could you please suggest some alternative to go with SolrNET.
@jlman, I tried your way, that do reduces the number of request, but
delta-import still take longer than full-import. There is no improvement in
performance.
--
View this message in context:
http://lucene.472066.n3
k,
>>
>> Thanks for the pointer. Getting astray from what Vrinda is looking for
>> (sorry about that), what if there are no sub-entities? and no
>> deltaImportQuery passed too. I looked into the code and determine it
>> calculates the deltaImportQuery itself,
>> S
hat Vrinda is looking for
> (sorry about that), what if there are no sub-entities? and no
> deltaImportQuery passed too. I looked into the code and determine it
> calculates the deltaImportQuery itself,
> SQLEntityProcessor:getDeltaImportQuery(..)::126.
>
> Ideally then, a full-imp
.
Ideally then, a full-import or the delta-import should take similar time to
build the docs (fetch next row). I may very well be going entirely wrong
here.
Amrit Sarkar
Search Engineer
Lucidworks, Inc.
415-589-9269
www.lucidworks.com
Twitter http://twitter.com/lucidworks
LinkedIn: https
Thanks Erick,
But how do I solve this? I tried creating Stored proc instead of plain
query, but no change in performance.
For delta import it in processing more documents than the total documents.
In this case delta import is not helping at all, I cannot switch to full
import each time. This
This is often the delta query configuration, where sub-entities may
execute a DB request for each row. Is that possible?
Best,
Erick
On Wed, May 31, 2017 at 2:58 AM, vrindavda wrote:
> Exactly, Delta import in taking More than Delta
>
> Here are the details required.
>
> When
Exactly, Delta import in taking More than Delta
Here are the details required.
When I do the delta import for 600(of total 291,633) documents is get this :
Indexing completed. Added/Updated: 360,000 documents. Deleted 0 documents.
(Duration: 6m 58s)
For Full import :
Indexing completed
I am facing kinda similar issue lately where full-import is taking seconds
while delta-import is taking hours.
Can you share some more metrics/numbers related to full-import and
delta-import requested, rows fetched and time?
Amrit Sarkar
Search Engineer
Lucidworks, Inc.
415-589-9269
Hello,
Number of requests spike up, whenever I do the delta import in Solr.
Please help me understand this.
<http://lucene.472066.n3.nabble.com/file/n4338162/solr.jpg>
--
View this message in context:
http://lucene.472066.n3.nabble.com/Number-of-requests-spike-up-when-i-do-the-delta-
5.3.1 and 5.4. And then 6.4 if
> the problem is still there. If it is still there in 6.4, then we may
> have a new bug.
>
> Regards,
>Alex.
>
> http://www.solr-start.com/ - Resources for Solr users, new and experienced
>
>
> On 16 March 2017 at 09:17, Sujay Ba
for Solr users, new and experienced
On 16 March 2017 at 09:17, Sujay Bawaskar wrote:
> This behaviour is for delta import only. One document get field values of
> all documents. These fields are child entities which maps column to multi
> valued fields.
>
> query="IMPORT
This behaviour is for delta import only. One document get field values of
all documents. These fields are child entities which maps column to multi
valued fields.
On Thu, Mar 16, 2017 at 6:35 PM, Alexandre Rafalovitch
wrote:
> Could you give a bit more details.
documents in solr and total index size is 4 GB. DIH
delta import is dumping all values of mapped columns to their respective
multi valued fields. This is causing size of one solr document upto 2 GB.
Is this a known issue with solr 5.3.1?
Thanks,
Sujay
Hi,
We are using DIH with cache(SortedMapBackedCache) with solr 5.3.1. We have
around 2.8 million documents in solr and total index size is 4 GB. DIH
delta import is dumping all values of mapped columns to their respective
multi valued fields. This is causing size of one solr document upto 2 GB
On 3/1/2017 8:48 AM, Liu, Daphne wrote:
> Hello Solr experts, Is there a place in Solr (Delta Import
> Datasource?) where I can adjust the JDBC connection frame size to 256
> mb ? I have adjusted the settings in Cassandra but I'm still getting
> this error. NonTransientCo
Hello Solr experts,
Is there a place in Solr (Delta Import Datasource?) where I can adjust the
JDBC connection frame size to 256 mb ? I have adjusted the settings in
Cassandra but I'm still getting this error.
NonTransientConnectionExce
Did you solve the problem? I'm stuck with exactly the same problem now. Let
me know if you already had a solution,please.
--
View this message in context:
http://lucene.472066.n3.nabble.com/problem-with-data-import-handler-delta-import-due-to-use-of-multiple-datasource-tp4093698p4314273
Thanks. We did implement the delete by query on another core and thought of
giving the delta import a try here. Looks like differential via full index and
deletes using delete by id/query is the way to go.
-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com]
Sent
tedPkQuery with the full import is that
>> dataimporter.last_index_time is no longer accurate.
>>
>> Below is an example of my deletedPkQuery. If run the full-import for a
>> differential index, that would update the last index time. Running the
>> delta import
itched to delta imports. The
> problem with using deletedPkQuery with the full import is that
> dataimporter.last_index_time is no longer accurate.
>
> Below is an example of my deletedPkQuery. If run the full-import for a
> differential index, that would update the last index time.
of my deletedPkQuery. If run the full-import for a
differential index, that would update the last index time. Running the delta
import to remove the deleted records then wouldn't do anything since nothing
changed since the last index time.
deletedPkQuery="SELECT id
Sowmya,
My memory is that the cache feature does not work with Delta Imports. In fact,
I believe that nearly all DIH features except straight JDBC imports do not work
with Delta Imports. My advice is to not use the Delta Import feature at all as
the same result can (often more-efficiently
Good morning,
Can CachedSqlEntityProcessor be used with delta-import? In my setup when
running a delta-import with CachedSqlEntityProcessor, the child entity values
are not correctly updated for the parent record. I am on Solr 4.3. Has anyone
experienced this and if so how to resolve it
hello guys,
I met a problem when i using the solrcloud mode. When the solr
instance run delta-import, it may take
some time to be finished( my data source is mysql database). So during this
time, the new added documents
will loss, the deltaQuery, i use SUBDATE(${dih.last_index_time
un, Aug 21, 2016 at 7:43 AM, Or Gerson wrote:
> > Hello,
> >
> > I have Solr version 4.3.0.
> >
> > I have encountered a problem where document is not returning from queries
> > after delta import although the delta import does not report that a
> > d
-softcommit-and-commit-in-sorlcloud/
Best,
Erick
On Sun, Aug 21, 2016 at 7:43 AM, Or Gerson wrote:
> Hello,
>
> I have Solr version 4.3.0.
>
> I have encountered a problem where document is not returning from queries
> after delta import although the delta import does not repor
Hello,
I have Solr version 4.3.0.
I have encountered a problem where document is not returning from queries
after delta import although the delta import does not report that a
document has been deleted.
i have a document that is composed of several fields , the delta import
looks for a field
lieve Solr currently has no capability of doing
this. Can someone please confirm based on your experience?
Also, does delta import work for this datasource? It dosen't seem
;re going to move to that eventually so we can
leverage our models instead of maintaining a separate data configuration.
Thank you for sharing the link.
--
View this message in context:
http://lucene.472066.n3.nabble.com/DIH-Caching-with-Delta-Import-tp4235598p4238094.html
Sent from the Solr - Use
James-2 wrote
>> The DIH Cache feature does not work with delta import. Actually, much of
>> DIH does not work with delta import. The workaround you describe is
>> similar to the approach described here:
>> https://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport
Dyer, James-2 wrote
> The DIH Cache feature does not work with delta import. Actually, much of
> DIH does not work with delta import. The workaround you describe is
> similar to the approach described here:
> https://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport ,
&
The DIH Cache feature does not work with delta import. Actually, much of DIH
does not work with delta import. The workaround you describe is similar to the
approach described here:
https://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport , which
in my opinion is the best way to
hank you.
--
View this message in context:
http://lucene.472066.n3.nabble.com/DIH-Caching-with-Delta-Import-tp4235598.html
Sent from the Solr - User mailing list archive at Nabble.com.
On 9/9/2015 4:27 PM, Scott Derrick wrote:
> I can't seem to get delta-imports to work with a FileDataSource DIH
The information I have says delta-import won't work with that kind of
entity.
http://wiki.apache.org/solr/DataImportHandler#Using_delta-import_command-1
Also, please make
I can't seem to get delta-imports to work with a FileDataSource DIH
full-import works fine.
delta-import always imports nothing, no error. I can add a new file or
change an existing one, no joy.
my requesthandler declaration
class="org.apache.solr.handler.dataimport.DataImp
.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224849.html
Sent from the Solr - User mailing list archive at Nabble.com.
i don't use SQL now. i'm adding documents manually.
db_id_s
--
View this message in context:
http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224762.html
Sent from the Solr - User mailing list archive at Nabble.com.
error that my
> required uuid field is missing.
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224701.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
--
Bill Bell
billnb...@gmail.com
cell 720-256-8076
Now I set db id as unique field and uuid field,which should be generated
automatically as required. but when i add document i have an error that my
required uuid field is missing.
--
View this message in context:
http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224701.html
As far as I understand I cant use 2 uniquefield. i need db id and uuid
because i moving data from database to solr index entirely. And temporaly i
need it to be compatble with delta-import, but in future i will use new only
uuid .
--
View this message in context:
http://lucene.472066.n3
que fields one with uuid and one with db id? what will
> happened then?
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224395.html
> Sent from the Solr - User mailing list archive at Nabble.com.
ok, can I use 2 unique fields one with uuid and one with db id? what will
happened then?
--
View this message in context:
http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224395.html
Sent from the Solr - User mailing list archive at Nabble.com.
On 8/20/2015 4:27 PM, CrazyDiamond wrote:
> i have a DIH delta-import query based on last_index_time.it works perfectly
> But sometimes i add documents to Solr manually and i want DIH not to add
> them again.I have UUID unique field and also i have "id" from database which
&
i have a DIH delta-import query based on last_index_time.it works perfectly
But sometimes i add documents to Solr manually and i want DIH not to add
them again.I have UUID unique field and also i have "id" from database which
is marked as pk in DIH schema. my question is : will
Id from the solr?
ths!
--
View this message in context:
http://lucene.472066.n3.nabble.com/How-to-Delta-Import-to-solr-by-Id-key-word-tp4224090.html
Sent from the Solr - User mailing list archive at Nabble.com.
On 7/27/2015 1:37 PM, Bade, Vidya (Sagar) wrote:
> I am currently using Solr 4.10.2 and having issues with Delta-imports. For
> some reason delta seems to be inconsistent when using query caching. I am
> using SqlEntityProcessor. To overcome the issue I want to try having two root
> entities - o
Hi,
I am currently using Solr 4.10.2 and having issues with Delta-imports. For some
reason delta seems to be inconsistent when using query caching. I am using
SqlEntityProcessor. To overcome the issue I want to try having two root
entities - one each for full and delta. Can someone help with a
I have the following data-config:
Now, when the object in the [locations] table is updated, my delta import
(/dataimport
I don't think you can since you can't query RSS normally. You just do full
import and override on ids.
Regards,
Alex
On 10 Mar 2015 7:16 pm, "Ednardo" wrote:
> Hi,
>
> How do I create a DataImportHandler using delta-import for rss feeds?
>
> Thanks!
Hi,
How do I create a DataImportHandler using delta-import for rss feeds?
Thanks!!
--
View this message in context:
http://lucene.472066.n3.nabble.com/Import-Feed-rss-delta-import-tp4192257.html
Sent from the Solr - User mailing list archive at Nabble.com.
On Tue, Feb 17, 2015 at 8:21 PM, Aniket Bhoi wrote:
> Hi Folks,
>
> I am running Solr 3.4 and using DIH for importing data from a SQL server
> backend.
>
> The query for Full import and Delta import is the same ie both pull the
> same data.
>
> Full and Del
Hi Folks,
I am running Solr 3.4 and using DIH for importing data from a SQL server
backend.
The query for Full import and Delta import is the same ie both pull the
same data.
Full and Delta import query:
SELECT KB_ENTRY.ADDITIONAL_INFO ,KB_ENTRY.KNOWLEDGE_REF
ID,SU_ENTITY_TYPE.REF
I'm deleting records in SOLR using deletedPkQuery. While executing delta
import in SOLR admin, I can see 1 document deleted. But when I query I'm
getting that document. I've Commit option checked in SOLR admin. My
deletedPkQuery is simple as follows. Am I missing anything here.
t;
>> I am very new to Solr but I have been playing around with it a bit and my
>> imports are all working fine. However, now I wish to perform a delta
>> import
>> on my query and I'm just getting nothing.
>>
>> I have the entity:
>>
>> &g
RDBMS you are using, but you probably don't need to work
around the column names at all.
On Thu, Feb 5, 2015 at 5:18 PM, willbrindle wrote:
> Hi,
>
> I am very new to Solr but I have been playing around with it a bit and my
> imports are all working fine. However, now I wish to
Hi,
I am very new to Solr but I have been playing around with it a bit and my
imports are all working fine. However, now I wish to perform a delta import
on my query and I'm just getting nothing.
I have the entity:
I am not too sure if ${dih.delta.id} is supposed to be id or id2
: I have solr installed on Debian and every time delta import takes place a
: file gets created in my root directory. The files that get created look
: like this
:
:
: dataimport?command=delta-import.1
that is exactly the output you would expect to see if you have a cron
somewhere, running
Is ' dataimport?command=delta-import.1' actually a file name? If this
the case, are you running the trigger from a cron job or similar? If I
am still on the right track, check your cron job/script and see if you
have misplaced new line, quote (e.g. MSWord quote instead of normal)
or
On 9/3/2014 3:19 AM, madhav bahuguna wrote:
> I have solr installed on Debian and every time delta import takes place a
> file gets created in my root directory. The files that get created look
> like this
I figure there's one of two possibilities:
1) You've got a misc
I have solr installed on Debian and every time delta import takes place a
file gets created in my root directory. The files that get created look
like this
dataimport?command=delta-import.1
dataimport?command=delta-import.2
.
.
.
dataimport?command=delta-import.30
Every time
1 - 100 of 545 matches
Mail list logo