Re: Scheduling Cache Refresh

2020-05-25 Thread nithin91
Thanks for the inputs.It is really helpful.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deploying Ignite Code

2020-05-25 Thread nithin91
Thanks for the inputs.It is really helpful.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-24 Thread nithin91
Hi 

Can anyone please help me with your inputs



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deploying Ignite Code

2020-05-24 Thread nithin91
Can anyone please help me with your inputs



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deploying Ignite Code

2020-05-19 Thread nithin91
Hi

Thanks for the inputs.

I have few queries.

For example i have few caches which have custom key a JAVA POJO Class and
custom Value which is also a JAVA POJO Class.Currently i am unable to do
Cache.invoke and Cache.invokeall operations with peer class loading(i.e
facing class Not Found Exception),so decided to deploy the jar files in each
of the server nodes.

In this case do  i need to create a JAR file for the POJO Classes alone or
for the entire eclipse project and should  i include the maven dependencies
also in the JAR File(i.e is Uber Jar needed).






--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-19 Thread nithin91
Hi 

Thanks For the inputs.

To the question is this way of refreshing the cache is efficient, i mean 
refreshing the cache using the following process.

We have created a REST-API using Spring Boot which refreshes a particular
cache when a GET Request is triggered.

*Sample REST-API Url :*

http://domainname.com/api/v1/refresh?cachename="MyCache;.

To refresh the cache, in the Spring Boot Application, we have started a
client node which stays connected to the cluster as long as the Application
forever.

Also can you please let me know whether DataSteamer API provided by Ignite
can be accessed through any of the thin clients like JAVA,Python,JS.




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-18 Thread nithin91
Hi 

Can anyone help me by providing inputs to the questions posted in my
previous message.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite.cache.loadcache.Does this method do Increamental Load?

2020-05-18 Thread nithin91
Thanks for the inputs.It is really helpful.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Deploying Ignite Code

2020-05-18 Thread nithin91
Hi 

Can any let me know whether should i deploy my Ignite Project developed in
UNIX in each and every node.If Yes, then should it be deployed on the bin
folder shipped with ignite or can it be kept in any folder.


Currently we are following steps

1. We created a  bean file which has all the cache configuration details and
data region configuration details.

2. Content of this xml file is  copied into the default-config.xml file
shipped with ignite present on each Linux server node and then starting each
server node using nohup ./ignite.sh 


3. From my local system using the same xml file but adding an additional
property Client mode=true
and then running a standalone java program to load the cache.

With the help of peerclassloading enabled=True, i am able to  execute my
java program without deploying the classes in each server node.

But this method of execution is not working, if i use cache.invoke or
cache.invokeall methods as i am getting ClassNotfoundException even though
the class is present in my local machine.

Can you please let me know how to overcome this error.



Following is the log generated by the program

[20:08:21]__   
[20:08:21]   /  _/ ___/ |/ /  _/_  __/ __/ 
[20:08:21]  _/ // (7 7// /  / / / _/   
[20:08:21] /___/\___/_/|_/___/ /_/ /___/  
[20:08:21] 
[20:08:21] ver. 2.7.6#20190911-sha1:21f7ca41
[20:08:21] 2019 Copyright(C) Apache Software Foundation
[20:08:21] 
[20:08:21] Ignite documentation: http://ignite.apache.org
[20:08:21] 
[20:08:21] Quiet mode.
[20:08:21]   ^-- Logging by 'JavaLogger [quiet=true, config=null]'
[20:08:21]   ^-- To see **FULL** console log here add -DIGNITE_QUIET=false
or "-v" to ignite.{sh|bat}
[20:08:21] 
[20:08:21] OS: Windows 10 10.0 amd64
[20:08:21] VM information: Java(TM) SE Runtime Environment 1.8.0_131-b11
Oracle Corporation Java HotSpot(TM) 64-Bit Server VM 25.131-b11
[20:08:21] Please set system property '-Djava.net.preferIPv4Stack=true' to
avoid possible problems in mixed environments.
[20:08:21] Initial heap size is 254MB (should be no less than 512MB, use
-Xms512m -Xmx512m).
[20:08:21] Configured plugins:
[20:08:21]   ^-- None
[20:08:21] 
[20:08:21] Configured failure handler: [hnd=StopNodeOrHaltFailureHandler
[tryStop=false, timeout=0, super=AbstractFailureHandler
[ignoredFailureTypes=[SYSTEM_WORKER_BLOCKED,
SYSTEM_CRITICAL_OPERATION_TIMEOUT
[20:08:27] Message queue limit is set to 0 which may lead to potential OOMEs
when running cache operations in FULL_ASYNC or PRIMARY_SYNC modes due to
message queues growth on sender and receiver sides.
[20:08:27] Security status [authentication=off, tls/ssl=off]
[20:08:28] REST protocols do not start on client node. To start the
protocols on client node set '-DIGNITE_REST_START_ON_CLIENT=true' system
property.
[20:09:00] Performance suggestions for grid  (fix if possible)
[20:09:00] To disable, set -DIGNITE_PERFORMANCE_SUGGESTIONS_DISABLED=true
[20:09:00]   ^-- Enable G1 Garbage Collector (add '-XX:+UseG1GC' to JVM
options)
[20:09:00]   ^-- Specify JVM heap max size (add '-Xmx[g|G|m|M|k|K]' to
JVM options)
[20:09:00]   ^-- Set max direct memory size if getting 'OOME: Direct buffer
memory' (add '-XX:MaxDirectMemorySize=[g|G|m|M|k|K]' to JVM options)
[20:09:00]   ^-- Disable processing of calls to System.gc() (add
'-XX:+DisableExplicitGC' to JVM options)
[20:09:00] Refer to this page for more performance suggestions:
https://apacheignite.readme.io/docs/jvm-and-system-tuning
[20:09:00] 
[20:09:00] To start Console Management & Monitoring run
ignitevisorcmd.{sh|bat}
[20:09:00] 
[20:09:00] Ignite node started OK (id=6601bb78)
[20:09:00] Topology snapshot [ver=217, locNode=6601bb78, servers=2,
clients=1, state=ACTIVE, CPUs=16, offheap=9.0GB, heap=9.6GB]
[20:09:00]   ^-- Baseline [id=0, size=2, online=2, offline=0]
[20:09:03] Ignite node stopped OK [uptime=00:00:03.075]
Exception in thread "main" javax.cache.processor.EntryProcessorException:
class org.apache.ignite.binary.BinaryInvalidTypeException:
ignite.example.IgniteUnixImplementation.NumberandDateFormat
at
org.apache.ignite.internal.processors.cache.CacheInvokeResult.get(CacheInvokeResult.java:108)
at
org.apache.ignite.internal.processors.cache.IgniteCacheProxyImpl.invoke(IgniteCacheProxyImpl.java:1440)
at
org.apache.ignite.internal.processors.cache.IgniteCacheProxyImpl.invoke(IgniteCacheProxyImpl.java:1482)
at
org.apache.ignite.internal.processors.cache.GatewayProtectedCacheProxy.invoke(GatewayProtectedCacheProxy.java:1228)
at Load.Computejob.main(Computejob.java:22)
Caused by: class org.apache.ignite.binary.BinaryInvalidTypeException:
ignite.example.IgniteUnixImplementation.NumberandDateFormat
at
org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:707)
at
org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1758)
at

Re: Data streamer has been cancelled

2020-05-18 Thread nithin91
Got it. Thanks a lot. This is very useful



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data streamer has been cancelled

2020-05-18 Thread nithin91
Hi 

Implemented the code as suggested by you. Please find the code related to
this. Please let me know is this 
right way of implementing what you suggested.

Also can you please let me know the use of stmr.autoflushfrequency(2000)
method usage .If i pass higher number to this method,will that improve the
performance.

Map Originalobjs=new HashMap();--Contains all the
0.1 million key value pairs that has to be loaded

Map tempobjs=new HashMap();--Temp object that
will contain only 2000 
records at a time and which will be pushed to cache using data streamer

int j=0;
for (Map.Entry entry :
PieCountryAllocationobjs.entrySet()) { 
 
tempobjs.put(entry.getKey(), 
entry.getValue());
//For ever 2000 rows i am callling stmr.addData(tempobjs) and then
stmr.flush and stmr.close(false)
if((j%2000==0 && j!=0) ||

(PieCountryAllocationobjs.keySet().size() < 2000 &&
j==PieCountryAllocationobjs.keySet().size())
|| 
j==PieCountryAllocationobjs.keySet().size()
){
System.out.println(j);
IgniteDataStreamer stmr =
ignite.dataStreamer("PieCountryAllocationCache");
stmr.allowOverwrite(true);
stmr.addData(tempobjs);
stmr.flush();
stmr.close(false);
tempobjs.clear();
System.out.println("Stream 
Ended");
System.out.println(j);

}
j++;
 }



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deleting multiple entries from cache at once

2020-05-18 Thread nithin91
Thanks for sharing this.Its really helpful



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Service Node vs Data Node

2020-05-18 Thread nithin91
Hi 

W.r.t client mode i am clear, But what is the use of starting the compute
job on server node as when i start the the job  on server node following
things happen 


It is creating a new server node every time and the node is getting
disconnected once the job is done. But the problem with this behavior, since
a new node is created as 
a part of execution of this job, data will be re balanced such that data is
evenly distributed across
nodes and when the job completes, then this node goes down which means  data
on this node is lost.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deleting multiple entries from cache at once

2020-05-18 Thread nithin91
The method mentioned in the API is cache.removeAll() which removes  all the
elements is the cache but i want to remove certain entries from cache at
once like cache.removeAll(List of Keys). is there any such method or
efficient way to remove  entries corresponding to List of Keys at once .

*Similarly is there any such method or efficient way to update entries
corresponding to List of Keys at once .*



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Service Node vs Data Node

2020-05-18 Thread nithin91
Hi

I have initially two nodes,
But when i am initiating the compute job(IgniteCompute compute =
ignite.compute()) from server i.e Ignite ignite=Ignition.star("Server.xml")
it is creating a new server node every time and the node is getting
disconnected once the job is done. But the problem with this behaviour, when
node joins data will be rebalanced so that data is evenly distributed across
nodes but when the job completes then this node goes down which means  data
on this node is lost.

This  is the log i when i start  a compute job using
ignite=Ignition.star("Server.xml") 

 Topology snapshot [ver=3, locNode=92d08291, servers=3, clients=0,
state=ACTIVE, CPUs=8, offheap=13.0GB, heap=11.0GB]
[13:10:21]   ^-- Baseline [id=0, size=2, online=2, offline=0]

When the job completes execution, i get below

Ignite node stopped OK 

Please let me know whether my understanding is correct.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Service Node vs Data Node

2020-05-18 Thread nithin91
Thanks.This information is very helpful.




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Deleting multiple entries from cache at once

2020-05-17 Thread nithin91
Hi

Is there an API, to delete multiple entries from cache efficiently?



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Data streamer has been cancelled

2020-05-17 Thread nithin91
Hi 

Currently i am trying to load the data into ignite cache using data steamer
from Oracle DB.

Currently i have two server nodes deployed on two Linux servers and i am
executing this as a standalone java program from my local machine.

To achieve this i have followed the below steps.

1. Start in client Mode by setting client node=true in the bean file.
2.  Fetch the data from Oracle DB using JDBC resultset and set the fetchsize
=10 for the prepared statement and load this data into a temporary Map
Object.
3. Iterate through the map object and load data into cache using
stmr.adddata API corresponding data steamer.

 Out of 1 lakh rows, only 35K rows are getting loaded and the client node is
stopped all of a sudden.Can anyone please help me in resolving this issue.


Following is the log generated by this program.Attached you the Program code
that i am executing 
and Client.xml file for your reference.
Client.xml
  
Java_Program.txt
  

what my understanding after looking at the log is , stmr.adddata retruns
IgniteFuture which means it is an async operation and since the program gets
ended after completion of iteration with some data yet to load in the cache.


May 17, 2020 1:35:29 PM org.apache.ignite.logger.java.JavaLogger error
SEVERE: DataStreamer operation failed.
class org.apache.ignite.IgniteCheckedException: Data streamer has been
cancelled: DataStreamerImpl [bufLdrSzPerThread=4096,
rcvr=org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$IsolatedUpdater@3b0ee03a,
ioPlcRslvr=null, cacheName=PieCountryAllocationCache, bufSize=512,
parallelOps=0, timeout=-1, autoFlushFreq=0,
bufMappings={be10cd31-aed7-448a-8fec-60fd72a62313=Buffer
[node=TcpDiscoveryNode [id=be10cd31-aed7-448a-8fec-60fd72a62313,
addrs=[127.0.0.1, 172.30.197.5], sockAddrs=[/127.0.0.1:47500,
azuswvlnx00687.corp.frk.com/172.30.197.5:47500], discPort=47500, order=1,
intOrder=1, lastExchangeTime=1589702634687, loc=false,
ver=2.7.6#20190911-sha1:21f7ca41, isClient=false], isLocNode=false,
idGen=85, sem=java.util.concurrent.Semaphore@5b12012e[Permits = 15],
perNodeParallelOps=64, entriesCnt=2944, locFutsSize=0, reqsSize=49],
92cbed29-93d6-428d-a3da-a30e4264aa20=Buffer [node=TcpDiscoveryNode
[id=92cbed29-93d6-428d-a3da-a30e4264aa20, addrs=[127.0.0.1, 172.30.197.6],
sockAddrs=[azuswvlnx00688.corp.frk.com/172.30.197.6:47500,
/127.0.0.1:47500], discPort=47500, order=2, intOrder=2,
lastExchangeTime=1589702635145, loc=false, ver=2.7.6#20190911-sha1:21f7ca41,
isClient=false], isLocNode=false, idGen=99,
sem=java.util.concurrent.Semaphore@2f7dcef2[Permits = 0],
perNodeParallelOps=64, entriesCnt=1152, locFutsSize=0, reqsSize=64]},
cacheObjProc=GridProcessorAdapter [],
cacheObjCtx=org.apache.ignite.internal.processors.cache.binary.CacheObjectBinaryContext@4a3be6a5,
cancelled=true, cancellationReason=null, failCntr=0,
activeFuts=GridConcurrentHashSet [elements=[GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=1579584742],
GridFutureAdapter [ignoreInterrupts=false, state=INIT, res=null,
hash=2059282367], GridFutureAdapter [ignoreInterrupts=false, state=INIT,
res=null, hash=1027006452], GridFutureAdapter [ignoreInterrupts=false,
state=INIT, res=null, hash=950125603], GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=227100877],
GridFutureAdapter [ignoreInterrupts=false, state=INIT, res=null,
hash=741370455], GridFutureAdapter [ignoreInterrupts=false, state=INIT,
res=null, hash=1536478396], GridFutureAdapter [ignoreInterrupts=false,
state=INIT, res=null, hash=1081344572], GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=1538745405],
GridFutureAdapter [ignoreInterrupts=false, state=INIT, res=null,
hash=2000563893], GridFutureAdapter [ignoreInterrupts=false, state=INIT,
res=null, hash=997918120], GridFutureAdapter [ignoreInterrupts=false,
state=INIT, res=null, hash=985679444], GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=1164436797],
GridFutureAdapter [ignoreInterrupts=false, state=INIT, res=null,
hash=954937264], GridFutureAdapter [ignoreInterrupts=false, state=INIT,
res=null, hash=339126187], GridFutureAdapter [ignoreInterrupts=false,
state=INIT, res=null, hash=1053856141], GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=862152124],
GridFutureAdapter [ignoreInterrupts=false, state=INIT, res=null,
hash=1934729582]]],
jobPda=org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$DataStreamerPda@3721177d,
depCls=null, fut=DataStreamerFuture [super=GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=1986676021]],
publicFut=IgniteFuture [orig=DataStreamerFuture [super=GridFutureAdapter
[ignoreInterrupts=false, state=INIT, res=null, hash=1986676021]]],
disconnectErr=null, closed=true, lastFlushTime=1589702649336,
skipStore=false, keepBinary=false, 

Re: Service Node vs Data Node

2020-05-17 Thread nithin91
Thanks for the detailed explanation.It is very helpful.

I have one query.

To execute a compute job on server nodes, should i use the below command by
starting as 
client node.Please correct me if  i am wrong.
IgniteCompute compute = ignite.compute(cluster.forRemotes());



Also can you please confirm, 
that a compute job can be initiated only on a client node(i.e by setting the
property set client node=true in the bean file).







--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Service Node vs Data Node

2020-05-16 Thread nithin91
Hi 

We are exploring the iginite Service Grid.

Can any explain the difference between Service node and Data node.

i.e Currently i have 2 data nodes.

If i need to have a service deployed, then should i need to have a new node
which doesn't store data(based on the explanation given in the video
https://www.youtube.com/watch?v=nZ57o330yD0 ) or can i have the services
deployed on any of the data nodes directly.

Also if we want to use compute grid, can we connect in client mode or do we
need to start as server node.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-15 Thread nithin91
Hi 

Issue got resolved after doing the following changes.
Attached you the updated code,client.xml(will be in src/resources folder)
and pom.xml file.
srcandpomupdated.zip

  

1. Removed the following dependency from pom.xml

org.apache.ignite

ignite-spring-boot-autoconfigure-ext
1.0.0

2.  Modifying the IgniteConfig.class such that  
changing the method igniteInstance to return Ignite 
instance by calling 
Ignition.start("Ignite-Client.xml").
Eariler it used to return IgniteConfiguration
by using the following piece of code.Also earlier 
i haven't used IgniteClient.xml.

IgniteConfiguration cfg = new IgniteConfiguration();

// The node will be started as a client node.
cfg.setClientMode(true);

// Classes of custom Java logic will be transferred over the 
wire
from this app.
cfg.setPeerClassLoadingEnabled(true);

// Setting up an IP Finder to ensure the client can locate the
servers.
TcpDiscoveryMulticastIpFinder ipFinder = new
TcpDiscoveryMulticastIpFinder();
ipFinder.setAddresses(Collections.singletonList("Ipaddress"));
cfg.setDiscoverySpi(new 
TcpDiscoverySpi().setIpFinder(ipFinder));
// Starting the node
  
cfg.setSqlSchemas("PIE");
retrun cfg


Although this seems to be working.
I have following queries with this approach.Can you please
provide your inputs/suggestions.



We are going to expose this SpringBoot
App as a RestFul API where we are trying to
refresh the Ignite Cache by sending  request to this
Application whenever the data in Oracle changes.
.
As we are trying to expose this APP as a API,
which means that this  Ignite Client Node 
instance launched from Spring boot Application will 
be active for ever.Now my queries are

1. Is it ok to have a client node which 
is continously in connected state forever.
If Ok,then how many such client node can exist.
2. Can this Application, be hosted on the   
one of the server node with different port.
3. Also will client node connect automatically 
to the cluster if the client node has  lost 
communication with server because if the client 
node is down and if there is an API request
, then the application will throw an error.
4. Is this way of refreshing the cache is efficient.
If no, can you please suggest a better way.
5. Also take provide an example on the possibility
of  having the REST API calling a
compute job which would
execute in a separate server
https://apacheignite.readme.io/docs/compute-grid
Also i have few more queries related to Ignite 
Behaviour.It would be really helpful if you can 
provide some inputs.

1. Currently we are using 
ignite.cache("CacheName").loadCache(null) method
to load  the cache initially and also increamentally as 
this method loads data very fast almost 0.1 million
in 1 minute. For increamental load also  we are using this 
method 
by passing an additional argument i.e. Custom 
Sql  to load/update only the records that have created/changed. 
but what is happening is if the key doesn't 
exist it is inserting the data  but if the key is present
it is not updating its value.
Why is the value not updated if the key is present.

2. To overcome this, i have to build my own custom 
cache store as mentioned in the following link
https://apacheignite.readme.io/v1.9/docs/data-loading.
But the problem with this approach is  i need to use JDBC
result set to load the data which is very slow as 
we have to insert row by row.
Is there a better way to update the keys.


3. How to load data faster with custom cache store
as loading the data which is very slow as 
  

Re: Restarting the Main node when it goes down

2020-05-15 Thread nithin91
Thanks this information is helpful



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite.cache.loadcache.Does this method do Increamental Load?

2020-05-15 Thread nithin91
Then what is the best way to do perform incremental load.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Critical System error Detected

2020-05-15 Thread nithin91
Yes I am able to ping the IP address from my local machine and yes, i am
specifying* only  the IP address  corresponding to a particular server node
in the Client node bean file*.Is it better to have all the server node IP
address in the Client nodes under the following section of the bean file.










--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Restarting the Main node when it goes down

2020-05-15 Thread nithin91
Also what should the address property have in bean file corresponding to
Client Node.Should it have the list
of all server nodes IP address?Also Can you please explain what happens if
list of all server nodes IP address is specified.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Restarting the Main node when it goes down

2020-05-15 Thread nithin91
Actually in the node A , bean file no IP address is specified for the
property "addresses" corresponding to
class
"org.apache.ignite.spi.discovery.tcp.ipfinder.multicast.TcpDiscoveryMulticastIpFinder"
but in the node B
IP address of A is specified.

Is it because Node A bean file does not have any IP address under address
field , so it is not able to recognize node B even though node B bean file
has IP address corresponding to node A.

Is it required to specify the property "addresses" corresponding to
class "org.apache.ignite.spi.discovery.tcp.ipfinder.multicast" for the main
node .(i.e Node initially started when the launching the ignite cluster)

Also if i have 3 Nodes, what is the value that property "addresses"
corresponding to
class
"org.apache.ignite.spi.discovery.tcp.ipfinder.multicast.TcpDiscoveryMulticastIpFinder"
 
  should have in each of these bean files.Since this property accepts a
list, so do we need to specify all 3 nodes  ip address in each bean file
that is deployed on each node.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-15 Thread nithin91
Please find below the log info 


  .     ___ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  || .__|_| |_|_| |_\__, | / / / /
 =|_|==|___/=/_/_/_/
 :: Spring Boot ::(v2.2.5.RELEASE)

2020-05-15 10:35:32.005  INFO 14684 --- [   main]
org.example.springbootapi.App: Starting App on HYDNBK155967 with
PID 14684 (C:\Users\ngovind\eclipse-workspace\springbootapi\target\classes
started by ngovind in C:\Users\ngovind\eclipse-workspace\springbootapi)
2020-05-15 10:35:32.008  INFO 14684 --- [   main]
org.example.springbootapi.App: No active profile set, falling
back to default profiles: default
2020-05-15 10:35:32.564  INFO 14684 --- [   main]
.s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data JPA
repositories in DEFAULT mode.
2020-05-15 10:35:32.586  INFO 14684 --- [   main]
.s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository
scanning in 13ms. Found 0 JPA repository interfaces.
2020-05-15 10:35:33.450  INFO 14684 --- [   main]
o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat initialized with port(s):
3000 (http)
2020-05-15 10:35:33.458  INFO 14684 --- [   main]
o.apache.catalina.core.StandardService   : Starting service [Tomcat]
2020-05-15 10:35:33.459  INFO 14684 --- [   main]
org.apache.catalina.core.StandardEngine  : Starting Servlet engine: [Apache
Tomcat/9.0.31]
2020-05-15 10:35:33.696  INFO 14684 --- [   main]
org.apache.jasper.servlet.TldScanner : At least one JAR was scanned for
TLDs yet contained no TLDs. Enable debug logging for this logger for a
complete list of JARs that were scanned but no TLDs were found in them.
Skipping unneeded JARs during scanning can improve startup time and JSP
compilation time.
2020-05-15 10:35:33.699  INFO 14684 --- [   main]
o.a.c.c.C.[Tomcat].[localhost].[/]   : Initializing Spring embedded
WebApplicationContext
2020-05-15 10:35:33.699  INFO 14684 --- [   main]
o.s.web.context.ContextLoader: Root WebApplicationContext:
initialization completed in 1645 ms
2020-05-15 10:35:33.955  INFO 14684 --- [   main]
com.zaxxer.hikari.HikariDataSource   : HikariPool-1 - Starting...
2020-05-15 10:35:34.073  INFO 14684 --- [   main]
com.zaxxer.hikari.HikariDataSource   : HikariPool-1 - Start completed.
2020-05-15 10:35:34.120  INFO 14684 --- [   main]
o.hibernate.jpa.internal.util.LogHelper  : HHH000204: Processing
PersistenceUnitInfo [name: default]
2020-05-15 10:35:34.184  INFO 14684 --- [   main]
org.hibernate.Version: HHH000412: Hibernate ORM core
version 5.4.12.Final
2020-05-15 10:35:34.305  INFO 14684 --- [   main]
o.hibernate.annotations.common.Version   : HCANN01: Hibernate Commons
Annotations {5.1.0.Final}
2020-05-15 10:35:34.406  INFO 14684 --- [   main]
org.hibernate.dialect.Dialect: HHH000400: Using dialect:
org.hibernate.dialect.H2Dialect
2020-05-15 10:35:34.619  INFO 14684 --- [   main]
o.h.e.t.j.p.i.JtaPlatformInitiator   : HHH000490: Using JtaPlatform
implementation:
[org.hibernate.engine.transaction.jta.platform.internal.NoJtaPlatform]
2020-05-15 10:35:34.625  INFO 14684 --- [   main]
j.LocalContainerEntityManagerFactoryBean : Initialized JPA
EntityManagerFactory for persistence unit 'default'
2020-05-15 10:35:34.687  WARN 14684 --- [   main]   
  
: Failed to resolve default logging config file:
config/java.util.logging.properties
Console logging handler is not configured.
2020-05-15 10:35:34.690  WARN 14684 --- [   main]
o.apache.ignite.internal.util.typedef.G  : Ignite work directory is not
provided, automatically resolved to:
C:\Users\ngovind\eclipse-workspace\springbootapi\ignite\work
2020-05-15 10:35:34.783  INFO 14684 --- [   main]
org.apache.ignite.internal.IgniteKernal  : 

>>>__    
>>>   /  _/ ___/ |/ /  _/_  __/ __/  
>>>  _/ // (7 7// /  / / / _/
>>> /___/\___/_/|_/___/ /_/ /___/   
>>> 
>>> ver. 2.7.6#20190911-sha1:21f7ca41
>>> 2019 Copyright(C) Apache Software Foundation
>>> 
>>> Ignite documentation: http://ignite.apache.org

2020-05-15 10:35:34.783  INFO 14684 --- [   main]
org.apache.ignite.internal.IgniteKernal  : Config URL: n/a
2020-05-15 10:35:34.798  INFO 14684 --- [   main]
org.apache.ignite.internal.IgniteKernal  : IgniteConfiguration
[igniteInstanceName=null, pubPoolSize=8, svcPoolSize=8, callbackPoolSize=8,
stripedPoolSize=8, sysPoolSize=8, mgmtPoolSize=4, igfsPoolSize=8,
dataStreamerPoolSize=8, utilityCachePoolSize=8,
utilityCacheKeepAliveTime=6, p2pPoolSize=2, qryPoolSize=8,
igniteHome=null,
igniteWorkDir=C:\Users\ngovind\eclipse-workspace\springbootapi\ignite\work,

Re: Critical System error Detected

2020-05-15 Thread nithin91
Also i am not getting this error when i am connecting with node js thin
Client.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Critical System error Detected

2020-05-15 Thread nithin91
Hi 

I two ignite nodes running on Linux and the bean file that consists of cache
configuration is deployed on both the nodes.I am trying to run a simple
stand alone Java Program and i am getting the following error.

Can any help me why i am getting this error.

Attaching the Java program i am running and Client.xml file.
 
Java_Program.txt
  
Client-XML.xml
  

May 15, 2020 3:38:27 PM org.apache.ignite.logger.java.JavaLogger error
SEVERE: Blocked system-critical thread has been detected. This can lead to
cluster-wide undefined behaviour [threadName=tcp-client-disco-msg-worker,
blockedFor=17s]
May 15, 2020 3:38:27 PM java.util.logging.LogManager$RootLogger log
SEVERE: Critical system error detected. Will be handled accordingly to
configured handler [hnd=StopNodeOrHaltFailureHandler [tryStop=false,
timeout=0, super=AbstractFailureHandler
[ignoredFailureTypes=[SYSTEM_WORKER_BLOCKED,
SYSTEM_CRITICAL_OPERATION_TIMEOUT]]], failureCtx=FailureContext
[type=SYSTEM_WORKER_BLOCKED, err=class o.a.i.IgniteException: GridWorker
[name=tcp-client-disco-msg-worker, igniteInstanceName=null, finished=false,
heartbeatTs=1589537289382]]]
class org.apache.ignite.IgniteException: GridWorker
[name=tcp-client-disco-msg-worker, igniteInstanceName=null, finished=false,
heartbeatTs=1589537289382]
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1831)
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1826)
at
org.apache.ignite.internal.worker.WorkersRegistry.onIdle(WorkersRegistry.java:233)
at
org.apache.ignite.internal.util.worker.GridWorker.onIdle(GridWorker.java:297)
at
org.apache.ignite.internal.processors.timeout.GridTimeoutProcessor$TimeoutWorker.body(GridTimeoutProcessor.java:221)
at
org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:120)
at java.lang.Thread.run(Thread.java:748)

May 15, 2020 3:38:39 PM org.apache.ignite.logger.java.JavaLogger error
SEVERE: Blocked system-critical thread has been detected. This can lead to
cluster-wide undefined behaviour [threadName=tcp-client-disco-msg-worker,
blockedFor=30s]
May 15, 2020 3:38:39 PM java.util.logging.LogManager$RootLogger log
SEVERE: Critical system error detected. Will be handled accordingly to
configured handler [hnd=StopNodeOrHaltFailureHandler [tryStop=false,
timeout=0, super=AbstractFailureHandler
[ignoredFailureTypes=[SYSTEM_WORKER_BLOCKED,
SYSTEM_CRITICAL_OPERATION_TIMEOUT]]], failureCtx=FailureContext
[type=SYSTEM_WORKER_BLOCKED, err=class o.a.i.IgniteException: GridWorker
[name=tcp-client-disco-msg-worker, igniteInstanceName=null, finished=false,
heartbeatTs=1589537289382]]]
class org.apache.ignite.IgniteException: GridWorker
[name=tcp-client-disco-msg-worker, igniteInstanceName=null, finished=false,
heartbeatTs=1589537289382]
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1831)
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1826)
at
org.apache.ignite.internal.worker.WorkersRegistry.onIdle(WorkersRegistry.java:233)
at
org.apache.ignite.internal.util.worker.GridWorker.onIdle(GridWorker.java:297)
at
org.apache.ignite.internal.processors.timeout.GridTimeoutProcessor$TimeoutWorker.body(GridTimeoutProcessor.java:221)
at
org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:120)
at java.lang.Thread.run(Thread.java:748)

May 15, 2020 3:39:00 PM org.apache.ignite.logger.java.JavaLogger error
SEVERE: Blocked system-critical thread has been detected. This can lead to
cluster-wide undefined behaviour [threadName=grid-nio-worker-tcp-comm-0,
blockedFor=12s]
May 15, 2020 3:39:00 PM java.util.logging.LogManager$RootLogger log
SEVERE: Critical system error detected. Will be handled accordingly to
configured handler [hnd=StopNodeOrHaltFailureHandler [tryStop=false,
timeout=0, super=AbstractFailureHandler
[ignoredFailureTypes=[SYSTEM_WORKER_BLOCKED,
SYSTEM_CRITICAL_OPERATION_TIMEOUT]]], failureCtx=FailureContext
[type=SYSTEM_WORKER_BLOCKED, err=class o.a.i.IgniteException: GridWorker
[name=grid-nio-worker-tcp-comm-0, igniteInstanceName=null, finished=false,
heartbeatTs=1589537327904]]]
class org.apache.ignite.IgniteException: GridWorker
[name=grid-nio-worker-tcp-comm-0, igniteInstanceName=null, finished=false,
heartbeatTs=1589537327904]
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1831)
at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance$2.apply(IgnitionEx.java:1826)
at
org.apache.ignite.internal.worker.WorkersRegistry.onIdle(WorkersRegistry.java:233)
at
org.apache.ignite.internal.util.worker.GridWorker.onIdle(GridWorker.java:297)
at

Re: Scheduling Cache Refresh

2020-05-14 Thread nithin91
Hi 

Actually the Bean File which contains  the Cache Store Factory Details  and
Cache Configuration Details 
is deployed in Linux Server on which the Ignite Instance is running.I am try
to connect to the ignite instance running on the Linux Server from my local
machine.

PFA for the bean file that is deployed. Ignite-Server.xml
  





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-14 Thread nithin91
Hi 

Attached link is at the end of post.However i am attaching it again here.
srcandpom.zip
  

Please let me know in case of any difficulties while opening the attachment.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Restarting the Main node when it goes down

2020-05-14 Thread nithin91
Hi 

I have two nodes Node A(Main Node) and Node B with persistence enabled.When
node B  is restarted whenever it  goes down, it is able to recognize node A
and is joining the Baseline Topology but when Node A alone goes down and
once Node A is restarted, it is not able to recognize Node B and it is not
included in the base line topology.This issue is fixed, only if i stop Node
B and again restarts it.

Information of the baseline Topology that gets displayed on console,
whenever i connect to the cluster as client.

*-- Baseline [id=0, size=2, online=1, offline=1]*

Is there a way to fix this issue with just restarting Node A instead of
stopping and starting the node B.

Is this happening because of enabling Persistence.Also can anyone please let
me know what needs to be done, if the same thing happens without Persistence
being enabled.




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh

2020-05-14 Thread nithin91
Hi 

Attached the Sample Spring Boot Application that i am running in my local
machine to connect to ignite cluster deployed in Unix as a client node and
refreshes the cache when ever a GET Request is made.

Facing the following error.Can you please help me in resolving the errors

*org.apache.ignite.IgniteException: Spring application context resource is
not injected.*

I am facing this error while trying to refresh the cache using the following
method

ignite.cache("NumberandDateFormatCache").loadCache(null,"java.lang.Long",
"select a.*,row_number() over(order by COUNTRY_CODE) AS ID FROM
\"vwNumberAndDateFormat\" a where rownum<=10 ");

but the application gets executed fine when i try to get the list of caches
using the following method

ignite.cacheNames();

Attached the entire project for your reference.Can you please help me in
resolving this issue.

Also can you please let me know is this the right way to instantiate ignite
instance for a building
Rest-API using Spring Boot.




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deploying the Ignite Maven Project in LINUX

2020-05-08 Thread nithin91
Thanks for sharing the link. It is really very helpful.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Deploying the Ignite Maven Project in LINUX

2020-05-07 Thread nithin91
Hi 

 I am new to JAVA. If possible , can you please share a link on how to
create fat jar(i.e jar with all dependencies) using eclipse.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Deploying the Ignite Maven Project in LINUX

2020-05-05 Thread nithin91
Hi 

We have couple of ignite nodes running in LINUX and the cache configuration
details are specified in the bean file and deployed on both the nodes.Now i
am connecting to one of the nodes as client and loading the cache from my
local system.

Is there a way, i can deploy the JAVA code  with Maven Dependency in LINUX
and run the program in LINUX.

I am using Eclipse as IDE.








--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Scheduling Cache Refresh

2020-05-05 Thread nithin91
Hi ,

We are using Ignite as a caching layer and loading the data from Oracle to
Ignite using JDBC POJO method
ignite.cache("CacheName").loadCache(null,"java.lang.String","CustomSql").

Now we want to schedule the refresh in such a way that whenever the job that
updates the Oracle Table has completed corresponding cache refresh job has
to be triggered.

We thought about going with following approach


Build a Rest API using Spring Boot that accepts cache name as the query
parameter post which 
corresponding cache would be refreshed by calling the JDBC POJO method
   ignite.cache("CacheName").loadCache(null,"java.lang.String","CustomSql").


Once the job that updates the Oracle Table has completed,a HTTP Get
Request would be triggered with 
the Cache Name as Query Parameter.

 *  But this apprach isn't working because JDBC POJO method
   ignite.cache("CacheName").loadCache(null,"java.lang.String","CustomSql")
is not available from Spring 
   Boot.*

   Can anyone help me with best possible approach to schedule the cache
refresh.







--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite Persistence

2020-04-14 Thread nithin91
Thanks for sharing this useful info



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Ignite Persistence

2020-04-10 Thread nithin91
Hi 

Currently we are using ignite as an in-memory cache.We are now trying to
explore the ignite persistence to avoid reloading the caches whenever the
cluster restarts.

Had a walk through of the documentation and have one question.Can any one
help me answer the following question.

For example i have 2 caches, i want to persist data present in one cache but
not the second one.For this i have created two data regions in the bean file
one with persistence enabled that uses the default data region configuration
and the other without persistence with max size of the data region as 500 MB
and mentioned this data region name in the corresponding cache
configuration.

Now my question is, default data region configuration always use i.e. 20
percent of RAM, then does this mean the other data region which is
configured with size of 500 MB uses 500 MB from the RAM other than the 20
percent of the RAM used by default configuration or it uses the 500 MB
allocated for default region configuration.




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Loading Data from RDMS to ignite using Data Streamer

2020-03-26 Thread nithin91
Hi

Currently i am able to load 5.5 million data from Oracle Table to Ignite in
1 hr using JDBC Pojo Configuration.So i was curious that how much time will
it take to load the data using data streamers as a result of which i tried
to load 10,000 records using the Data Streamers and it took almost half an
hour.
So wanted to know whether technique i implemented for  Data Streamers to
load Data from Oracle to Ignite is correct or wrong. PFB for the psuedo code
i implemented for  Data Streamers to load Data from Oracle to Ignite.
Using this  method it is taking lot of time which is as expected as we are
looping through each
and every record .

try (IgniteDataStreamer stmr = ignite.dataStreamer("CacheName")) {
// Stream entries.


while(rs.hasNext()){-->rs is JDBC result set

Key keyobj=new Key(rs.getString(1));
Value valueobj=new
Value(rs.getString(2),rs.getString(3),rs.getString(4));
stmr.addData(keyobj , valueobj);
}
}

Can you please share a sample code streaming data  from any RDBMS to ignite.
Also can you please let me know if an external streamer is needed to stream
data from RDBMS to ignite



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Loading Data from RDMS to ignite using Data Streamer

2020-03-23 Thread nithin91
Hi ,

I am new to Ignite. i am able to load data from oracle into ignite using
Cache Pojo Store method ignite.cache.("cachename").loadcache(null) . But in
the documentation it was mentioned that Data Steamers will provide better 
performance when loading data into ignite cache.

*Documentation Reference:*

https://apacheignite.readme.io/docs/data-streamers

Following is the example provided for Data Streamers in the documentation

// Get the data streamer reference and stream data.
try (IgniteDataStreamer stmr =
ignite.dataStreamer("myStreamCache")) {
// Stream entries.
for (int i = 0; i < 10; i++)
stmr.addData(i, Integer.toString(i));
}

I have following questions by taking  this example as reference to stream
data from oracle table to ignite 
cache.

1. Is an external streamer needed to stream data from oracle to ignite to
use data streaming technique
as mentioned in above example.

2. If an external streamer is not required, then do we need to loop through
the jdbc result set and add data to the cache.Please find below for the
pseudo code implementation of the same.

try (IgniteDataStreamer stmr = ignite.dataStreamer("CacheName")) {
// Stream entries.


while(rs.hasNext()){-->rs is JDBC result set

Key keyobj=new Key(rs.getString(1));
Value valueobj=new
Value(rs.getString(2),rs.getString(3),rs.getString(4));
stmr.addData(keyobj , valueobj);
}
}

When i tried to load data into ignite cache from oracle using this  method
it is taking lot of time which is as expected as we are looping through each
and every record.Is this right method of implementing Data Streamers to load
data from Oracle to Ignite.If this is not right way,can anyone share sample
code to stream data from any RDBMS table to Ignite cache.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Ignite.cache.loadcache.Does this method do Increamental Load?

2020-03-23 Thread nithin91
Hi 

I am trying to load the data into ignite cache using JDBC Pojo Store method
ignite.cache("cacheName").loadCache(null).I have used this method and got
the following results for the following scenarios.

*Scenario-1:Trying to load the same key which is available in cache*

In this case, the value part corresponding to the key is not updated in the
cache based on the latest available record.

*Scenario -2:When loading a key which is not present in cache.*

in this case, it is appending the new key and value pair to the cache and
preserving the old data.


But my doubt is why in scenario-1, it is not  updating the value
corresponding to the when i am  trying to load the same key.

Does this method do incremental load?.
Is this the expected behavior or do i need to set any additional property in
the bean file.Attaching you the bean configuration of the cache.

cache.xml
  









--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Isolation during Ignite Cache load

2020-03-23 Thread nithin91
Hi ,

But it is not behaving that way. I have tested it  using the following
scenario.

Scenario:


Initially loading 1000 rows from oracle table with the following filter
where rownum<=1000 into ignite cache.
During the cache load process, i checked the count using sql fields query i
got the response as number of records that are loaded into the cache till
that point i.e  100,200 and finally 1000.

Again started the load into ignite cache  from oracle table with the same
filter where rownum<=1000.
Now during the cache load time, till 1000 rows are loaded it is showing the
count as 1000 but once the count crosses 1000 it is showing the count as
1001,1002 and finally 2000.

Can you please help me fix this issue (i.e While loading Ignite Cache using
JDBC POJO Ignite.cache.loadcache method,  ignite should display the data
prior to the start of data load instead of displaying the data that is
loaded till that point when a query is submitted during the load process. As
we are using the ignite cache for front-end websites,there might be
scenarios where the  user queries the cache against a specific value for
which  he might get an empty response in case if the requested data is not
yet loaded when the cache load process is running. Instead the problem will
be resolved if ignite always shows the data prior to start of data load if
the cache is queried during data load process).









--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Isolation during Ignite Cache load

2020-03-18 Thread nithin91
Hi ,

Filter in the sense,  SQL Fields Query with where clause is executed. I hope
the following example helps you understand my query.

*Scenario:*

For example there is a cache named Products which has 1 million records with
two distinct product categories(i.e. Furniture,Electronics) among these 1
million records. 

Now once the Data load is started , if i execute a sql fields query like
below, will i get the count as 1 million or only the count of records that
are loaded so far.

select count(1) from Products where category='Electronics'



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Data Isolation during Ignite Cache load

2020-03-17 Thread nithin91
Hi ,

We have Created a cache in Ignite that gets loaded with Data from
Oracle.Cache is getting loaded using JDBC POJO Store method
Ignite.Cache("Cache Name").loadCache(null).Can anyone explain how will
ignite  handle the below scenario when loading data using the method
Ignite.Cache("Cache Name").loadCache(null).

*Scenario:*
 
We have built a rest service on top this cache and sends the data in the
cache as response after applying some filters.Now During the time of Data
load, if the user sends a get request and when we try to fetch the data from
ignite cache will the filters be applied only the data that is loaded so far
or does the filters be applied on the data that is available prior to the
start of data load.
*Please note we have set the Atomicity mode is set as Atomic for cache.*


In case if  the filters are applied only on the data that is loaded so far,
then is there a property that i need to set so that until the cache load is
loaded , any query on the cache should fetch only the data that is available
prior to the start of data load.Currently as per my observation,  i am able
to see that the filters are applied only on the data that is loaded so far.

Also can anyone help me the possible ways to schedule the Ignite cache
refresh such that the ignite cache refresh job has to be triggered on the
success condition of another unix shell script job.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load from Oracle to Ignite is very slow

2020-02-20 Thread nithin91
Hi 

When i executed on the server , Data is getting loaded very fast.Thanks for
the inputs.

With respect to Data Steamer, it would be really helpful if you can share
any sample code other than the one provided in documentation.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Getting error while extracting Date Fields from ignite cache using node express

2020-02-20 Thread nithin91
Hi,

We have an Oracle Table for which a corresponding cache/table is created in
ignite and loaded using Cache JDBC POJO Store. On top of this cache a Rest
API is built  using node express.

We are fetching the data from the cache using sql fields query and the
programming language used is node js. We are able to fetch all the fields
except the Date Fields and when we are trying to fetch the date fields we
are getting the following *error *after execution of the statement const
data=await  cursorProductDetails.getAll();


*Error: Type type code -2 is not supported*

Java Data type for this Field is java.sql.Date


In if i cast the date field as varchar in sql fields query this error is not
encountered.Even i didnt encounter this error when i am fetching the data
using Java and Python.Can anyone please let me know how to resolve this
issue without doing a cast as varchar and also the reason on why i am
getting this error while using node express alone.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load from Oracle to Ignite is very slow

2020-02-18 Thread nithin91
Hi 

We are doing POC, as a result of which we are running it in local mode.

Currently it is taking 25min to load 2 records with Cache JDBC POJO
Store.

Even i am giving the initial filter to reduce unnecessary records.


   
ignite.cache("PieProductRiskCache").loadCache(null,"ignite.example.IgniteUnixImplementation.PieProductRiskKey",
"select *  from Table where  
as_of_Date_Std='31-Dec-2019'");

Regarding the Data Steamer code i have shared, is that the way we implement
Data Steamers or is there another way of implementing Data Steamers.If the
approach is correct, then it will not not work right as we are looping
through the result set.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load from Oracle to Ignite is very slow

2020-02-18 Thread nithin91
I am having two nodes running on local machine.

Following is the logic i implemented to load data using Data Steamer.Can you
please check whether the implementation is correct and also can you please
share sample code on how to push 
entries to data streamer from multiple threads

public class PerformanceCacheStore {

public static void main(String[] args)  throws Exception {
 String url ="..";
 

  
 try(   Ignite ignite = Ignition.start("Ignite-Client.xml");
Connection conn = DriverManager.getConnection(url,"..", 
"..");  

PreparedStatement stmt = 
conn.prepareStatement("select  .. where 
COUNTRY_CODE=? and
as_of_Date_Std=?");
 

) {
 
   IgniteDataStreamer
stmr = ignite.dataStreamer("PieProductPerformanceCache");
 
stmt.setString(1,"USA");
stmt.setString(2,"31-Dec-2019");

ResultSet rs = stmt.executeQuery();
 
 
 
 while (rs.next()) {
PieProductPerformance perf=new PieProductPerformance();
/*
perf Setter Methods
  
 */

PieProductPerformanceKey perfkey=new 
PieProductPerformanceKey();

/*
perfKey Setter Methods
  
 */

stmr.addData(perfkey, perf);

 }
 
 System.out.println("Completed");
 
 }
 
 }
}




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Data Load from Oracle to Ignite is very slow

2020-02-18 Thread nithin91
Hi,

I have multiple oracle tables with more than 10 million rows. I want to load
these tables into Ignite cache.To load the cache I am using Cache JDBC Pojo
Store by getting the required project structure from Web Console.

But Loading the data using cache JDBC POJO Store (i.e.
ignite.cache("CacheName").loadCache(null)) is taking a lot of time.* Is
there any alternative approach to load the data from Oracle DB to ignite
cache*.

Tried using data steamer also but not clear on how to use it.It would be
helpful if some one can share the
sample code to load data to ignite cache using data steamer. 

I have one doubt reg the usage of Data Steamer,

Following is the process mentioned in documentation to implement Data
Steamer,

 // Stream words into the streamer cache.
  for (String word : text)
stmr.addData(word, 1L);
}

But for my case Looping  through the Result Set generated after executing
the prepared statement using JDBC Connection and add  each row to Data
Steamer.Will this be efficient as i have to loop through 
10 million rows.Please correct me if this is not right way of implementing
Data Steamer.

 






--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Loading cache from Oracle Table

2020-02-18 Thread nithin91
Hi Prasad,

Is there a improvement in performance. If so can you please share the sample
code as i am also looking for a similar solution.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Dynamic Cache Change not allowed

2020-02-13 Thread nithin91
Also one important observation is ,we are not getting this error
"dynamic cache change is not allowed" when Ignite server node and client
node  is running on local machine.Getting this error only when server node
is running in unix and trying to connect to this node from local system.
Should the entire project has to be deployed in unix?



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh using Ignite

2020-02-13 Thread nithin91
Following is the java code that loads the cache.

package Load;

import java.sql.Types;

import org.apache.ignite.Ignite;
import org.apache.ignite.IgniteCache;
import org.apache.ignite.Ignition;
import org.apache.ignite.cache.CacheAtomicityMode;
import org.apache.ignite.cache.store.jdbc.CacheJdbcPojoStore;
import org.apache.ignite.cache.store.jdbc.CacheJdbcPojoStoreFactory;
import org.apache.ignite.cache.store.jdbc.JdbcType;
import org.apache.ignite.cache.store.jdbc.JdbcTypeField;
import org.apache.ignite.cache.store.jdbc.dialect.OracleDialect;
import org.apache.ignite.configuration.CacheConfiguration;
import ignite.example.IgniteUnixImplementation.OrderDetails;
import ignite.example.IgniteUnixImplementation.OrderKey;

public class OrdersLoad {

private static final class CacheJdbcPojoStoreExampleFactory extends
CacheJdbcPojoStoreFactory {
/**
 * 
 */
private static final long serialVersionUID = 1L;

/** {@inheritDoc} */
@Override public CacheJdbcPojoStore create()
{

setDataSourceBean("dataSource");
return super.create();
}
}


private static CacheConfiguration
cacheConfiguration() {
CacheConfiguration cfg = new
CacheConfiguration<>("OrdersCache");

CacheJdbcPojoStoreExampleFactory storefactory =new
CacheJdbcPojoStoreExampleFactory();

storefactory.setDialect(new OracleDialect());

storefactory.setDataSourceBean("dataSource");

JdbcType jdbcType = new JdbcType();

jdbcType.setCacheName("OrdersCache");
jdbcType.setDatabaseSchema("PDS_CACHE");
jdbcType.setDatabaseTable("ORDERS2");

jdbcType.setKeyType("ignite.example.IgniteUnixImplementation.OrderKey");
jdbcType.setKeyFields(new JdbcTypeField(Types.INTEGER, "ORDERID",
Long.class, "OrderID"),
new JdbcTypeField(Types.INTEGER, "CITYID", Long.class, "CityID")


);

   
jdbcType.setValueType("ignite.example.IgniteUnixImplementation.OrderDetails");
jdbcType.setValueFields(
new JdbcTypeField(Types.VARCHAR, "PRODUCTNAME", String.class,
"Productname"),
new JdbcTypeField(Types.VARCHAR, "CUSTOMERNAME", String.class,
"CustomerName"),
new JdbcTypeField(Types.VARCHAR, "STORENAME", String.class,
"StoreName")
);

storefactory.setTypes(jdbcType);

cfg.setCacheStoreFactory(storefactory);

cfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);

cfg.setReadThrough(true);
cfg.setWriteThrough(true);
cfg.setSqlSchema("PIE");

return cfg;
}

public static void main(String[] args) throws Exception {
try (Ignite ignite = Ignition.start("Ignite-Client.xml")) {

System.out.println(">>> Loading cache OrderDetails");

IgniteCache cache =
ignite.getOrCreateCache(cacheConfiguration());

cache.clear();

ignite.cache("OrdersCache").loadCache(null);

System.out.println(">>> Loaded cache: OrdersCache
Size="+cache.size());

}
}
}





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh using Ignite

2020-02-13 Thread nithin91
Attached the bean file used



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Scheduling Cache Refresh using Ignite

2020-02-13 Thread nithin91
Thanks aealexsandrov. This information is very useful.

Also i have one more query.

Currently as a part of POC, installed Ignite in UNIX and trying to load the
data from Oracle DB to Ignite Cache using Cache JDBC Pojo Store.
   
As a part of this process, bean file is custom configured  to start
ignite
node in unix. Attached the bean file.
This bean file consists of both cache configuration details and
ignite
configuration details.
   
Once the node is running, we are trying to do the following
   
1. Connecting to the ignite node running on unix through eclipse by
creating a replica of
the attached bean file from local system and adding an additional property
in Bean file with
Client Mode = true and
   then loading the cache that are defined in the bean file deployed
in
unix using the
   following method from local system using JAVA
   
ignite.cache("CacheName").loadCache(null);
   
   * We are able to do this successfully.*
   
2.  Connecting to the ignite node running on unix by creating a
replica of
the attached bean file
in local system and adding an additional property in Bean
file with Client
Mode = true
and then trying to create a cache and configure the cache
and then finally
loading
the cache using the attached JAVA Code.
   
   
   * When we are trying this approach, we are getting an error
like dynamic
cache change
is not allowed.Not getting this error when Ignite server
node and client node  is running on local machine.Getting this error when
server node is running in unix and trying to connect to this node from local
system.*
   
It would be really helpful if you can help me in resolving
this issue.
   
If this not the right approach, then
Configuring all the caches in the bean file is the only
available
option?If this is case,
What should be the approach for  building some additional
caches in ignite
and load these Caches using Cache JDBC POJO Store when the node is running.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Loading and Fetching the Data using Node js.

2020-02-13 Thread nithin91
Hi 

Pasted below the code and error i got.Actually i am trying query an existing
cache using node js which is loaded using Cache JDBC Pojo Store in Java.It
would be really helpful share me if you have any sample code.

PS C:\Users\ngovind\NodeApp> node NodeIgnite.js
ERROR: Binary type has different field types [typeName=OrderId,
fieldName=OrderID, fieldTypeName1=long, fieldTypeName2=double]
(node:13596) UnhandledPromiseRejectionWarning: ReferenceError: igniteClient
is not defined
at start (C:\Users\ngovind\NodeApp\NodeIgnite.js:44:5)
at processTicksAndRejections (internal/process/task_queues.js:93:5)
(node:13596) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
This error originated either by throwing inside of an async function without
a catch block, or by rejecting a promise which was not handled with
.catch(). (rejection id: 1)
(node:13596) [DEP0018] DeprecationWarning: Unhandled promise rejections are
deprecated. In the future, promise rejections
that are not handled will terminate the Node.js process with a non-zero exit
code.




const IgniteClient = require('apache-ignite-client');
const IgniteClientConfiguration = IgniteClient.IgniteClientConfiguration;
const ObjectType = IgniteClient.ObjectType;
const CacheEntry = IgniteClient.CacheEntry;
const ComplexObjectType = IgniteClient.ComplexObjectType;

class OrderKey {
constructor(OrderID = null, CityID= null) {
this.OrderID = OrderID;
this.CityID = CityID;
}  
}

class OrderDetails {
constructor(Productname = null, CustomerName= null,StoreName=null) {
this.Productname = Productname;
this.CustomerName = CustomerName;
this.StoreName = StoreName;
}  
}

async function start(){
try {
const igniteClient = new IgniteClient();
await igniteClient.connect(new
IgniteClientConfiguration('127.0.0.1:10800'));
const cache = await igniteClient.getCache('OrdersCache');
const OrderKeyComplexObjectType = 
new ComplexObjectType(new OrderKey(0,0),'OrderId');
const OrderComplexObjectType = new ComplexObjectType(new
OrderDetails('','',''),'Orders')
;

   cache.setKeyType(OrderKeyComplexObjectType).
   setValueType(OrderComplexObjectType);

const data = await cache.get(new OrderKey(1,1));
   console.log(data.Productname);

}
catch (err) {
console.log('ERROR: ' + err.message);
}
finally {

igniteClient.disconnect();
 
console.log(" Data Fetch Completed");

}
}

start();



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Dynamic Cache Change not allowed

2020-02-12 Thread nithin91
Hi 

No I am creating new cache and configuring the cache from java in client
mode and trying to load the cache
from java in client mode.

Following is error i get.

Feb 13, 2020 11:34:40 AM org.apache.ignite.logger.java.JavaLogger error
SEVERE: Failed to send message: null
java.io.IOException: Failed to get acknowledge for message:
TcpDiscoveryClientMetricsUpdateMessage [super=TcpDiscoveryAbstractMessage
[sndNodeId=null, id=b9bb52d3071-613fd9b8-0c00-4dde-ba8f-8f5341734a3c,
verifierNodeId=null, topVer=0, pendingIdx=0, failedNodes=null,
isClient=true]]
at
org.apache.ignite.spi.discovery.tcp.ClientImpl$SocketWriter.body(ClientImpl.java:1398)
at org.apache.ignite.spi.IgniteSpiThread.run(IgniteSpiThread.java:62)

Feb 13, 2020 11:34:47 AM org.apache.ignite.logger.java.JavaLogger error
SEVERE: Failed to reconnect to cluster (consider increasing 'networkTimeout'
configuration property) [networkTimeout=5000]
[11:34:52] Ignite node stopped OK [uptime=00:00:24.772]
Exception in thread "main" javax.cache.CacheException: class
org.apache.ignite.IgniteClientDisconnectedException: Failed to execute
dynamic cache change request, client node disconnected.
at
org.apache.ignite.internal.processors.cache.GridCacheUtils.convertToCacheException(GridCacheUtils.java:1337)
at
org.apache.ignite.internal.IgniteKernal.getOrCreateCache0(IgniteKernal.java:3023)
at
org.apache.ignite.internal.IgniteKernal.getOrCreateCache(IgniteKernal.java:2992)
at Load.OrdersLoad.main(OrdersLoad.java:82)
Caused by: class org.apache.ignite.IgniteClientDisconnectedException: Failed
to execute dynamic cache change request, client node disconnected.
at
org.apache.ignite.internal.util.IgniteUtils$15.apply(IgniteUtils.java:952)
at
org.apache.ignite.internal.util.IgniteUtils$15.apply(IgniteUtils.java:948)
... 4 more
Caused by: class
org.apache.ignite.internal.IgniteClientDisconnectedCheckedException: Failed
to execute dynamic cache change request, client node disconnected.
at
org.apache.ignite.internal.processors.cache.GridCacheProcessor.onDisconnected(GridCacheProcessor.java:1180)
at
org.apache.ignite.internal.IgniteKernal.onDisconnected(IgniteKernal.java:3949)
at
org.apache.ignite.internal.managers.discovery.GridDiscoveryManager$4.onDiscovery0(GridDiscoveryManager.java:821)
at
org.apache.ignite.internal.managers.discovery.GridDiscoveryManager$4.lambda$onDiscovery$0(GridDiscoveryManager.java:604)
at
org.apache.ignite.internal.managers.discovery.GridDiscoveryManager$DiscoveryMessageNotifierWorker.body0(GridDiscoveryManager.java:2667)
at
org.apache.ignite.internal.managers.discovery.GridDiscoveryManager$DiscoveryMessageNotifierWorker.body(GridDiscoveryManager.java:2705)
at
org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:120)
at java.lang.Thread.run(Thread.java:748)




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Scheduling Cache Refresh using Ignite

2020-02-12 Thread nithin91
Hi 

We are doing a  a POC on exploring the Ignite in memory capabilities and
building a rest api on 
top of it using node express.


Currently as a part of POC, installed Ignite in UNIX and trying to load the
data from Oracle DB 
to Ignite Cache using Cache JDBC Pojo Store.

Can someone help me whether the following scenarios can be handled using
Ignite as i couldn't find this in the official documentation.

1. If we want to add/drop/modify a  column to the cache, can we 
update the
bean file directly 
   when the node is running or do we need to stop the node and 
then again
restart.
   It would be really helpful if you can  share sample code or
documentation link.
   
2. How to refresh the ignite cache automatically or schedule 
the cache
refresh.
   It would be really helpful if you can  share sample code or
documentation link.

3. Is incremental refresh allowed? It would be really helpful 
if you can 
share sample code or 
   documentation link.
   

4. Is there any other way to load the caches fast other Cache 
JDBC POJO
Store.
   It would be really helpful if you can  share sample code or
documentation link.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Dynamic Cache Change not allowed

2020-02-12 Thread nithin91
Forgot to attach the bean file.Attached the bean File now.  config.txt
  



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Dynamic Cache Change not allowed

2020-02-12 Thread nithin91
Hi 

We are doing a  a POC on exploring the Ignite in memory capabilities and
building a rest api on 
top of it using node express.


Currently as a part of POC, installed Ignite in UNIX and trying to load 
the
data from Oracle DB 
to Ignite Cache using Cache JDBC Pojo Store.

As a part of this process, bean file is custom configured  to start 
ignite
node in 
unix. Attached the bean file.
This bean file consists of both cache configuration details and ignite
configuration details.

Once the node is running, we are trying to do the following

1. Connecting to the ignite node running on unix by creating a replica 
of
the attached bean file from
   local system and adding an additional property in Bean file with
Client Mode = true and
   then loading the cache that are defined in the bean file deployed in
unix using the 
   following method from local system using JAVA

ignite.cache("CacheName").loadCache(null);

We are able to do this successfully.

2.  Connecting to the ignite node running on unix by creating a replica 
of
the attached bean file 
in local system and adding an additional property in Bean file 
with Client
Mode = true
and then trying to create a cache and configure the cache and 
then finally
loading
the cache using the attached JAVA Code.


When we are trying this approach, we are getting an error like 
dynamic
cache change
is not allowed.

It would be really helpful if any one can help me in resolve 
this issue.

If this not the right approach, then
Configuring all the caches in the bean file is the only 
available
option?If this is case, 
What should be the approach for  building some additional 
caches in ignite
and load these
Caches using Cache JDBC POJO Store when the node is running.

Also can you please help me on the ways  to handle these issues.

1. If we want to add/drop/modify a  column to the cache, can we 
update the
bean file directly 
   when the node is running or do we need to stop the node and 
then again
restart.
   It would be really helpful if you can  share sample code or
documentation link.
   
2. How to refresh the ignite cache automatically or schedule 
the cache
refresh.
   It would be really helpful if you can  share sample code or
documentation link.

3. Is incremental refresh allowed? It would be really helpful 
if you can 
share sample code or 
   documentation link.
   

4. Is there any other way to load the caches fast other Cache 
JDBC POJO
Store.
   It would be really helpful if you can  share sample code or
documentation link.

 JavaCode.txt
  



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Loading and Fetching the Data using Node js.

2020-02-11 Thread nithin91
Hi 

I am new to Apache Ignite.Can someone help me on how to fetch and load the
data from ignite cache 
using node js without using sql field queries option. Cache is loaded using
CacheJDBCPOJO Store and the Key and Value types are custom types defined
using JAVA.As these classes are defined in Java not sure on how to fetch the
data using node.

Hope the following example , explains the issue better.

We have ignite cache of custom key Type i.e Person Key with attributes
Person First Name and person Last Name and custom value type  i.e Person
Info with attributes Person Address and Person Age etc.
 These classes are defined in Java and the caches are configured in Bean
File and loaded using CacheJDBCPOJO Store.

As these classes will not be available in node js, how can we load /fetch
the data from node js using cahe.put /cache.get.Tried creating similar
classes in node and pass the object of these classes to 
cahe.put /cache.get but it is in't working.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


REST API on top of ignite using node express

2020-02-10 Thread nithin91
Hi ,

We are trying to build an Rest API on top of ignite cache using node
express.

Following is the way we are fetching data from ignite.

await igniteClient.connect(new IgniteClientConfiguration(ENDPOINT));
const cache = igniteClient.getCache(CacheNAME);

const querysql=new SqlFieldsQuery("SqL");
const cursor = await cache.query(querysql);
const row =await  cursorProductDetails.getValue();

We are facing the following issues while fetching the data in cursor.

1. cursor._values property is always having only 1024 rows even though the
table as 100k rows.
2. cursor._fieldnames  property  is not displaying the field names as result
of which we have created an 
array with list of fields and creating a list of json objects using this
array and  traversing each row of cursor._values using map function.

Please check below for sample code

var dataProductDetails=cursor._values ;

var res_data_prddetails=[];

 var fields=[field1,field2]

await dataProductDetails.map(function(arr){
 var prdobj={};
 fields.forEach((k,v)=> prdobj[k]=arr[v]);
 res_data_prddetails.push(prdobj);
   }

   
  );

Also can you please let me know whether there is a way to directly convert
the sql fields query output to JSON using node express.

















--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


REST API on top of ignite using node express

2020-02-10 Thread nithin91
Hi ,

We are trying to build an Rest API on top of ignite cache using node
express.

Following is the way we are fetching data from ignite.

await igniteClient.connect(new IgniteClientConfiguration(ENDPOINT));
const cache = igniteClient.getCache(CacheNAME);

const querysql=new SqlFieldsQuery("SqL");
const cursor = await cache.query(querysql);
const row =await  cursorProductDetails.getValue();

We are facing the following issues while fetching the data in cursor.

1. cursor._values property is always having only 1024 rows even though the
table as 100k rows.
2. cursor._fieldnames  property  is not displaying the field names as result
of which we have created an 
array with list of fields and creating a list of json objects using this
array and  traversing each row of cursor._values using map function.

Please check below for sample code

var dataProductDetails=cursor._values ;

var res_data_prddetails=[];

 var fields=[field1,field2]

await dataProductDetails.map(function(arr){
 var prdobj={};
 fields.forEach((k,v)=> prdobj[k]=arr[v]);
 res_data_prddetails.push(prdobj);
   }

   
  );

Also can you please let me know whether there is a way to directly convert
the sql fields query output to JSON using node express.

















--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


REST API on top of ignite using node express

2020-02-10 Thread nithin91
Hi ,

We are trying to build an Rest API on top of ignite cache using node
express.

Following is the way we are fetching data from ignite.

await igniteClient.connect(new IgniteClientConfiguration(ENDPOINT));
const cache = igniteClient.getCache(CacheNAME);

const querysql=new SqlFieldsQuery("SqL");
const cursor = await cache.query(querysql);
const row =await  cursorProductDetails.getValue();

We are facing the following issues while fetching the data in cursor.

1. cursor._values property is always having only 1024 rows even though the
table as 100k rows.
2. cursor._fieldnames  property  is not displaying the field names as result
of which we have created an 
array with list of fields and creating a list of json objects using this
array and  traversing each row of cursor._values using map function.

Please check below for sample code

var dataProductDetails=cursor._values ;

var res_data_prddetails=[];

 var fields=[field1,field2]

await dataProductDetails.map(function(arr){
 var prdobj={};
 fields.forEach((k,v)=> prdobj[k]=arr[v]);
 res_data_prddetails.push(prdobj);
   }

   
  );

Also can you please let me know whether there is a way to directly convert
the sql fields query output to JSON using node express.

















--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Table not found error while executing sql fields query API

2020-02-02 Thread nithin91
Hi ,
 
  We are doing a POC on using the Ignite in-memory capabilities for our
application and as a part of this
  we are trying to load the data from one of the Oracle Tables to ignite
cache using
  CacheJdbcPojoStoreFactory.

 
   I specified properties like  cache configuration,cache name,key type ,
value type ,cache store
   factory,data source bean,Query Entities and JDBC Types in the bean file.

   For example the cache name specified in Bean file is
"ProductDetailsCache" and the Key is of 
   type corresponding to Custom  class "ProductKey", value is of type
corresponding to Custom  POJO 
   class "ProductDetails".

   I am able to load the data into cache and  also able to retrieve the data
in cache using get method 
   corresponding to Key-Value API. But when i am trying to retrieve the data
using sql fields query API, i 
  am facing an error like "Table Not found".Following are the 2 ways in
which the SQL Query string is 
  passed to Sql Fields Query API.

   "select * from ProductDetails limit 10"

 or 
  "select * from ProductDetailsCache.\"ProductDetails\" limit 10" 

   
   Can you please help me in resolving this issue by letting me know where
exactly i am going 
  wrong.
   



  










--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Failed to load bean in application context [beanName=dataSource] in ignite.Spring bean doesn't exist

2020-02-02 Thread nithin91
Hi ,
 
  We are doing a POC on using the Ignite in-memory capabilities for our
application and as a part of this 
  we are trying to load the data from one of the Oracle Tables to ignite
cache using 
  CacheJdbcPojoStoreFactory.


  Steps followed to load the data from  Oracle to Ignite cache:

  1. We have installed the Ignite ( with Version 2.7.6) in Unix Server and
started the ignite node 
  by executing the ignite.sh file in UNIX.

  2. Now i have written a java code using eclipse as my IDE in my local
windows system and trying to 
  connect to the Ignite node running on UNIX by specifying the UNIX
Server IP details in the 
 "discoveryspi" property corresponding to Ignite Configuration in the
bean file.
 
  3. In addition to the server details, bean file has details related to
cache configuration,cache store 
  factory,data source bean,Query Entities and JDBC Types and has client
mode set to "true".

   4. Now we are trying to load the cache by executing the below code

   public class ProductLoadCache {
 
public static void main(String[] args) throws Exception {
try (Ignite ignite = Ignition.start("Ignite-Client.xml")) {
System.out.println(">>> Loading caches...");

System.out.println(">>> Loading cache: ProductLoadCache");
ignite.cache("ProductLoadCache").loadCache(null);

System.out.println(">>> ProductLoadCacheis loaded");

  
}
}


   5.  Upon execution of the above code i get an error like "Failed to load
bean in application context 
   [beanName=dataSource] in ignite.Spring bean doesn't exist".  

   6.  I encountered similar error while running the same code in local
mode(i.e by specifying the local
IP address details in the "discoveryspi" property corresponding to
Ignite Configuration in the bean 
   file) but later identified that Oracle jdbc driver is missing and
added the Oracle Jdbc Jar which is  
   available in my local system  as an  "external jar" to my project in
eclipse.After doing this the issue 
   got resolved and i am to load the data into ignite cache by running 
ignite in local mode.

  Now after starting the node in unix, i am again facing the same error.

  Can you please let me know is anything additional needed to run the
code in unix or is it because it is 
 not able to find the Oracle JDBC jar in unix. If this is the issue
please let me know , what are available 
 approaches to fix the issue.







  










--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load to Ignite cache is very slow from Oracle Table

2020-01-28 Thread nithin91
Hi Ilya,

Thank you so much. Issue is resolved after following the steps mentioned in
the link you have shared.






--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Error Connecting to Web Console

2020-01-28 Thread nithin91
Hi ,

  We are doing a POC on using the Ignite in-memory capabilities
for our application and as a part of this trying to connect to Ignite Web
Console  but 
 we end up facing the error provided in the attachment after downloading the
web agent and  starting the agent by executing the ignite.bat command. Can
anyone please help me on resolving this issue.

Error.xlsx
  



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load to Ignite cache is very slow from Oracle Table

2020-01-27 Thread nithin91
Hi Belyakov,

Thank you so much. This is very helpful.

I am facing the following error when i am using this approach

Failed to start component: class org.apache.ignite.IgniteException: Failed
to initialize cache store (data source is not provided).

Below is the code used for implementation.I have configured the data source
property correctly but not sure why this error pops up.Can you please help
me on this.

package ignite.example.ignite_read;

import java.sql.SQLException;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.Set;

//import javax.activation.DataSource;
import javax.cache.configuration.Factory;
import javax.cache.integration.CacheLoaderException;
import javax.sql.DataSource;

import org.apache.ignite.Ignite;
import org.apache.ignite.IgniteCache;
import org.apache.ignite.IgniteException;
import org.apache.ignite.Ignition;
import org.apache.ignite.cache.CacheAtomicityMode;
import org.apache.ignite.cache.CacheMode;
import org.apache.ignite.cache.QueryEntity;
import org.apache.ignite.cache.store.jdbc.CacheJdbcPojoStoreFactory;
import org.apache.ignite.cache.store.jdbc.JdbcType;
import org.apache.ignite.cache.store.jdbc.JdbcTypeField;
import org.apache.ignite.cache.store.jdbc.dialect.OracleDialect;
import org.apache.ignite.configuration.CacheConfiguration;
import org.apache.ignite.configuration.IgniteConfiguration;
import org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi;
import
org.apache.ignite.spi.discovery.tcp.ipfinder.vm.TcpDiscoveryVmIpFinder;

import oracle.jdbc.pool.OracleDataSource;

public class IgniteCacheload {

@SuppressWarnings("unchecked")
public static void main(String[] args) throws IgniteException, 
SQLException
{




 IgniteConfiguration config = new IgniteConfiguration();
 /*
// config code
 *
   
 try (Ignite ignite = Ignition.start(config)){

CacheConfiguration prdCacheCfg = new
CacheConfiguration<>();

prdCacheCfg.setName("ProdCache");
prdCacheCfg.setCacheMode(CacheMode.PARTITIONED);
prdCacheCfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);

//personCacheCfg.setReadThrough(true);
//personCacheCfg.setWriteThrough(true);

CacheJdbcPojoStoreFactory factory = new
CacheJdbcPojoStoreFactory<>();


factory.setDialect(new OracleDialect());
//factory.setDataSource(dsdetails());
factory.setDataSourceFactory(dsdetails());

JdbcType productType = new JdbcType();
productType.setCacheName("ProdCache");
productType.setKeyType(ProductKey.class);
productType.setValueType(Products.class);
// Specify the schema if applicable

productType.setDatabaseTable("table");

productType.setKeyFields(new 
JdbcTypeField(java.sql.Types.VARCHAR, "fid",
ProductKey.class, "FID"));
productType.setValueFields(new 
JdbcTypeField(java.sql.Types.VARCHAR,
"scode", Products.class, "scode"));


factory.setTypes(productType);

prdCacheCfg.setCacheStoreFactory(factory);


config.setCacheConfiguration(prdCacheCfg);



IgniteCache cache =
ignite.getOrCreateCache(prdCacheCfg);

  cache.clear();

  // Load cache on all data nodes with default SQL statement.
  System.out.println(">>> Load ALL data to cache from DB...");
  cache.loadCache(null);
  

  System.out.println(">>> Loaded cache entries: " +
cache.size());

}
 catch (Exception e) {
throw new CacheLoaderException("Failed to load the cache"+
e.getMessage());

}


}

public static Factory dsdetails() throws SQLException{
//public static DataSource dsdetails() throws SQLException{
OracleDataSource oraDataSrc = new OracleDataSource();
oraDataSrc.setURL("url");
oraDataSrc.setUser("username");
oraDataSrc.setPassword("pswd");
//return oraDataSrc;
return (Factory)oraDataSrc;

}

}






--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load to Ignite cache is very slow from Oracle Table

2020-01-27 Thread nithin91
Hi Mikael,

Thanks for your quick response.

I have gone through the documentation reg usage of IgniteCache.loadcache
method.

Documentation Link:
https://apacheignite.readme.io/docs/3rd-party-store#section-loadcache-

in the Documentation It was mentioned to enable the JDBC POJO store manually
in the Ignite XML configuration file (or via code).

Can you please provide any reference link on how to enable the  JDBC POJO
store  via Code.








--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Data Load to Ignite cache is very slow from Oracle Table

2020-01-27 Thread nithin91
Its taking almost 1hour to load the 0.1 million data using result set Cursor



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Data Load to Ignite cache is very slow from Oracle Table

2020-01-27 Thread nithin91
Hi 

I am trying to load data from Oracle Table to Ignite Cache using Cache Store
Load Cache method.

Following is the logic implemented in Load Cache method to load the data
from Oracle Table using Ignite Cache.

1. JDBC connector is used to connect to Oracle Table and the data is
available in Result Set Cursor.
2. While loop is used to loop through the Result Set Object and insert the
data into cache.

Is there any other way to insert the data from Oracle Table to Ignite Cache.
If possible please share sample code.





   











--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/