Hi,
I encountered a scenario where my process was blocked at a call to
deployClusterSingleton.
IgniteServices svcs = ignite.services();
svcs.deployClusterSingleton(..., ...);
Too bad that I didn't get the stack trace.
I didn't specify any node filter. I was able to deploy the service upon
Here you go...
I added this ticket, because we hit a similar problem, as was able to find
some quite suspect code: https://issues.apache.org/jira/browse/IGNITE-9026
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Hi Monstereo,
monstereo wrote
> When I want to add new element to cache, it will also update the database.
> When I want to update any element in cache, it will also update the
> database.
> When I want to delete any element in cache, it will also delete the
> element
> from database.
>
> How I
Hello!
You could also add ignite-rest-http module and call some rest endpoint as
Healthcheck, such as version or cache read:
https://apacheignite.readme.io/docs/rest-api#version
Regards,
--
Ilya Kasnacheev
2018-07-17 18:37 GMT+03:00 Dave Harvey :
> Any suggestions on an appropriate
Hello!
As your message states, IgniteConfiguration isn't serializable. Sooo, you
will need to create IgniteConfiguration from inside the () => igniteConf
lambda, instead of passing it from outside. Instead, pass parameters needed
to create that configuration with this lambda.
Regards,
--
Ilya
The Apache Ignite Community is pleased to announce the release of
Apache Ignite 2.6.0.
Apache Ignite [1] is a memory-centric distributed database, caching,
and processing platform for transactional, analytical, and streaming
workloads delivering in-memory speeds at petabyte scale.
This release
Hi ,
The Reason is i am trying to do it though load method , which supports the
read through. The requirement is to pass a set of param say
Map which will contain a sql statement some query param value
and the return will be the list of BinaryObject . That is against one key I
want to hold all the
I’d look into calling control.sh or ignitevisorcmd.sh and parsing their output.
E.g. check that control.sh --cache can connect to the local node and return one
of your caches.
However, this check is not purely for the local node, as the command will
connect to the cluster as a whole.
A more
Hello!
Why do you try to store List in cache? It should work if you
will just put plain BinaryObject's in it without List<>.
Regards,
--
Ilya Kasnacheev
2018-07-18 17:42 GMT+03:00 debashissinha :
> Hi ,
>
> If I add a List to cache and also in the cache configuration
> I
> set QueryEntity
Hello again!
I have just noticed the following stack trace:
"flusher-0-#588%AppCluster%" #633 prio=5 os_prio=0
tid=0x7f18d424f800 nid=0xe1bb runnable [0x7f197c1cd000]
java.lang.Thread.State: RUNNABLE
at java.net.SocketInputStream.socketRead0(Native Method)
at
Hello!
Can you please share the configuration of your Apache Ignite nodes,
especially the cache store's of caches. I have just noticed that you're
actually waiting on cache store lock.
Regards,
--
Ilya Kasnacheev
2018-07-17 19:11 GMT+03:00 Shailendrasinh Gohil <
Hi ,
If I add a List to cache and also in the cache configuration I
set QueryEntity with fields, then how can I query using
cache.query(new SqlFieldQuery("Some sql"));
Sample I am trying to use is
CacheConfiguration> cfg = new
CacheConfiguration();
cfg.setQueryEntities(new ArrayList(){{
I can connect to ignite rmdbs table from spark but can't query it. ignite
server running in intellij - rmdbs integration (1 table) and cache loaded
in spark i have the following code:
import org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi
import
Hi ,
Thanks a lot for your help
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Hi,
I am out of context what you do in your code. However, I know that several
page corruption issues were fixed in 2.6 release.
So there is no specific suggestion from my side.
BR,
Andrei
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Hi Andrei,
Yes, I have stopped all the servers and removed corrupted nodes data and
upgraded to 2.6 and restart the server.
any more suggestion and configuration changes to prevent this issue.
Thanks & Regards,
Venkat
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
here is my rmdbs integration server config
http://www.springframework.org/schema/beans;
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
xmlns:util="http://www.springframework.org/schema/util;
xsi:schemaLocation="http://www.springframework.org/schema/beans
could it be that spark is not compatible with ignite rmdbs integration? If i
run an example without that it works (writing to and reading from caches and
tables)
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
So you said that if I will try to start provided example I will see the same
error? I mean that I can try to investigate the problem in case if I will be
able to reproduce the same behavior.
Let me some time to take a look at this example.
--
Sent from:
Hi,
First of all, you shouldn't use spark 2.1. With Ignite because you could
have conflicts of spark versions.
>From your log when you used ignite spark (that used spark 2.2) I see that
you have the problem with spring configuration:
class org.apache.ignite.IgniteException: Spring application
If i run the same code on spark 2.1 i get the following error (same jars in
classpath)
Welcome to
__
/ __/__ ___ _/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server
thanks but that doesnt work
I have the folowing jars in spark classpath for spark 2.2
cache-api-1.0.0.jar
spring-expression-4.3.7.RELEASE.jar
spring-context-4.3.7.RELEASE.jar
spring-core-4.3.7.RELEASE.jar
spring-beans-4.3.7.RELEASE.jar
ignite-spring-2.5.0.jar
ignite-core-2.5.0.jar
I can run the example on github for hibernate l2 cache with ignite. And also
I am adding the new data to mydatabase like this command::
User user = new User("jedi", "Luke", "Skywalker");
user.getPosts().add(new Post(user, "Let the Force be with you."));
ses.save(user);
When I want to add
Hi,
In case if you see (page content is corrupted) and after that upgrade your
version to 2.6 then it's possible that your persistence is still broken.
The most simple way here is cleaning your PDS (work/db) directory before the
upgrading to 2.6.
In case if data is important then you also can
Sorry, I have a typo. ignite contains "spark-core_2.11" inside
. Not spark-core_2.10.
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Hi,
To connect to the 3rd party store you will need to implement your own
CacheStore to interact with 3 party file system. Here is a good
documentation with examples:
https://apacheignite.readme.io/docs/3rd-party-store
I think you can implement it using Hive jdbc driver or you can directly
Hi,
What did you mean when saying "remote logging"?
In case if you asking how you can configure the standard Apache Ignite
logging:
Ignite java node will store its own log into a log file that will be located
in work dir. C++ ignite node will be started as a wrapper for java node.
When you
I can't get it to work with spark 2.2 with context error that i asked about
last week with no answers. Documentation is lacking here it only described
the features and not the version compatability
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
29 matches
Mail list logo