Hi there,
I'm getting an OutOfMemoryError on ActiveMQ v5.3.0. I'm using the default
activemq.xml config file, the only change I have made is to set
producerFlow control to false for all queues. I only have one queue, created
dynamically by my message producer. My message producer is placing
ObjectMessages with a body size of 250K onto the queue. I am experiencing
the following error when my queue size gets to around 20,000 unconsumed
messages:
Exception in thread "ActiveMQ Scheduler" java.lang.OutOfMemoryError: Java
heap space
at
org.apache.activemq.protobuf.BaseMessage.mergeFramed(BaseMessage.java:228)
at
org.apache.activemq.store.kahadb.MessageDatabase.load(MessageDatabase.java:664)
at
org.apache.activemq.store.kahadb.KahaDBStore.loadMessage(KahaDBStore.java:534)
at
org.apache.activemq.store.kahadb.KahaDBStore$KahaDBMessageStore$4.execute(KahaDBStore.java:229)
at org.apache.kahadb.page.Transaction.execute(Transaction.java:728)
at
org.apache.activemq.store.kahadb.KahaDBStore$KahaDBMessageStore.recoverNextMessages(KahaDBStore.java:222)
at
org.apache.activemq.broker.region.cursors.QueueStorePrefetch.doFillBatch(QueueStorePrefetch.java:81)
at
org.apache.activemq.broker.region.cursors.AbstractStoreCursor.fillBatch(AbstractStoreCursor.java:227)
at
org.apache.activemq.broker.region.cursors.AbstractStoreCursor.hasNext(AbstractStoreCursor.java:134)
at
org.apache.activemq.broker.region.cursors.StoreQueueCursor.hasNext(StoreQueueCursor.java:131)
at org.apache.activemq.broker.region.Queue.doPageIn(Queue.java:1364)
at
org.apache.activemq.broker.region.Queue.pageInMessages(Queue.java:1503)
at org.apache.activemq.broker.region.Queue.doBrowse(Queue.java:759)
at
org.apache.activemq.broker.region.Queue.expireMessages(Queue.java:588)
at org.apache.activemq.broker.region.Queue.access$000(Queue.java:85)
at org.apache.activemq.broker.region.Queue$2.run(Queue.java:116)
at
org.apache.activemq.thread.SchedulerTimerTask.run(SchedulerTimerTask.java:33)
at java.util.TimerThread.mainLoop(Timer.java:512)
at java.util.TimerThread.run(Timer.java:462)
My test producer is as follows:
import java.util.Random;
import javax.jms.Connection;
import javax.jms.DeliveryMode;
import javax.jms.JMSException;
import javax.jms.MessageProducer;
import javax.jms.Session;
import org.apache.activemq.ActiveMQConnectionFactory;
public class ProducerActiveMQ {
public static void main(String[] args) throws Exception {
ActiveMQConnectionFactory cf = new
ActiveMQConnectionFactory();
cf.setBrokerURL("tcp://localhost:61616");
Connection conn = cf.createConnection();
conn.start();
Session session = conn.createSession(false,
Session.AUTO_ACKNOWLEDGE);
MessageProducer producer =
session.createProducer(session.createQueue("TEST"));
producer.setDeliveryMode(DeliveryMode.PERSISTENT);
producer.setDisableMessageID(true);
producer.setDisableMessageTimestamp(false);
Random random = new Random();
byte[] body = new byte[1024 * 250];
while (true) {
random.nextBytes(body);
producer.send(session.createObjectMessage(body));
}
}
}
Can anyone suggest what I am doing wrong? We need to be able to support up
to 10 million unconsumed messages on the queue, so using BlobMessage isn't
really an option for us as it creates a single file on disk for each
unconsumed message.
I've read the documentation and had a play around with various different
configurations, but nothing seems to help.
Has anyone else had this problem and been able to solve it?
--
View this message in context:
http://old.nabble.com/ActiveMQ-5.3.0-OutOfMemoryError-tp26340339p26340339.html
Sent from the ActiveMQ - User mailing list archive at Nabble.com.