Quantcast
Channel: Ignite Realtime: Message List
Viewing all articles
Browse latest Browse all 12000

High memory usage when sending large amounts of data

$
0
0

We are experiencing some problems with Openfire when we are sending large amount of data.

The typical setup for us is that we have a server that after the connection with a client

initially sends a large amount of data with message sizes of around 200 - 500 KB to

the client through Openfire.

 

In a test setup with Openfire we have one server where 5 clients connect to. The amount

of data that we wish to transfer is 1GB to each client. (The 1GB may be a little bit excessive, but

it is a test).

 

During the test we noticed that the memory usage in Openfire increases rapidly and that soon the entire

heap memory was used. (See attachment with screenshot from VisualVM). A heap dump showed

that the memory was allocated to byte arrays (See attachment with heap dump from VisualVM).

After killing the data sending and receiving clients the memory usage stays the same.

To us it seems that the data entering Openfire causes new byte buffer allocations that are

afterwards added to a pool for reusage. But the volume of data and the fact that the writing

to the receiving clients is a lot slower than the rate of which the data enters Openfire, results

is many new byte buffer allocations. This seems to exhausts the heap, after which the GC is very

busy trying to fee up some memory, but without any result.

 

This entire mechanism makes Openfire pretty much unresponsive and halts the data throughput

to an absolute minimum.

 

Is there anything that we can do to keep the byte buffer pool from growing endlessly (to max heap usage)?

 

For the test we used Openfire 3.9.3


Viewing all articles
Browse latest Browse all 12000

Trending Articles