Knowledgebase: Product > memoQ Server
Memory use and performance: information and recommendations for high workload memoQ servers
Posted by P├ęter Botta on 23 September 2014 09:10 AM

Title: Memory use and performance: information and recommendations for high workload memoQ servers

Description:   If a computer running memoQ server repeatedly or constantly runs into low memory situations and related performance issues, then it is very likely that this is caused by the amount of files in the file system, which is in turn in most cases caused by the amount of accumulated project data.

In short, what Kilgray recommends is that old unused projects are archived externally (for example, to exported MQXLIFF files) and then deleted, and/or the amount of RAM in the system is increased to a level where the problems disappear. It is difficult to recommend an exact amount of RAM to have, because it depends on the amount of project data, and also on external factors like the potential presence of other software running on the same server machine, etc. This is explained in more detail below.

Details and how to rectify:   

1. The file system using up RAM. If you move large quantities of translation through your memoQ server, and do not routinely delete old projects, then there can be a large amount of files (millions) accumulated in the system. This is an acknowledged design issue with memoQ server, it creates a very large amount of small files, especially for the storage of project and document data. The problem with having millions of files is that the NTFS file system itself will start using up gigabytes of memory. This is a good article on the topic:

You could use the Rammap tool as described in the above article to see how much RAM the NTFS Metafile is using up. This can be a significant amount, and can help explain why RAM is "disappearing", while you cannot identify a specific process using it up.

One of the things that can help is getting rid of old projects. This involves deleting projects completely, because just putting them into the recycle bin will not help. In the long run, this aspect of memoQ server may see improvement in a future version: it is likely that a future version will consolidate file management to create a much smaller amount of files in projects (although we do not have a timeframe for this development). In the short and mid run, customers with this problem should keep only those projects that are actually used, and fully delete those that aren't (not just put them into the recycle bin).

2, Large main memoQ server database causing SQL Server to use large amounts of RAM. This is related to the amount of projects again, especially unused projects with many and/or large documents. Document content of all the online projects handled by memoQ server is stored in the main database (SQL Server). SQL Server will use up as much RAM as it deems "reasonable", and its performance will depend on the available RAM. It dynamically responds by changing its behaviour depending on how much RAM is available to it. This is where this becomes interrelated with RAM usage by the NTFS Metafile as explained above: if you manage to get the RAM use by the NTFS Metafile down, SQL Server may get much better performance.

You can artificially limit the amount of RAM SQL Server is allowed to use up, but that may lead to caching to disk with a huge performance cost.

Currently, Kilgray doesn't recommend trying to attack the problem by "optimizing" SQL server configuration, because it is a complex topic that requires very specific expertise to do right. Rather, our recommendation is to try and limit the amount of data you have (delete old projects), and increase system RAM. If you still think there is room for improvement in SQL Server configuration, I recommend that you consult an expert SQL Server database administrator.

(1 vote(s))
This article was helpful
This article was not helpful

Comments (0)
Help Desk Software by Kayako