out-of-memory

Guidelines to set MetaspaceSize - java 8

|▌冷眼眸甩不掉的悲伤 提交于 2020-05-14 17:54:08
问题 What is the default value of MetaspaceSize for 64-bit servers? I couldn't find it in the official documentation. I'm observing that in a server JVM process, at times, the GC frequency becomes high and keeps growing. If I restart the service a few times, it returns to stable. I think its due to the JRE upgrade. JVM Heap max size is set to be 6GB but when this problem occurs, we see only 3GB heap being used. Metaspace grows by very little and is almost always full. I tried increasing the

Guidelines to set MetaspaceSize - java 8

情到浓时终转凉″ 提交于 2020-05-14 17:53:45
问题 What is the default value of MetaspaceSize for 64-bit servers? I couldn't find it in the official documentation. I'm observing that in a server JVM process, at times, the GC frequency becomes high and keeps growing. If I restart the service a few times, it returns to stable. I think its due to the JRE upgrade. JVM Heap max size is set to be 6GB but when this problem occurs, we see only 3GB heap being used. Metaspace grows by very little and is almost always full. I tried increasing the

Json.Net deserialize out of memory issue

好久不见. 提交于 2020-05-13 01:13:49
问题 I got a Json, which contains among others a data field which stores a base64 encoded string. This Json is serialized and send to a client. On client side, the newtonsoft json.net deserializer is used to get back the Json. However, if the data field becomes large (~ 400 MB), the deserializer will throw an out of memory exception: Array Dimensions exceeded supported Range . I also see in Task-Manager, that memory consumption really grows fast. Any ideas why this is? Is there a maximum size for

Memory exception while filtering large CSV files

◇◆丶佛笑我妖孽 提交于 2020-04-06 23:25:47
问题 getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory. $inputFolder = "C:\Change\2019\October" $outputFile = "C:\Change\2019\output.csv" Get-ChildItem $inputFolder -File -Filter '*.csv' | ForEach-Object { Import-Csv $_.FullName } | Where-Object { $_.machine_type -eq 'workstations' } | Export-Csv $outputFile -NoType 回答1: May be can you export and

java.lang.OutOfMemoryError: unable to create new native thread error using ChromeDriver and Chrome through Selenium in Spring boot

*爱你&永不变心* 提交于 2020-03-23 08:49:18
问题 We have selenium based web application developed using spring boot. The server is located as VM Instance at google cloud server. We have a thread base mechanism using Executor . Using selenium we open a chrome browser (headless) to perform operation and for each operation we create new thread. After facing outOfMemory error, if we restart cloud instance then within 1 hour it breaks again with the same error. Please find below the snippet which we used to create a new instance of executor

How to fix OutOfMemoryError thrown while trying to throw OutOfMemoryError with retrofit

旧巷老猫 提交于 2020-03-21 18:03:25
问题 I am using retrofit to download some media files like video,mp3, jpg, pdf,... in my application.threre is a problem when I want to download a large file with 55MB with the format of mp4. when I want to download this file I get an error like this: OutOfMemoryError threw while trying to throw OutOfMemoryError; no stack trace available this is my code: private void downloadFile() { ArrayList<FileModel> filesInDB = G.bootFileFromFileDB(); for (final FileModel fm : filesInDB) { APIService

How large matrix can be fit into Eigen library? [closed]

拈花ヽ惹草 提交于 2020-03-21 05:59:17
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 years ago . I am working on large scale data like 300000 x 300000 matrix currently may interest. It is really hard to process in matlab due to "Out of memory" error so I decide to use EIGEN. Is there any restriciton for eigen in the matrix size? 回答1: The dense matrices in EIGEN are stored in a contiguous block of memory,

Exception of type 'System.OutOfMemoryException' was thrown.

僤鯓⒐⒋嵵緔 提交于 2020-03-17 06:43:25
问题 I got the following problem Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception

numpy Loadtxt function seems to be consuming too much memory

天大地大妈咪最大 提交于 2020-03-13 06:27:28
问题 When I load an array using numpy.loadtxt, it seems to take too much memory. E.g. a = numpy.zeros(int(1e6)) causes an increase of about 8MB in memory (using htop, or just 8bytes*1million \approx 8MB). On the other hand, if I save and then load this array numpy.savetxt('a.csv', a) b = numpy.loadtxt('a.csv') my memory usage increases by about 100MB! Again I observed this with htop. This was observed while in the iPython shell, and also while stepping through code using Pdb++. Any idea what's

C# OutOfMemoryException creating ZipOutputStream using SharpZipLib

僤鯓⒐⒋嵵緔 提交于 2020-03-05 05:54:28
问题 I keep getting a very annoying OutOfMemory exception on the following code. I'm zipping a lot of small files (PDF, each being 1.5mb approx). At first I was getting the exception afer 25 files zipped, which doesn't seem like a massing archive. Setting up the size of the ZipEntry somehow helped since now I manage to get up to 110 files zipped (I'm debugging under visual studio) Here's my code, maybe there's something wrong with it. Any help would be greatly appreciated. Thanks public static