how to fix memory error while using np.r_

三世轮回 提交于 2019-12-24 23:52:25

问题


I have a list with 482000 entries. The structure of the array is like this:

X_docs = [array([0., 0., 0., ..., 0., 0., 0.]), 
array([0.60205999, 0.60205999, 0.47712125, ..., 0.  , 0.  ,0.])]

each array have 5000 entries. so in the end we have 482000 * 5000.

Then I need to apply np.r over it like this:

np.r_[X_docs]

When it reaches this line it raises this error:

MemoryError

I dont know how to fix this? is there any limitation regarding the numpy thing? I have 32 gig ram. I even tried to run it in AWS Amazon sagemaker(Free version). there it still raises error.

Update 1

This is the whole code before reaching the np part:

    corpus = load_corpus(args.input) 
n_vocab, docs = len(corpus['vocab']), 
corpus['docs'] corpus.clear() 
# save memory 
doc_keys = docs.keys() 
X_docs = [] 
for k in doc_keys:
 X_docs.append(vecnorm(doc2vec(docs[k], n_vocab), 'logmax1', 0))
 del docs[k] X_docs = np.r_[X_docs]

来源:https://stackoverflow.com/questions/57814074/how-to-fix-memory-error-while-using-np-r

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!