parallel-processing

mcapply: all scheduled cores encountered errors in user code

↘锁芯ラ 提交于 2021-02-09 09:22:17
问题 The following is my code. I am trying get the list of all the files (~20000) that end with .idat and read each file using the function illuminaio::readIDAT . library(illuminaio) library(parallel) library(data.table) # number of cores to use ncores = 8 # this gets all the files with .idat extension ~20000 files files <- list.files(path = './', pattern = "*.idat", full.names = TRUE) # function to read the idat file and create a data.table of filename, and two more columns # write out as csv

mcapply: all scheduled cores encountered errors in user code

被刻印的时光 ゝ 提交于 2021-02-09 09:21:33
问题 The following is my code. I am trying get the list of all the files (~20000) that end with .idat and read each file using the function illuminaio::readIDAT . library(illuminaio) library(parallel) library(data.table) # number of cores to use ncores = 8 # this gets all the files with .idat extension ~20000 files files <- list.files(path = './', pattern = "*.idat", full.names = TRUE) # function to read the idat file and create a data.table of filename, and two more columns # write out as csv

mcapply: all scheduled cores encountered errors in user code

╄→尐↘猪︶ㄣ 提交于 2021-02-09 09:21:06
问题 The following is my code. I am trying get the list of all the files (~20000) that end with .idat and read each file using the function illuminaio::readIDAT . library(illuminaio) library(parallel) library(data.table) # number of cores to use ncores = 8 # this gets all the files with .idat extension ~20000 files files <- list.files(path = './', pattern = "*.idat", full.names = TRUE) # function to read the idat file and create a data.table of filename, and two more columns # write out as csv

Parallel for loop over range of array indices in C++17

落花浮王杯 提交于 2021-02-09 07:23:25
问题 I need to update a 100M-element array and would like to do it in parallel. std::for_each(std::execution::par, ...) seems great for this, except that the update needs to access elements of other arrays depending on the index that I am updating. A minimal serial working example of the kind of thing I'm trying to parallelize might look like this: for (size_t i = 0; i < 100'000'000; i++) d[i] = combine(d[i], s[2*i], s[2*i+1]); I could of course manually spawn threads, but that is a lot more code

Parallel for loop over range of array indices in C++17

戏子无情 提交于 2021-02-09 07:23:25
问题 I need to update a 100M-element array and would like to do it in parallel. std::for_each(std::execution::par, ...) seems great for this, except that the update needs to access elements of other arrays depending on the index that I am updating. A minimal serial working example of the kind of thing I'm trying to parallelize might look like this: for (size_t i = 0; i < 100'000'000; i++) d[i] = combine(d[i], s[2*i], s[2*i+1]); I could of course manually spawn threads, but that is a lot more code

Why does my SQL Server UPSERT code sometimes not block?

纵然是瞬间 提交于 2021-02-08 20:46:07
问题 I have a table ImportSourceMetadata which I use to control an import batch process. It contains a PK column SourceId and a data column LastCheckpoint . The import batch process reads the LastCheckpoint for a given SourceId , performs some logic (on other tables), then updates the LastCheckpoint for that SourceId or inserts it if it doesn't exist yet . Multiple instances of the process run at the same time, usually with disjunct SourceIds , and I need high parallelity for those cases. However,

BlockingCollection with Parallel.For hangs?

陌路散爱 提交于 2021-02-08 19:43:56
问题 I'm playing around with BlockingCollection to try to understand them better, but I'm struggling to understand why my code hangs when it finishes processing all my items when I use a Parallel.For I'm just adding a number to it (producer?): var blockingCollection = new BlockingCollection<long>(); Task.Factory.StartNew(() => { while (count <= 10000) { blockingCollection.Add(count); count++; } }); Then I'm trying to process (Consumer?): Parallel.For(0, 5, x => { foreach (long value in

BlockingCollection with Parallel.For hangs?

浪子不回头ぞ 提交于 2021-02-08 19:43:53
问题 I'm playing around with BlockingCollection to try to understand them better, but I'm struggling to understand why my code hangs when it finishes processing all my items when I use a Parallel.For I'm just adding a number to it (producer?): var blockingCollection = new BlockingCollection<long>(); Task.Factory.StartNew(() => { while (count <= 10000) { blockingCollection.Add(count); count++; } }); Then I'm trying to process (Consumer?): Parallel.For(0, 5, x => { foreach (long value in

BlockingCollection with Parallel.For hangs?

房东的猫 提交于 2021-02-08 19:43:20
问题 I'm playing around with BlockingCollection to try to understand them better, but I'm struggling to understand why my code hangs when it finishes processing all my items when I use a Parallel.For I'm just adding a number to it (producer?): var blockingCollection = new BlockingCollection<long>(); Task.Factory.StartNew(() => { while (count <= 10000) { blockingCollection.Add(count); count++; } }); Then I'm trying to process (Consumer?): Parallel.For(0, 5, x => { foreach (long value in

BlockingCollection with Parallel.For hangs?

拜拜、爱过 提交于 2021-02-08 19:42:39
问题 I'm playing around with BlockingCollection to try to understand them better, but I'm struggling to understand why my code hangs when it finishes processing all my items when I use a Parallel.For I'm just adding a number to it (producer?): var blockingCollection = new BlockingCollection<long>(); Task.Factory.StartNew(() => { while (count <= 10000) { blockingCollection.Add(count); count++; } }); Then I'm trying to process (Consumer?): Parallel.For(0, 5, x => { foreach (long value in