I\'d like to make a program that read each line of a .txt file do something with each line.
I would like to user to be able to choose threads so if he choose for example
Parallel.ForEach will run each loop instance in its own thread, and will also allow you to set the MaxDegreeOfParallelism (# of threads) to whatever you want.
Parallel.ForEach(lines, new ParallelOptions{ MaxDegreeOfParallelism = 10 }, line =>
{
// do stuff with line
});
You can't absolutely control the number of threads. The number used depends on the number available and other factors. But you can limit the number of threads by specifying the maximum number to use.
Using Parallel.ForEach:
var options = new ParallelOptions {MaxDegreeOfParallelism = 5};
Parallel.ForEach(lines, options, line =>
{
// do something with each line
});
MaxDegreeOfParellelism
doesn't tell the whole story, though. It merely limits the number of threads. The threads used come from the managed thread pool. The number of threads available for starting new tasks can be set using ThreadPool.SetMinThreads
and ThreadPool.SetMaxThreads
.
But that still doesn't tell the whole story. The number of concurrent operations that can be performed is limited by your number of CPU cores. It also depends on what those operations do. For example, no matter how many threads you have running they can't all write to the disk at the same time. Increasing the number of threads beyond a certain point may decrease performance.
So while it's interesting to experiment with, in most scenarios it's very unlikely that the user of a program would know how many threads should run concurrently.
And this is only one way to accomplish multithreading. Parellel.ForEach
is a convenience for when you have an IEnumerable
(like the array of lines read from the file) and want to execute them in parallel.
One more factor to take into account is that when you perform operations in parallel you can't guarantee the sequence in which they will execute. They may appear to run FIFO, but then you'll run it again and they won't.