You can take advantage of multi-core processors and execute foreach loops in parallel. This works especially well when getting multiple chunks of data from external data sources such as a database, web service, or RESTful API. I designed this after the upcoming parallel features coming in .NET 4.0, but for now they’re in beta and I’ve had trouble using them. So this method is a simple implementation of it.
Usage
(not that this example makes any sense…)
string[] names = { “cartman”, “stan”, “kenny”, “kyle” }; names.EachParallel(name => { Console.WriteLine(name); });
Of course you’ll probably want some exception handling within the lambda method to prevent the threads from aborting if an exception occurs.
string[] names = { “cartman”, “stan”, “kenny”, “kyle” }; names.EachParallel(name => { try { Console.WriteLine(name); } catch { /* handle exception */ } });
Source Code
- Source hosted at my Helpers.Net GitHub Project
Update: Added special handling for the case when the enumerable only contains one element. It just executes the method directly (in serial) to avoid the overhead of creating a thread and using the WaitHandle, since that will essentially execute it in serial anyway.
Update: Some systems have a cap of 64 threads that you can track with WaitHandle. An anonymous developer posted a solution in the comments. I integrated his changes and made some modifications. It breaks up the enumerable into 64-item chunks and processes those items in parallel.
/// <summary> /// Enumerates through each item in a list in parallel /// </summary> public static void EachParallel<T>(this IEnumerable<T> list, Action<T> action) { // enumerate the list so it can't change during execution // TODO: why is this happening? list = list.ToArray(); var count = list.Count(); if (count == 0) { return; } else if (count == 1) { // if there's only one element, just execute it action(list.First()); } else { // Launch each method in it's own thread const int MaxHandles = 64; for (var offset = 0; offset <= count / MaxHandles; offset++) { // break up the list into 64-item chunks because of a limitiation in WaitHandle var chunk = list.Skip(offset * MaxHandles).Take(MaxHandles); // Initialize the reset events to keep track of completed threads var resetEvents = new ManualResetEvent[chunk.Count()]; // spawn a thread for each item in the chunk int i = 0; foreach (var item in chunk) { resetEvents[i] = new ManualResetEvent(false); ThreadPool.QueueUserWorkItem(new WaitCallback((object data) => { int methodIndex = (int)((object[])data)[0]; // Execute the method and pass in the enumerated item action((T)((object[])data)[1]); // Tell the calling thread that we're done resetEvents[methodIndex].Set(); }), new object[] { i, item }); i++; } // Wait for all threads to execute WaitHandle.WaitAll(resetEvents); } } }
The post Parallel ForEach Loop in C# 3.5 appeared first on RobVolk.com.