Tuesday, 22 July 2014

Parallel programming in depth

Hi Friends,

In this article I will tell you what are parallel programming, how to use it and what the advantages over thread using parallel programming. Parallel programming is introduced in 4.0 frameworks

1.      What is Parallel programming

o   Parallel programming is a technique where we use multiple threads to execute a task faster. This means that on modern multi-core architectures we can utilize more of the resources available to perform a task.
o   A great example of this is sorting a list using quicksort.
o   Normally with parallel programming performance is important and all the threads are working to a common goal.
o   Parallel programming means executing operations at the same time using multiple threads, processes cpu's and or cores.
o   You perform parallel programming by running the same process multiple times, with the difference that every process gets a different "identifier", so if you want, you can differentiate each process, but it is not required.
o   Namespace used for parallel programming is using System.Threading.Tasks;

2.      How to use Parallel programming

The Parallel.For() and Parallel.ForEach() methods make use of a Partitioner. It would be very inefficient to execute a loop over 10,000 elements on 10,000 individual Tasks. The partitioner splits the data in segments and ideally the  ForEach() will execute in 4 Tasks (threads) of 2,500 elements on a 4-core CPU. This sometimes requires some heuristics and you can write your own custom-partitioner.
When using the 'normal' (simple) overloads of ForEach() this is fully transparent. But your example uses one of the <TLocal> overloads that surfaces the partitioning.

There are 3 methods for using parallel programming

o   Parallel.Invoke

Parallel.Invoke accepts an array of action delegates (you can construct the array yourself, or just pass a number of delegates to Invoke), which it then executes in parallel
o   Parallel.For.

The Parallel.For() construct is useful if you have a set of data that is to be processed independently. The construct splits the task over multiple processor.
Normal For loop
for (int row=0; row < ds.Tables[0].Rows.Count; ++row)


Parallel.For loop will be
Parallel.For(0, ds.Tables[0].Rows.Count), row =>
o   Parallel.ForEach.

The Parallel class’s ForEach method is a multithreaded implementation of a common loop construct in C#, the foreach loop


Parallel.ForEach(TSource, TLocal) Method (Partitioner(TSource), ParallelOptions, Func(TLocal), Func(TSource, ParallelLoopState, TLocal, TLocal), Action(TLocal))
Example: We have a collection of Customers, and we want to iterate through each customer then Normal ForEach loop will be
foreach(var customer in customers)
Parallel.foreach loop will be
Parallel.ForEach(customers, customer =>

3.       Some New concepts in Parallel programming

  •          How to handle Exceptions in Parallel Loops

We can handle all exceptions in parallel programming  using        System.AggregateException.

  •          Break Statement in Parallel Loops

We can use ParallelLoopState.Break method for breaking loop
Example :
    (i, state) =>
  •          Stop Statement in Parallel Loops

We can use ParallelLoopState.Stop method for Stopping loop

Parallel.ForEach(integers, (int item, ParallelLoopState state) =>
     if (item > 5)
             Console.WriteLine("Higher than 5: {0}, exiting loop.", item);
             Console.WriteLine("Less than 5: {0}", item);

  •          How  to Exit from Parallel Loops Early

We can exit from parallel loop early by passing CancellationToken
Explicitly. Cancellation is supported in parallel loops through the new System.Threading.CancellationToken type introduced in .NET 4.0.

  •          ParallelOptions Class in Parallel Programming

There are 3 properties in ParallelOptions Class
o   CancellationToken Gets or sets the CancellationToken associated with this ParallelOptions instance.
o   MaxDegreeOfParallelism             Gets or sets the maximum number of concurrent tasks     enabled by this ParallelOptions instance.
o   TaskScheduler   Gets or sets the TaskScheduler associated with this ParallelOptions instance. Setting this property to null indicates that the current scheduler should be used.

  •         Limit the number of parallel threads in C#

We can limit the number of parallel threads using MaxDegreeOfParallelism clss
 ParallelOptions options = new ParallelOptions();
options.MaxDegreeOfParallelism = 4;

4.      Advantages of Parallel programming over thread

    •   Most applications can benefit from implementing some form of Data Parallelism.  Iterating through collections and performing “work” is a very common pattern in nearly every application.
    •   Suppose if you are running long method then you can send that particular method in the thread and you don’t  need to wait until that method is executed. 
    • Performance of the application is very fast

5.      Disadvantages of Parallel programming

    •      It’s a fire and forgets kind of mechanism.
    •       We don’t know when we get the result back.
    •    As per my knowledge its better to use in a dropdownlist. For example: In the long filling info such as Signup.

Happy Programming!!

Don’t forget to leave your feedback and comments below!

If you have any query mail me to Sujeet.bhujbal@gmail.com     

Sujeet Bhujbal

Friday, 23 May 2014

WCF: The maximum message size quota for incoming messages (65536) has been exceeded

Hi Friends,


Recently working on one WCF project we found one issue when we are reading PDF file from SERVER using WCF. There is error like

System.ServiceModel.CommunicationException : The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.


To resolve this problem we increased MaxReceivedMessageSize in app.config.
We need to increase the value of two limiting parameters: maxBufferSize and maxReceivedMessageSize. In my example these values are set to 2147483647.
For max file size in WCF, we need to set following parameter in binding files.

MaxBufferPoolSize  gets or sets the maximum size of any buffer pools used by the transport.  The default is 524,288 bytes.

MaxReceivedMessageSize gets and sets the maximum message size that can be receivied. The default is 65536 bytes.

ReaderQuotas: Defines the contraints on the complexity of SOAP messages that can be processed by endpoint configured with a binding. You need to set the maxBytesPerRead, maxDepth, maxNameTableCharCount,maxStringContentLength.

<binding name="MyService"
       maxBufferSize="2147483647" transferMode="Streamed" >
       <readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647"
     maxArrayLength="2147483647" maxBytesPerRead="2147483647"
<!-- other binding info here, such as security -->

Max possible size: MaxReceivedMessageSize to 2147483647 (i.e. Int32.MaxValue), which is 2GB

Please Note:

1.       But it’s recommended that the file size should be less, as the data is transferred through network there is possibility of slow network performance.
 2.       We can set it to the maximum value, which is 2GB (2147483647) - or if we’re using streamed transfers we can go up to 263 bytes (almost 10,000,000 terabytes). But remember that by setting it to a value larger than necessary we may be opening our service (or client) to attacks.
3.       Only if the service is expected to receive messages larger than 65536 bytes. And in this case, we’d be increasing MaxReceivedMessageSize on the endpoints where we expect to receive such big messages.
4.     If the messages that our client/servers expect to receive are larger than the default value for MaxReceivedMessageSize, then we need to increase this quota.

Happy Programming!!

Don’t forget to leave your feedback and comments below!

If you have any query mail me to Sujeet.bhujbal@gmail.com     

Sujeet Bhujbal

Contact Me


Email *

Message *