Thursday, March 22, 2012

Asynchronous data flow tasks how to run more than 4 at a time

Hi guys,

i have a for each loop and it has about 20 data flow tasks (simple data extractions). i notice when i run the package it only runs up to 4 data flow tasks at a time. others have to wait till one of the first 4 flows finishes.

i was wondering if there's a way to change the limit of how many data flow tasks can run at a time. is there a property some where ?

i know this will be stressfull to the server, but the server is well equiped with CPU power and memory, so performance will not be an issue.

any thoughts?

Package.MaxConcurrentExecutables Property

Valid values are one and higher, or -1. Other values are invalid. A value of -1 allows the maximum number of concurrently running executables to equal the number of processors plus two. Setting this property to zero or any other negative value fails with an error code that indicates an invalid argument.

This property is used when parallelism exists in the workflow. If the workflow is a series of sequential precedence constraints, then this property has no effect.

I don't know if you can get more than CPU Count + 2 by forcing the value. If this is a 32-bit server then I would be concerned about memory, as despite having 10 GB in there, a process (read SSIS Package) can only use 2GB or 3GB with the/3GB boot.ini switch, so you may want to break out into multiple packages, or just call the same package multiple times. The Execute Package Task can be used to get multiple processes with the out of processes property, but this has a higher overhead for loading and starting the packages.

No comments:

Post a Comment