Hi,
I have a SSIS package which is scheduled to run via SQL Agent. I have a data flow task within the SSIS Package which has a single source adapter and single destination adapter & lookup transformations.
The # of engine threads for the task is 5 (default). The max concurrent executables on the package is 20. The packge contains a whole lot of execute sql tasks etc.
The package after few (3-4) successful runs started to give following error for the data flow:
"The Data Flow task engine failed at startup because it cannot create one or more required threads"
This could be seen from the log files.
OR
"Data Flow task failed to create a required thread and cannot begin running. The usually occurs when there is an out-of-memory state."
Even if concurrent exectuables are set to -1 it is same.
Now the Package keeps on failing. What is the cause of this problem. I have not run, but I am sure once run through BIDS it would work fine.
Any hints on the same? This has put us in a trouble spot.
Server Config:
4 - WAY, 16 GB RAM, enough disk!
Thanks,
Gaurav
Hi Gaurav,
I am also facing this same problem, after few sucessful run the package is returning above error now.
that is :
[DTS.Pipeline] Error: The Data Flow task failed to create a required thread and cannot begin running. The usually occurs when there is an out-of-memory state.
Any solution you have got? Please share it with me.
Thanks a lot.
Swarna,
|||
What service pack level are you running?
|||Hi,
Thanks for your reply, I am running with SP2. Here the problem is i can not run the package at all, even in BIDS i am getting the above error.
Thanks.
Regards,
Swarna.
|||Has anybody got the solution for this?
I have a package which was running perfectly and suddenly it blows up with this memory error?
Please can anybody share the solution?
Thanks,
Prash.
|||Well My pakage is having too many lookups.
When I incresed the cache memory from 5mb(default) to 30mb it stared running again..
so I think it can be one of the solutions
prash
|||Which cache you are refering? any package setting?
Swarna.
|||I believe he is referring to the properties on the Advanced tab of the lookup - under Enable Memory Restriction.|||yes u are rite!!|||For my case, i am not using so many look ups, but still i am facing the same problem. Any idea?
Thanks,
Swarna.
|||in fact my package was running well when there were about million resords fetched in lookup. but as the records incresed to 2.5 millions package suddenly it stopped and i was kicked out. i saw progress of package . actually debugger was showing that all stages are fetchin number of records but still my data flow was showing yellow color with the message as package finished execution. I was really confused. but some how I incresed the cache memory of every loookup in dataflow and it stared executing normally....|||
Swarna wrote:
For my case, i am not using so many look ups, but still i am facing the same problem. Any idea?
Thanks,
Swarna.
For the lookups that you do have, is memory restriction enabled?
|||swarna can u check debugger and see wether its showing any info about records being fetched in various stages?|||Hope I'm not too late in giving my feedback here. The issue that we have faced here is where there are not too many lookups object, but it keeps giving the same error message.
What I have tested was to optimize the usage of the Lookup object by not selecting the "Use a table or a view:" selection. Instead, I chose the "Use Results of an SQL Query:" selection. I had only 3 lookup objects, 2 of which are looking up to a table with less than 10000 records and less than 8 columns in average. But the other table that we did the lookup has about 2 million worth of records, with about 10 - 12 columns.
I did the changes for all of the lookup object and some how everything was working again. I think it would be the same issue with the memory restrictions as well.