I think an auxillary testing group is a good idea. Someone who tests all the components as a cohesive whole. Ensuring it acts as expected in terms of businesss expectations.
My experience though has been counter productive though. I expect a bit of lead time, introducing them to the code base. But I had daily meetings covering the code base, and what it was delivering. Then several weeks of meetings mapping out test cases into a spread sheet, with the various results.
There is a fine balance, but when executed properly they are worth it.
I like having a tester watch my back.
I have been helped by different kinds of testers. I like the guys who keep careful notes, have automation, and can hand the automation code to me so I can reproduce the bug in a split second.
There also was a guy that we hired to do customers who could and would screw anything up and we made him a tester; if he could not screw a system up, nobody could!
I think domain knowledge is the most important quality of a good manual tester.
We've had great luck with testers that moved from users to testers. We had a nurse testing as a tester on our clinical app and she was invaluable, we had a geologist testing our seismic applications and she was also invaluable. They also focused on what issues were important in an app, and were great at prioritization.
And we've had cross-domain testers, and they were not as valuable. They seemed to be more nit-picky because they didn't have the domain knowledge to judge what was truly important and what wasn't.
What makes a good automation engineer is pretty similar to what makes a good software engineer.