The problem of parallelization is one of co-ordination.
Things are more easily parallelizable if they are more independent, that is they require less co-ordinating information flow between the parallel nodes.
We need a symmetry breaking theory to argue that test requires less co-ordination than generate.
Is there one?
Or perhaps we should just try to categorize things on a 2D graph :
|| || Generate || Test ||
|| Less Co-ordination Cost || || ||
|| More Co-ordination Cost || || ||
See also TopicsDiscussedHere