It does give more space in the sense that the model generates more tokens of steps/etc that it can then base its actual answer on, rather than being forced into generating the answer right away.
> It does give more space in the sense that the model generates more tokens of steps/etc that it can then base its actual answer on,
It doesn't give more space in the sense of increasing the upper bound on space used; it may bias the space used higher than a single naive prompt aiming to respond to the same question, but it doesn't alter the constraints.