The way we teach people to be software developers doesn't make a lot of sense.
It's one of those things that sounds reasonable so long as it is described in vague-enough terms. The idea is that you build up someone's skills slowly, incrementally.
For instance, some people believe that you should start by teaching scripting. Then promote students to procedural coding. The traditional version of object-oriented design follows. Finally, assuming the person you are talking to has enough understanding of software development, you start teaching design patterns.
The same is true of test-driven development. Start with test never. Then move on to test-after. Upgrade to test-first. Promote to TDD and then, ultimately, to organization-level BDD.
Refactoring gets a similar treatment if it gets any treatment at all.
What's really being proposed is ordering alternative ideas by how difficult they are to completely master.
This doesn't make any sense because it sets up a pattern of wasted effort. You start by teaching someone something the wrong way. Then, after they've attained a modicum of mastery in that way, you tell them it's bullshit and they should do something else.
How do we expect people to react?
I'm experimenting with a different pattern: Start with a skill that doesn't have any dependencies and don't worry about mastery. Just help people get enough to start using it, which really isn't that much. As skill in that area improves, it also enables someone to start learning a skill that depends on it.
In software development, given developers who already have basic coding skills, refactoring is the skill with no dependencies. Refactoring enables design because it's way easier to imagine a new design if you have a chance of getting there safely. Design, in turn, enables TDD because it's easier to imagine using tests to drive your development process when you know you can create and refactor to designs that will enable the effort.
It seems like it's working really well.