It strikes me as something that can be incredibly useful or do great harm, depending on dosage. A selection of conversation partners at your fingertips, and you can freely test reactions without risking harm to a relationship. At worst you reset it. Maybe you can even just roll back the last couple messages and try a different angle. Sounds like a great way to enhance social skills. Yet as you point out, healthy development also requires that you deal with actual humans, with all the stakes and issues that come with that.
People who are used to working with an undo stack (or with savegame states) are usually terrified when they suddenly have to make do in an environment where mistakes have consequences. They (we) either freeze or go full nihilistic, completely incapable of finding a productive balance between diligence and risk-taking.
If by social skills you mean high performance manipulators, yes you would get some of those. But for everybody else, it would be a substitute to social interaction, not a preparation for.
Only from a very narrow perspective. Opening yourself up and being real with people is how relationships form. If you test every conversation you are going to have with someone before having it, then the 3rd party basically has a relationship with an AI, not with you.
Now testing every conversation is extreme, but there is harm any time a human reaches out to a computer for social interaction instead of other humans.
That "instead of other humans" part is doing a lot of heavy lifting here. What if it's "instead of total isolation" or "instead of parasocial interactions" or "instead of exploitative interactions"? There are many cases that are worse than a person chatting with a robot.
It's very rare that you would ever say something that would have real damage that couldn't be resolved by a genuine apology. Having to actually go through an awkward moment and resolving it is a real skill that shouldn't be substituted with deleting the chatbot and spawning in a new one.