It's easy to calibrate torque wrenches to +/-5% of each other for _torque_. Calibrating a torque value to _tension_ of a fastener depends not only on the accuracy of the tool, but on materials, lubrication, temperature, process, etc. +/- 30% actually depends on having a good process. This only accounts for variations in tolerance, surface finish, etc. If you add lubricant under the fastener head when you are not supposed to, you can easily reach +50 or +100% tension.
Ah, I see your point now, I thought you meant it to be a direct cause of the torque wrenches themselves to be off. That makes good sense and aligns with my experience.
After manufacturing tension tends to drop over time so starting off with a '+' may not be entirely bad assuming it isn't extreme and that it doesn't cause the materials to deform more than permitted. The way I understand it: you apply a certain torque to a fastener in order to get to minimum levels of tension and friction (which still have an engineering reserve) on the fastener itself to guarantee a seal and to stop the fastener from coming loose, so under-tension is far worse than over-tension as long as the over-tension does not result in damage to fastener or the materials, and the allowed tolerances for over-tension are quite large (up to +150% or so normally before any permanent deformation would occur).
Unless you are using 'stretch' bolts which tend to elongate to accommodate any over-tension to end up with something quite close to the intended value. This stretching tends to be non-elastic so you'd have to replace a stretch bolt every time you unfasten it or there is a pretty good chance that it will break and/or that the threads under the ending position of the nut will have deformed so that they end up being stripped if you refasten them because the nut will travel a bit further on every refastening.