I feel like most of the strife caused at work is different value judgements on things. So I've been trying to tease out truths vs value judgements.
1 + 1 will always equal 2, no one debates that. AWS offers both Postgres and MySQL, no one debates that. But which one is better, that will be informed by possibly irrational value judgements ("Postgres bit me once, never again!")
I think people are irrational in terms of their inability to admit where their biases are ("Postgres bad") vs things that are largely factual ("we can choose between these vendors").
I've seen this very often. People are absolutely consumed by avoiding their previous mistakes. Sometimes those are even searched-for almost-made-up mistakes rather than real ones, like in postmortems for successful projects.
And then BIG changes are implemented in the process. Usually in the worst possible place: in the planning of the process in the first place. If a project succeeded, and finished, your adjustments towards future projects, of course, should be tiny and minimal. In practice people make sweeping changes ("level 3 testing needs to be planned in before we even make the software design" boomed the senior architect).
And yet one lesson you quickly learn in machine learning is that whilst making adjustments that are too small ... does pretty well. You can be off by a factor 10 and while sub-optimal it works (and often, the further you get the smaller the adjustments should be).
By contrast if your adjustments are 1.1x what they should be you are stuck. You will never get where you're going. Never. That's a complete failure and you should expect to do worse and worse over time with such a strategy. So a smart person should almost always err on the side of not adjusting a behavior. Certainly from one project to the next, making more than 2% adjustment is absolute lunacy. This never works in learning algorithms. Never (0.1% is the largest generally used value). That would mean that for every software engineer in your team there should be ONE day difference EVERY 2 MONTHS from one project to the next. That's 2%. Even that is an absurdly large change, far greater than wise.
But I've never even once seen a software team that doesn't start with making 20% (one day a week different) adjustments between projects. And I've seen people make much bigger changes than that, generally not with better results. And the thing they're proud of is that they avoided a previous mistake ... a mistake they solved ... often much quicker than the delays caused by the adjustments ...
You have good points but ML needs thousands of generations; at most we'll have done dozens of multi-person projects under our belts. 2% doesn't even beat inflation.
Opinions from those who have used one aren't useful. "Best [or worst] I've ever used." Bit or saved by each differently on multiple projects and a good choice can be made.
If you collect the pros/cons from everyone and try to agree on which ones matter more in the current context then you're not tied to a previous commitment.
1 + 1 will always equal 2, no one debates that. AWS offers both Postgres and MySQL, no one debates that. But which one is better, that will be informed by possibly irrational value judgements ("Postgres bit me once, never again!")
I think people are irrational in terms of their inability to admit where their biases are ("Postgres bad") vs things that are largely factual ("we can choose between these vendors").