If you have a few long data loading and preprocessing steps it's a pain to wait for them to run again, people try to avoid it.
When something odd begins to happen, they don't immediately consider the possibility that it's not their bug and waste time trying to 'debug' the problem instead of just rerunning the notebook.
Would it be a solution to store intermediate computations to an in-memory or disk database like Redis, SQLite? It is a matter of few minutes to run a docker instance and write simple read / write + serialize Python util functions?
You don't reload every time you write a line of code. Nobody's insane like that. You reload every two hours or so. This is good enough for most except most extreme data sets.
When something odd begins to happen, they don't immediately consider the possibility that it's not their bug and waste time trying to 'debug' the problem instead of just rerunning the notebook.