You are not the OP, but I think I was trying to point out this example case in relation to their descriptions of Error/Warnings.
This scenario may or may not yield in data/state loss, it may also be something that you, yourself can't immediately fix. And if it's temporary, what is the point of creating an issue and prioritizing.
I guess my point is that to any such categorization of errors or warnings there are way too many counter examples to be able to describe them like that.
So I'd usually think that Errors are something that I would heuristically want to quickly react to and investigate (e.g. being paged, while Warnings are something I would periodically check in (e.g. weekly).
Like so many things in this industry the point is establishing a shared meaning for all the humans involved, regardless of how uninvolved people think.
That being said, I find tying the level to expected action a more useful way to classify them.
But what I also see frequently is people trying to do the impossible and idealistic things because they read somewhere that something should mean X, when things are never so clearly cut, so either it is not such a simplistic issue and should be understood as not such a simple issue, or there might be a better more practical definition for it. We should first start from what are we using Logs for. Are we using those for debugging, or so we get alerted or both?
If for debugging, the levels seem relevant in the sense of how quickly we are able to use that information to understand what is going wrong. Out of potential sea of logs we want to see first what were the most likely culprits for something causing something to go wrong. So the higher the log level, the higher likelihood of this event causing something to go wrong.
If for alerting, they should reflect on how bad is this particular thing happening for the business and would help us to set a threshold for when we page or have to react to something.
It would be more plausibly practical if GHC could now target wasm, but this announcement is actually about being able to run the compiler itself in the browser.
I have burned git into my brain, so it's no longer hard to me. OTOH, I only pull out jq once every six months or so, and I just barely scrape by every time.
I also do this sort of restless refactoring. I find interacting with a new codebase is a kind of brain hack that (subjectively) helps me get up to speed faster.
It appears to have something to do with CGS units.
1 Jy = 10-23 erg s-1 cm-2 Hz-1 (cgs)
only their figure: L9.9 GHz < 2.1 × 10^25 erg s−1 Hz−1
leaves out the cm-2. (So not a density, like Jy. Perhaps 'L' is luminosity? ... As in: "The solar luminosity unit is a measure of the Sun's radiant energy and is equal to 3.828×10^(26) Watts." -(NRAO)
I was thinking maybe joules/year e.g. energy/time might be some brightness indicator for some astronomical definition of brightness - especially in the non-optical wavelengths.
But that's division, not multiplication. Another thread in my brain thought maybe the product of the two could be useful for people in that field, sort of how ISP is useful to people in rocketry but us normal people need to divide it by G(earth) to get something intuitive.
I don't think it's obscure in that field or for the target audience. You might want to read the soon to be published distilled and transposed article in popular mechanics ...
reply