Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While analog watches aren't as precise, they are faster. It takes most people less time to process, as humans (apparently)[1] can interpret an image faster than numbers.

For example, 189/304 vs [####__]; the latter is probably faster to process.

Also, not related to clocks, but analog gauges in particular: a number lacks an important vector of information: rate of change.

[1] https://books.google.nl/books?id=nrtUgKzFhJ4C&pg=SA7-PA16&lp...



Honestly, I have a contradicting personal experience. For me, it takes me an order of magnitude longer to process an analog watchface into a numerical reading, than just glance over a digital watchface directly (over a second versus a few hundred milliseconds).

For the progress bars (or, say, tank level gauges) it totally makes sense - I like them the same way you've presented, it's easier to ingest [####__] (or, better, [####__]62%, so it can be also spelled out if someone's asking) than 189/304.

Linked book doesn't open for me, but I agree that this also applies to various instruments, especially in aviation, where IFR is of extreme importance. Though for a car I've picked mine specifically based on having a clear large digital speedometer (because unlike on an aircraft, cars 101% rely on seeing outside) and I'm always interested in speedometer reading for answering "how fast I'm going, exactly?" (speed limit comparison is a no-brainer, and with a precise reading I can maintain speed at +/-1mph of the desired target, which is impossible with an analog speedometer) rather than "am I slowing down or speeding up?" (I see this already).

But watches - when using for telling time - aren't really gauges, are they? Analog watchfaces can be (and are, by some) used as a gauge when measuring time intervals ("how long had passed", "how much time left"?), but again, like I wrote, I suspect that most people read the time then do the arithmetic rather than imagining a pie chart.


> , I suspect that most people read the time then do the arithmetic rather than imagining a pie chart.

depends how the person was trained, my friend.


Of course, yes. And I merely "suspect" rather than have some concrete statistics to back this up.

I have this suspicion based on a few anecdotal data points plus the fact that digital watchfaces are becoming more and more common (phones and computers specifically), and people love to do things in an uniform manner.

But I'm very biased here (and thus could be wrong), because I strongly dislike classic analog watchfaces. I'm certainly for novel approaches - I do like Apple Watch's "Solar" watchface (the digital version), as it has a digital reading centered in essentially a gauge that shows the Sun's motion, allowing to see the time relative to the daylight. Similarly, I enjoy calendar and weather widgets that show the day as a timeline, so I can see how events or weather conditions approach. But I'm always asking myself "what time is it now?" first (and "ok, how long until...?" only comes after, even if it was the original intent) - it was my mental model for as long as I remember myself - so I strongly prefer digital 24-hour displays.


> For example, 189/304 vs [####__]; the latter is probably faster to process.

That’s an excellent analogy. If digital were as quick to process, we would simply have data tables everywhere and no need for data charts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: