(b) Would staring at the correct red color on a monitor work? Or would that not work since monitor colors are generated by mixing R, G, and B pixels and therefore not a pure wavelength?
(c) Would it be reasonable methodology to test one eye to see if it improves over the other eye?
Note that it's fairly common to see chromaticity diagrams labelled with wavelengths around the edges, so you can check multiple sources if you want.
I didn't read the original article in sufficient depth to tell you if the researchers even know what 615nm would do, but it's not 670nm anyhow ;-).
Incidentally, the linked article also includes measured spectral responses for a few wide gamut screens, some of which have at least some response near 670nm. The most extreme on that page was the Del 3007WFP-HC LED, which peaked at 653, and has a wide peak hitting well over 700nm too (but below 650 too). A 2019 model Samsung Q80R qled TV (perhaps more common), had a peak at 632 nm, and at 670nm is about 20% as bright as at its peak.
(d) How much does the precise wavelength matter? The study used 670nm but also mentions the range 650-1000nm. Lots of red-light gadgets are available on Amazon but how do you verify the wavelength? I was thinking of an optical spectrum analyzer[1] but at $28,000 they are shockingly expensive. Even with raw LEDs bought from a reliable electronics distributor like Digi-Key, it would be nice to be able to measure it somehow.
You can get a 1nm resolution "science grade" spectrometer for under £1k - I used to design them [1]. You can also buy used ocean optics kit for very reasonable prices on eBay (a few hundred for a USB2000 if you wait).
You might find this of interest - it seems to be a cottage-industry spectrometer selling for £67: "The i-Phos can see wavelengths from approximately 420 - 980nm and their relative (though not their absolute) intensities." http://chriswesley.org/spectrometer.htm
I suspect you do not need to use 670nm exactly, however I suspect you need to be above 650nm. Note that many red LEDs are at a lower wavelength (~630nm). So you need to search for deep red or by wavelength.
You can use a OSA, but yes they are expensive. Spectrometers are typically cheaper, but still expensive (you could go to a university optics group and ask them if they could measure it for you). That said if you buy from a reputable source you should get the right wavelength.
If you are trying this out be careful with the brightness.
Shouldn't it be quite cheap to determine the frequency using a prism, using the angle of refraction? I think you should be able to use known missing frequencies in sunlight to calibrate your set up. Back in a minute going for duckduckgo
So it looks like you need a bit more than a prism to see Fraunhofer lines, but there seem to be online description of how to do it with a with a prism and a CD, maybe you don't even need a prism, there seem to be 'make your own spectroscope' tutorials than mention building one with cereal box, and maybe a lens of some kind
(b) unlikely, monitors work by essentially filtering out undesired colors from white light. I don't believe that >650nm light is within the color gammut of monitors. That's quite a long wavelength and close to infrared. To give you an indication, the Helium Neon lasers which used to be very common in schools etc for laser demonstrations have a wavelength of 632nm.
Even if the monitor could display this, the brightness would likely not be strong enough.
(c) Could be, but it might be quite annoying to do.
There are several different approaches to monitors, OLED RGB sub pixels individually emit light. https://en.wikipedia.org/wiki/OLED. That said, your wider point stands.
Might be a good idea to stare at the light reflected from a white sheet of paper. The wavelength won’t change but the intensity (and FOV coverage ) certainly will
This only makes sense on a precisely calibrated display. On everything else, there's no way to tell what wavelength you actually end up with when you fill your screen with a color.
A perfectly calibrated display might still not be able to display pure wavelengths. At the very least if it's calibrated properly it won't normally do so at #ff0000 as pure 670nm is well outside the sRGB colour space.
(a) Does this study sound plausible?
(b) Would staring at the correct red color on a monitor work? Or would that not work since monitor colors are generated by mixing R, G, and B pixels and therefore not a pure wavelength?
(c) Would it be reasonable methodology to test one eye to see if it improves over the other eye?