The reason you can’t buy a good webcam is the same reason you can’t buy a high quality monitor outside of LG’s unreliable apple collab.
The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them. They simply rebrand commodity cameras and IPS panels in a crap plastic housing and slap their logo on it.
Then they give the product a hilariously user-hostile product name, like “PQS GRT46782-WT” as an extra f-you to the user.
They don’t care about you because they have no ongoing relationship with you, and their executives mistakenly see their own products as commodities.
Combine this with the fact that most home users don’t care about good quality or even know what it is, and you have the current situation.
A friend once described the peripheral market as “Assholes selling crap to idiots.”
> Then they give the product a hilariously user-hostile product name, like “PQS GRT46782-WT” as an extra f-you to the user.
That is also a strategy to prevent product comparisons and unbiased reviews. They quickly cycle through product names and sell a certain product no. only in a limited geographical area.
Doesn't matter if a consumer org/magazine/someone on reddit/your friend/etc. does a review. The product will be out of market by the time you read it, or will not be sold in your country. The similar looking product you find on the shelf might be the same, or it might have something completely different inside.
That's exactly what I have been telling folks about Apple vs. other laptops. Apple only has a handful of laptop models for sale and they don't even change that much across generations. Furthermore they seem to exhibit hardly any manufacturing variations within each generation. That means that if there is a problem (and yes, there were a few big ones), everyone is affected the same, everyone is screaming about it, the majority of customers are not corporate customers, and eventually a class action lawsuit is set up and Apple will often (grudgingly) offer to fix/replace broken units for free, like what happened with Staingate or the Butterfly switches.
Now compare that to buying a model from Dell or Lenovo, where the current product lineup is already 2-3x the size, the models are sometimes discontinued, sometimes changed significantly between refreshes, often refreshed annually, oftentimes configurable in a meaningful way (1080p non-glossy vs. 1080p privacy screen vs. 4k glossy vs. 4k touch screen), sometimes just available in certain geographical locations and they exhibit more intra-generation manufacturing differences. The chances of finding other folks with your exact same permutation (and same day of the week it was manufactured) of these options are much smaller, so you stand less of a chance of getting something recognized as a fundamental manufacturing issue which should be covered for free by the vendor. Plus, even if you can get repair/replacement for free, you still fear that your specific model has a flaw, so you might only get lucky after having it replaced 2-3 times.
I've seen it happen with Dell and Lenovo where folks sent back brand-new units repeatedly because the first one had overheating issues with the SSD, the second one had really noisy capacitors and the third one had a display cable that wasn't seated correctly. At least with Apple I know that if I'm getting screwed, I'm in the same boat with everyone else ;)
That last paragraph is indicative of terribly quality and quality control.
Fortunately that hasn't been my experience with the recent Dell, Asus, Lenovo and HP laptops I've purchased. Each have been without any issues at all.
But my point here is that... it sounds like you're arguing against consumer choice. You can have your Model T in any color as long as it is black. And this is Apple's model. You can have your product in any configuration as long as it's the one configuration Apple offers. Apple tried this, actually, for a long time. The iPhone started out with extremely limited configurations and only more recently branched out beyond 2. In a way, I agree with the confusion, because I know now I can't just say "Macbook" because there is at least one Macbook without suffix, at least one Air, and the Macbook Pro has a myriad of configurations - different sizes, with or without touch bar, etc.
Why did Apple start offering more options? Because that's what consumers want. Dell exists exactly because they were the first big PC manufacturer to accept configuration orders, and then (relatively) quickly manufacturer and deliver those custom configurations to users. Consumers want this. As Apple's share of the market grows, they will have to meet the consumers where they are - or their market share will be limited by the limitations they place upon themselves.
Now, I agree that, for example, the variety of models between different geographic locations is - if nothing else - annoying. Especially when nicer options aren't offered in your location! But I don't agree with the offered example of getting bad replacements. Maybe buying one laptop a year isn't enough to experience these issues.
Actually from what I've heard from Linux distro maintainers (mainly from Fedora) it seems like Apple actually does quite substantial hardware revisions internally without telling anyone as users report that their MacBook 20XX does not work with a Linux distro while another users report that on the paper the same device of the same model year works fine.
IIRC the estimate was about 4 hardware revisions per year.
This really complicates the already limited attempts to run Linux distros on the Mac hardware as users simply can't reliably tell if their hardware will be compatible beforehand.
On the MacOS side they likely paper over the differences, quirks and bugs in firmware and drivers, so the user does not native anything and they can change the hardware underneath as needed.
Apple has it's fair share of issues, and it's often a pain to diagnose them remotely without the user looking up the specific model code which isn't all to easy to identify. They often don't change any visual appearance and certainly don't distinguish between different models in their marketing. It's easy for me to look up what common issues Lenovo's had with a specific T series model, but if I'm buying a used MacBook, it's hard to know what I should be searching for until the seller has told me the exact model number. And even still, there's variance between machines of the same model because sometimes different panels and SSDs are used for the same model number.
> without the user looking up the specific model code which isn't all to easy to identify
My 7 year old macbook has "A1465" written in perfectly legible text on the bottom. "About this mac" has the serial number two clicks away, which is convertable online to exact specifications.
My old laptop from 2006 had nc8430 written close to the screen hinges. The new one from 2014 has ZBook at the bottom left of the keyboard. Maybe it's one of the differences between $300 laptops and $1000+ ones.
Except what you said is not at all true. Dell, hp, and Lenovo have a whole bunch of different models for different laptops. They're also very customizable, so the model may be slightly different if you want different hardware. But you're arguing for less consumer choice. Dell, hp, and Lenovo do not swap models around to try to fool consumers. Quite the contrary; you can search a very old model number and find exactly what you're looking for on their site.
They all cater a different need. I donated few laptops to a school in India. These were some of the cheapest models Lenovo had to offer. Lenovo Dell etc. cater to a much wider range of market that apple does not even bother to target. You are basically comparing Tesla with Camry and upset that Camry is not as good.
On other hand Lenovo X1 carbon is a pretty solid high end laptop along with LG gram. Former specifically is far more customizable, repairable, upgradable and also around 30% cheap while being more powerful.
That raises a very interesting point: The highway system. Who owns it? Who can “attach” to it? What you said made me think about line access and comcast using its mass to keep out Google Fiber etc. I really, REALLY do not want that battle to take place over roads.
The geo-locking of model numbers is one of the vilest practices I've seen. I don't see it going away for any reason, it'll only be possible to combat by intl. legislation... but how do you even legislate that?
I don't see it going away as there are valid reasons for doing so for small but important market differences.
If you're making some device (for example, washing machine) which has a power cord and a knob for some mode selection with writing on it, then for the exact same internals you need different models where the power cords are different (USA, UK, Germany and Japan each require different plugs) and the writing on the device is printed in different language and with customizations such as Celsius vs Fahrenheit. You can't sell the exact same laptop model in every market because the keyboard layouts are different. Etc.
Just append the region identifier to the model number. Internally they clearly need to track the products separately, and you don’t want people getting the wrong product accidentally.
> then for the exact same internals you need different models where the power cords are different
The IEC 60320 connectors were specified for exactly that reason. Honestly, I don't get why these were not made mandatory for all kinds of appliances. There are even locking variants available if vibration is of concern.
> The IEC 60320 connectors were specified for exactly that reason. Honestly, I don't get why these were not made mandatory for all kinds of appliances. There are even locking variants available if vibration is of concern.
I'm not sure what you mean by the second sentence but you can't use most appliances made for Europe in America and vice versa. Most electronic appliances depend on the input voltage and supplying 240V can easily cause a fire. That is true for almost all electronic appliances (water heater, fan, washing machine, etc) but not true most "computer related devices" such as a monitor, PSU, charger. Since those devices already operate on a much lower DC Voltage, they often have transformers (not sure if that's the right word), that can scale down the current from either 120 or 240. [0]
That being said, a mandatory IEC connector (and it's variances) would help a lot to cut down unnecessary e-waste. Instead of throwing away a device because the cable is damaged, you can easily order a replacement that is around $2 and high quality, instead of relying third party cords that might have bad wiring from a non reputable brand. The reason they are not mandatory, though, is that most companies like to have their own connectors so that you either overpay for it or just buy a new device.
[0]: You should still always read the specs on the input current for the device though. It is dangerous to rely on the fact that similar devices can operate at 120V/240V because yours might not. You can usually see the specs on the website/packaging or usually near the input plug.
> I'm not sure what you mean by the second sentence but you can't use most appliances made for Europe in America and vice versa. Most electronic appliances depend on the input voltage and supplying 240V can easily cause a fire.
Here in America "electronic appliances" would imply the tech/gadget category like TVs or computers where "electrical appliances" would be the big household equipment. Just to clarify in case that confuses anyone else like it did me, it kind of reverses the meaning of what you're trying to say.
Anyways, at least with relatively modern gear you can generally assume that anything with batteries or USB ports runs off a switch-mode power supply, and all but the cheapest of those will happily accept pretty much anything resembling residential power.
Anything with a large motor or any kind of resistive element (lighting, heating) on the other hand is almost certainly built for a specific variety of electrical service and will likely require modification to accept anything else without releasing the magic smoke.
The stuff in between those categories, well, RTFLabel. Outside of audio and ham radio gear I'd imagine most DC stuff runs on switch mode power supplies these days.
I wonder if there are regulatory reasons that prevent IEC connectors being used in e.g. washing machines. I guess getting a water ingress protection IP rating might be harder if you have an IEC connector. The lack of an IP rating might prevent you from then installing the appliance in, say, a bathroom (depends presumably on country-specific regulation). This in turn might limit your sales.
Even if that's the case, the appliances should be easy to repair for a competent person and if necessary allow the cable be replaced.
My cheap hair drier has a switch to select the input voltage (you need to turn the dial with a screwdriver). For many devices it shouldn't be too hard to make it possible to use them both with 110V and 230V, even more so for already complex and expensive machines like a washing machine.
The biggest problem might be the amount of power a device can draw. Half the voltage gives you half the power, which is the reason why e.g. kettles are much less useful in the US.
For resistive loads (like the heating element in a kettle), half the voltage gives you a quarter of the power. (Electric kettles work just fine on 110/120; they just haven't been a thing in the US. They've been ubiquitous in Canada, although they've been pushed aside somewhat by drip coffee makers. You just need a lower-resistance heating element than would be practical with 220/240.)
Electric kettles in the US typically are 1000W-1500W, while in Europe any kettle is 2000W-2800W. This is simply because houses are typically wired with outlets for 10A-16A everywhere, regardless of grid voltage.
That ofc makes electric kettles much less useful in countries with 110V grid. It also keeps stovetop kettles relevant in these counties, since stoves don't suffer power limitations.
Houses in the US are typically wired with 240v split phase, so nothing is stopping a mad lad from installing a 240v outlet in their kitchen, and running a kettle from it.
This just seems so crazy, to think of 120V. I haven't seen a house here with 120V in my entire lifetime. It sounds like a relic from the past, like the kind of thing people used back when they rode horses.
Hmm, I'm having trouble understanding what you mean. 120V is the standard household plug in North America. If you're in, say Europe, I don't imagine you'd see a house with 120V as IIRC, 220V is the standard, and you generally have to connect to your local grid. I think the parent's post about 120V was about international devices that may have to work in North America, for instance.
> you need different models where the power cords are different
That should only affec the power brick, and hence the overall SKU, not the notebook itself.
> the writing on the device is printed in different language and with customizations such as Celsius vs Fahrenheit.
What writing is there on notebooks except for keyboard labeling and the product name? The certification label on the back already shows all labels for all countries.
> You can't sell the exact same laptop model in every market because the keyboard layouts are different.
Keyboard layouts are somewhat orthogonal to regions. I'm German, but use US-layout keyboards.
Legislation can mandate the conspicuous publication of a clear indication of the difference between models.
Now, companies can of course lie about this, in theory, but that's a bit like car manufacturers lying w.r.t. emission tests - possible, but you tend to get caught (cf. the recent Volkswagen case) so it's probably not worth it.
Here’s my lazy take: the best governing body to start petitioning about this is probably the EU. If I remember correctly, they already have some of the most consumer-friendly laws on the planet, e.g. w.r.t. planned obsolescence.
Eben Apple sells different iPhone versions on different countries. It’s mostly frequency bands but Chinese iPhones have a second sim slot instead of an eSim as far as I remember.
It also makes it possible for big electronic stores to give price guarantees as the webshop undercutting them has monitors with a different product number.
It becomes even more unawesome when the same product identifier actually has different parts in it.
When looking for a nice display a couple of years ago I clearly remember reading about some that were promising yet getting confoundingly variable reports, until some tore their hardware apart and revealed that the internals were different.
Tip: try to find out what panel the monitor you are buying has. Then look the panel up in a database like panelook.com. This way you can get the specifications without any marketing bullshit.
It also works the other way round. Find a panel that is good enough for your eyes, then see if there's a mass marketed display with that panel. If you are adventurous, you can grab "DIY" or "assembled" monitors with the panels on Chinese e-Commerce sites.
Isn't there a chance that the assembled Chinese monitors actually use second grade panels that the big makers wouldn't accept?
I remember getting a 27-inch 1440p display from a Chinese manufacturer for really cheap back in high school. It should've been the exact same panel as was in Apple's iMacs. However, the were some quality issues with it long term and it's definitely suffering from burn-in that I don't think the iMacs suffer from.
FWIW Planar makes 27" monitors that use the same 5K and 2K panels used in the iMacs, down to the bonded glass surface. The Planar IX2790 and PXL2790MW respectively. I have the PXL2790MW and if you look closely you can see the glass peephole for the nonexistent iSight Camera. Not sure if it's B grade panels that Apple rejected but it's flawless, maybe I just got lucky.
It is hit or miss. I own two, the one is flawless (apart from the retention which is the norm apparently in LG displays). The other I have exchanged it 2 times and still have issues with many many dead pixels. So in my case 3/4 were bad apples.
I have a couple of these from 2014-15, and they are very, very nice (and as a plus, they matched the dpi of some macbook models at the time, at least). One surprise: they were very heavy compared to other, similarly-specced monitors.
The only issue I had was coil whine coming from a choke on the power supply inverter board. I resolved this by cracking open the monitor and encasing the choke in two part epoxy.
That was the case with noname Korean/Chinese monitors a decade ago that used high quality IPS panels found in Apple and other professional displays - they used rejected panels, which had various issues (mostly dead pixels afaik).
Yep! Still using the "Auria" monitor that I purchased at Microcenter back around 2012. Cost maybe $300-350 for a 2560x1440 IPS monitor at a time when you were easily looking at $500-800+ for a similar panel from a name brand.
Now, if you were a professional, that quality control and warranty (not to mention better ergonomics, etc.) were easily worth the added cost, but for just "some dude who liked playing video games and doing some photo/video editing", it was a great bang for the buck.
I still use this as my main monitor and haven't noticed any dead pixels (if there are any, they're so hard to see that they may as well not be there). It's not the best monitor out there and you can probably get a better 2560x1440 display for less now, but at the time it was a big improvement over the cheap 1920x1080 display that quickly got demoted to secondary (and has now been loaned indefinitely to a teacher friend who needed a second monitor to plug into her laptop for online classes).
Heyo another Auria user here! I actually recently upgraded to a Dell 4k screen, but that Auria served me great for several years and is my secondary setup screen. Got it used for $150, amazing value there!
That brings back memories :). I ended up buying one of these and apart from some weird quirks (only wanted to work over DVI and not with an HDMI-DVI dongle), the image quality was great and so cheap (for the time).
I'm reading this on an IPS panel I bought a decade ago from South Korea. Works great, but with a bit of light bleed in the top left hand corner. I paid extra to have one without dead pixels.
Bought mine on Amazon for $400 six years ago. It stopped showing a picture but I get a white flicker at the base of the screen every few seconds.
I could (and probably should) investigate fixing it but it was easier to buy a 2160p Philips for $240. Only issue with the Philips is it doesn't have a VESA mount and it would be difficult to make some sort of jury-rigging work.
I run them attached to a Mac Mini and use the DisplayPort on the monitor. At one point I believe HDMI (or maybe just the Mac) wouldn't do 1440p. I'm copying stuff from an Intel Mac mini to an M1 and I'm able to toggle back and forth using HDMI for the Intel just fine.
Yeah, a cheap 1440p as a student was certainly a great thing when they were still rather expensive, even though the base was wobbly as all hell and it later developed some issues.
Yes, so often I ask if the seller can provide a "perfect" display, that is, without any artifacts on the display. This adds 100-200 CNY to the price.
There's a Chinese panel manufacturer called BOE that makes products competitive with some of the lower-end Samsung / LG panels.
I got one 15.6" 2160p external display with a BOE panel that offers 100% sRGB coverage. I can see a huge difference compared to my Dell Latitude laptop display.
Now if anyone can find a source of 55" 4K OLED panels, that would be the one ultimate display. Combine it with a VBO driver board and it becomes better than any smart TVs.
>I can see a huge difference compared to my Dell Latitude laptop display.
And outside of a few occupations that might actually require pixel-perfect colour, what does this matter? Is this like the audiophile world, where people argue about seemingly subjective things that no else cares about?
The customer interprets colours differently than you, the customer sees colours differently than you, and the customer is using a monitor that almost assuredly displays the colours differently than yours. And the world continues to turn.
I don't think the audiophile comparison makes really sense here (and I like to mock audiophiles more than most) simply because display technology still has a long way to go before it reaches the level of audio when it comes to "bang for your buck".
CD quality audio is less than 1 megabit per second per channel, uncompressed. HDR (10 bits per component) 4K60fps 4:2:2 video is around 10Gbit per second of data.
Of course data bandwidth is only a small part of the problem of correctly reproducing an analog signal, but it gives you the orders of magnitude we're dealing with.
I currently use a cheap ASUS 4K display. It's more good enough for coding, but I wouldn't trust it for any sort of graphical work. The viewing angle is pretty bad, so depending on what part of the screen I'm looking at I see colors differently, and some gradients become more or less visible depending on which part of the screen they're on. Contrast is pretty bad, making even some videogame display poorly: depending on the location and time of day contrast seems always too high or too low.
You can buy a good sub $100 pair of earphones and a sub $50 DAAC and they'll be good enough to do 99% of any audiophile work you could ever want to do reliably. If you want to do serious graphics work without having to constantly adjust for your display you'll have to go for something a lot more expensive than an entry-level monitor.
These differences are very clearly noticeable. I upgraded many years ago from a 72% sRGB to a 99% sRGB Dell IPS and everything looked much better. I just got the LG 27GN950 which is 95% DCI P3... I was mainly getting it for the 4k/144 with the P3 as a nice bonus (I already had 4k/60 on the Dell). Looking at the Dell, I was thinking that P3 might be nice to have but it wouldn't really matter much aside from photo editing - the colors on the Dell already looked great.
I just unboxed the new monitor 2 days ago. The richer color was immediately noticeable, and when I looked at some random photos I took with my phone recently I was blown away by just how red and green and yellow/blue things were. Like a completely new realm of color.
It's one of those things that you can't appreciate until you experience it (same going from the original 72% to 99% sRGB).
The Dell was $450 for 4k, 2.5 years ago. The new LG was $800, but you can find 60fps P3 4k monitors for around $500 these days iirc. If you're on Hacker News you probably use your computer a lot. Unless you're running low on cash, upgrading to a great monitor is worth it.
Seconded. I have 2 LG 27GN950-B's on my desk, and love the 27" 4K HDR @ 144Hz experience (at least on Catalina. Big Sur has completely broken DSC and will only do HDR @ 60, non-HDR @ 95).
monitor image quality is quite a bit more objective than what audiophiles look for in high end audio equipment. sRGB defines a specific physical color that ought to be displayed for each RGB sequence. if you can get a very accurate display for <$1000 just by doing a bit of research, why wouldn't you?
A similar thing to audiophile is that better quality doesn't always mean improve QoL for just a consumer (not designer or similar use). Sometimes I think that it's happy if I satisfied with $100 headphones or cheap laptop display quality.
> And outside of a few occupations that might actually require pixel-perfect colour, what does this matter? Is this like the audiophile world, where people argue about seemingly subjective things that no else cares about?
I'm a color blind person and even I can see a color difference between cheap displays that I have at work and an old EIZO one that I bought years ago at home.
I can more accurately diffrentiate between different colors/shades on my EIZO panel.
I enjoy having a high quality display for all kinds of reasons. Better comfort while programming, accurate colour representation while looking at photos, having a good sense of what things might look like for others (accurate colour means you might be the middle ground of your users experiences, inaccurate colour means you can’t be sure at all), and otherwise, if I’m going to spend a lot on something I’ll own for half a decade I would prefer to get something accurate. The price difference isn’t sufficient enough to justify saving a little bit to have a poor colour experience.
> accurate colour means you might be the middle ground of your users experiences, inaccurate colour means you can’t be sure at all
Having a setup with multiple cheap monitors is imho really underrated for design and development. Moving something between screens and seeing clear contrast disappear, or see pleasing color choices turn ugly can be eye opening.
Agreed! Back when I was in music school, they brought in Tony Bonjiovi[0], a well-known record producer at the time. He talked about how the ultimate test of any recording was to copy it to a cassette, take it out to the engineer's Camaro with 1 broken speaker and see how it sounded there. If it sounded good there, it would sound great anywhere else.
Actually iMacs and their display counterparts in the LG ultrafine series are known to suffer from burn-in.
Google iMac or LG ultrafine “image retention” or “ghosting.” I have no idea what percent of displays are affected, but there’s enough threads about it on Reddit and macrumors to make me think it’s pretty common.
For monitors it is more than that. For a pretty expensive 144hz/1440p/gsync category that I researched a couple of years ago there were three options: acer, asus and viewsonic (and unavailable aoc). It turned out that asus, despite being a “better, much more money” brand, did a worse job of mounting the panel, so it had statistically worse backlight bleeding at one edge.
What's wrong with Asus installing alarms in their monitors? I learned the hard way being woken up in the middle of the night by a loud siren I couldnt locate the source as I wouldn't have expected it will come from a frikin monitor! There is no way to turn that off apart from physically powering it off and it happens totally random.
Oh, that’s amazing. I got curious and found this comment on youtube, sharing in case it may help or diagnose:
Battle Angel Sorry, not sure why I didn't share previously. So, I believe this is caused by using a non-HDMI cable with the audio out turned on. Either turn the audio in the display off completely in the settings, or use a new HDMI cable. The alarm is a result of the display trying to send an audio signal through a Displayport cable. Are those of you getting this alarm using Displayport cables? They do not pair audio with video, as HDMI does. I hope this fixes your problem.
It took me a while to find something matching those specs too. I got a couple of these when the price dropped below $300 and they're great. I had to watch a few PC Building deal Reddits to catch the deal.
Yeah, if you look into it, you'll find most monitors using the same panels from LG, AUO, Samsung or ChiMei, with some outliers.
When it comes to assembled monitors, the highest failure rate is in the power supply. The components used and the cooling/ventilation play a big part in that.
Yes. Consider most people only care about the resolution, sometimes manufacturers substitute a lower cost panel that is inferior in say, gamut or response time.
...and if you're really adventurous, you can buy just the bare panel, get the backlight inverter and "scaler board" elsewhere, and build your own custom monitor. The "3663" seems to be a common model of scaler.
> The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them
> Combine this with the fact that most home users don’t care about good quality or even know what it is, and you have the current situation.
It sounds apt. But... There is an absolutely thriving market for keyboards and mice.
For both, conglomerates like logitech and microsoft are selling both what you describe as 'crap', as well as higher end stuff that tries to care about quality. Possibly not in the way you think is most important, but certainly a Logitech MX Master3 keyboard retailing at >$100 is not a cheap piece of crap. The letters aren't inked on, for example, thus ensuring they don't rub off particularly easily. Not a feature that is advertised or is likely to show up in a review. The kind of quality move that doesn't make sense if the market is just 'assholes selling crap to idiots'.
Keyboards are even more interesting; a lively indie market for custom-built usually mechanical keyboards, supported by parts manufacturers where price isn't particularly important.
I agree, though - the webcam market is quite a mess. So, what's the explanation for that? Why do keyboards and mice not fall under your 'assholes selling crap to idiots' rule?
Keyboards are MUCH easier to produce than LCD displays or cameras.
The problem is that there's only so many companies that truly design and manufacture photographic sensors, lenses and LCD panels. And all the downstream "brands" that assemble this technology into cheap plastic cases to sell to consumers can only do so much to differentiate. Add neon lights for gamers. Make it look dull for business users. Etc. They also sell TVs and have thousands of other SKUs, so they really don't care about any individual product.
Apple is both incentivized (due to their ongoing customer relationships) and able to break out of this mold because they:
a) Produce a tighter number of SKUs
and
b) Do enough volume to control and change what the original equipment manufacturers are producing
I think it’s more of a “the market won’t pay for quality” problem. People won’t pay thousands of euros for a good monitor, so manufacturers have to slap together the parts available in bulk in order to reach price points people will pay. The LG 5K is a good example, because it is clearly compromised to reach a somewhat reasonable price point. From what I can tell the monitor market mostly exists to cater to the generic business monitor and pc gamer markets anyway, as those are the only parts still selling in volume.
Although I have to admit that I was equally frustrated when I wanted a good retina screen with 200-ish dpi to pair up with the mac mini I wanted to buy, only to conclude getting the 5K iMac instead was the most sensible option.
Apple is the most valuable company on earth right now, entirely due to their thesis that people will pay for quality hardware.
The “creator” market is much more profitable than the gamer market where kids only have as much as mom will allow them to spend (vs The tech workers, coders, designers, youtubers, etc that need high quality displays to make a living).
It’s why Apple is able to get insane 50% margins in many products. It’s crazy to me that the big Asian manufacturers don’t see the market opportunity in catering to this crowd.
In their minds you’re either an office drone using excel or a gamer who wants neon lights. Both of which are market segments with terrible margins.
> The “creator” market is much more profitable than the gamer market where kids only have as much as mom will allow them to spend (vs The tech workers, coders, designers, youtubers, etc that need high quality displays to make a living).
This is a common misunderstanding of the gaming market due to stereotyping. The biggest age category in gaming is 18-34 by far. They generally also slant strongly to people with both more than average disposable income and higher likelihood to spend that same on gaming and related electronic toys. This makes gaming a 100 billion dollar market atm which is still growing rapidly.
> Both of which are market segments with terrible margins.
Not even close to accurate. Gaming related hardware is generally quite high margin. There's a reason ASUS et all use their gaming imprints as the place to introduce new high end parts. It's also a highly concentrated and networked market, making it very efficient to advertise to.
There are tons of expensive monitors available that cater to the professional market. The whole premise is not based in reality. Apple simply has the highest mindshare among average products.
What's extremely frustrating is that Apple makes an excellent 27" 5K monitor in a nice housing for $1800. The only problem is that it comes with an iMac...
So clearly, Apple could sell a monitor for about $1600, that would be perfectly compatible with Mac Pros, mac mini and as a secondary monitor for all the various MacBooks.
Maybe it's related to the reason why you can't use the 5k iMac as an external monitor. I can't remember where I found the sources, but the problem was that they essentially needed to video cards to drive the thing and making sure that both sides of the output looked identical was tricky and not something that an external source would be able to do.
IIRC it was an issue with the video connection. Nothing at the time could provide enough bandwidth to support 5K, so Apple had to cobble something together. With USB-C and TB3, that's no longer an issue.
What does "proper resolution for MacOS" mean in this case? There are tons of 27" 4k that work fine in macOS in Retina mode and matches the medium tier iMac (their low end today is still 21.5" 1080p). Unless you declare everything below 5k subpar, I don't see where you're coming from.
Basically, the ideal PPI of mac displays is a multiple of 110 PPI. So, for retina quality you need a display of roughly 220 PPI, which is what you get from 5K at 27 inch. A 27 inch 4K display is around 160 PPI. If you use that in 2x mode, things will appear too large. If you set it to scaled mode to make things appear the proper size, there are display artifacts (like shimmering when scrolling). In fairness, it's not super obvious unless you know what to look for. But if you're already spending money on a high end screen, why should you have to compromise?
That seems pretty outdated information, given that the OOTH default Retina scaling in MacBooks have been non-2x fractional since 2016 (1400x900 for the 13-inch's 2560×1600, 1680×1050 for the 15-inch 2800x1800).
I don't have the exact numbers in front of me at the moment, but a 27" 4K monitor will not match the pixel pitch of every other Mac -- screen elements appear larger when both are set at the same scaled resolution.
Yeah, to match the dot pitch apple is designing for, the 4K monitor would have to be more like 22 inches instead of 27. We know this since the 4K iMac is a 21.5 inch screen.
I think "gamer with neon lights" is certainly a segment with amazing margins, especially compared to standard office equipment. Most "gaming" mice/keyboards/chairs/computers/whatever are just decorated and brightly coloured versions of other products with insane markups. Alienware PC's are a great example here - The parts in one of those PC's can cost around half the cost of what the company actually sells it for. Other peripherals have similar price increases once they're branded as a device for gaming rather than office use as they know consumers are willing to pay more. I realize that there are lots of kids who want "gaming" gear (that their parents will pay for) but the PC gaming market is certainly geared towards mid 20's/30's who have the scratch to be able to afford this stuff. Not that these people are stupid or misinformed for doing so, some simply appreciate the aesthetic (even if it's something you or me might not particularly like).
Another thing I find really annoying is when I browse a website and first have to choose a product line when I don't even know what the difference between the product lines are.
Right? Like "is this for home use, office use, or gaming?"
I guess I understand where they're coming from, when most potential customers would likely glaze over and click away if presented with a long list of specs and product numbers.
Still, it's always nice when they at least have a "show all products" link that takes me to exactly that. I want a full list that can be narrowed down with filters.
Why do you think the LG monitors outside of the Apple collaboration are not good?
Recently, I bought two 27GN950, which is a 27" 4k@144Hz gaming monitor with good colors. So far the worst part is the fan and in the long run the absence of HDMI 2.1 might be disappointing, but overall I have the impression of a good product.
Yes, it doesn't have the same PPI as smartphones, but I am not sure if we are going to see that happen ever.
I think it's in part related to fractional scaling, which hasn't been sorted everywhere. Text at 4k on a 27 inch is too small to run at 100%, too much of a waste to run run at 200% (equivalent to 1920x1080). So you're running 150% or 175% and that can be an issue if you're running something that doesn't like fractional scaling.
27 inch is perfect for 1440p 100% or 200% (i.e., 5k), but it seems like no one other than Apple has that figured out.
Yes, fractional scaling is an issue, but I don't think it is as much of an issue as it was a few years ago. In fact, I think the only application that doesn't scale for me is steam currently. Everything else seems to be handled by setting the correct DPI in the xorg.conf and the scaling factor of KDE (150%).
But as this is clearly a software issue, I wouldn't blame the hardware for it ;-)
But Apple's fans are made from recicled SR-71 blackbirds to ensure ultimate noise suppression and each fan blade is assembled by a Swiss watchmaker to ensure quality.
Just kidding, it comes from the same Chinese factory as every other fan but Apple's fans(pun intended) like to believe in magic to justify the price tag.
I've noticed over decades that high quality stuff comes out of the checks-and-balances of experts specifying and purchasing stuff.
Examples I remember are sun monitors based on sony trinitron tubes, sun/sgi hard drives that were always checked - and sometimes returned by he container - so were actually enterprise grade, not consumer grade. Lots and lots of OEM stuff like that.
Early in my career I learned important lesson, there is no point buying displays from other brands than NEC or EIZO. Preferably upper tier products. The exception from this rule was Apple Cinema Display and Some Dell models. EIZO FlexScans are reliable and rarely have any issues. https://www.eizoglobal.com/products/flexscan/index.html
Unfortunately EIZO doesn't produce a single display with the ideal resolution for MacOS.
27" 5k or 22" 4k (4069 x 2304, like the first LG ultrafine was) are the unicorns I am seeking.
Unfortunately the LG ultrafine suffers from image retention/ghosting. So there's ultimately no great displays for Mac outside of the wildly expensive Pro Display XDR.
For reference, the 16” MacBook Pro has a 3072‑by‑1920 display.
This means the 27” 4K monitors that are the industry standard now for some reason at 3,840 x 2,160 are almost twice the size, yet have barely more resolution than the MacBook true retina screen.
MacOS can do scaling to adjust for this, but it uses the least amount of resources in native or pixel doubled mode. Any display that is between the resolutions I mentioned (like 27 4K) requires fractional scaling.
This is more resource intensive, and doesn’t look as good as pure retina.
Sorry but this has no sense at all. From 20 years on I work only with Apple based desktops/laptops. And I am a pixel peeper. Scaling is not a problem even on MacBook Pro from 2013. If you want to rationalise a purchase of new Apple Display XDR there are more factual reasons for this. Don't get me wrong - the new displays are very competitive for grading middle market, but most professionals are using separate proofing displays for testing.
I upgraded from a MacBook Pro with a good quality LG 4K 27” display using non integer scaling to a 5k 27” iMac with 2x scaling. Both provide the same visible screen area and not give you the same size icons and text. But the iMac with integer scaling is a better, sharper picture. The difference isn’t huge but it’s noticeable.
That seems a bit drastic but I agree with the sentiment and would also include laptops. Maybe I'm wrong since I didn't check all hardware but Apple still seem to be the only ones that have rigid quality controls, make sure parallel parts are parallel, mechanical parts stand a reasonable amount of movements. (At least as long they are not testing a new "innovation" like butterfly keyboards... ;)) That said, I buy stuff used if I cannot buy the high quality version. It falls apart anyway, this way there's no reason to be upset and it's better for the environment.
If you want to go bargain basement yes. If not, get an XPS or a Thinkpad (of the 'pro' series) and you'll have great hardware. Of course there will always be someone who finds something to complain about, but overall, these are fantastic machines.
Thing is, people complaim about there not being high quality gear, but when someone then makes it, they balk at the price. Yes people, a great laptop will cost you $3000.
My organization has been slowly rolling out the use of Cisco Desk Pros which are hardware endpoints that connect to Webex but are in a practical sense essentially monitors with very nice camera modules and microphone arrays built into the bezel. Laptops connected to the monitor with USB C can use the camera/mic. These cost like $4000 though.
I find the video flattering (camera angle/focal length?)
You certainly can buy good high end webcams.
But their market profile here is clearly still B2B and Telepresence, no Prosumer or Individual Professionals yet.
Yeah, no. I have a 25" UltraSharp from them and it has a "fun" bug where it advertises to the system that it refreshes at 59.95hz but in fact refreshes at exactly 60hz, which leads to the monitor(!!!) Freezing for a frame every 20 seconds or so, it's absolutely infuriating in games and movies, and I only found out how to fix it by modifying windows drivers and forcing it to refresh at solid 60hz despite what the monitor advertises. But of course you can't do that with something like a PS4 connected to it. Would never buy another dell, thanks.
Unfortunately, you must select carefully any monitor that you purchase, and the cheapest models are unlikely to be good choices.
I am using 2 good Dell 4k monitors. One is 1-year old (U2720Q), but the other (UP2414Q) is more than 5-year old and it works as well as in the first day.
I don't buy a lot of computers, so this can change at any time without me noticing, but: I think there's basically two Dells. There's the Dell that sells the cheapest equipment you can buy. This Dell sucks as much as anyone else. Don't expect miracles. Then there's the Dell that sells upscale gear. This is usually pretty good, or at least has the ability to be pretty good. I have appreciated the ready access to service manuals and such, too.
I say this because it's unwise to hear that Dell has pretty good gear, then go to their site and buy the cheap stuff. It isn't necessarily any worse cheap stuff than anybody else, but it's not what people mean when they say Dell can have pretty good gear.
Purchased U2720Q after reading through a number of rave reviews and can definitely vouch for it. Excellent 4K image quality and works well with MacBook. It's only the monitor though, so no webcam or audio speakers.
Me too but with Samsung. I bought a cheapish 34-inch curved monitor that had great reviews, but it has almost as much light coming through gaps in the rear housing as the screen. Text looks like crap on its "not quite retina" resolution, especially when it's next to a retina macbook pro, although videos look very nice. I really do wish Apple would make a monitor more reasonably priced than the Pro Display XDR.
You can check the refresh rate as follows. Open System Preferences, then click on Displays. There's a radio button, titled "Scaled". Option-click that radio button, and you'll see a pull-down menu, titled "Refresh Rate". It should be 60 Hz or higher.
Dell has very nice adjustable stands, but panels are mixed bag. eg. P2416D (1440p) is fine, but P2415Q (4K) has quite bad ghosting. So annoying I had to disable browser smooth scrolling.
I was surprised to find out the Ultrasharp series includes monitors with 6-bit IPS panels. Rather noticeable, even in desktop use. Previous to that I often told people to "just buy an Ultrasharp of a size you like".
Dell used to have good offerings, but all they seem to push now is the same 27” not-quite-4K 3,840 x 2,160 panels everybody else does. Now even the 22" inch LG ultrafine that used to be 4069 x 2304 is bigger at 24" and a worse 3,840 x 2,160. The only good option for mac is the 27 ultrafine 5k.
27 4k a bad size & resolution for the current computer market. Windows scaling looks like crap, and MacOS has to do more resource intensive 1.5 scaling (as opposed to native or pixel doubling mode) to look okay on these.
M1 might make this a mute point going forward, but the fact is at 27 inches, 5k is the only monitor that will look as good as the screen on your laptop while actually giving you more real estate.
But...that is 4K. It's what 4K is defined as, exactly 2x 1080p resolution in each dimension.
> Windows scaling looks like crap
I don't understand. 2x each dimension (so 1 pixel in the old resolution is 4 in the new) is, like, the easiest possible scenario when it comes to scaling in software.
While DCI 4K is a standard with 4,096 pixels of width, you’re correct that the HD standard (and therefore what is relevant to the discussion here) has always been UHD 4K and 3840 pixels wide.
DCI is relevant for movie industry professionals only, as these are the dimensions used for projection devices and (potentially) their content.
Two groups have competing definitions. One isn't inherently correct.
I say this as someone who was "that guy" when it came to HD Radio: "It's not High Definition, it's Hybrid Digital!" even though that's exactly the confusion they were trying to encourage.
Arguing that this is misleading is a fool's errand, and only plays into things if you assume that the primary purpose of a "4K" screen somehow is inherently "to play back cinematic content", which... it's not.
200% (2 times each direction) scaling on 4K is the equivalent of 1080p. A 1080p 27 inch monitor has huuuge pixels for the normal viewing distance of a desktop monitor. 1080p is common on 23-24 inch displays. Therefore you are forced to use fractional scaling which is less then perfect.
>But...that is 4K. It's what 4K is defined as, exactly 2x 1080p resolution in each dimension.
That's irrelevant though, except if we're talking about consuming movies fullscreen.
For a monitor I don't want 4K, I want insivible pixels at viewing distances, so hi-dpi.
I would also prefer no scaling for assets that are bitmap in nature. This ideally means pixel doubling (less cpu/gpu demanding and less fuzzy than fractional scaling).
This, for 27" and more, means higher resolution that 4K. I don't want to restrict myself to pixel-doubled 1920x1080 on my 27" or 32" monitor.
You do get nice DPI, but needlessly large buttons and other assets (compared to something closer to 5K).
I just completely don't understand your point. There's no misleading advertising here - the resolution is exactly as promised ,at the size promised....what's the problem? If the resolution isn't high enough for you....then buy one where it is? There are 5K monitors out there, maybe even 8K? Or just get a 4K one but in a smaller size?
>I just completely don't understand your point. There's no misleading advertising here - the resolution is exactly as promised ,at the size promised....what's the problem?
That would be relevant is my problem was false promises or misleading advertising.
But my problem is not
(a) "Monitors say they are 4K and they are not"
but:
(b) "Most monitors out there are BS-4K, but for the best quality/viewing comfort at their 27" and above diagonal they should rather be 5K, but most manufactures like Dell aren't bothered to produce at such a resolution and the few that do have prices to the skies".
>There are 5K monitors out there, maybe even 8K? Or just get a 4K one but in a smaller size?
Perhaps you've skiped through the thread?
My comment responds to (and agrees with) the sub-thread started by a parent commenter writing:
"Dell used to have good offerings, but all they seem to push now is the same 27” not-quite-4K 3,840 x 2,160 panels everybody else does.".
For me it's hard to believe that 4k on 27" is not enough, I use 1440p 27" 144Hz display as daily driver and barely see any pixels(usually with badly hinted fonts, and still not pixels, but uneven forms of letters), because I sit around one meter apart from it, and sitting closer makes me turn my head around too much, except when watching movies.
>For me it's hard to believe that 4k on 27" is not enough, I use 1440p 27" 144Hz display as daily driver and barely see any pixels
It's not just about "not seeing any pixels", and "barely see any pixels" is not the same as enjoying hi-res typography and fine detail.
27-inch 1440p monitor is about 108 ppi. That's hardly better from what we used in the 90s and 00s, dpi-wise. Sure, if you haven't used to hi-dpi it looks ok. But try using a 5K/27-inch monitor for a while and then go back to 1440p/27-inch to see the difference you miss.
Now, 4K hi-dpi (pixel doubled) on 27" is 1920x1080.
This makes pixels just fine and detail is great, but everything too large and cuts off screen space, as it's 33% less area than 1440p (which, I presume, you don't use pixel-doubled)
The solution is either 5K/27" (which gives you back the 1440p kind of screen space and UI control size PLUS hi-dpi), or using a non-doubled, fractional resolution, to overcome, (which is not optimal, looks fuzzier, and wastes cpu).
What matters for perception is angular resolution, not DPI. And 27" display covers more visual field that 17" from 90s, so you can and should sit further away from it. Once angle of perceived pixel is smaller than angular resolution of your eye, reducing pixel size only adds to the resolutions of shades you can show to the user in that area (closer to bpp increase, than dpi increase, because you can't see pixels anymore, but still can perceive irregularities of brightness on edges).
Yeah, same - 27" 1440p as a daily monitor for work and I have no issues with it. I have had a 27" 4K monitor for a while but it was just too small at 100% scaling, and at 150% scaling some things looked naff. Prefer the 1440p at that resolution.
>I have had a 27" 4K monitor for a while but it was just too small at 100% scaling, and at 150% scaling some things looked naff. Prefer the 1440p at that resolution.
That's what we say too. 27" 4K monitor is too small at 100% scaling, while too small at 50% scaling (pixel-doubling hi-dpi mode).
That's why the idea is to have a 5K at 50% scaling (so everything is pixel-doubled on each axis, and a pixel becomes 4 pixels, doubling the detail you see).
Wasn't the problem that 5K displays(or maybe it's just this specific one?) are notoriously difficult to make it work on windows? Last time I looked into getting one I found out that it just wouldn't work without getting a thunderbolt card for my AMD based system, or a DP 1.4 compatible gpu.
On the other hand, HDMI 2.1 can now support 8K@60hz, so maybe this is not an issue anymore.
Not to the upthread specific claim that the resolution was “not quite 4K”, which is what the comment you are responding to addressed.
On the bigger issue, I don't really see the complaint. I have pretty good vision (corrected—to 20/15 or so—uncorrected is crap but I'm not coding without glasses/contacts) and honestly my 34” ultrawide at 3440x1440 is excellent for coding, and pretty much any other use. Now, would I prefer whatever resolution a 5K 16:9 would be when extended to 21:9? Or better a 4320p at the same aspect ratio? Sure, more pixels are always better. But does the sub-4K display look like crap or force bad sizes for controls? No.
>Sure, more pixels are always better. But does the sub-4K display look like crap or force bad sizes for controls? No.
Sure, I can work with a 3440x1440 34". Heck, I've worked with CGA monitors back in the day, and black and white (!) SUN Sparkstation monitors.
But, as you said, it's about looking better. "Doesn't look like crap" is a pretty low bar, no? For 2020, and after 10 years of hi-dpi phones and laptops, I expected better from monitor companies...
I have an LG 28" 4K and while definitely isn't as nice as my iMac 27" 5K, it works well enough for coding (I'm primarily concerned about text rendering without visible pixels).
I'm using a Dell P2715Q (also 27 inch 4k); it looks fine. But... scaling? The point of having a gigantic 27 inch monitor is that you don't need to scale it. The only problem I do have with the monitor is that it makes me disable scaling on my 15" laptop screen, since there are annoying interactions when you have one screen with scaling active and one without.
> The point of having a gigantic 27 inch monitor is that you don't need to scale it
The point of using a High-DPI display is that you can use scaling without losing the screen real-estate. With 5K @ 27" you can get what looks like 1440p in physical UI element size, but with an increase in clarity, readability, and quality.
27 inch is a small monitor. But I agree, usable real estate comes first, then clarity (by way of scaling). So you want a genuinely large monitor (at least 40 inches) at 8K+.
In my experience, the High-DPI support of Windows 10 is excellent. I am using a 27" 3840x2160 screen set to 150% next to an old 24" 1920x1200 screen at 100%. Pretty much all modern applications seamlessly adapt to the pixel density of the screen they are currently running on without any interpolation.
Didn't know about this company but looking at their history here[1], they used to sell under the brand name Nanao in the US. Nanao made great monitors with consistent high reviews.
I used to have a two-monitor setup with a Dell (? not sure) and an Eizo, both using the exact same panel. I started with one monitor and then got the Eizo. The difference in picture quality and eye comfort was absolutely jaw-dropping. Dell looked and felt like a complete junk in comparison.
Make what you will of this, but it's not just a panel per se, it's also how it's integrated and used in the whole product.
It seems like some company there (besides Apple), should seize the opportunity to differentiate themselves on quality, and deliver supply-chain controlled "boutique" hardware, which I'm certain many would shell out for.
I'm thinking the op was imagining something in the middle ground. Maybe +10–40% for a guarantee of quality. The case I can think of is Anker for cables although in that case they also don't charge a premium either.
There’s no money in it and the XDR is a vanity project. Too expensive for the person who isn’t grading marvel movies but not quite hitting the actual specs needed to grade a marvel movie.
Meaning the only people buying it are top tier youtubers
Let me chime in: just bought a T14s. Could be a dream machine but the shite 1080 panel (Windows recommends a hilariously crappy 1.5 scaling. Kidding me?) the pesky trackpad and the awfully glitchy Windows 10 (yah, probably the drivers but the platform enables this horror) destroy the value of this otherwise pretty solid device.
> Windows recommends a hilariously crappy 1.5 scaling.
Windows has multiple GPU-accelerated vector graphics GUI frameworks. Well-written Windows apps look well with non-integer scaling.
> awfully glitchy Windows 10
Here's what you should do with new computers.
https://www.microsoft.com/en-us/software-download/windows10 Make an installation USB drive, boot from that drive, remove all partitions from the laptop's SSD, perform clean install. You don't need product keys to reinstall Windows as long as the SKU matches (i.e. if you have Win10 home edition, reinstall the same edition).
Don't just blindly click through the wizard, read messages and you'll get better UX (you don't want cortana, personalized ads, geolocation, etc).
Connect to internet, run windows update.
Open device manager. If some devices are left in "unknown device" state, you might need to manually find their drivers. Make sure to only install drivers and not user-mode utilities.
It glitches, together with the keyboard. It’s like events start piling up as the UI loop locks up (for up to 1 sec.) then suddenly they rush through and the pointer wanders around and keystrokes fall through at a constant rate. Other times they’re barely lagging enough to feel it.
Awful, but I remember similar issues on Dell and an HP. It’s an issue with the device driver or some “value add” driver control software. :(
Went looking for a monitor recently & was so sad to see that there are less than a dozen monitors with full-array dimming, & most of those have 16 zones or less.
I ended up going with a budget option, no local-dimming. It's frustrating how behind, how stagnant, computer displays are. I don't want to sit in front of a 48 inch OLED tv, too big, not high enough dpi, but I feel like I'm throwing money at bad products trying to buy a computer monitor. At least there are some fair budget options (Pixio PX275h 95Hz 4k, $250, doing ok).
I spent 6 months learning about monitors before buying one for graphic design and the tldr version is: you buy from NEC or EIZO. The panel is one thing, but what sets them apart from the other ones are the electronics inside that drive the panel. And QC, the commodity brands tend to be very hit or miss.
Ahh the LG monitor from the Apple collaboration. It’s so great. So different than other monitors. Stable, reliable, solid. A great product. You can literally see how Apple forced their product tenets on LG.
Sadly, it’s not available anymore. I have one in my office but needed another one for home. Ended up buying an LG with the funny name of something 27UKi6716263 that looked similar on amazon. It’s so different... what a shame
I thought it was the exact same as the LG 24UD58-B except with lightning connectors instead of display port and hdmi? That (the non-lightning model) is the monitor I have, it's the smallest (24") 4k monitor I could find - I wanted high DPI, and I got it.
Unfortunately at 3840 x 2160, it's not ideal since in pixel doubled mode (retina), you're only getting the equivalent of a 1080p display.
The 22" LG Ultrafine used to have a 4069 x 2304 resolution. So in pixel doubling mode you actually got more screen real estate than newer 24" 4k models (which are only 3840 × 2160)!
Hmmm. I actually quite like 2160p as a simple upgrade over 1080p; usually I just solve the screen real estate problem by buying more monitors. You can get the 24UD58-B for about $200 used on eBay, so this is not a large cost.
Eh, do really consumer want them? Like not pros that do streaming for a living etc, but the people doing conferencing or the occasional capture?
They will hit local, isps or teleconferencing bandwidth cap well before sending all the bits a webcam captures, with subsequent recompression to crap quality.
The lazy conglomerates who sell these peripherals often don’t actually produce the parts in them. They simply rebrand commodity cameras and IPS panels in a crap plastic housing and slap their logo on it.
Then they give the product a hilariously user-hostile product name, like “PQS GRT46782-WT” as an extra f-you to the user.
They don’t care about you because they have no ongoing relationship with you, and their executives mistakenly see their own products as commodities.
Combine this with the fact that most home users don’t care about good quality or even know what it is, and you have the current situation.
A friend once described the peripheral market as “Assholes selling crap to idiots.”