Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Major Linux Problems on the Desktop, 2020 edition (itvision.altervista.org)
115 points by spekcular on May 31, 2020 | hide | past | favorite | 177 comments


This list exaggerates many issues in an attempt to inflate the list.

It's totally bloated, full of baseless personal opinions.

A few examples:

- "Nvidia open-source driver nouveau.." - why include it if not to bloat the list?

- "Under Linux, setting multi-monitor configurations especially using multiple GPUs running binary NVIDIA drivers can be a major PITA" - Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

- "Linux drivers are usually much worse (they require a lot of tinkering..)" - Broad statements are usually wrong. Especially this one based on an opinion from the 90's.

- "Resume after suspend in Linux is unstable and oftentimes doesn't work." - Bollox. Resume from Hybernate - maybe. But suspend works perfectly most of the time.

- "X.org is largely outdated" - duh... the whole list on this topic is dated issues resolved by WM.

- "Adding custom monitor modelines in Linux is a major PITA." - Is it? a one line command? and how do you do it on Windows? oh you can't...

- "Applications development is a major PITA" - this is ridiculous.


> - "Nvidia open-source driver nouveau.." - why include it if not to bloat the list?

Because the proprietary Linux driver doesn't support Wayland. So you have to either choose to use the proprietary driver that only supports Xorg (with all the limitations, for example if you have multiple monitors with different DPI there is no way to scale them correctly) or use the open source nouveau driver with the problems associated with it. To make the nouveau driver even not crash my PC after 10 minutes I had to download a firmware file from a bug report on the internet, simple isn't it. And still is barely usable, I mean watch a 60p video on YouTube and it lags.

> Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

If you have monitors with different DPI there is no way to make them work correctly on X11. This is a problem about how X is designed, in an era where you didn't have HiDPI screens. Also if you have a laptop and you have to connect to a projector you don't want to spend 20 minutes thinkering around with the Xorg config while the other people at the meating with their Macbooks laught at you.


> > - "Nvidia open-source driver nouveau.." - why include it if not to bloat the list?

> Because the proprietary Linux driver doesn't support Wayland.

Do you have a source for that. All the sources I've seen say it works, if you set nvidia-drm.modeset=1

> > Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

> If you have monitors with different DPI there is no way to make them work correctly on X11. This is a problem about how X is designed, in an era where you didn't have HiDPI screens. Also if you have a laptop and you have to connect to a projector you don't want to spend 20 minutes thinkering around with the Xorg config while the other people at the meating with their Macbooks laught at you.

I'm not sure what you're doing i guess you're just repeating something you read 15 years ago. I haven't had to edit an xorg file in at least that long and I'm using Linux as my daily driver. Plugging in a projector also works (both wayland and xorg), what was causing problems were the scaling of some programs if you dragged from a hidpi to the projector, but then just restarting the app would also fix this (and this has been fixed for quite a while). As a side note I've seen lots of issues from mac users, funnily if that happens mac users almost always blame it on the projector, it's never the fault of the mac.


> So you have to either choose to use the proprietary driver that only supports Xorg ... or use the open source nouveau driver with the problems associated with it.

You have also a third option: not use Nvidia hardware at all. In the end, it's you who is voting with your money; you gave Nvidia your money, so the attitude you are getting from Nvidia is between you two.

> there is no way to make them work correctly on X11.

Technically, there is, but it is user unfriendy (set framebuffer sizes of all displays to largest dpi and downsample on scan-out, via xrandr) and non-systematic hack.

> This is a problem about how X is designed, in an era where you didn't have HiDPI screens.

That's why you should switch to Wayland. X11 is dead, in next years it will be just a few libraries so legacy applications have something to link with.

> Also if you have a laptop and you have to connect to a projector you don't want to spend 20 minutes thinkering around with the Xorg config while the other people at the meating with their Macbooks laught at you.

Newer had that problem. In fact, I didn't edit xorg config since early 2000s, I think it didn't exist on any of my machines for decade+ already.


What's the issue with the projector? I plug mine in and... it appears. (Scaling issues aside)


Eh, you have a 4K laptop, you connect the projector and everything is huge.


this is about scaling issues :facepalm:


Ok, so what's the issue? It's a bit inconvenient when your laptop screen changes dpi on connection, but in the projector scenario you're not working between those two screens. One is full screen presentation / video which gets scaled anyway and the other has... full screen speaker notes?

I get why is a serious problem in a laptop and monitor side-by-side setup, but here?


> Because the proprietary Linux driver doesn't support Wayland.

Sure, so don't get NVIDIA hardware for Linux. People don't seem to compile long lists about how the Mac doesn't really have an NVIDIA option, it's considered that your hardware is restricted in that way on the Mac and so you're going into it knowing that. Well, Linux is the same way, just exclude NVIDIA from your list unless you're on a desktop, fine running X and fine with their proprietary driver.

This is not something we can reasonably fix without NVIDIA's direct involvement, so I see no point of including it. It's like complaining the Raspberry Pi can't run your favorite x86 programs.


You can avoid computers with NVIDIA hardware, but imagine the average person that wants to switch to Linux from Windows, he already has a Windows computer that has an NVIDIA GPU, then tires Linux on it and nothing works, and goes back to Windows.

But the problem is not only NVIDIA, the problem is Xorg in general, is too old to support new hardware like HiDPI screens, and Wayland is not yet stable enough for me.


> he already has a Windows computer that has an NVIDIA GPU, then tires Linux on it and nothing works

He doesn't try installing macOS on his Windows PC, so perhaps we need to educate people to buy Linux hardware if they want to try Linux.

Also "nothing works" is hardly true. The proprietary driver works for your most common use case for such an user i.e. desktop, running X11 and mostly gaming, watching Netflix etc.

You run into problems on laptops with switchable graphics or if you are into the latest and greatest and want Wayland, but in that case one should seek Linux-specific hardware.

Regarding Wayland, I have various machines running amdgpu or Intel's i915 no problem, so tailoring your choice of hardware to what's well supported would help here.


"Linux hardware" is meaningless? I don't think there's any such thing as "Windows hardware". The drivers are supposed to abstract that away.

If Linux doesn't work on hardware, its the usual reason that Linux receives little love in the market so drivers lag Windows drivers in coverage and features.


> I don't think there's any such thing as "Windows hardware". The drivers are supposed to abstract that away.

Microsoft disagrees with you[1][2], as do most online retailers of electronic goods who list supported OSs on their product pages and still include Windows in that list.

[1] https://docs.microsoft.com/en-us/windows-hardware/drivers/da...

[2] http://i1-news.softpedia-static.com/images/news2/Certified-f...


> I don't think there's any such thing as "Windows hardware".

The majority of laptops you can pick up at BestBuy are "Windows hardware" - they're indeed specifically manufactured to run Windows and hardly anything else. If you want to test this, try installing Windows on your ISP's router or on a TALOS II workstation and see how well Windows "abstracts that away".

> the usual reason that Linux receives little love in the market so drivers lag Windows drivers

Which is directly related to most PCs coming with Windows preinstalled, so most people being either unaware of Linux or seeing little reason to switch to Linux since Windows already runs their Chrome anyway.

What you're confusing is the fact that the majority of PCs are made with Windows in mind, if the majority of PCs came with Linux or BeOS or OS/2, Windows would be in a similar position, maybe even worse since companies would not get the benefit of open code, therefore even less reason to contribute to a minority OS.

If Linux was the standard, many people would be so familiar with it and happy enough to see little reason to switch to Windows, that is if they ever happen to have learned an alternative exists in the first place.


So? This is the world we live in. Linux gets no love. Its less performant and less desirable on the desktop/laptop for this reason.


Performance is entirely dependent on use-case, there are many where Linux outperforms Windows, especially if you're a developer, but as to why it is less "desirable" has mainly to do with what comes per-installed on 99% of Walmart PCs.

Pre-installed is king, if there's extra step involved for the competition, it's not a game one can easily win.


The main reason stuff is mostly windows compatible is due to the fact that driver management is a mess (different distros with different drivers and/or linux kernel versions). The ABI/API issues that are complained about are not helping this.

Even if there are drivers, most of the time they are developed by external people that only wanted to get it working _to a certain level_. Manufacturers dont care about Linux, most of the times, because its hard to develop an installer that is compatible with _Linux_.

I have a story about a friend of mine that tried to make a led driver for the Beaglebone Black. He looked for the documentation, and there was some stuff about an old, deprecated way of toggling GPIOs (pass pin ID) and a new interface (gpiod, using pinctrl and identifier). The documentation was also talking about ACPI, which was totally unrelated[1]. He tried to use "the right way" (using the latest and gratest API), but it was not clear on how to implement this api, as he was experienced with switching registers manually, the embedded way, and had little experience with the Device Tree.

Eventually he wanted to use iomap to just do the gpio toggle _manually_ because it was better documented (read chip documentation and done). He got this idea from when he went to this Korean school as an exchange student, and it was _common practice_ to do this by teachers!

Small example about the gpio madness, google for "linux gpio kernel": 1. A generated page of the API, but also some implementation documentation. [2] 2. Readme file from the kernel, telling you that this is not the documentation you are looking for (old api). [3] 3. An actual article that explains the old and new interface, but still all userspace [4] 4. LWN docu from 2013 using the old api [5]

[1] https://www.kernel.org/doc/html/v4.17/driver-api/gpio/board....

[2] https://www.kernel.org/doc/html/v4.17/driver-api/gpio/index.... [3] https://www.kernel.org/doc/Documentation/gpio/sysfs.txt [4] https://embeddedbits.org/new-linux-kernel-gpio-user-space-in... [5] https://lwn.net/Articles/532714/


Your argument is rather muddled there to say the least. You start off talking about consumer hardware then bring up an anecdote about a developer writing userland software against a developer board (ie non-consumer hardware) and use that as an argument that the Linux kernel is hard to develop against despite there being nothing in your anecdote discussing kernel development.

"Anacdata" is always going to be pretty sketchy in any technical debate but here it seems especially so since this isn't even your own first hand experiences (let alone relevant to the point you were trying to prove).


The main reason is, as I stated, because Windows is where the market is.

You're categorically wrong about hardware vendors not supporting Linux, in fact many of Linux developers are employed by these companies, be it Intel, AMD, Linaro, Qualcomm etc.

The problem usually is that one needs the right combination of components for things to work well "out of the box" and so for example avoiding certain GPUs or WiFi chipsets, (others work fine).

This is not up to the individual component manufacturers who are already contributing upstream, but up to the OEMs assembling the complete machines to avoid the ones who don't and many do not simply consider Linux when doing so, because customers are accustomed to Windows and they get a good deal on it from Microsoft anyway, so why bother.

In recent years, DELL has stepped up on this front, Lenovo is coming around and you even have dedicated vendors like System76 selling good hardware with Linux pre-installed.

Your "story" is a problem with a specific vendor (Beaglebone) and then ARM in general, which is a bit of a wild west compared to x86, but that's the nature of ARM that Linux tries to accommodate for, not a Linux problem specifically.

I do wonder why your friend did not install Windows on the Beaglebone and did not interact with the hardware that way? It seems like the experience there is so much better.

What is that? The Beaglebone does not support Windows at all? Well, I guess Windows hardware support is rather poor then.


Its the other way around for Beaglebone support. TI has apparently used their own interrupt logic, and Windows IoT Core doesn't seem to support that. ( https://blogs.msmvps.com/dvescovi/2018/01/20/beaglebone-and-... ).

I was mainly talking about how development on Linux Kernel can be tedious, the adoption of the drivers is usually limited and thus the gains are too low. How come that Realtek is one huge manufacturer of network chips, but their drivers on linux are still crap? Everybody knows that, and if they dont yet know, they usually get bitten by it one day, use google, and _accept their faith_. Users are easy to accept how things are and dont bother to improve because its too hard/complicated/not rewarding.

People that are in control of these issues are the developers. The only thing I hear about Intel/Nvidia from Linux people is complaints about how shitty they are, and why they dont open up their stuff. One key thing might be that debugging of kernel things is not easy on Linux, as listed on the Original Post url? Crash a windows box, hook a debugger on it and off you go. If you dont know enough, im sure windows will help you get it running when you pay them. Oh? Linux doesnt work that way? Bummer.

Maybe the abstractions of Linux's core is at the same time its main weak point. Theres nobody governing a stable api and theres not enough telemetry to detect regressions. Yes, i said it. 99% of the people don't care if theres telemetry reported if it can improve the product. As a developer myself, I would _love_ to have _real world issues_ reported to me, automatically.

anyhow. Nothing is perfect. Linux could be better if there was a team behind it that want to get it "production ready". Like, maybe tell the user why something doesnt work, instead of shouting on the internet that they should buy "linux compatible hardware", which is telling the regular user to buy blinker fluid.


> was mainly talking about how development on Linux Kernel can be tedious, the adoption of the drivers is usually limited

You literally provided an example that does not work in your favor to prove your point.

> Realtek is one huge manufacturer of network chips, but their drivers on linux are still crap?

That's not the case for years, rehashing the same arguments you had a decade ago, or simply being this clueless does not help your case.

> the only thing I hear about Intel/Nvidia from Linux people is complaints about how shitty they are, and why they dont open up their stuff.

You don't really know what you're talking about and it shows. Intel is fine, they have dedicated developers supporting Linux with open-source code, the problem is an Intel+NVIDIA laptop combo and this is due to NVIDIA, not Intel.

NVIDIA support is crap, because NVIDIA wants it to be crap, since they sell enough GeForce GPUs to gamers not to care, nothing Linux can really do about that, it seems that Intel and AMD have no problem shipping a kernel driver. It's a culture thing, more than anything.

And yes, we want the driver to be open, caring about free software is one of the major reasons to go with Linux.

> One key thing might be that debugging of kernel things is not easy on Linux, as listed on the Original Post url?

Much, much easier than on Windows, you even get DTrace, get to inspect the code etc. That's not the issue. Anyone who wants to provide a driver has no problem providing one.

ARM is its own beast, but as I said earlier, this is mostly due to there not being a standard way to i.e. probe hardware like there is on x86 and that's ARM, not Linux at fault here.

Does Linux have problems, sure. But as someone who had family members with Windows 10 call me that they lost their files after an update and the internet is full about the latest macOS being the new Vista, I wouldn't be too sure the competition fares better.

> Yes, i said it. 99% of the people don't care if theres telemetry reported if it can improve the product.

Maybe they don't care, but I'd argue they seldom even know about it or understand the implications.

> As a developer myself, I would _love_ to have _real world issues_ reported to me, automatically.

Part of the free software movement is about equality between users and developers because developers have unquestionably more power. This is more of an ideological thing than anything. Nobody's saying telemetry isn't useful, just that one should be up-front, opt-in, full disclosure about it.

A large part of us care about that, about privacy and the future of unrestricted, general-purpose computing.

I am sure many don't and for them Windows and macOS are there. If Linux doesn't work well for you, that's fine, what I find annoying is pretending that Windows/macOS are somehow way less annoying to use, when every time I use them, I come across crappy behavior.


>Sure, so don't get NVIDIA hardware for Linux.

This is one of the excuses people make, it's condensending. Most of the time you already have the hardware before you want to try Linux. Users don't care whos at fault here (Kernel/distro/x11/nvidia) the end result is that it doesn't work as opposed to Windows.


I have said this many times, but it's worth repeating; people don't seem to have a problem with not being able to run macOS well on their Windows PC.

It's condescending to the Linux community to always have to deal with users who don't even put in the minimal amount of effort to get Linux compatible hardware and only install it on crap machines Windows doesn't work well for them on anymore.

I find that extremely disrespectful of the community and my time and do not wish to support such users. A large part of us do this in our free time out of passion, asking the user to put in the minimum amount of effort necessary really isn't much.

Proprietary companies who don't care one bit about their customers get the benefit of the doubt and people buy only supported hardware, but Linux is always supposed to care about some second rate hardware of users who don't care about anything, but "free as in beer".

This may not be the most popular thing to say, but I honestly don't care about such users anymore.


>I have said this many times, but it's worth repeating; people don't seem to have a problem with not being able to run macOS well on their Windows PC.

Well, Apple does not provide an official release for those PCs, so the analogy is not apt.

>This may not be the most popular thing to say, but I honestly don't care about such users anymore.

And as a result Linux still only has 0.x% market share. "Fuck you Nvidia" and public shame might help more.


"Linux" does not provide an "official" release for your crappy PC either. If it's x86, it happens to be compatible, but guess what? macOS's x86 as well, hence the whole Hackintosh community.

The difference being, Hackintoshers seem to know they need specifically supported hardware if they want to have a good experience.

> Linux still only has 0.x% market share

Linux will have that marketshare regardless, because 99% of Walmart PCs Karin buys simply come with Windows. If the reverse was true, guess what marketshare Windows would have?

> "Fuck you Nvidia"

It's not like NVIDIA wasn't endlessly approached nicely first, offered assistance by the community etc. So at this point, fuck NVIDIA indeed.

I've been running Linux for close to two decades full time and it has worked sufficiently well for me for all that time, but anyone for who that's not the case, I am willing to help to an extent, provided there was some good faith effort on their part as well, since I am donating my free time.

If not, be my guest with your bug free Windows 10[1] and the well liked macOS Catalina[2].

1 - https://www.tomsguide.com/uk/news/what-the-hell-windows-10-u...

2 - https://www.macworld.co.uk/news/mac-software/problems-macos-...


You can try to compare it to macOS all you want, the comparison is still wrong. For starters, I can't just go to Apples website to download an image that will boot on my PC.

"Linux" (as in Kernel) no. Distributions yes. Just look at their websites when downloading.

>Linux will have that marketshare regardless, because 99% of Walmart PCs Karin buys simply come with Windows. If the reverse was true, guess what marketshare Windows would have?

Still Windows. But this is a purely hypothetical question, nobody knows if the software world would have developed differently if Windows would not be preinstalled.


> For starters, I can't just go to Apples website to download an image that will boot on my PC.

Are you auguring about licensing here? macOS's obviously not open-source so you cannot just download it off their website for free but you can absolutely boot in on your PC, there's an entire Hackintosh community around it.

If what you're trying to say is that a Hackintosh is not a supported configuration, there are entire databases with supported hardware for Linux. If you have an incompatible component, you're running an unsupported configuration.

But moving onto Windows, the hardware there needs to be compatible too, it's just that going into Walmart you're likely to run into PCs designed with Windows in mind, due to the whole IBM/MS pact of the 80s and the resulting IBM PC clone industry preinstalling Microsoft software ever since. Linux came after MS was already preinstalled on most PCs, so you have to specifically seek it out.

Pretending that this isn't the case, the playing field was fair and "Windows" just won is dishonest at best.

> But this is a purely hypothetical question, nobody knows if the software world would have developed differently if Windows would not be preinstalled.

If you're telling me that in a world where 99% of PCs are Linux preinstalled, kids are familiar with Linux from school and nonetheless people are going out of their way to manually install Windows on them, I think you're selling me a bridge.

I don't see people demanding the ability to install Windows onto a Raspberry Pi for example, (yes there's Windows for IoT, not the same).


>Are you auguring about licensing here? macOS's obviously not open-source so you cannot just download it off their website for free but you can absolutely boot in on your PC, there's an entire Hackintosh community around it.

Yes. But that's not from Apple directly and not really the point here. macOS on PCs is for the hardcore, the 1%, the people who like to tinker with their system. Not as an alternative to Windows. There is no guarantee that the next update will survive a reboot.

>If you're telling me that in a world where 99% of PCs are Linux preinstalled, kids are familiar with Linux from school and nonetheless people are going out of their way to manually install Windows on them, I think you're selling me a bridge.

Ideally there would have been no OS preinstalled and you had to chose yourself. But again, hypothetical example. Would Winodws still be dominant in this example? My guess is yes, based purely on how GUI friendly the OS is.


It is possible to run NVIDIA cards on Mac Pro (or with eGPU hardware). There is no alternative to NVIDIA for many software engineers working with HPC and/or simulations / deep learning tasks; many applications and libraries require CUDA, and there is no alternative for CUDA (OpenCL and ROCm are much lower-level).

I personally use Linux with NVIDIA cards, it works reasonably well, and some desktop environments now even work with Wayland (as of 2020).


> some desktop environments now even work with Wayland (as of 2020).

Wayland compositors themselves work. What is a problem is sharing GPU resources between different applications using different APIs.

I.e., you have your compositor using OpenGL and your application using Vulkan. The application renders itself into a buffer and sends it to the compositor, to render the desktop.

If the compositor and application cannot share buffers, the application has to read it from GPU, put into system RAM and share with compositor via SHM. The compositor then writes it back to VRAM. (Obviously, when application renders itself via software-only, this is not an issue.)

When the compositor and application can share buffer, the application just send the buffer handle to compositor, done.

This sharing on Linux is done via dma-buf. AMD and Intel drivers use dma-buf, so it is not a problem. Nvidia proprietary driver doesn't, it uses it's own EGLStreams, so it is incompatible with compositors. It is similar, as if Nvidia drivers for Windows started ignoring WDDM and started doing their own thing, complete with their own display server. It would also become incompatible with the rest of system. For some reason, Nvidia thinks it is fine to do that in Linux something, that they would not dare to do in Windows.


Regardless this is an NVIDIA problem and not a Linux one.


> It is possible to run NVIDIA cards on Mac Pro (or with eGPU hardware).

only very select cards. I used to have a GTX 690 and could absolutely never get a hackintosh to get hardware acceleration through it.


> Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

Have you actually ever tried to set up a PC with two NVIDIA GPUs running simultaneously and monitors connected to each of them? I believe you haven't.

> "Resume after suspend in Linux is unstable and oftentimes doesn't work." - Bollox.

And it's bullox because you say so? So, a sample size of one?

> - "X.org is largely outdated" - duh... the whole list on this topic is dated issues resolved by WM.

None of the issues listed there can even be solved by a WM.

> "Adding custom monitor modelines in Linux is a major PITA." - Is it? a one line command? and how do you do it on Windows? oh you can't...

In Windows there's a GUI for doing that for AMD, NVIDIA and Intel drivers. In Linux you have to create an X.org configuration file with monitor scanlines and run several non-trivial commands from console. It's not done "using one command", so you're basically lying.

> "Applications development is a major PITA" - this is ridiculous.

Because why?


Totally agree. For all the non-Mac laptops I've used, nouveau is never a problem. And Ubuntu default setting is pretty good nowadays. I'm suprise that those point make it to a "2020 Edition" list.


Have you tried it with Pascal cards and later? It simply doesn’t work at all, with my experience.


NVIDIA has taken an indifferent / hostile approach for a few decades, and only now is it changing. So, the easiest fix is to ditch NVIDIA because it doesn't play well with Linux.

If you don't want to ditch NVIDIA then ditch Linux, but don't blame Linux because NVIDIA doesn't want to play ball.

AMD on the other hand supports open source (not just Linux), and has made a fair amount of it's hardware documentation available (unlike NVIDIA). On my workstation with an AMD card it runs the same games about 5% quicker under Linux than it does under Windows (and other benchmarks back that up).


I don't have any experience with Pascal cards. My ones are usually the consumer laptops from Dell (Inspiron, Vostro, XPS, Precisions), Lenovo Thinkpad, HP...


Pascal works [1], later cards indeed do not work well, but this is due to NVIDIA's hostility and not something the community can easily fix. Not a Linux problem, but an NVIDIA problem.

1 - https://www.phoronix.com/scan.php?page=article&item=nouveau-...


> - "Resume after suspend in Linux is unstable and oftentimes doesn't work." - Bollox. Resume from Hybernate - maybe. But suspend works perfectly most of the time.

Seems like you are agreeing. "Most of the time" means it's unstable and oftentimes it does not work!

On my desktop I particularly noticed this with the latest Ubuntu. Suspend used to work in Ubuntu 18 I think but not I'm on Ubuntu 20.04 and it does not. I think it stopped working on Ubuntu 19.

This confirms the lack of QA too.

I didn't have the patience to go through the whole list but it seems to hit a few pain points rather well.


Most of these issues vary between distros but they are mostly valid, especially on laptops.

I only had two laptops, but on both suspend doesn't work properly because of btusb driver. And to get battery life on par with Windows, you need to tinker with

1) graphics drivers (or just disable discrete gpu)

2) CPU governors (mostly works ok on Linux today but very different from windows and may confuse people)

3) touchpad. Just disable it, because moving cursor uses like 100% of a single CPU core and drains battery. No such problem with mouse.


I use Linux every day. I love it. But I don't think your list reflects reality.

> Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

I have to go from multi-screen configuration to configuration because of my job. I happen to try a lot of combinations of connectors, dongles, screen brands/resolutions...

And no, this is really, really not solved. I have problems 1 time out of 3.

> "Linux drivers are usually much worse (they require a lot of tinkering..)" - Broad statements are usually wrong. Especially this one based on an opinion from the 90's.

They are, here. For BT, Wifi, graphic cards and printers at least, I attest that I most of the time have less features and/or stability using Libre drivers.

Does it work well enough for day to day use? Certainly. Can I use my "scan" button on my Epson laser? Absolutly not.

> "Resume after suspend in Linux is unstable and oftentimes doesn't work." - Bollox. Resume from Hybernate - maybe. But suspend works perfectly most of the time.

Most the time is not enough for sometime like suspend. The whole idea is that you _trust_ the computer to restore you work space the way it used to be. If you can't, if sometime it may not boot, or disable the network card (common problem), then it failed.

Can you imagine if VSCode would restore your file tabs, but only, most of the time?

In fact, one of the first thing I do on a fresh install is to go the the "close lid" settings, and disable suspend, because I'm afraid to lose state by mistake.

> "X.org is largely outdated" - duh... the whole list on this topic is dated issues resolved by WM.

X is largely outdated. That why we still have flickers when booting a linux desktop, which means you will never ever make a Mac lover use Linux because they will not trust it.

Yes, appearance matters.

> "Applications development is a major PITA" - this is ridiculous.

If you talk about web or cli apps, yes. But desktop apps, no.

It IS harder to dev a desktop app for linux, because you have to support a HUGE diversity of configurations: distros, versions, desktop managers. Want to add a system tray icon ? Well, are you targetting Gnome 2, Gnome shell or Unity? Want to make a package ? rpm, deb, snap, flatpack or appimage? Want to use QT4? Not installable from the official repos in the last Ubuntu LTS.

Meanwhile, apps that worked in Windows XP often still works on Windows 10.

The alternative is to limit yourself to a weak % of users from something that is already a niche.

I love Linux, but being in denial doesn't help getting it in better shape.

Not to mention this lists is tiny compared to the actual article.


>Most the time is not enough for sometime like suspend. The whole idea is that you _trust_ the computer to restore you work space the way it used to be. If you can't, if sometime it may not boot, or disable the network card (common problem), then it failed.

I do not undertand what you guys are talking about. It works flawlessly 100% of the time on my Librem 15 and Lenovo T400 both in Debian and Qubes OS. Just find proper harware to run GNU/Linux.


> Just find proper harware to run GNU/Linux.

I suspect this is indeed the root of the problem; people tend to just pick any hardware first and then try to run Linux on it, whereas in reality they should pick hardware to run Linux on.

They seem to get this when it comes to Mac, but not Linux for some reason.


> They seem to get this when it comes to Mac, but not Linux for some reason

The reason seems pretty obvious.


Care to elaborate?


Sorry, I thought your post was sarcasm.

Well, you have literally no choice for macOs supported hardware. Only apple devices.

Whereas for Linux, you can in theory buy any laptop (there is no official list of supported laptops).

So it's not that they get it for Mac, it's just the lack of choice.


But that's precisely my point.

You don't really have a choice of just any random hardware anywhere, not on Linux, not on macOS, not even Windows.

The only difference is that most BestBuy laptops come preinstalled with Windows, so of course they're going to support Windows. The support is often not excellent either, but in Windows world that's somehow just dismissed as expected or something.

Also, try picking a TALOS II POWER9 workstation and installing Windows on it. Would not go well.

My point being, people should buy Linux-supported hardware, preferably from a reputable Linux-friendly vendor, which would spare them a lot of trouble.

They're already doing this with macOS and they're doing it with Windows too, just not realizing it because that's what comes with the majority of PCs.


My point is that no-one is doing it on purpose for Mac or windows. They either have no choice (for Mac) or it comes by default (for windows).

What you're asking is for people to do it on purpose, which is harder, and requires a lot of thoughts into it, reading about hardware vendors, ..


I agree it's harder, but my opinion is that at this point we should market Linux on the desktop as being a very specific thing requiring hardware explicitly for that purpose and should try to work with vendors to make it an option for more hardware, rather than trying to support everything and it not being the best experience as a result.


Yeah, I agree with that.


but it's very hard to do. if you have some arbitrary requirement (for me it's hidpi[0] and enough ram), it may be very hard to find a provider who meets your requirement, or a review for a product that is on the market. (today, it seems i could get a purism machine, but i couldn't last time i was in the market in 2017.)

unfortunately, for linux users, hit-or-miss may be your best option.

[0] i guess my eyesight started degrading, and now i find low res screens confusing. glasses help but hidpi is important too.


It used to be somewhat hard, agreed, but this is a "2020 edition", where now you have plenty of options, from Purism to System76 to DELL and soon Lenovo - all of these would meet your requirements as listed.

What frustrates me is that these lists get posted every year and never seem to get updated. It's like it's not from daily experience, but still the same list compiled when they tried Mandrake Linux 15 years ago.


Oh, but I do research a lot my machines before.

There are many problems with that:

- a casual user will just not do it. They don't have the know-how, but would they have it, they don't care enough to do so. They just want something that works. For now, Linux is and remains a tech saavy user systems, which is and will always remains a niche. Which of course means less resources will be put into it.

- it's hard, and time consumming. It take no joy in hunting down "that one laptop" that will work everytime I want to buy a new machine. Then being stressed out when I actually install the system, because of course I will never be 100% be right. There will always be something.

- you get twice the constraints. Search for a quality machine AND search for machine that will work on Linux.

- it's not permanent. Upgrade may just break support.

- it's not total support. I love my 2in1 xps. I chose it knowing I couldn't use the webcam and that I would have to forgo the tablet mode use. It's a compromise I had to make. Which a windows user doesn't have to make.

- it means I have 300 great laptops I could potentially buy, but in reality probably only 10 among the most recent ones.

- it means you never get to enjoy modernity. You always have to wait one year or two after a tech is popular to get decent support.

None of this is a deal breaker for _me_. They are mostly 1st world people problems, meaning problems I'm happy to have :)

But hand waving is not how we should approach this.


It all depends on your experience doesn't it. Last time I switched back from Windows to Linux, it was because my laptop's Intel wifi card would constantly stop working under Windows, and because that laptop had no reliable sleep mechanism because Windows was trying to push something called 'connected standby' that basically prevented the laptop properly entering S3 sleep.


> And no, this is really, really not solved. I have problems 1 time out of 3.

This is surprising, since I do the same for work and don't ever have issues. I don't do anything fancy DPI-wise, though, so that might have to do with it.

> Can you imagine if VSCode would restore your file tabs, but only, most of the time?

I can, since Azure Data Studio (which is VSCode-based, IIRC) has this exact issue. So do browsers, on that note.

My laptop, however, does not have this issue, nor have any other laptops on which I've installed Linux in the last 5 years or so (including a lot of random consumer-grade Dells and HPs and Lenovos, let alone the business-grade ones I prefer to buy for myself or for corporate deployments). I close the lid, it sleeps, I open the lid, it wakes up. Every single time.

> It IS harder to dev a desktop app for linux

I do all my desktop application development on Linux (internal corporate applications, nothing public yet). Developing on Linux with PyQt5 (or PyQt4, or TKinter) is an absolute breeze. It's porting to Windows and macOS that's a royal pain in the behind (and has got me looking at C# and Avalonia, which is similarly painless on Linux and hopefully will be less painful for Windows and macOS than my current PyQt5+fbs flow).

> Want to make a package ? rpm, deb, snap, flatpack or appimage?

AppImage. It works everywhere, and effectively makes cross-distro compatibility a non-issue. There's literally no reason not to use it.

If distros want to build and maintain RPM or APT or AUR or Nix or whatever packages, then that's on them.

> Want to use QT4?

Why would I want to use QT4? QT5 is much nicer all around.


I moved to linux on desktop after windows 8 blasphemy (but I am developer of linux applications for decade)

> "Applications development is a major PITA"

This is so true. But not (or mostly not) due to kernel. The whole problem lies in distributions and libraries, they break the ABI on all levels, even libc is not stable. This is the largest showstopper for linux on desktop - it is a moving target, from system file locations to ABI. It is just horrible.

Too many cooks spoil the broth. And the answer is to just add another layer. Docker. No it is not a answer, it is just another patch to the anarchy. In theory there shouldnt be any need for it.


Playing devil's advocate, the anarchy is a major reason for me choosing Linux. Yes, the choice and "disorganization" of the whole thing can be overwhelming and inconvenient, but once you set up things just the way you want, it hits a sweet spot from me that no other OS can, (I've tried them all).

Granted, I mostly work on server-side and CLI software, but both Qt and GTK were reasonably pleasant for the odd job I needed them to do here and there.

I am on Arch, but I've found that targeting the latest Ubuntu LTS gives you a popular enough denominator of libraries that it isn't too hard to make it work on other distros, (the Arch community will usually even do a good job for you via the AUR).


Then you'll have the Rust/Go devs that static link everything for this reason. And the flatpack/snap/appimage advocates to ship a virtual containers. But of course you forgo the trust you get from the repository maintainers curation, and break the security update mechanism.


> I use Linux every day. I love it. But I don't think your list reflects reality.

> > Bollox. Gnome/Mate Display-Manager does an excellent job. If you're still using X11 manual configs than what do you expect..

> I have to go from multi-screen configuration to configuration because of my job. I happen to try a lot of combinations of connectors, dongles, screen brands/resolutions...

> And no, this is really, really not solved. I have problems 1 time out of 3.

> > "Linux drivers are usually much worse (they require a lot of tinkering..)" - Broad statements are usually wrong. Especially this one based on an opinion from the 90's.

> They are, here. For BT, Wifi, graphic cards and printers at least, I attest that I most of the time have less features and/or stability using Libre drivers.

> Does it work well enough for day to day use? Certainly. Can I use my "scan" button on my Epson laser? Absolutly not.

> > "Resume after suspend in Linux is unstable and oftentimes doesn't work." - Bollox. Resume from Hybernate - maybe. But suspend works perfectly most of the time.

> Most the time is not enough for sometime like suspend. The whole idea is that you _trust_ the computer to restore you work space the way it used to be. If you can't, if sometime it may not boot, or disable the network card (common problem), then it failed.

> Can you imagine if VSCode would restore your file tabs, but only, most of the time?

> In fact, one of the first thing I do on a fresh install is to go the the "close lid" settings, and disable suspend, because I'm afraid to lose state by mistake.

> > "X.org is largely outdated" - duh... the whole list on this topic is dated issues resolved by WM.

> X is largely outdated. That why we still have flickers when booting a linux desktop, which means you will never ever make a Mac lover use Linux because they will not trust it.

> Yes, appearance matters.

> > "Applications development is a major PITA" - this is ridiculous.

> If you talk about web or cli apps, yes. But desktop apps, no.

> It IS harder to dev a desktop app for linux, because you have to support a HUGE diversity of configurations: distros, versions, desktop managers. Want to add a system tray icon ? Well, are you targetting Gnome 2, Gnome shell or Unity? Want to make a package ? rpm, deb, snap, flatpack or appimage? Want to use QT4? Not installable from the official repos in the last Ubuntu LTS.

Have you ever tried teaching someone how to set up a dev environment on windows? The things you need to install is a huge PITA, especially knowing what to install. Here's a challenge for you, try to compile a cython extension using clang. In Linux I just set CC=clang.

> Meanwhile, apps that worked in Windows XP often still works on Windows 10.

Except when they don't which is all to common (let's not even talk about OSX).


> They are, here. For BT, Wifi, graphic cards and printers at least, I attest that I most of the time have less features and/or stability using Libre drivers.

Wonder what bt, wifi, graphic cards and printers are you using. Realtek and Intel are fine, so Broadcom? So are AMD and Intel, so Nvidia?

Yes, you have pick your hardware, you cannot throw Linux on random garbage and expect it to be perfect. You would not expect it from MacOS either.

If your vendor didn't do the integration needed to run the system you want, it means you will have to do it. Yes, it takes time; I prefer the vendor doing it too.

> Does it work well enough for day to day use? Certainly. Can I use my "scan" button on my Epson laser? Absolutly not.

So Epson it is?

I have a Samsung MFP in my office that cannot scan in color with Mac (the result is corrupted file). It scans fine with Linux (needs binary-only driver though, uld) and Windows (which cannot find it on the network half of the time I need it). So you can always find some wrinkle, on all systems.

> Most the time is not enough for sometime like suspend.

I always wondered, what is the use case for hibernate. You have to dump the entire RAM on your SSD, and then read it back on resume. Incredibly wasteful and slow, if your RAM is 16GB+.

Suspend is something else; but I haven't seen it fail in years. Thinking about it, the last time I've seen it fail, it was a Mac.

Anyway, both suspend and hibernate is what the firmware is responsible for; the host OS just calls the firmware and it does the job. So contact your system vendor if it doesn't work for you.

> X is largely outdated.

Yes it is. But did you tried to discuss with some people, that X11 is dead and Wayland is the future, and the sooner the switch is done, the better for all? If they cannot hack in their spacebar room heater (ref: https://xkcd.com/1172/), it is not ready.

> That why we still have flickers when booting a linux desktop, which means you will never ever make a Mac lover use Linux because they will not trust it.

X11 has nothing to do with flickering at boot; it is not even running yet. Anyway, the user-friendly ditros do use BGRT nowadays (Ubuntu starting with 20.04, Fedora for some two or three release cycles). So unless you configure Plymouth to use something else than BGRT, or you are not using UEFI, you should have flickerless boot on a recent distro.

But while we are talking anecdotes, I do have a machine that flickers at boot with Windows 10. Do I blame Windows 10 for that? Does that make Windows 10 unsuitable for a normal user? Of course not, it is UEFI GOP on the graphic card that sets up the 1024x768 mode, despite the native being 4K. No OS is going to do anything with it, it can only read EDID and set different mode for itself, which means it will flicker (btw, thanks, Asus, for never updating the firmware for your card, that AMD did release).

> It IS harder to dev a desktop app for linux, because you have to support a HUGE diversity of configurations: distros, versions, desktop managers.

No, you don't. Package it in flatpak, done. If, for some reason you don't want flatpak (why? do you like going against the grain?), pick a distro that most of your customers are using and do QA on them (most pick duo of Ubuntu and RHEL); if it runs on anything else, it is a nice bonus, but unsupported. If someone is trying to make it run on riced out Gentoo with i3, let them, it is their time and effort.

Do you do QA of your app on all possible Windows releases combined with all possible Citrix releases? We don't either.

> Want to add a system tray icon?

Don't. You are not in Windows 95 anymore. Do exactly the same thing you would do in Android: background service + notifications + frontend. The frontend runs only when it has any window open. When the user closes it's windows, the frontend quits, no remnants running in background. When the user stops the service, your app is down, nothing keeps running. There's nothing more annoying that trying to close an app and it thinking it knows better and keeps running and hiding in systray.

> Want to use QT4? Not installable from the official repos in the last Ubuntu LTS.

Being obsolete has something to do with it. If it was available, would you wonder why it doesn't support HiDPI? You don't target PowerPC Mac either, do you?

> Meanwhile, apps that worked in Windows XP often still works on Windows 10.

Might work on Windows 10. There's a chance, that it will, or it might not. I've seen few of them...

> I love Linux, but being in denial doesn't help getting it in better shape.

It is not denial; it is using same standard for all systems. I run all three desktop systems: Windows, Mac, Linux and all of them have their strengths and warts. Each of them is more suitable for some purposes more than the other two. If Linux users are in denial, so are Windows and Mac ones.


> I always wondered, what is the use case for hibernate. You have to dump the entire RAM on your SSD, and then read it back on resume. Incredibly wasteful and slow, if your RAM is 16GB+.

> Suspend is something else; but I haven't seen it fail in years. Thinking about it, the last time I've seen it fail, it was a Mac.

> Anyway, both suspend and hibernate is what the firmware is responsible for; the host OS just calls the firmware and it does the job. So contact your system vendor if it doesn't work for you.

i rely on hibernate since my battery is dead and non-replaceable. effectively, hibernate has added several years of life to my laptop.

i have found that in linux, hibernate and suspend take a round robin approach in the first few years of a device. now, suspend works and hibernate doesn't. next, hibernate works and suspend doesn't. sometimes there is back-and-forth. and then they both work. definitely seems to require coordination between os and firmware.


I've read the same sort of arguments for 20 years. They haven't changed or evolved.

Either a minority of lucid fans have been right all along about the complains of the masses, or this is denial.


Containers, Flatpak, AppImage, AMD's new drivers, libinput, Glamour drivers, Plymouth, bluez and the whole bluetooth stack, PulseAudio.. none of these are 20 years old. It disingenuous to suggest nothing has changed or improved in 20 years.


Anecdotally, I do have to regularly support Windows systems that are a horrible mess, the difference being that Windows users are expecting this and while they may curse Windows, it's just another case of 'it acting up again' and this is so common that is not even a major point against it anymore.

As for macOS, just search for 'catalina' here on HN and you'd find endless complaints.

Not saying Linux doesn't have problems, but it does seem like its proprietary competitors are given a lot more slack.


It is not masses that complain; it is just few loud mouths.


> "Under Linux, setting multi-monitor configurations especially using multiple GPUs running binary NVIDIA drivers can be a major PITA" - Bollox. Not Bollox, multimonitor setup and hdpi scaling is in a sad state in Linux

- "Applications development is a major PITA" - this is ridiculous. Yea, that is the major framework should I use to desktop app that will be supported in 10 years? Compared to Windows compatibility, it is a joke.

I use Linux for more than 10 years, but let's not pretend that it is in the same league as Win/Mac as Desktop OS. It is great if you use it for dev work, but for general Desktop use, usability, security, compatability is trerrible.


With all the different variants of Linux and customization, I have yet to find a desktop environment that I'm actually happy with. Windows has a lot of features I like, they built on them and make them better. For Linux it usually seems there's no direction, they are all spread thin or not optimized and are slow. Even things like file explorers, since they all need to make their own, they are lacking in comparison to Windows. It is incredibly frustrating to be able to do something in Windows file explorer, and to not be able to do the same thing on Linux and instead have to open up a terminal to do something that just takes 2-3 button presses with a proper GUI.


Dolphin is better than Windows Explorer IMO. What can't it do? I would say Dolphin is just above Explorer in terms of the 'top' graphical file managers across operating systems


I don't really like KDE, so of the environments I've used, they don't use Dolphin. You can install it otherwise, but it tends to be buggy. Part of it too is that when you open/save a file in Windows, it effectively gives you a file explorer. There are some apps that roll their own, but on Linux basically all of them roll their own. A lot of the times they lack features like simply being able to create a new folder when you are saving something.


Ah yep, that’s all true. KDE apps are particularly fiddly to get running nicely outside of KDE as well.


Dolphin is better than any other Linux file managers but Windows Explorer is best IMO. In directory that contains many files, Windows Explorer runs quickly but Dolphin isn't quick as it (and alternatives are much worse). Also I like ribbon UI on Explorer.


Ranger ftw


Even without gnome just use the xrandr command when you need multiple monitors. IMO they’re very straightforward. Some distros maybe generate xorg configs that don’t have include your second output port? This isn’t an issue I’ve had with the last 4 computers I’ve owned

I don’t get the driver thing? IME especially with USB drivers Linux needs less tinkering than windows. Maybe they’re complaining about closed drivers or firmware? That’s the manufacture being dickish and not a Linux problem.

I’ve never had the resume issue but people have the same problems on windows.

Application development is easier on GNU/Linux than any other platform. I guess if you’re used to the one particular language/framework another platform supports and you move to Linux the support for that isn’t going to be as good. Developing closed source apps sucks but no one wants to run closed source apps on Linux.


> Numerous people report that Broadcom and Realtek network adapters are barely usable or outright unusable under Linux.

This links to a 5-year-old thread on reddit.

I have personally used both Broadcom and Realtek wifi cards in Linux and while they used to suck 10 years ago (i.e. go to a different computer, find the firmware blob you need, stick it on a USB stick and then come back to the computer you're trying to use and try to work out how to load the firmware blob), I don't recall recently having to do anything to get them to work. They just work straight away in the Ubuntu installer and you can select your network and connect, and this has been the case for several years. I'd say more than 5, but the reddit thread suggests otherwise.

> X.org is largely outdated, unsuitable and even very much insecure for modern PCs and applications.

I don't have any comment to add to this, other than that the word "modern" links to a slashdot thread from 2012. Are we sure this article is 2020 edition?

> Web fonts under Linux often look horrible in old distros.

2 of the linked pages here have dates of 2013 and 2014, the 3rd link is to a Google+ page which no longer exists because Google+ no longer exists. And how are problems in "old distros" a 2020 problem anyway?

> As much as Ubuntu might be commended they still distribute their downloaded ISO images via HTTP in 2020

Aha! Something about 2020. I went to ubuntu.com and tried to download the ISO and it came over https. I tried 3 times and got a different mirror each time, but each one was with https, so I do not understand the complaint. Maybe some of the mirrors aren't https?


Inherent X.org issues have never been solved. Why does it matter what the date of the linked article is?


> All native Linux filesystems (except ext4) are case sensitive about filenames which utterly confuses most users. This wonderful peculiarity doesn't have any sensible rationale.

There is in fact a solid rationale behind this.

> No polish, no consistency and no HIG adherence (even KDE developers admit it).

See also: every operating system. It's unfair to pick on Linux for this since pretty much every Windows and Mac program seems to want to invent its widgets & unique behaviours, and this has been the case since at least the late 1990s when I got into computers.


I think the problem with case sensitivity is that either EVERYTHING has to be either case-sensitive OR case-insensitive.

It would be very hard to change such a deeply-rooted assumption.

I have worked with collisions between case-sensitive and case-insensitive filesystems, and let me tell you it's not pretty. Some common problems:

  #include <Foo.h>    // not found

  cp FOO.BAR FOO.bar foo.BAR /case/insensitive/file/system
(all files cannot survive.)

I think some problems might be very hard to change.


Lol, I missed that. Is the article claiming that ext4 is not case-sensitive?


The author may be thinking of this work from last year: https://lwn.net/Articles/784041/ I do not know if this has since been merged.


The article says that native Linux filesystems _are_ case-sensitive and that could be an issue for users.

I've been working with computers for over 30 years and I haven't had a single case when I needed hello and HELLO in the same directory.


> There is in fact a solid rationale behind this.

Which is what? ;-)


I would recommend Torvalds on this: https://lwn.net/ml/linux-fsdevel/CAHk-=wg2JvjXfdZ8K5Tv3vm6+b...

As for my own thoughts on top...

In US-ASCII, case insensitivity is easy. Outside of that, your operating system kernel has to have opinions on "Ö" being the upper-case "ö" and so on. But why stop there? Should "Ö" be considered the same thing as "O"? How about Ø? Or O with diaeresis[1], which looks exactly the same as "Ö" but is in fact a different character? How about Cyrillic O vs Latin O?

But that also means that your kernel now has to have opinions about filename encoding. Maybe someone will come up with an idea so obviously superior to UTF-8 that it'll supersede it just as UTF-8 ate everything else, or maybe nobody will. What I am glad about, is that filename encoding wasn't baked into Unix in the 1970s, or indeed at any time prior to about 1993.

So, your operating system needs knowledge of all of Unicode and to care about filename encoding for...what gain? It's only weird if your previous experiences with operating systems tell you that it is weird, and calling it weird is not the same thing as being an actual usability problem.

In fact, as witnessed by the fact we're capitalising the first letters of our sentences in our comments, I would argue that case sensitivity is the default in normal writing (the word "it" being a fine example; if I had said "IT" you would have read the sentence differently), and so if anything's weird and counter-intuitive it's treating case as irrelevant.

[1] https://en.wikipedia.org/wiki/O_with_diaeresis_(Cyrillic)


Such a long comment and again what's your use case exactly?

When do you need files e.g. named O.txt and o.txt in the same directory? Won't you yourself be confused by them? It's the simplest example and it already shows that case sensitive file systems bring nothing but confusion.


I don't have a use case. I am saying that it requires zero effort to treat filenames as a big bag of bytes containing whatever, encoded however anyone wants them to be.

Anyone arguing for case-insensitivity has to both a) demonstrate that this is a source of confusion in the real world, by actual people using computers in UIs that exist today, rather than "when I think about it that's kind of weird because it doesn't behave like Windows" or hypothetical oh-dot-texts and b) come up with a better plan that handles the world outside of US ASCII.


There are so many legit problems to be sad about, but he ruins the list by bloating it with like 20-30% crap/bullshit entries. Why? Would be so much better and more productive if he kept it tight.

He actually doesn't even have the issue that got me the most when I switched back from OSX (I hadn't been using Linux on laptops).

libinput is developed by one guy, so instead of real usability testing, he just sort of plays around locally and asks a few friends to record inputs so he can analyse them. This results in really terrible defaults for the trackpad acceleration and clicking, that I could only override with xinput commands that referenced the device IDs, but the device IDs would often change on reboot.


You might have already solved that since then, but you can also add config files to /usr/share/X11/xorg.conf.d/50-mytrackpad.conf and that will set the properties you want for libinput devices at startup. Works on my xubuntu so I can disable the trackpad and use only the track point with custom acceleration and sensitivity values.


I'm not the parent but when I go through the trouble of creating a custom config I go one step further and switch to using synaptics driver instead of libinput.


> by bloating it with like 20-30% crap/bullshit entries.

There's a comments section below the article and the author actually listens to the input and updates the article when something is no longer true.


Or bans you :-) How funny so many people point out the subjectivity of your article huh ? Facts, facts, facts... Pedrito :-)


For the past four weeks a single person has been banned because he was obsessed with the Linux kernel and dismissed the entire article because of that.

Nice you've discovered that while overlooking the other 3000 comments.

That's called selective attention and confirmation bias, sir.


A single one :) Sure about that ? ^^ But nice to hear that dismissing your article leads to ban. Indeed this is what I experienced myself...


If you're dismissing something with prejudice (yep, the last banned commentator just didn't have anything valuable to add and he was simply obsessed with the Linux kernel and terminology though the article explicitly states it's about Linux for/on the desktop) it's not even clear why you choose to comment upon in the first place. Dismiss the whole website then! Don't visit it ever ;-) Show your total indifference. :-)

This pesky horrible Web 1.0 website filled with FUD from an unknown Joe doesn't deserve your time. 2 million visitors over 10 years? That's what google.com sees in a few minutes. Cited and quoted all over the web? Still doesn't make it an authority. Almost everything on it is either wrong or exaggerated.

Aren't you smiling and laughing yet? You must be ;-)


I am not this guy who was obsessed by Linux Kernel. Just check your Disqus dashboard and you will get the list of the users you banned these last days if you cannot remember by yourself ^^

But anyway, why dismissing the whole website which i do not know ? I am just talking about this lazy article you wrote, full of perceptions but a bit short when it comes to facts... And so outdated... You wanted to promote yourselves on HN but regarding the amount of negative feedbacks, I hope you got that before thinking about promotion, you should think about working harder and better.


> You wanted to promote yourselves

I have never promoted the article on HN. You may recheck who posted the link here.

> full of perceptions but a bit short when it comes to facts

I will dismiss this utterance with prejudice. In absolute most cases the people who use it actually have no counter arguments or valid criticism.

Edit: actually there's nothing from you in this discussion aside from "it's all wrong".


OTOH he does have some things coded in green.

But yeah, it looks overly critical.

It might seem less biased if there was a subsystem-by-subsystem scorecard, showing positives and "needs work" items side-by-side.


> It might seem less biased if there was a subsystem-by-subsystem scorecard, showing positives and "needs work" items side-by-side.

And then we'll have wars on what's serious and what's not. Year, right. :-)


Like 80% of the issue go away if you buy a machine with linux preinstalled. All the hardware issues are solved, all things just work. And there choice too: Dell, tuxedo, system76 and others

I have bought like 4 Dell xps 13 developer edition machines so far, never had any problems.

Why do ppl expect software to work on random crap windows-only machines?! There problems that can only be solved by vendor good will.

At least do you homework and check if your favourite distro supports the hardware you buy. Arch has a wonderful hardware-related wiki, ubuntu certifies machines, etc...


It kind of helps until you get a Linux update and a perfectly working driver gets replaced by some WIP alternative in name of FOSS.

Had that experience with an Asus laptop sold with Ubuntu LTS.


That's the risk when the perfectly working driver is binary only and the original vendor has just thrown it over the wall, with no intent of maintenance.

Speaking of ASUS, I've yet to buy something from them that doesn't have some quirk. The last thing I bought from them, a GPU, had never updated it's firmware, despite AMD releasing it to OEMs and other OEMs releasing it to their customers. Unlike your laptop, my GPU at works with the FOSS driver, its just limited while UEFI is running.


> No high level, stable, sane (truly forward and backward compatible) and standardized API for developing GUI applications (like core Win32 API - most Windows 95 applications still run fine in Windows 10 - that's 24 years of binary compatibility). Both GTK and Qt (incompatible GTK versions 1, 2, 3, 4 and incompatible Qt versions 4, 5, 6 just for the last decade) don't strive to be backwards compatible.

So just install the previous version of those toolkits with the application (or use your package manager which will do that for you).

Windows gets around this by installing basically everything in the base install and that causes the OS footprint to be grossly inflated. Where as Apple basically stick their middle finger to backwards compatibility and tell devs they have to update their applications.

Personally I think Linux has the right approach here: being new features but allow old libraries to be installed when required.


Note that even with Windows you often had to install some standard libraries, like MSVCR.DLL or the DirectX version du jour (and I think some VB runtimes, too). Most often redistributed with the software, so done by the installer.

Although, to be fair, it might very well be the case that for a given old Linux binary, your distribution won't* have the old runtimes available. Good luck if you've got a commercial pre-press helper application that requires Qt 1.x.

Other than availability, what's the big backwards compatibility killer for Linux? a.out vs. ELF? (I remember someone creating their own distribution due to the ELF/glibc shift, back in neolithic times)


Try and run an app relying on Qt4 on the most recent Ubuntu. It's a total PITA. You'd need to compile Qt4 yourself.

(Not joking.)


Ubuntu isn't the desktop Linux out there. There are others and some of them have a more extensive array of software available and/or a larger collection of legacy libraries in their official repositories.


Last time I used Ubuntu, there were PPAs for this, but Qt 4 is outdated and does not even support HDPI, coming from a Mac user it is rich, because macOS software tends to support at most 2 releases of macOS at any given time.


Distros like to deprecate the previous version of these toolkits and not include them at all. Installing GTK1 or QT <= 4 is beyond the capacity of ordinary users (not of HN crowd of course).


If an application in the repos depends on it, then that version of Qt or GTK will also be available in the repos. What's more likely to happen is:

- either that application is either dropped from the repos for being unmaintained (if it's using a library that old then there's likely other issues with it)

- or some other helpful sole would fork the project and update it to run on a newer version framework

This is also very distribution specific. For example ArchLinux still supports Qt 3 applications and libraries in the community repos however I wouldn't expect the same from CentOS (though admittedly I've also not checked).


Well, our scenario is that the app is not updated, so that's kinda assumed. What usually happens is that distros drop apps due to old library version, and if the user needs it, than the user has to build all the deps, and most users can't do it.

I didn't know Arch had older versions - most distros are very eager to drop, since supporting (or even compiling) these huge frameworks past EOL becomes harder and harder.


That’s the problem though, your scenario is making a boat load of assumptions to stand up a theoretical point. Assumptions that the user is running a specific distro, that they then want a specific application and that application has been unmaintained for more than 10 years. This isn’t a common scenario regular Linux folk are going to find themselves in.

Plus the situation wouldn’t be any better on Windows (good luck finding the installer for software that has been abandoned 10 years ago from safe places online) nor macOS (Apple do breaking changes more frequently more than any other platform) either. And that’s without factoring in any security concerns that running software as old as that might introduce.

So regardless of the platform, if it’s consumer software (ie not industrial applications or other such specialist edge case), you’d recommend the user switches to a newer and maintained alternative.


"your scenario is making a boat load of assumptions to stand up a theoretical point"

Well, that's the scenario the original post made, not mine. From my experience, this can happen albeit rarely - especially with closed source software.

"specific distro" == The most popular Linux distros don't include old frameworks.

"they then want a specific application and that application has been unmaintained for more than 10 years"

This (for example) eventually describes pretty much 99% of games released natively for Linux, unless one goes with Steam which tries to keep stuff working (I dunno whether GoG or itch also do it).

For really old software, Windows is the best bet. Even if Windows itself doesn't work (it usually does), wine might.


TK is pretty stable and high level and allows for scripting on top of that.

Plus it runs on nearly anything not just GNU/Linux.


I wonder if a semi-centralised -- and hopefully curated -- list of Linux-approved hardware can be maintained somewhere?

Even though I am generally sceptical towards desktop Linux due to a multitude or really weird issues some people report, I also know that it works quite rock-solidly for much more others. So I'd say nowadays "vet your hardware before investing in a Linux station" is a fair and balanced advice -- especially in a world where most hardware vendors just make a half-done Windows driver and peace out.

Furthermore, I wonder if something can be built on top of such a curated list of approved Linux hardware like, say, troubleshooting tips for multi-monitor setups, or multi-sound-card setups, or general weirdnesses that occasionally happen only to some?

At this point I believe it's really fair for the Linux community to give up on most of the hardware. But IMO if Linux is to have some mindshare of the desktop segment it should have a curated list of approved hardware and tidbits for addressing rare issues and improving the general quality of life.

(And let's not forget X11's scandalous ability to allow every program to capture keystrokes. This has to be addressed in an ergonomic matter one day. Qubes is quite nice and all but it's a pain to setup. Somebody has to wrap this stuff in the usual brainless Next->Next->Next GUI wizards sometime.)


There is

For Ubuntu see: https://certification.ubuntu.com/


For RHEL: https://access.redhat.com/ecosystem/search/#/ecosystem/Red%2...

I don't think we'll ever see a "Linux" hardware compatibility list because what you really need is a "this particular version of this particular distribution" compatibility list.


Just check the arch/Debian/gentoo wiki before you buy something heh.

If you want a truly centralized list all the drivers are kept in one repo on kernel.org


Oh, this paragraph is good, especially in light of the Corona impact on OSS too:

> Money, enthusiasm, motivation and responsibility: I predicted years ago that FOSS developers would start drifting away from the platform as FOSS is no longer a playground, it requires substantial effort and time, i.e. the fun is over, developers want real money to get the really hard work done. FOSS development, which lacks financial backing, shows its fatigue and disillusionment. The FOSS platform after all requires financially motivated developers as underfunded projects start to wane and critical bugs stay open for years. One could say "Good riddance", but the problem is that oftentimes those dying projects have no alternatives or similarly-featured successors.


You can compile such a list for any operating system in 2020.

Recently I evaluated three randomly selected HDMI capture devices, Ubuntu supported each of them out of the box, while Windows and MacOs struggled. With the printer in our office its the other way around. The printer at home is old, so its just Linux supported. And so on..


The only problem with desktop Linux these days is there still not being more companies throwing some effort in.

Adobe and Microsoft could make serious money here. Their tooling aren't just standards, they're —in many ways— the best available tools, that Linux does not have viable solutions for. Clawing our way along with alternatives, and Wine and VMs isn't sustainable. Not forever. We just need a couple of big names to start things going. But it perpetually never happens. Maybe 2021. I'm bizarrely optimistic about Microsoft.

Once software support comes, and people can carry on doing their work, the weight of demand forces hardware support for oddball wifi chips, RGB mice, whatever. I can buy another mouse today, I need Illustrator and Photoshop for professional interop.

As for TFA, everybody derives their annoyances from different places. Desktop Linux isn't perfect, but it's the best I've got, by some distance, for my workloads.


Tangent: What are people using for thin clients / terminals? E.g. stuff that is deployed at e.g. airport displays for showing connections flights or self service cash-registers?

Linux seems like the obvious choice as a base OS. However, running X+Firefox ontop of it seemed too clunky (slow) and unrelieable to me?

- How do I make sure you recover from crashes reliably?

- How do you secure the inputs, to avoid escapes from the application?


Considering the times I've seen airport connecting flight displays replaced with bluescreen "Your Windows 7 PC is out of support", shopping mall digital ad displays showing that a QuickTime 7 & iTunes patch is available, or the self-service grocery shopping terminals displaying a Windows XP style "Ethernet cable is unplugged" notification, I'd guess not as much thought is going into those kinds of concerns as we might like.

I think I have heard of some retail chain stores using FreeDOS with a dedicated keyboard-driven accounting package to run the business, though. (In this case I'm thinking of one of the larger Luxottica owned optometry chains.)


That’s part of the "embedded computing" problem space, and people have been doing stuff like developing Qt application running as a systemd service and directly hitting DRM (the kernel graphical API), so no X, said application being mostly just a webview tailored to your use case. In addition to the webview you could imagine adding an HTTP endpoint to your application for monitoring and remote control, that kind of stuff.

I say Qt because it’s popular in the embedded space, and indeed makes doing that kind of stuff easy (by example by having a DRM backend).


I feel like most of the corporate world is gravitating toward web interfaces.

Works great for me, since then a "thin client" can be as simple as a stripped-down Linux running a Qt5 or Electron app (or a browser in kiosk mode) that does nothing but access that web interface (or perhaps has a couple different tabs for different systems).

For actual thin-client needs, it seems like Citrix XenDesktop is the go-to nowadays. Linux as a base OS works okay for this (there are Linux ICA clients). Linux also has RDP clients for Windows terminal servers (or hell, even Linux terminal servers with an RDP server; I typically do this for openSUSE servers in corporate environments where the people doing day-to-day maintenance are used to Windows and insist on a GUI).

> How do I make sure you recover from crashes reliably?

Just restart the app.

> How do you secure the inputs, to avoid escapes from the application?

With a stripped down environment the only such escapes are ones you configure yourself. X11 and the session thereof can be configured to have zero options besides starting up your chosen application. Even if you leave Ctrl-Alt-Backspace enabled, it'd just boot right back into that app. Ctrl-Alt-F(whatever) (if still enabled) should present a login prompt, so unless the password's compromised you're good to go there, too.

Then it's just a matter of locking down the app itself, which again is straightforward if the session is configured to just restart the app if it closes/crashes. A custom application (say, a Qt5 app doing nothing except presenting a WebEngine widget and perhaps some support for hardware interfacing) offers fewer escape opportunities than, say, a generic browser in "kiosk mode".


Some of those points seem taken a bit too far. For example:

> When people purchase a Windows PC do they research anything? No, they rightly assume everything will work out of the box right from the get-go.

That's how I ended up with an Epson printer which works just fine with macos and Linux, but not with windows 10.

I also really enjoy the fact that printer drivers in Linux have no extra features. They normally support printing from a chosen tray with chosen colours and dpi. That's all I want from a driver. I hate every single value-add option stuffed into windows drivers.


There is a flavour here that although the individual points are subsystem specific the overall view is very much the Linux Desktop as a whole.

The Wayland commentary is especially interesting and solidifies my view that Wayland is never going to replace X, and quite possibly this is intentional on the part of the Wayland devs. Reading through the X Server and Wayland headings, some of those issues sound so structural that the most viable long term path might be containerised mini-displays under some as yet unpublicised meta-manager that rethinks how applications play with together to manage keyboard and mouse.

The Wayland experiment remains a source of great entertainment. The devs are doing amazing work lightening the dependence of the Linux ecosystem on X11.


I installed Fedora on my Macbook Pro 2015 a few weeks ago to try it out, mainly because of one killer app - the i3 window manager.

It's been a blast, albeit not without some frustrations.

The good

- WiFi worked out of the box

- i3 is awesome

- battery life seems fine

The bad/annoying

- took me a while to get a reliable way of setting sufficient Xrandr settings to allow me to extend my desktop onto a 2nd monitor.

- took me a while to find out the right commands to use to switch keyboard layouts when I plug in an external keyboard.

- I've seen suspend not work properly a few times when closing the laptop lid

- have to get used to new keyboard shortcuts for copy&paste and other things, this is more of a problem with switching from OSX -> Linux rather than a fault of Linux

- firefox performance is really really bad. like horribly slow switching tabs, and all the items on a website seem to load in really slowly, especially on resource hungry websites (e.g. YouTube). I've tried installing nightly, enabling webrender etc, but it's just been dogshit slow. In comparison, Chrome feels lightning fast so sadly, I've had to switch.

- hardware video decoding in web browsers doesn't seem to be supported? i've been reading around, apparently there's a patched version of chromium out there that supposedly supports it - but it doesn't like that's ever going to go mainstream. I was shocked loading a video and seeing my CPU fan maxing out!

I'd imagine the xrandr/keyboard issues are mostly because of the manual nature of i3wm, over say Gnome where it might be more fluid

The depressing

At work I have a 2019 MBP which won't support linux out of the box (well - I'd imagine it boots, but it seems like there are horrible audio/wifi/input issues and problems with the T2 chip) and I really want to use i3wm at work. I'm tempted to get a VM going as a substitute...


Have you tried using SwayWM? It's compatible with your existing i3 configuration but uses Wayland as the display protocol so some of your issues could be solved by that switch (Firefox performance, hardware video decoding, multiple monitors...).

See https://fedoramagazine.org/setting-up-the-sway-window-manage...

And https://fedoramagazine.org/how-to-setup-multiple-monitors-in...


Yabai [1] turned out to be pretty sufficient replacement of i3 on macOS for me

[1] https://github.com/koekeishiya/yabai


I just want to say that after upgrading to 20.04 from 19.10 something went wrong so I clean installed it. Then my Intel WiFi stopped working (even though the Bluetooth did work) and my multi monitor setup kept getting screwed up on reboot where a mousclick on the left screen would register on the right screen. I tried to file bug reports with canonical but the process seemed so convoluted I just gave up and am back to Windows and MacOS again. Maybe I'll try again in a few months, maybe not. WSL2 is working pretty well for me


Ubuntu 20.04 seem to get a lot of heat compared to previous LTS releases. IIRC 18.04 release was relatively uneventful (in a good way).


Get yourself a Tuxedo and be happy for the rest of your days: https://www.tuxedocomputers.com/en


I really wish more "Linux-friendly" laptop vendors would ship with AMD GPUs instead of NVIDIA. I get that there are probably constraints from upstream whitelabel manufacturers, but still. Even the high-end Ryzen based laptops on that site seem to only offer NVIDIA GPUs, which is lame.

It's 2020. Ain't nobody got time for NVIDIA driver issues.


Until you need support, according to reviews: https://www.trustpilot.com/review/tuxedocomputers.com


Support was excellent in my case. They responded quickly, and solved my issue competently.

I agree that not all of their laptops are top quality, but mine is just perfect for me, couldn't be more satisfied.


I used to develop on a macbook. Then I changed employer and consequently, changed laptop too. I won't name brand and model but it was running ubuntu out of the box with the following problems:

- couldn't suspend/resume properly when folded

- very low battery life

- overheat, a lot, and of course noisy fans then

- couldn't make use of all the ports of the dock

I actually liked the OS (eventually I switched to Debian/XFCE though), but should I be the one to pay for this, I would be pretty pissed off. We're talking about a 2500€ machine here.


This list is too sprawling and too detailed. If the author used the same format he used for the summary or for his summary of Windows issues, and then put the detailed list under the summary, it would be far more useful.

Most Linux problems on the Desktop aren't the result of individual bugs/issues, but of structural problems, and highlighting these problems first would be much more accessible and useful than the bug reports.


> If the author used the same format he used for the summary or for his summary of Windows issues

But there is one:

https://itvision.altervista.org/why.linux.is.not.ready.for.t...


I noticed that (that's the "Summary" I mentioned).

But reading it after the very long text about individual bugs is much less useful. Most position/research papers put the executive summary/abstract first with good reason - that helps the paper be better received. Maybe even organizing the lists as collapsible items could help the presentation be better* .

As for the summary list, it too can benefit from a more probing look which tries to look at the issue more deeply, much more than listing individual issues. For example, ABIs and APIs are often unstable on purpose, especially on the kernel. Linus has long arguments supporting the unstable model and some Free software proponents like that this encourages GPL code.

Do we want the Windows model? Maybe a shim could supply that? Or maybe we shouldn't have even that due to Free software reasons? (RMS tried that which gcc and it was one of the reasons that led to LLVM gaining traction). Or development speed?

* That doesn't require JavaScript. One could use HTML5's simple summary/detail elements.

https://developer.mozilla.org/en-US/docs/Web/HTML/Element/de...

There are css hacks which can achieve the same result on HTML4, but they are much more complicated and I suspect a list on Linux issues shouldn't care about supporting Internet Explorer.


The article was meant to be read in full.

Yeah, the modern generation of people is mostly capable of consuming 140/280 chars tweets and has severe issues with the attention span but if you're wanna get deep into it, you'll read it all. Or just dismiss everything because you didn't like how the article is organized - that'll work as well, as it has for the past 30 years which means Linux on the desktop will remain a hobby OS for geeks.


Your tone is unwarranted.

I did not dismiss anything. Nor did I ask you to remove any information. I merely wished you'd organize it in a better way.

The better way IMHO is to emphasize the structural economical/sociological/technical reasons. Why? Because even fixing all the small technical bugs in that long list would help only for a little while.

For example, if Linux Desktop somehow got to magically all items fixed case, it is way underfunded. So it will either stagnate in place or try to innovate and introduce new bugs... On the other hand, find a workable revenue model, and then devs would be paid to fix bugs, and that would be sustainable.

Another example, fixing current NVidia incompatibilities would help, but the unstable Linux ABI/API would remain, and that would cause issues in the future with any other future NVidia or other manufacturer which balk at putting their code under OSS. This issue needs a decision which balances all the competing interests.


> Why? Because even fixing all the small technical bugs in that long list would help only for a little while.

Totally agree with that.

> For example, if Linux Desktop somehow got to magically all items fixed case, it is way underfunded.

Kinda disagree with "the underfunded" part. The biggest issue with Linux is that it's not a platform like Windows, Android, iOS, etc. All of them feature

1) Rich stable APIs/ABIs, including for device drivers 2) Strong backward and forward compatibility 3) Strong cohesion between system components (e.g. kernel and userspace) 4) Excellent support for hardware 5) Universal packaging mechanism (the same 'exe' can be installed on any supported version of Windows/MS-DOS/OS/2, the same APK can be installed on any supported version of Android)

(See the Solving Linux part of the article).

There's just one distro which ticks almost all the checkboxes and that's RHEL and it's not exactly meant to be used on the desktop. Besides its hardware support leaves a lot to be desired as RedHat sticks to the same kernel release for all the support period (10 years or something) and regularly backports only certain drivers from mainline.

If someone created a Linux platform, even with the same number of bugs/missing features that we already have and it would still be a hundred times more successful than Linux has ever been.


I think linux works reasonably well on the desktop as long as you stick to the standard settings, i.e., running only linux, accepting the way things were set, not worrying too much about loud fans, etc.

The big issue I have is the mixed quality of documentation for "normal users" (those between just accepting everything on the one side and going through all man pages on the other), which always gives me an uneasy feeling that I didn't quite set something up the right way and that it could lead to issues somewhere along the way.

While it would be nice to have, I don't necessarily need a graphical/terminal configuration tool (though it would be nice when I'm setting up things that I only set up once and don't want to go through the trouble understanding how it works, e.g. wifi). What I really need is a good wiki that is up-to-date and useful posts on stack exchange.

Now this is where I run into problems: Arch for instance has a great wiki, but many posts, articles, etc. are for Ubuntu, which doesn't have a very useful wiki.


The man pages are the documentation for users?

I would say stay away from gnome if your fans are running all the time, it’s a terrible DE and you don’t need it. Just install eg fvwm or xfce (or lxde or whatever you like.)

>I only set up once and don't want to go through the trouble understanding how it works, e.g. wifi)

That’s going to cause issues no matter which OS you run. Just read the wpa_supplicant man page (or the appropriate documentation if your distro uses something else and you don’t want to change it.) You can skim for just the things you need and grab an example config, it’s not a hard tool to use and it only takes a couple minutes. After you do this you won’t be surprised when it breaks in a way you don’t understand.


Money and corporate influence and culture is a big cause of most of those problems.

If Linus Torvalds had been born american (just discovered he is naturalized american since 2010 https://lwn.net/Articles/404729/ ), I'm pretty certain Linux would be quite a different beast today.

Generally, computing works best when the software dictates the homogeneity of hardware, something Microsoft managed to do. Microsoft is able to influence the hardware industry, which is why windows is a success.

I'm also suspecting microsoft got some kind of relationship with hardware vendors, to help them write drivers etc.

If you look at android, it's exactly the same scenario: the hardware follows the software, not the other way around.

Hardware will always be the problem, especially as long as hardware tools are also proprietary, and generally all you need to do is follow how the US maintains its grip on the industry.


The biggest most obvious slap you in the face problem with the Ubuntu desktop is cut and paste.

This is the key thing a desktop should get right but it’s so wrong.

It uses different keys to both OS X and windows, whereas it really should seamlessly offer both key combinations. And it’s inconsistent between applications.

We’re so far down the Linux desktop path for cut and paste to be this broken.


> It uses different keys to both OS X and Windows

Why should this be a problem? You are on a different system, learn to use it the way it was intended.

Ctrl-C has a different meaning in Unix. It has been so before windows and Macs existed.


This is so precisely a representation of Unix think, and why Linux has such a problem competing as a desktop.


This is an odd complaint. Cut/paste on Linux has been broken for a long time (possibly it's a past problem now), but not for the key bindings (d'oh), rather, for programs using incompatible APIs/frameworks, causing copy/paste not to work all the time.

Even this, I think has been fixed, though, as I haven't been using a clipboard manager for a long time.


I sometimes have issues with pasting screenshots into Rambox on my work laptop (it works most of the time, but sometimes it'll just inexplicably break, forcing me to paste the screenshot into a paint program first and then copy/paste into Rambox).

Other than that, yeah, copy/paste ain't much of an issue. It's also really nice to be able to select text with my mouse and middle-click to paste (and it works across pretty much every non-game app I've used).


> Under some circumstances the system or X.org's GUI may become very slow and unresponsive due to various problems with video acceleration or lack of it and also due to notorious bug 12309

Anyone know what he's talking about. It sounds a lot like the issue I was experiencing, but the ticket is not accessible, even when logging in. Would love to read the details. The constant, sometimes multi-minute stutters where the system became unresponsive is one of the main reasons I stopped using it last time. Every few years, I give it a shot, love it, and then something like this (or external displays no longer working 90% of the time) that just makes me give up. I've never had an install run for more than about six months without something catastrophic happening that requires a complete reinstall / restore from backup, usually due to some desktop environment issue preventing it from loading/displaying.


I take the chance to report that, embarrassingly, Bluetooth is broken by default on Ubuntu, and has been since 18.04, due to wrong settings in the Pulseaudio configuration file.

This has been reported against 18.04, and has been ignored.

What are users going to think when they try such a basic functionality?


I have lots of bad experiences with Bluetooth. On Moto G1, Nexus 5x, Pixel 1, Aquaris X, Dell laptop with recent Windows 10, ARM Chromebook, x86 Chromebook, Ubuntu. While connecting between those devices, connecting to a Jabra headset, Lenco soundbar, noname mobile speaker, OneAudio headset, Renault Laguna audio system. Issues varying between: can't see any device at all; can't see the device I'm trying to connect to; can't connect; can connect, but the device freezes and needs a restart; can connect, but the sound still comes out of an internal speaker; can connect everything works, but then it stops working and above apply; everything works, but have to pair every time.

It's dreadful everywhere. I think I had somewhat better experience with a Macbook, but at this stage I use bluetooth as little as possible, so can't really say.

Sorry, I guess I'm just venting...


This is a generalization, but these devices are very likely only tested until they barely work on Windows and Mac OS. And if they have to choose between a quick fix that makes it work on those two, and implementing the spec properly, they'll pick the first option. So it's on Linux to discover and work around all the brokenness (in the bluetooth host device as well as the speakers, phones, peripherals, etc...)


Bluetooth is just massively (and possible needlessly) complex. The combinations of supported profiles and options are likely more numerous than IPsec ciphers. I've not seen a stack that doesn't spectacularly fail with at least one of my devices. It's not vendor/product specific.


Woah!!! So that's why my Dell XPS13 couldn't use my Android phone's internet via bluetooth?! I thought I was doing something wrong (I got the internet to work using the wifi hotspot instead). I am on Ubuntu 20.01 so this problem seems to still be there.


Do you mean Bluetooth audio, or actually pulseaudio impacting the whole stack?

What's the launchpad bug number?


I don't have the Launchpad bug at hand, but I have the Stackoverflow question/answer: https://askubuntu.com/q/1037370


Which settings?


I have to agree with the post that explains areas of improvement.

I have been an ArchLinux/Gnome user for 1.5 years on my home laptop but I am pondering switching back to Windows since WSL2 support. I am currently writing this message from Windows10 partition. Windows10 has also its painpoints like Telemetry, default apps that I don't want, not following Unix philosophy (configuration with simple text files) and ability to customize but it is improving also.

For me the remaining painpoints for Linux on the desktop even if it improved a lot over the years are:

* bluetooth/pulseaudio not snappy and reliable. Sometimes my bluetooth headset connects instantly, sometimes it takes 30s to 1mn, sometimes it never connects automatically and I have to do it manually. I did not have that issue on Mac OS X/W10. Also I have yet not figured out why an audio profile is added with gnome bluetooth but not when I use sway as desktop environment

* resume from suspend reliably. Sometimes my network connectivity is broken after resume and cannot be recovered easily. I need to reboot.

* Graphic drivers support even if not running Nvidia simplifies it a little bit. For Intel IGpu, it is ok but it seems still less performant than on Windows/Mac

* lack of full featured drivers and software for some hardware especially for printers, scanners or fingerprint readers, the available ones lack proprietary features but it is obviously not Linux's fault.

* lack of default video acceleration for browsers. Even if it is now possible with Firefox on Wayland with some configuration and Chromium but with community managed VAAPI support

* Multiple screen and especially switching from Hidpi and low dpi. It is a really mess here due to different GUI toolkits being used and lack of standard. GTK has its own way, you can do with X or Wayland fractional (but blurry) scaling too. For instance what I do now on Gnome wayland is not doing any fractional (blurry) scaling but text scaling of 1.22 by default and using systemd udev revert to 1.0 when I plug my external display...

as for Gnome I like it, I find it to be a MacOS GUI clone with a lack of customization that can be circumvented with extensions except for performance (still laggy even if it improved also a lot).


> bluetooth/pulseaudio not snappy and reliable

Without patching of pulseaudio, microphone on a bluetooth headset doesn't work anyway. I'd take not snappy but working any day, really. Maybe in Pulseaudio 14 this will get resolved, there are a couple of proposals/drafts already.


It doesn't have to be Windows or Linux, it can be Windows and Linux. I just use Arch Linux on my Windows machine using WSL 2. Arch is my favourite linux distro with a really brilliant package manager and because of it being bleeding edge. By running it inside of WSL2, I don't have to bother much about stability since it's running inside Windows. But at the same time I get a full blown proper linux distro for my development work.


And similarly, Windows runs incredibly well under QEMU if you want to do it that way around, though that kind of setup really requires a second GPU to be performant so is better suited to desktops.


> Linux has a 255 bytes limitation for file names (this translates to just 63 four-byte characters in UTF-8) - not a great deal but copying or using files or directories with long names from your Windows PC can become a serious challenge.

Most file systems on Linux like ext4 have a 255 byte file name limit. There’s no limit in the kernel.

I think Windows’ effective 260-char total path limit is worse.


> Windows’ effective 260-char total path limit

was lifted a few years ago.


It still hurt me a bit to say it, but I have a really smooth experience with Linux on Windows 10 thanks to WSL2, the new Windows terminal, and the integration with VS Code. I use Linux a lot on desktop through windows and I have almost none of the issues listed there. Windows has other issues but not as annoying or time consuming to fix for me.


It's 2020, and I still have to manually enable Wifi Mesh Networking (802.11s) and recompile in Debian, even Buster. But that is a minor inconvenience, and takes 5 minutes. I do that for some other package too, for some niche options, but it's very easy to do in Debian.


I don’t recommend Linux as a desktop anymore (free99 sold me for a while though).

Still beats windows on older hardware..when it comes to minimal browser boxes Linux is still the best.


I can't really agree here. I'm very much the definition of a power user and I love the easy accessible configuration (text files FTW) and the overall customizability of it. Working on Windows with hard-to-find GUIs, no tiling WM and overall lack of system control feels like a grind to me.

Sure, some of that is habit, but I've never felt like a Windows machine was truly "mine".


That's a bit throwing the baby with the bathwater. There is a significant amount of people who use their machines in extremely simple ways, ie. web browsing. Desktop Linux is perfectly fine for that.


Unfortunately I have to disagree with even that. Firefox (the default browser in most Linux distros) can't do hardware accelerated video, so you tend to get tearing/stuttering or at the very least a higher CPU load when watching YouTube for example, compared to say Windows, resulting in increased battery consumption if a laptop or at the very least increase fan speeds.

I know it's being worked on with Firefox having acceleration worked on in Wayland, but right now, if even the simplest and most common use-cases have major downsides compared to Windows/OS X, it's hard to justify moving to a worse experience. Windows rules with vendor support, that's just how it is. I've grown tired of having to compromise.


> Firefox (the default browser in most Linux distros) can't do hardware accelerated video

https://wiki.archlinux.org/index.php/Firefox#Hardware_video_...

In short: Firefox supports it under Wayland since version 75 (and for more formats since 76).

Not that I've really noticed this being an issue under X11.


AFAIK OSX hides from userland the support for VP9 decode acceleration present in the Macbook's hardware, Youtube's primary codec these days, so now Firefox supports VAAPI under Wayland it is already ahead of OSX/Macbooks in this regard.


I need to test this (I didn't know about the hardware acceleration), however, first: tearing and stuttering are different things.

Tearing required, for a very long time, to be configured manually in the Xorg configuration. However, on 20.04 now it's not needed anymore (I don't know why; I've just verified it on Nvidia and Intel GPUs).


My desktop is a Raspberry Pi, you insensitive clod!


Most of the problems are valid, but some of them are... very much not, and haven't been for years (and others are "problems" but absolutely not "major" by any stretch).

> NVIDIA Optimus technology which is used in most laptops often doesn't work well in Linux.

Then stop buying them. My laptop (Intel + AMD) works fine w/ Mesa, regardless of which GPU I'm using (Intel iGPU by default, and AMD if I set DRI_PRIME=1 for a given application, e.g. in a game's launch options in Steam).

NVIDIA's terrible. This whole section of the article seems to only describe issues with NVIDIA. If you buy hardware that's not shit, then lo and behold, you'll have a better experience.

> Open source drivers have certain, sometimes very serious problems

If "sometimes" means "basically never", then sure. I've "basically never" have had issues with Mesa on Intel or AMD hardware (I'm indeed running AMD hardware w/ Mesa on the very desktop on which I'm typing this comment, and it's been absolutely painless out-of-the-box).

> PulseAudio is unsuitable for multiuser mode

    sudo killall pulseaudio
Wow, so hard. Plus, the hyperlink for "multiuser mode" describes exactly how to launch PulseAudio system-wide to address this very concern (even if they do nonsensically advise against it, because of course they do).

PulseAudio's shitty, but this ain't a reason.

> There are still many printers which are not supported at all or only barely supported

And there are many more which work just about perfectly out-of-the-box, namely almost everything by HP or Brother.

> Resume after suspend in Linux is unstable and oftentimes doesn't work.

On a wide variety of old and new laptops on which I've installed Linux this has simply not been an issue for years.

> I have personally reported two serious audio playback regressions, which have been consequently resolved, however most users don't know how to file bugs, how to bisect regressions, how to identify faulty components.

Most users will report bugs to their distros' maintainers, who hopefully know how to file these bugs against the kernel.

> X.org allows applications to exclusively grab keyboard and mouse input. If such applications misbehave you are left with a system you cannot manage, you cannot even switch to text terminals.

Alt+SysRq+R, then Ctrl+Alt+F1

Wow, so hard. Like, this works for even the absolute worst lockups (it's the first step of the "REISUB" maneuver to cleanly reboot a locked-up Linux system).

> X.org has no means of providing a tear-free experience, it's only available if you're running a compositing window manager in the OpenGL mode with vsync-to-blank enabled.

Sounds like a "means of providing a tear-free experience" to me.

And, like, does this really warrant the double-exclamation-marks? God forbid you see a graphical artifact every once in awhile.

> Applications (or GUI toolkits) must implement their own font antialiasing

Applications might have different anti-aliasing needs. I see no problem with this. And again, this absolutely does not warrant double-exclamation-marks.

> Wayland doesn't allow XWayland applications to change display resolution

Good.

> Wayland doesn't allow applications to exclusively grab mouse/keyboard which is required for games and applications like VMs.

Were you not just complaining about X11 allowing this in the previous section?

> Traditional Linux/Unix (ext4/ reiser/ xfs/ jfs/ btrfs/etc.) filesystems can be problematic when being used on mass media storage.

Okay, so don't use them. JFFS2 has been in the kernel since 2001. F2FS has been in the kernel since 2013. Both are commonly used for flash media. You're also free to use FAT like you would on Windows; that's been in the kernel since forever.

Not that this seems like a real issue anyway, given that I've used ext2/3/4 and XFS on thumb drives with no problems at all. ext2 is particularly useful if you're concerned about journaling shortening the drive's lifespan.

> For the same reason you cannot modify your partitions table and resize/move the root partition on the fly.

I've resized my root partition on-the-fly multiple times with zero issues whatsoever. The fact that I use LVM on all my desktops/laptops might help with that.

> No unified installer/package manager/universal packaging format/dependency tracking across all distros

The whole point of a Linux distro is to be able to differentiate on things like package management or system configuration. No package manager is perfect for all use cases.

And besides, AppImage works on pretty much every distro that uses a reasonably-modern glibc, and literally doesn't care about the distro itself. If your concern is "well I don't want to have to build packages for each and every packaging system out there", then AppImage is your tool, so use it.

> It should be possible to install any software by downloading a package and double clicking it

Which is totally possible, thanks to things like AppImage (well, you download it somewhere and mark it executable, but wow, so hard). Hell, most distros support this even for .deb and/or .rpm packages.

openSUSE also has 1-click installers, but it seems like they've fallen out of fashion (which is a shame, since they address this issue pretty dang elegantly).

> Applications development is a major PITA.

Compared to Windows? Linux (and Unix in general) is a breath of fresh air in comparison. A PyQt5 app (for example) takes me a few hours (total) to design and develop in Linux... and then days of nitpicky troubleshooting to get it working on Windows in a way that's reasonable for end-user distribution.

> Packaging all dependent libraries is not a solution, because in this case your application may depend on older versions of libraries which contain serious remotely exploitable vulnerabilities.

And yet this is exactly what Windows and macOS encourage, specifically to avoid the need for users to manage libraries. Like, you can't have your cake and eat it too; either you're gonna be dealing with dependency hell or you're gonna need to package libraries with applications.

----

Continued below...


Continuing from above:

----

> No plug-and-play support for a lot of input devices like joysticks and steering wheels. Many require editing of cryptic configuration files.

My Logitech F310 and my Saitek X52 both work out-of-the-box without any such "editing of cryptic configuration files". At most, I might need to set bindings in the game itself. This applies even with Wine/Proton (I'm able to play Ace Combat 7 with a controller - like God intended - with exactly zero configuration on my part).

> Many anti-cheat protections fail to work under Linux.

Tell that to Valve Anti-Cheat.

> There's no concept of drivers in Linux aside from proprietary drivers for NVIDIA/AMD GPUs which are separate packages: almost all drivers are already either in the kernel or various complementary packages (like foomatic/sane/etc).

Good. Like, how is this a bad thing? It's the very reason (well, one of many) why I prefer Linux over Windows for my desktops and laptops.

> There's no guarantee whatsoever that your system will (re)boot successfully after GRUB (bootloader) or kernel updates

That's much less of an issue with UEFI (where you're not mucking with boot sectors and such). And kernel updates haven't been an issue for me for years.

> Samba is not native

How is it not native? Just because it's "reverse-engineered" doesn't mean it doesn't ship with most/all modern distros and can be used for filesharing.

> Steep learning curve (even today you oftentimes need to use a CLI to complete some trivial or non-trivial tasks, e.g. when installing third [party software]).

The link is to a question that specifically asks how to do it through the command-line. The user didn't have to do it that way (Google has provided .deb and .rpm packages on chrome.google.com for as long as I can remember, and nearly all distros using either package format are able to graphically install such packages with a double-click).

> Most well written GUI applications for Windows 95 will work in Windows 10 (24 years of binary level compatibility)

And most "well written" GUI applications for Linux 1.2 (which is when Linux switched from a.out to ELF) will work in Linux 5.4.31 (24 years of binary level compatibility). The problem is that applications are rarely "well written"; there are plenty of poorly-written Windows applications that don't run on more modern versions, and Linux is no different.

If your (statically linked) Linux 1.2 compiled application doesn't run on the latest-and-greatest Linux, then that's a bug of the "Linus will verbally abuse someone on the LKML" variety.

> No standard way of software deployment (pushing software via SSH is indeed an option, but it's in no way standard, easy to use or obvious - you can use a sledgehammer to crack nuts the same way).

So? There are plenty of options here, ranging from "pushing software via SSH" to widely-used and industry-standard tools like Puppet and Chef.

> No CIFS/AD level replacement/ equivalent (SAMBA doesn't count for many reasons)

Why doesn't SAMBA count? I've literally used SAMBA 4 as an AD domain controller in production (with Windows, macOS, and Linux domain members). Works fine.

There's also FreeIPA, but I don't have as much experience with it (besides trying it out when it was first announced, during which point it was pretty buggy, though Fedora was probably just as much to blame).

> No filesystems at all to support per-file encryption

    gpg --encrypt --sign --armor -r my@email.address wow-so-hard.txt

    openssl enc -in wow-so-hard.txt -out wow-so-hard.txt.enc -e -aes256 -pbkdf2 -pass 'pass:WOW SO HARD'
Like yeah, I guess it's cool if the FS itself supports encryption, but this seems like it's best left to the application (and obviously end-users shouldn't be expected to run these commands themselves, but applications can certainly do so or - better yet - use the actual underlying libraries).


Can your grandma do everything that you've just written? Is she even able to find all these solutions in the first place?

Yes, many (but not all, including compatibility, regressions and lack of proprietary software) Linux issues might be solved or worked around. This doesn't make the OS any more suitable for the average tech-illiterate Joe.


Considering Microsoft's recent investments in OSS, it would be awesome to see a Linux distro developed and/or funded by Microsoft that makes the adoption of Linux an easier process.


The meme from way back when was that openSUSE was basically "Microsoft Linux", but I don't know what triggered that (maybe YaST making AD domain membership relatively painless?).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: