Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> the Android security model has taken into account sideloaded apps since the very beginning

Counterpoint: tech websites have literally warned users that they need to be wary of installing apps from inside Google's walled garden.

> With malicious apps infiltrating Play on a regular, often weekly, basis, there’s currently little indication the malicious Android app scourge will be abated. That means it’s up to individual end users to steer clear of apps like Joker. The best advice is to be extremely conservative in the apps that get installed in the first place. A good guiding principle is to choose apps that serve a true purpose and, when possible, choose developers who are known entities. Installed apps that haven’t been used in the past month should be removed unless there’s a good reason to keep them around

https://arstechnica.com/information-technology/2020/09/joker...

"You should not trust apps from inside the walled garden" is not a sign of a superior security model.



> Counterpoint: tech websites have literally warned users that they need to be wary of installing apps from inside Google's walled garden.

This is not a counterpoint to what I was saying. I'm talking about sideloaded apps, not apps from Google Play. I agree that Google should work to improve their app vetting process, but that's a separate issue entirely, and one I'm not personally interested in.


If your security model is so weak that you can't keep malware out of the inside of your walled garden, the situation certainly isn't going to improve after you remove the Play store's app vetting process as a factor.


I avoided making a claim regarding the relative "security level" of Android vs. iOS because it's not easy to precisely define what that means. All I was saying was that Android's security model explicitly accommodates openness. If your standard for a "strong" security model excludes openness entirely, that's fair I suppose, but I personally find it unacceptable. Supposing we keep openness as a factor for its own sake, I'm not sure how you can improve much on Android's model.

This discussion seems to be headed in an ideological direction rather than a technical one, and I'm not very interested in that.


If your point of view is that you value the ability to execute code from random places on the internet more than security, perhaps that is the point you should have been making from the start.

However, iOS makes the security trade off in the other direction.

All an app's executable code must go through the app vetting process, and additional executable code cannot be added to the app without the app going through the app vetting process all over again.

In contrast, Google has been unable to quash malware like Joker from inside the Play store because the malware gets downloaded and installed after the app makes it through the app vetting process and lands on a user's device.

> Known as Joker, this family of malicious apps has been attacking Android users since late 2016 and more recently has become one of the most common Android threats...

One of the keys to Joker’s success is its roundabout attack. The apps are knockoffs of legitimate apps and, when downloaded from Play or a different market, contain no malicious code other than a “dropper.” After a delay of hours or even days, the dropper, which is heavily obfuscated and contains just a few lines of code, downloads a malicious component and drops it into the app.

https://arstechnica.com/information-technology/2020/09/joker...

iOS not having constant issues with malware like Joker inside their app store has nothing to do with "security through obscurity" and everything to do with making a different set of trade offs when setting up the security model.


all of this malicious code still requires the user to grant the permissions, or exploit bugs in the operating system. same as ios, infinitely better than mac, windows and linux.

apple might pretend they are secure because they usually manage to catch such things during review. this doesn't actually mean they are secure.

end of the day, it's up to the user to choose what software they install, and what permissions they grant.

if your security model includes taking all user choice away, forbidding them from running software that they wish to run, and essentially treating them like unsophisticated toddlers that need your guidance because you know best, then sure, you might view this as a problem. but at that point, you are the problem.


> apple might pretend they are secure because they usually manage to catch such things during review. this doesn't actually mean they are secure.

The easiest way to see that Apple's security model is more robust is that tech websites don't have to warn users that they should fear the apps from inside the app store.


There's little evidence that this isn't simply because Apple is better at policing their store. It probably also helps that an Apple developer license costs $99/year, whereas Google Play has a one-time $25 fee. Keep in mind that the Play Store is just one of Google's many endeavors, whereas the iPhone is Apple's premier product, and as such, one of their top priorities.

Regarding dynamic native code execution, please see saagarjha's comment and my reply.

> Apple's security model

It would be more accurate to say "the iOS security model", because as beeboobaa3 mentioned, macOS fully allows apps from outside the App Store, dynamic native code execution, and most other "insecure" things that are blocked on iOS.


Downloading executable code is irrelevant; it’s easy to alter app behavior dynamically on either platform.


I agree. But it's at least worth noting that Google has taken steps towards blocking this as well, and it will likely be fully blocked in a future release of Android. From what I understand, it's partially to protect apps from themselves rather than the OS from apps, however. Additionally, it breaks legitimate apps like Termux, which many Android users see as a major regression. I personally think it's just another example of Apple-ish security theater, but Google has been known to copy some unfortunate things from Apple in attempt to mirror their success (see also: headphone jack, SafetyNet). Regardless, it goes to show that Android security is still evolving and the referenced 2020 article likely doesn't reflect the current state of things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: