Sunday, January 28, 2018

How can peer group analysis address malicious apps?

Google has had issues in the past with malicious Android apps found in the Google Play Store. The company has since taken to machine learning, peer group analysis and Google Play Protect to improve the security and privacy of these apps. By utilizing these techniques, Google is taking a proactive approach to limit attackers from publishing apps that could take advantage of users after being installed on their mobile devices. This article will explain how these actions can increase security, while asking a few other questions regarding their vetting process.

By using machine learning and peer grouping, Google is looking to discover a malicious app by comparing its functionality to similar apps, and then sending an alert when things are out of the norm for its categories. Machine learning helps to review apps, as well as the function and privacy settings that are being used within other apps in the Google Play Store.

The peer grouping creates somewhat of a category for these apps and searches for anomalies in new apps coming into the store. This can baseline the apps for what is considered normal activity, and then compare that activity to a standard. In theory, these comparable apps should be similar in fashion, and abnormalities are then flagged for review by Google.

An example of this would be a flashlight app that needs access to your contacts, GPS and camera. There is essentially no need for this app to have permission to access these functions and, thus, it would be flagged by peer group analysis as something outside the norm.

Personally, I'm a big fan of machine learning to assist with finding and guiding engineers toward making better decisions, but I also believe it's neither a standard, nor a framework.

We're also seeing this machine learning functionality used to improve security and privacy within the Google ecosystem of apps. This is a fantastic way to determine potential issues within the app store, but I think requiring particular standards to be in place before apps are allowed to be published may be a better first step in achieving enhanced privacy.

Such standards could include enforcing NIST and OWASP Mobile standards, or validating that all EU apps meet the General Data Protection Regulation -- or, if there's health-related information in the app, that it passes HIPAA-related standards. This would be difficult to enforce, since there might be multiple categories and frameworks the app has to adhere to, but this would take a security-first approach when putting an app through the store for vetting.

Read the rest of the article here: http://searchsecurity.techtarget.com/answer/How-can-peer-group-analysis-address-malicious-apps

3 comments:


  1. Best Action Games Collection 2018
    Hello friends
    I have the best quality collection from my store or download. Quality from the application window. Free stores for all business phones. Do you have the best in app store? Select the application on your computer that will be used for your phone. Below I will mention many stores for your phone:

    - Play Store for SamSung
    - Play Store for PC
    - Play Store for iOS

    Visit my site for more quality apps: Iplaystoredownloadfree

    Hope my collection will make you happy. Share to get the best apps and games in 2018. Thank you!

    ReplyDelete

  2. When I initially commented, I clicked the “Notify me when new comments are added” checkbox and now each time a comment is added I get several emails with the same comment. Is there any way you can remove people from that service? Thanks.

    Amazon Web Services Training in Pune | Best AWS Training in Pune


    AWS Training in Pune | Best Amazon Web Services Training in Pune

    ReplyDelete
  3. Thanks for sharing this quality information with us. I really enjoyed reading. Will surely going to share this URL with my friends. blockchain jobs canada

    ReplyDelete