Submit to your local DSC chapter CFPSubmit now!
close

The Secure Developer | Ep 81

Exposing the SourMint Scandal

with Danny Grander

About this episode:

In episode 81 of The Secure Developer, Guy Podjarny is joined by Danny Grander, Co-founder and Chief Security Officer at Snyk, to discuss SourMint – a malicious SDK that has been integrated into popular apps, seeing a total of 1.2 billion downloads per month. This was before it was exposed by the Snyk research team! Here, we summarize the scandal and unpack exactly what SourMint is, with details on how it tracks Android and iOS user behaviour while allowing for remote command execution. Guy and Danny also reflect on the challenge of protecting people who are using old versions of apps that still have malicious SDK integrated into them.

Tags:

Application Security
AppSec
Open Source
Secure Development
Security Transformation

Episode Transcript

[00:00:16] ANNOUNCER: Hi, you’re listening to The Secure Developer. It’s part of the DevSecCon community, a platform for developers, operators, and security people to share their views and practices on DevSecOps, Dev and Sec collaboration, cloud security, and more. Check out devseccon.com to join the community and find other great resources.

This podcast is sponsored by Snyk. Snyk is a DevFirst security company helping companies fix vulnerabilities in the open source component in containers without slowing down development. To learn more, visit snyk.ios-n-y-k.io. On today’s episode, Guy Podjarny, president and co-founder of Snyk is joined once again by fellow co-founder and veteran security researcher Danny Grander. On this episode, they will be chatting about SourMint, a malicious SDK that breached device security and user privacy for billions of users in both iOS and Android ecosystems. The vulnerability was exposed by Snyk security team, led by Danny. The team manages and curates the Snyk Intel Vulnerability Database, the most advanced and accurate open-source vulnerability database in the industry.

[INTERVIEW]

[00:01:32] Guy Podjarny: Hello, everyone. Thanks for tuning back in to The Secure Developer. Today we have an unusual episode. Recently is — Snyk’s security research team unveiled a very significant malicious SDK that we called SourMint. And to talk a little bit about this, how we discovered it, what it means, what you can do about it, we have Danny Grander, who is Snyk’s co-founder and Chief Security Officer. Danny, thanks for coming back on the show, because, you know, last time we talked about Capture the Flag.

[00:01:58] Danny Grander: Thank you. I’m excited to be here, and especially to talk about this research project that we just recently concluded.

[00:02:07] Guy Podjarny: Cool. So let’s dig in, you know, we’ll uncover a little bit on you know, what SourMint is, but then we’re not — just to sort of set expectations, we’re not going to dig too much into the technical details. There’s, you know, we’ll refer you to some, some places you can find those. But more about the implications. What is this malicious SDK? What does it do? how prevalent it is, and talk a little bit about, you know, the entities involved. And what can you do to protect yourself or preempt this. So then, to start us off, just tell us — what is SourMint? What is this discovery?

[00:01:32] Danny Grander: Right. So SourMint is an SDK, a software component library of a commercial company named Mintegral that we identified to contain a malicious code. Code that was not known to be existing in it. And there are several kind of components to that maliciousness, and — but kind of, to summarize, we found a backdoor that was within the SDK that allows remote command remote code execution. And just to say, we’re talking about mobile space. So this is a mobile app SDK. So the backdoor practically allowed executing code on the app that the SDK was integrated to. We also found excessive tracking that were performed in a very kind of hidden and sneaky way. Without the knowledge of the developers that were integrating the SDK into their apps.

And also, we found that similar kind of tracking on Android devices around any download — any downloads that the user performs on their mobile device, were tracked and reported back to the company. So this kind of started with one finding, but when we dug deeper, we kind of find more and more stuff. So this is, to summarize, basically an SDK server component that had quite a lot of kind of malicious functionalities in it.

[00:04:03] Guy Podjarny: Got it. And this backdoor was introduced through the SDK, but it became a backdoor in the apps then that contained the SDK? Is that correct?

[00:04:12] Danny Grander: True. So first, let’s maybe talk about what is this SDK and what the company does. So Mintegral, the company is a company, spun off a company, a public company named Mobvista. And their business is mobile advertising. And so the SDK is something that app developers, the publishers can integrate into their app, and monetize on their apps through advertisements. So this is basically an SDK that an app developer can just place into their app, kind of let it know when and how to display different ads. And since then, basically, when the app is getting downloaded and used, applications, the advertisement network delivers that advertisements into the apps. And if there is impressions or if there are clicks and actions, then the app developer is getting paid for those.

[00:05:05] Guy Podjarny: So this is the. So these are like, my kids have all these sort of free apps where they bounce a ball from one place to the other. Some of them are more elaborate. They keep kind of prompting them with advertisements to other apps, it’s the provider or the SDK helps them sort of show those ads, right?

[00:05:22] Danny Grander: Absolutely. Absolutely. And so when we talk about the remote code execution backdoor. So basically, now that the SDK is integrated into those, all those apps, and we’re talking about all the popular games you can think of. Okay, and then mostly games, but not only that, there are photo editing apps, there are dating apps, there are chat apps. And so all those apps that hold — have the SDK integrated, basically have a remote command execution open to Mintegral. But even more severely than that — to any advertiser. So you know, anyone can go and deliver an ad because basically, you know, that’s, as an advertiser, that’s what you do. You get to choose what to show. And so today, ads are what’s called playable ads, they contain JavaScript. And that’s exactly where the backdoor was within the SDK that allowed executing any native code from within the JavaScript code. So that’s, that’s practically the backdoor.

[00:06:32] Guy Podjarny: Yeah, are these things sandbox? I know in browsers, ads get presented, and they’re contained, you know, they can annoy you. But the damage they can do around stealing your cookies and all that there’s all these controls, right? Are these two things typically sandbox or they sandbox in this case?

[00:06:48] Danny Grander: So that’s a great question. And the answer is, yes, they are typically sandbox. This is exactly why there is this separation, this clear boundary. So even if I get to display a render, an HTML page, and some JavaScript within my app, I’m not necessarily opening execution capabilities to the native code. But that’s exactly what happened. In this case. The SDK had a JavaScript to native code breach that was, again, hidden. You needed to be Mintegral, to know how to trigger it. So it was hidden, obfuscated, probably, to kind of also fly under the radar of Apple’s, you know, review. And so that’s why it is — we consider it a backdoor because, you know, this is a capability that basically only Mintegral could know about. And then they could trigger the correct JavaScript to trigger this native code. But to your question, yes, there was a specific component that is typical kind of JavaScript to native code breach that was responsible for that.

[00:07:50] Guy Podjarny: Yeah. So that’s pretty scary stuff. Right? It means, basically, anybody installing one of these games gets a playable ad, presumably, Mintegral, maybe a bit more easily. And other advertisers, if they’re, you know, in the loop, or if they’re savvy enough to have — and malicious enough, I guess, to have reverse engineered, the Mintegral SDK, are able to just run code on machines. Is this SDK used substantially? Is it a popular SDK?

[00:08:16] Danny Grander: Yes, absolutely. So just to share some numbers. So this SDK is available for both the Android and iOS ecosystems. And so it is integrated, in total, in three and a half thousand apps in both ecosystems. I think Android is slightly more, bigger, share in the three and a half thousand. But in total, we are talking about three and a half thousand apps. But what’s interesting is that not every app is as popular as the other apps.

So some apps are really popular. And that would bring us to 1.2 billion downloads per month. This is a monthly number of downloads. So yeah, this is a — you know, when you look at the time that how long this functionality, the code was available, you know, existed. And so we’re talking about a one year since the code was introduced, until the point we published our findings. So during this one year, you know, there were many billions of downloads of those apps. So obviously, not every month, you know, was the same. So I assume, you know, the company they were growing as well. And you know, the popularity was rising. But it is very clear that this is very, very popular. And just to kind of give a few more examples, we were talking about, I think, close to 100 out of top 500 apps on App Store. So these are very, very popular games.

[00:09:44] Guy Podjarny: Yeah, yeah. It sounds like it. So we’re talking big numbers here, right? Like, this has been around for a year. It gets, you know, over a billion downloads a month. It’s pretty harsh in what it can do on your device, right? And I believe remote command execution is the end game. It’s the bottom. But even before that, right, it was doing all sorts of other, you know, as you called it, excessive tracking, and others? How did we find out about it? How did the Snyk research team learn it existed?

[00:10:15] Danny Grander: So the team, the research team, that’s what we do. We basically, since we started, we basically go and look into software components. For the most part, these are open-source components available, you know, on GitHub, and all the kind of source code management systems that are public. And so that’s our goal, is to find components that have known vulnerabilities, right. I mean, make them known, they are not yet known when we can hunt for them, but also malicious components.

So, you know, just, I remember, when we just started with Snyk, we made this presentation together. The ‘Stranger Danger’, and we talked about this, you know, the vulnerabilities in open-source components. And now remember, you added a slide talking about malicious code. And, you know, we’re talking about five years back. And back, then it seems to me, it seemed like, you know, “No, this can’t happen,” you know, “I can’t imagine open-source components getting malicious.” Right. But, you know, looking back at the last two years, this pretty much became a thing.

I mean, if you look in our Vulnerability Database, and just all the cases that were published around, you know, malicious components, we just see a big rise in them. So, and all versions of them. Like, malicious components, like code that is inserted as a backdoor to steal Bitcoins. And code that is intended to kind of steal information. So there is, really, all the variations and all the cases and all the methods. Sometimes it’s taking over a developer account and inserting malicious code. Sometimes it’s being malicious from the get go. And just using techniques like typosquatting, basically introducing a malicious component that has a similar name to the original, legit component. And so we’ve seen it all.

And so, yeah, today, you know, when we look at all the kind of the components, we search not only for vulnerabilities, we also search for malicious code. And we typically do a base set scale. So we’re not kind of targeting one specific library or a vulnerable type of, we basically look at all the code that is available. So the analogy I use is, you know, we kind of cast a huge net into the ocean of components. So these are open-source components, but also, all components that are available for download in all the major package managers.

So in this case, you know, we’re talking about CocoaPods, this component, the Mintegral SDK, as it is named in the package manager itself, this component is closed-source. It was closed-source, it became open-source. They turned it open-source, after our publication. And so this component was open. And what we looked for, the net we threw, looked for all the different kind of anomalies, different behaviors compared to other components. So in this case, we identified several, you can follow the research paper, the write-up, the blog post to kind of read through the details there. But this was our kind of starting point. And throughout the five years of doing research at Snyk, that’s how we did our research, in most of the cases, basically starting with looking at all the components and then and then identifying the weird ones. And when we find something that is off, we dig in. And kind of do a full, really, do a full analysis. And in this case, reverse engineering. This is the first time at Snyk, we actually, you know, needed to kind of reverse engineer binary because there was no open-source code available.

[00:13:47] Guy Podjarny: Yeah, yeah. Well, I think, you know, first of all, it makes sense that attackers would clue in, that with the kind of large volume of open-source components, or SDKs, you know, closed or open-source that that applications consumed today, it’s become easier to hide a malicious piece inside of those, right? Because you have a maker make an effort they can sleep in but uh, you know, it’s a good thing. I guess, if you can rebuild a sense a little bit for what’s suspicious and what’s not suspicious and those components as part of that research, there must be recurring patterns.

[00:14:20] Danny Grander: Also within what I find interesting in this case, and this is what made it go undetected for more than a year. That is they actually made efforts to hide it, not only by obfuscating the code, the data, but also dynamically identifying whether the application is being debugged. Or whether the device is jailbroken or whether there is a proxy operating that intercepts the traffic. So, if they, as the case, see any of this, they would just turn the malicious functionality off. Which made it harder for all the dynamic analysis tools or any type of research, basically, because when you try to look into it, it would go benign. So that’s why it’s — we believe was not identified for all this year.

[00:15:12] Guy Podjarny: So let’s talk a little bit about the the players around it, you know, we understand the vulnerabilities, maybe we’ll talk kind of victim first. And then we’ll talk about the maybe the players that can do something about it, you know, and what can you do? Let’s start about the victim. So you know, you have this big malicious component, who’s the victim here in this sort of whole story?

[00:15:33] Danny Grander: Right, so we see two victims here. First would be the developers and the publishers in the mobile space. And these are vendors or individual developers that use an SDK and believed that, you know, it will help them monetize on their app. Integrated it into their app, but got something else, the deal was slightly different. The game was different than what they expected. So they have the publisher, the developer, have a trust relationship with their customers, their users. And so when they integrated this SDK, they eventually broke this trust, right. Because the app turned out to do more than it was promising to do so. So again, excessive data collection, exposures to RC and all that. So that that would be one clear kind of victim.

And the other one would be the end user, the consumer, the one that gets the app. And for them, it is the you know, the cost would be the both the privacy and the security aspect, right. So the privacy would be — their data was collected without their consent, without the publishers consent, in this case. You know, and collected and shared, and there was, you know, possible exposure to remote command execution on their own devices. This time, it’s their own devices, because the game or the app is installed on their devices. And so, you know, if I, as an attacker get to, you know, just steal the clipboard off your app from your device while you’re playing an app, this is an exposure to you, the user, the player, the gamer — you know, that was just a victim in this case?

[00:17:20] Guy Podjarny: Yeah, yeah, I think this is very reminiscent of some recent or previous attacks, right, like XcodeGhost and a variety of ones before that. Where, I guess, you know, we, we talked about, in a couple of presentations on it, you know, that developers are a malware distribution vehicle, right? You know, they are, at the end of the day, the developers are the ones being tricked to install an SDK through legitimate functionality that includes more than what you bargained for. But eventually, it’s not about stealing information, or harming or other from the developer themselves, but rather just using the developers to get to the consumers to get to those apps. But because developers have such wide reach in what we built today, you know, that damage or that multiple, it can be very substantial?

[00:18:09] Danny Grander: Absolutely. And you see this in the numbers. You know, basically, you have one SDK that is integrated into thousands of apps. So 3000, I mean, it’s, it’s a big number, but it’s — it’s nothing that, you know, scares you when you hear it. But then when you look at how many times those apps are downloaded, so that’s the big multiplier. And yeah, in this case, they were absolutely the distribution vehicle for that code. And then yeah, so this is a really good example of that. And you know, I absolutely encourage the listeners to watch the talk — that at this point, both of us already kind of gave at different conferences, and that is named ‘Developers as a Malware Distribution Vehicle.’ it you know, it covers the XcodeGhost, but also kind of more in older cases, coming back to Ken Thompson, and the, Trusting the trust…

[00:19:00] Guy Podjarny: ‘Reflections on Trusting Trust.’ Yeah. So I think and — just to say, because I guess we, sometimes the biggest evil kind of hides the lesser evils, right. But we focused a little bit on the remote command execution. But if the SDK was, for instance, installed on my Android device through an app, the, it basically is siphoning off all the URLs on this device, right, my private Google docs, my, maybe things that I, you know, private or sort of secrets that I might have in headers, like a lot of that type of information. Right?

[00:19:33] Danny Grander: Right. So that’s an interesting nuance that is different between the iOS and Android operating systems. So on Android, on iOS, when we did discuss the RC backdoor and the data collection, so it is — it was, in the context of the actual app that the SDK was part of, was integrated into. On Android, on the other hand, we have Intense, Intense global — they are sent globally. So any app can least, tap into the Intense, you know, assuming they have the permission and act on these.

So in the Android version of the SDK, what they did is basically tracked all download events, content URLs, right from Google. So that would include application installation from Apple Play, Google Play Store, but it will also include downloads you do from your Google Drive, or you know, or just like you mentioned, a Google spreadsheet or Google Doc. So these URLs were intercepted and sent to Mintegral. Regardless of where you opened them, right. So now you don’t have to be playing a game for me to kind of steal your clipboard. You can just open WhatsApp or your preferred chat app, click on a Google Doc link. And this URL is basically — and this event is captured by the SDK of some game that you installed a long time ago that you know, you actually did for your kid, whatever for them to play. And then now that you open the URL, it gets tracked and reported to Mintegral servers.

[00:21:15] Guy Podjarny: So let’s talk about — what can we do about this? So who could have prevented this problem? How do we defend ourselves?

[00:21:25] Danny Grander: I think there are kind of two personas right? Two answers to this question. One would be the actual, you know, marketplace vendors. So we’re talking about Google with Google Play, and Apple with the App Store. So they already have all their code reviews, all the checks, you can report an app. So they already have this process. Sometimes it misses things. Sometimes they do change their processes. And so that would be one they, you know, they are responsible for keeping the ecosystem clean, the marketplace clean out of all types and varieties of malicious — and, you know, bad apps.

We initially disclosed to Apple. And actually, in the beginning, they didn’t take hard action. They, we kind of thought that, you know, they could just even notify the developers about the issue. But only when we found the kind of the RC backdoor, that’s when Apple acted swiftly. They, you know, they sent out an email to all the affected publishers about the clause that allowed for RC. Asking them to remove it and, and submit the app for expedited review. And the same goes to Google, they took this matter really seriously and did their own investigation, they are absolutely working hard to keep their ecosystem and their marketplace clean of such things. And so that’s, you know, that’s one.

The second one that can improve this are the developers, the publishers who are integrating, as components, all code components. But, you know, SDKs as well. So for them, first is kind of understanding what you bring into your app. So, you know, that’s, that’s one big challenge, you pull all those components, those components are pulling other libraries or other components. And the same goes for SDKs. So you know, if you’re in a mobile space, you’re integrated, a mediation platform SDK, and you end up with it pulling in advertisement network SDKs. And so that’s, you know, first, understanding what you bring into your app. And of course, taking ownership understanding that this is now part of your app that you deliver to your end users, to your customers.

And so this is exactly what Snyk, the SCA, the open-source product helps developers do. Basically, you know, shed light on to all the components that are integrated into the app. And, you know, make them and make it actionable and easy to fix and get rid of vulnerable components. But of course, malicious one on ones as well.

So that’s kind of the two players I see as can do more to stop these things. And just to add about developers and publishers, which is really great to see is, you know, we talked with all the major publishers in the gaming space and mobile gaming space. And all of them took this really seriously and acted immediately. We’ve seen vendors rotating tokens, removing the, you know, the component, like really understanding the impact of what just happened, and acting on it. So raising all the awareness to such cases in the mobile and ecosystem was really great to see.

[00:24:47] Guy Podjarny: Yeah, I agree that was very uplifting to sort of see, I guess, to an extent we, we thought Apple could have acted more sort of harshly when we had the original findings. But it was, on the flip side, it was actually really reassuring to sort of see the developer of those apps address this with proper seriousness. Including trying to understand what type of data might have been leaked but also quickly removing the SDK from their apps, right? Or taking actions on it. Both of those make sense to me.

But there’s sort of this straggling effect, right. So Apple goes off and sends a message to everybody that they’re going to remove the app with a sort of malicious code from their app store. Some developers, you know, quickly, maybe will publish a new version. But some developers might not. And their app might be removed, you know, from the App Store. Maybe they’re — not all of these businesses are doing massively well, right? Maybe they haven’t gotten to it. Similarly, some users might not update to the newer version right away. Isn’t that still a liability? Like, isn’t it likely that there are many millions, if not billions of devices out there that have apps with the malicious code that basically are not getting an updated version, so they’re just sitting there to be hacked by anybody exploring kind of this, this sort of playable ad vulnerability and running it? Who can do something about that?

[00:26:14] Danny Grander: So it’s interesting, I think that what Apple did, at least it’s how I read their notification is that, you know, they basically say, remove the component or, you know, we’ll remove the app from the App Store. So you’re asking about the all the apps that are already in.

[00:26:30] Guy Podjarny: Some of those already installed it? Right? There’s a whole — there’s a whole story at the moment about Epic and Apple, right? There’s this sort of big battle going on. And one of the claims from Apple and Google is that they maintain security, you know, of those apps — is one of the services that they provide as part of the App Store is to justify the fees that they take from developers. Is that they provide security assurance. And I think, in this case, you see that happening with Apple to an extent now because, you know, again, you can argue exact urgency. But they did send this note out, they are protecting future users from downloading this malicious SDK. But there’s sort of this missing gap around everything that has previously been installed. So, I guess that that’s what I’m asking is, yeah, who can help those users? Right? If I, I’m concerned that on my iPad, there’s all sorts of games, you know, installed, for my kids that one of those is using that SDK, it just has no new version to it.

[00:27:29] Danny Grander: Yeah, sadly, this is something you know, that has no easy solution. It lies with the vendors, with Apple and Google to handle. I think Apple has the capability to — I’m not sure about it, but I think they do have a capability to kind of remove the apps remotely. It does sound, you know, aggressive in a way, right? Like, this an app you installed, you — perhaps you even paid for it? Right? And now you can get it to be deleted? For your, you know, for a good reason. Right? So for your own sake, but I’m not sure they’re going to use this in this case. So you know, we’ll see. Well, actually, it’s interesting to see, through stats through, you know, usage stats, how many games are still kind of actively used on different devices that still have an older version of the SDK. Yeah, this is a you know, with, they did protect the future users. But it is an open question about the ones that you know, that already has to have the game and the vendor. And the publisher, didn’t bother to update it to the new and publish a new version.

[00:28:36] Guy Podjarny: It’s an interesting ownership challenge as well, right? At Snyk, we oftentimes talk about how, when you use an open-source component, you need to own it, and you need to run it. If you’re downloading an app from the App Store, Apple will definitely make sure you know, to the best they can to, you know, and Google will as well, that the app getting downloaded, would be legit. But if a significant vulnerability is in it, or if a piece of it was even malicious in one of those apps, who’s responsible, right? Or sort of who’s that who’s the one that should notify the user to sort of integrate some update to it or deprecate — I think there’s that there’s a whole world of additional security scrutiny or security maintenance that is overlooked right now in the mobile space. Absolutely. We can probably spend another hour talking about the research. I think this is an amazing job by this new sort of security research team. But also, I think, a demonstration or a bit of a spotlight on a bunch of these complexities of malicious SDKs malicious components, especially in the mobile space, but also broader. If somebody does want to learn more, where can they find some of those details or dig deeper?

[00:29:46] Danny Grander: Right, so first, we have several blog posts covering all the findings in this research, so you know, you can just go on to snyk.io/blog and just see the recent post talking about the RC. the backdoor in SourMint. So that’s the code name for the project. But also, in addition to the blog post, we have published research write-ups. So basically, a page that has all the technical information about all the kind of findings that we have in this process. And so this you can find on snyk.io/research — there will be a page around our SourMint research, technical research write-up. We also, just last week, I gave a talk at SnykCon, our very first SnykCon. So this is also a short kind of half an hour talk, covering the process, the timeline and our findings. So this is another place where you can find more information. And then finally, like we mentioned before, I would really encourage listeners to check the talk, the ‘Developers as Malware Distribution Vehicle’ talk, that talks about other cases of malicious components that are very similar in a way to kind of what we seen here.

[00:31:06] Guy Podjarny: Yeah, thanks. You know, a lot of sort of deeper dives, you know, for those who wanted. And we’ll make sure we publish all of those links to the podcast show notes. Danny, thanks for coming back on the show and kind of reviewing this great research project.

[00:31:48] Danny Grander: Yeah, absolutely. That was really fun. And thank you.

[00:31:21] Guy Podjarny: And thanks, everybody, for tuning back in. And I hope you’ll join us for the next one.

[END OF INTERVIEW]

[00:31:29] ANNOUNCER: Thanks for listening to The Secure Developer. That’s all we have time for today. To find additional episodes and full transcriptions, visit thesecure developer.com. If you’d like to be a guest on the show, or get involved in the community, find us on Twitter at @DevSecCon stamps a comm Don’t forget to leave us a review on iTunes if you enjoyed today’s episode. Bye for now.

[END]

Danny Grander

Co-founder at Snyk

About Danny Grander

Danny Grander is a veteran security researcher and the cofounder of Snyk.io, where he works on open source security and leads Snyk’s security research. Previously, Danny was the CTO of Gita Technologies and a lead researcher and developer for a few startups. Danny is a frequent Capture The Flag participant, his team, pasten won both the Chaos Computer Club and Google’s Security CTFs.

The Secure Developer podcast with Guy Podjarny

About The Secure Developer

In early 2016 the team at Snyk founded the Secure Developer Podcast to arm developers and AppSec teams with better ways to upgrade their security posture. Four years on, and the podcast continues to share a wealth of information. Our aim is to grow this resource into a thriving ecosystem of knowledge.

Hosted by Guy Podjarny

Guy is Snyk’s Founder and President, focusing on using open source and staying secure. Guy was previously CTO at Akamai following their acquisition of his startup, Blaze.io, and worked on the first web app firewall & security code analyzer. Guy is a frequent conference speaker & the author of O’Reilly “Securing Open Source Libraries”, “Responsive & Fast” and “High Performance Images”.

Join the community

Share your knowledge and learn from the experts.

Get involved

Find an event

Attend an upcoming DevSecCon, Meet up, or summit.

Browse events
We use cookies to ensure you get the best experience on our website.Read Privacy Policy
close