AppForce1: news and info for iOS app developers

Jeroen Willemsen, principal security consultant at Xebia

March 11, 2021 Jeroen Leenarts
AppForce1: news and info for iOS app developers
Jeroen Willemsen, principal security consultant at Xebia
AppForce1: news and info for iOS app developers +
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript Chapter Markers

Send us a Text Message.

Jeroen Willemsen and I worked together on a security related mobile networking SDK. Based on that working history and Jeroen's work on the OWASP MSTG I wanted to get Jeroen on my podcast to talk about how to get started with mobile security testing on iOS.

Topics mentioned in the episode:

Runway
Put your mobile releases on autopilot and keep the whole team in sync throughout. More info on runway.team

Lead Software Developer 
Learn best practices for being a great lead software developer.

How to Start a Podcast Guide: The Complete Guide
Learn how to plan, record, and launch your podcast with this illustrated guide.

Support the Show.

Rate me on Apple Podcasts.

Send feedback on SpeakPipe
Or contact me on Mastodon: https://hachyderm.io/@appforce1

Support my podcast with a monthly subscription, it really helps.

My book: Being a Lead Software Developer

Jeroen Leenarts:

So welcome to another special episode of my podcast. I'm sitting here with Dylan Williamson. He's an old colleague of mine. And we did a lot of work together on a mobile security related SDK. So it was really cool stuff. And he's still into security at the time. And he has worked a lot on the mobile security testing guides from OWASP. He left that project already. But he's done a lot of good work on that. And I wanted to dig into security with you. So you don't Hi, how are you doing? How you doing?

Jeroen Willemsen:

Oh, thank you very much for the beautiful introduction. It's an honor to be here. Yeah, well, given the look down, and two young children will have been homeschooling, okay? Are you

Jeroen Leenarts:

pretty much the same? Maybe it's a bit confusing for our listeners, but you don't and I have the same first name. So if you hear a lot of urine in this podcast, sorry about that. But at RH the name urine is quite a common name in the Netherlands. I'm also dealing with a lockdown with two young kids at home. But we're managing getting through things. So just to get things started, within CBO, your current employees still, what is your role there.

Jeroen Willemsen:

So I'm a principal security architect over there, which means I do consulting over a broad scale of security items. One of the things I focus on currently setting up in a trading security teams or helping security teams, we have a lot of different chores in that sense to raise the security bar at the at the customer, which involves a lot of things, from infrastructure to processes, to web applications, but luckily, also mobile applications, which we hope to zoom into today. All those sorts of things, basically.

Jeroen Leenarts:

And when we worked together on mobile security, in a sense with that we were developing an SDK. You were involved with the mobile security testing guides from the OWASP organization, correct?

Jeroen Willemsen:

Yeah, yeah, that was actually a very, very old version of when we had the Google doc back then. And well, we were working on the SDK, I believe we had the migration to the new project variants on GitHub, which back then was led by, by Bernard to get it was fun. And later on, I stepped up to help span to share leadership on their project, because I mostly just joined your project to help outspend because I really saw what he was trying to accomplish. And it was a great idea. So I really want to chip into push it to a next level, basically.

Jeroen Leenarts:

So So how long were you in a leadership position on the mobile security testing guide?

Jeroen Willemsen:

If you hear me type, it means I'm going to LinkedIn because I'm not sure. Apologies for I think it was about to two years, I've been leading the project for two years, yes. Because what I really envisioned was, either you're an active leader, or you actively stepped down, because the last thing I wanted to do was being one of those slowdowns where people start communicating to, and then they get no response at all. And they're like, is this project dead or something, because the project is everything, but that I just don't have the time in my personal life to contribute as much as I would like to actually keep this protected the living standard should be

Jeroen Leenarts:

so you looked for a successor and then transferred all the ownership and then step down and support where you can if you ever get a question on the MST Gee,

Jeroen Willemsen:

well, partially actually. So in that sense, fender steel leader, I just said, Okay, I can't help you level I want to, but I don't want to be that slumbering leader while you do the work. And then we did had a conversation about, hey, who could step up to, to what might already be a leader and move ahead. And I think during that process, Carlos got more involved. Over time goes whole new era, I hope I'm pronouncing his Spanish surname correctly. And basically, that led to them being the new project do and then even Becker stepped up to also be become more active. Again, you'll Beckett has been active in a project while I was leader as well, and then did a lot of other things. And then when I stepped off, he basically became more involved again.

Jeroen Leenarts:

So you could say that you left leadership on the MSG, in good standing and made sure that it was well taken care of.

Jeroen Willemsen:

For as far as I could, yes, but still, a lot of those credits should go to spin Slayer, who is always trying to make sure that stuff keeps on working.

Jeroen Leenarts:

Okay. So we mentioned this testing guides quite a lot already. I'm an iOS software developer. I reckon that most of my listeners are also iOS software developers. Why? Should I listen to what people have to say in the mobile security testing guide.

Jeroen Willemsen:

So the mobile security testing guide basically helps you to assess the security of your application. Of course, I know compared to the Android ecosystem, you're already in a much safer place. Because there's manual reviews if you have an app store based app and not an enterprise licensed, distributed application. But still, using the MCG can help you to further strengthen the security of your application, which becomes particularly important if you have to deal with personal identifiable information or financial transactions or whatever, and can actually help you a bit if it comes to creating your own game, making it less copyable. And stuff like that. So protecting your IP a bit.

Jeroen Leenarts:

Okay, and how do I use the MSD? G?

Jeroen Willemsen:

So the last thing you should do is night now, open up the MCG read the whole book, because it's large. I actually think I have a physical copy over here. I'm not sure do you have a video with the Okay? So the MCG is like a big Bible right now and going through that will be a pain. So what I wrote, I recommend this, start doing a treadmill exercise, and then see what you're afraid of. And based on that you open up the mobile application security verification standard, or MSPs. You select the controls you believe are important, and then you open up the MCG to understand how do I implement them? Or how do I validate them.

Jeroen Leenarts:

So the MST G is basically a reference material that contains sort of like guides or principles that you can use within your app to cover your bases on specific topics that are related to App Security.

Jeroen Willemsen:

Exactly, exactly. So we have a bunch of things in the MCG are we I used to be part of we but oh, well space, a bunch of things in the MCG of varying from a description of, Hey, these are hacking tools you can use to verify whether you did it correctly, to a description of, hey, if you want to implement security, Control X, Y, or Z, this is how you do it. Or these are the references to the for instance, for iOS developers to the Apple documentation on where to find the controls and how to implement them. And if you want to validate lease, look for the following things or look at the following tool, which perfectly describes what you should look for and what it will check for you basically. So in that sense, it gives you a various various granularity levels, some pointers on how to move forward, basically.

Jeroen Leenarts:

Okay, and let's, let's switch gears a little bit, because I will make sure to reference these materials from my show notes. As a security consultant, you do a lot of reviews on code basis. Also, on a lot of mobile applications, both Android and iOS. What are the biggest mistakes that you come across?

Jeroen Willemsen:

So one funny mistake was see a lot Blitz, unfortunately, Android not necessary in iOS is that people try to aggressively minify the application to make it harder to read stuff. And then they destroyed our photo. So the functionality no longer works. That's basically the the worst q&a issue you see often. Issues main challenges that are also seen very often is that they forget to encrypt some data while they store it in something that's more publicly available, like the file system without setting the proper ACLs. For iOS, you can set your proper ACLs. Right. But then they don't set ACLs. They don't apply encryption. So basically, it's it's pretty much shared. We see a lot of custom jailbreak detection work, which is not effective. And will just actually delay the testing process. We see where people talk about public key pinning, because they think it's a good idea. But then they implemented certificate pinning on a three month living certificate, which eventually means the whole app comes to a grinding halt within three months. We see all those type of small things basically. And then of course, wrongly abide encryption. Or worse, having the authorization and authentication checks in app and have it perhaps you baked in the API's. Which means once you got into the app you got into other people's data. Those types of things still happen a lot.

Jeroen Leenarts:

So just to unpack a bit what you're saying there, you mentioned ACLs, on the file level, I think that's a gold file protection on iOS, correct? Yeah. Yeah.

Jeroen Willemsen:

So similar, like the keychain, the file system has also a few ACLs. You can set on Hey, when am I supposed to go out? Or when am I allowed to access this file? And what does the system have to have in terms of locking and synchronization? And if you don't set anything of that it becomes a bit more open. Basically.

Jeroen Leenarts:

Okay, yeah, because you configure a default in your info P list and if a file already exists, but because of a previous version of the same app, stuff is still not protected to the level that you might expect at that time. Yeah, it's actually

Jeroen Willemsen:

good to mention info P list, because that's the worst, we see still a lot of data just stored in P lists. Well, that's not really protected. That Well,

Jeroen Leenarts:

anything in the info P list, this is part of your manifest of the app. So it's publicly visible without any issue, really. So another thing that you mentioned was also pinning that that is implemented either incorrectly or on very short lived certificates. I can imagine with short lived certificates of three months that it's an issue if the pin expires, and you haven't updated your app yet. But what are good practices to do when you're dealing with certificate pinning.

Jeroen Willemsen:

So a good thing in general is to not use certificate pinning, where you basically compare the full certificate to what you have. But if you really want to make sure that you're working with the right key, just use public key pinning, so you pin towards the public key. Of course, the real challenge then is ask yourself, can the backend error or whatever is out there the infrastructure, protect your private key enough to keep that living for a long period of time? And is it worth it then, as a defensive debt mechanism, basically, but but again, yeah. So the best trick will be always stick to public key pinning. And ask yourself, How long should a public keep live? Basically, silence? Should you keep her be used and can you prepare timely, we've already a backup key in your application in terms of keying that you would pin to. So the moment you do have to rotate, you already have your next key pair ready.

Jeroen Leenarts:

And that's, that's a good one, actually, because at at my current day job, were looking into transitioning to a three month long certificate. And we're having some challenges there with making sure that it is properly supported for end users. From a security perspective, I can imagine why they want to do it. But we need to have policies and practices and mechanisms in place to actually make sure that we don't lock out our end users from our app. You also mentioned defense in depth, it's a term that you hear quite often, if you do anything security related, I myself listen to a lot of different podcasts amongst them, for instance, risky business, and that's very good podcast, with a lot of information on what's happening in security lens. And they also mentioned defense in depth a lot. But what is defense in depth for any layman out there.

Jeroen Willemsen:

So defense in depth, if you just dissect the words, it already tells it a bit. So you have your first line of defense, but then you dig deeper as in depth. And then you add new lines of defense. For instance, if you look at data encryption, which I won't say you should always implement, of course, because you can make so many mistakes with that. But let's look at the network connection you set up. So if your first line defense is, of course, making sure you are talking to the right server, so you're using TLS, you have your basic TLS level encryption in the underlined and you have your basic handshake from your iOS. And as your clients I believe, or whatever it's called nowadays, towards the server, basically, that is already your first line of defense. So you already have some form of encryption right now, because, you know, normally is check stuff, etc. Now, of course, this is only the first line of defense because we might have threats or risks. Moreover, like a risk of your being in a country where the network is controlled by somebody else. There's many individuals out there. And they provide a certificate that has been approved by Apple in terms of debts in their current, trusted list. But it's controlled by the government and the government might issue something. Or, of course, this is hardly happening. But a bigger, much bigger problem is for instance, you share your device or somebody else that wanted to play your free game and win a free game, they had to install a profile. Because of that profile. You could run the app, it's beautiful, the objects if the profile is insolvent. If so it rents the only thing you don't understand or the ones who for sharing your device didn't understand is that siloing that application basically meant that you now have an edit certificate to your certificate of what somebody was trying to meet in the middle you basically and while being met in the middle to if that new CA, your TLS first line of defense no longer works. Of course, the simplest control is educating your users never to do this, but unfortunately, these app stores never die. So yeah, then an added line of defense next to your TLS ID to make sure that you're still up men in the middle on a public Wi Fi while being their profile installed. So you see, we already went through a lot of things to this fairly small corner case of risk where the network was under control by somebody else and you installed their profile, then it becomes important to have some other layer if your app is really irrelevant in terms of the data exposed. And that's of course spinning. So now we start having an additional control for the network level as it hey, let's have a certificate, let's have a certificate or a public key pinning in there. So that we make sure it at the moment that we get a man in the middle or something that tries to offer us another TLS certificates that we can negate that. But as you can see, we already have TLS as a basic line of defense. And now we add its public key pinning as a defense in depth mechanism. So that's how we basically go defense in depth. Technically, the same thing holds, of course, for encrypting the stuff on your phone, if you have your data stored in a keychain or proper ACLs, you already have a lot of defense because you already have your extra management set out, you're already at that's basically done by you instructing the keychain, how it should work. And on top of that, you basically have already then the ACL set out on which app can you know, access your keychain, normally and whatnot, you can have additional protection, like biometric ID or whatever. And that's already where we start talking about additional protection. Another additional protection will be assuming that the keychain still you know, somebody is able to see your passcode or whatever enters data and bypasses the control or you have the keychain damper on the jailbroken device. And they already have your passcode because they saw you enter it or something. Because they asked you to unlock your phone for, I don't know, inspection at the airport, whatever, then an additional control could be too in depth as you can see, right, we already narrowed the case a lot. So it's in depth for your specific scenario where we are afraid off to have for instance, encryption on the materials within the keychain. But the general base level of making sure only your app can access that keychain item, but you really have to control it over passcode or biometrics already provides the first layers of protection, then defense in depth will be that next level to add more protection to it because you're afraid of a specific scenario. At least it's what you're supposed to do. Otherwise, we're developing a lot of stuff that you might never need, for instance, or marketing app or having encrypted key chain items like the content of the marketing campaign. That's life. I'm not sure if your developer has spent the time priceless.

Jeroen Leenarts:

Yeah, it's it is of course, a lot of complexity that you add, and you all should add this complexity if if there's a meaningful reason to actually do this. Exactly. One of the things that you see a lot is that third party dependencies are being used, for instance, you have an API gateway provider, or you're just grabbing some random piece of dependency from the internet, which you hopefully have checked. What are the drawbacks of using an SDK assets within your app? Are there any risks that you're taking there?

Jeroen Willemsen:

So I think a lot is beset about what we call supply chain risk. So another party gets compromised, or has an internal attacker or whatever, or somebody takes over an open source library and starts injecting funny things. And then that funny stuff, Colette ball, where call it a backdoor, call it a crypto minor, or whatever it is, then piggy backs into the SDK, and then piggy backs into your application. And that can be a risk anywhere, of course, you have your app store review. So some of that stuff might be picked up over there. Whereas in Android, we have the bouncer that might pick up something as well. But eventually, you're still responsible for validating that. So any SDK injected, it will be wise to use something like, for instance, OLS dependency checker can check CocoaPods. But I'm not sure if it can check Swift package manager or Cartage. But luckily, you can also just search for possible CVS on that SDK if it's widely used. And I believe GitHub also introduced a system to have a better transparency and that that matters, which is fair, the frying dose and start seeing what the SDK does might be very welcome. Not only from a security perspective, by the way, because sometimes some of those SDKs every program is in a way that is you slow down, you're happy, you have a lot of resources for no reason. Which actually might impact your usability or let's say older devices, for instance, if you want to support those.

Jeroen Leenarts:

Yeah, I know what you're talking about with what GitHub is doing. They have like a dependency checking tool that's already rolled out to a lot of languages and frameworks, but I think they're still working on adding support for iOS and Mac related technologies. But yeah, I think it is on their roadmap. So it's definitely something to keep an eye on. And the dependency checker by OWASP actually has a sort of better plugin for, I think CocoaPods related things. Yes. But yes, I don't think that the backing data of that tool is, is maintained to such level that you can really rely on it, it's still very experimental. But let's hope that something that you can just embed in your CI becomes available so that you don't have to think anymore about running checks, you only have to respond to signals being thrown out by your continuous integration.

Jeroen Willemsen:

And, by the way, hit the ball for something like, you can report security issues basically. Of course, you can also just report them as books, if you wanted to watch the SDK, if it's an open source one on a closed source, it becomes much harder, then you basically have to do a lot of the work yourself right now in iOS,

Jeroen Leenarts:

a bit suppose to diamond, an iOS developer, and I'm working with an SDK. And while working with this SDK, let's for this case, assume an open source one, you're working with it. And at some point, you notice that there's a glaring security hole in the SDK. What should I do as a developer at that point in time?

Jeroen Willemsen:

Well, the first thing is, of course, assess what that means that clearly security holes. So what's the risks involved? And how does this impact your application, and if it has a to bigger impact, basically, as in credentials might get loose, or whatever backdoors or stuff, you thought that will sort securities not stored, securely, try to rotate whatever was compromised, of course. So that's one thing to try to contain your current situation. So to hear sure that you can move ahead with a secure application. And of course, depending on the data, you have the GDPR requirements, yada, yeah, but let's just more focus on what you can do with it SDK as opposed. Often, the art the authors of the SDK, especially in the open source world, are very willing people to listen to you when you file a bug and you try to contact them. But sometimes it might be just as easy to just file an EMR because sometimes, let's say it's caching some data somewhere and it's just caching it as a as a general written file, it might already be worthwhile, but by providing some keychain based backing for the blob that has to be secured, for instance, in those type of things. Sometimes few of my customers become active contributors to one of those two of those SDKs. Whether it's for having, removing that. And it's arbitrary payloads configuration for your ATS, all the way to, hey, maybe you should put this into keychain and not just publicly available throughout the apps basically.

Jeroen Leenarts:

Okay. So is there any benefits of closed Source SDK? Is it more secure or not? Or?

Jeroen Willemsen:

Well, if it comes to security, it's definitely not more secure. There's a pattern that we often see effect that close source actually means you only get the compiled, thought I was called the the framework basically, to to air to include. And then sadly, once we start creating a very small sample app, load that inside, and then we run it through G Hydra, you can basically see what it does. A Yes, G Hydra is by the NSA, blah, blah, blah. But if you firewall it's what the leak, I mean, seriously, not really a problem for as far as I could see. But least you can tell what the application is doing or the SDK is doing. And then often you find things that are quite disturbing that you would have rather saw in code. That doesn't mean that every SDK is a problem. Of course, it just means you have similar risks, because the people developing that might make the same mistakes as the open source authors. The only difference is now you can't see it in code, you have to read it through no reverse engineering tools, or you create a sample app, try to call the different methods so that you patch it with objection. So with the free dig edges inside, you can see dynamically what the SDK is doing. And still, you'll find the same disturbing problems. What's worse is when closed source developers that want to share the source code for review, if you have a very if you have an application has serious security risks. Because that means that you're often always have to go through the reverse engineering exercise, which isn't that hard, actually. I mean, I'm not saying it's easy. I'm just saying, if you're a developer, you can learn this. It's not that hard to understand the basics, and to find some of the pointers of what's going wrong. And the moment you basically do that, you'll you'll have to contact your closed source supplier, but you just wasted far more time on reverse engineering. So what They're really hoping that eventually even a closed source, suppliers will, at least when your request for security wise, take a session with you to go to their code, they don't need to donate to go to you, but at least give more transparency, because you'll find out eventually doesn't matter what they do.

Jeroen Leenarts:

Okay. So that's a lot to take in, on the technical level of security already. I think we've, we've exhausted our listeners, enough with technical detail. So and I will make sure that all the reference materials in the show notes, but I wanted to like dig in to you as a person a bit as well. Because how did you become the are actually one of the principal security architects within CBO?

Jeroen Willemsen:

That's a very good question. It started off with that. I think it grew naturally, at CBI. And of course, you don't want to commercialize anything or say, hey, go work for us. But at CBO basically happens is that we're stimulated to grow that we're simulated to do stuff, we get educational budgets for that and we get lots of opportunities to see the stage and we're also trained a bit or coached on that. You can tell given my first talk and my ladder talks that I have been traded a bit at least I see a large difference and I'm sorry that the first auction still online on YouTube, but hey, if you want to have fun to watch them, you'll see somebody clowning. But that really helps to grow and eventually if you stick long enough if such a company that helps you growing you also grow internally to that level, then also of course, we have to be a bit ambitious to to grow. Yeah, the moment basically when I became a husband and later became a father, I got very motivated to make sure that you know, you can supply properly and stuff like that.

Jeroen Leenarts:

Okay, and how long have you been working with xebia now?

Jeroen Willemsen:

Eight years so longest time ever so I

Jeroen Leenarts:

was already working there when I started her or did she come in like one or two years later? I can't remember anymore. Such a long time ago.

Jeroen Willemsen:

Yeah, my first years that exist the were shared where you were already there for a long time

Jeroen Leenarts:

now. And what did you do before joining this consultancy company?

Jeroen Willemsen:

I will say that another consultancy firm called Domus Technica, and I was in training or working towards become a security consultant and more or less going towards the Enterprise Architect.

Jeroen Leenarts:

So you already hats at security book before qcb already and a long time ago, and what did you do before Domus

Jeroen Willemsen:

I worked at Capgemini. I work with fairies, Otter boy places, mostly as an intern or whatever. And, well, I already I think somewhere in so indigenous middle class whole, let's call it high school for fun sakes. I was already working together, somebody else I'm writing weird viruses and extended gets a Phoebe virus. So I already wanted to know how that works security, security wise. At the same time, I come from a well, not very wealthy family. So I was very motivated to tune my low end devices to be able to play newer games as I'm a gamer, which made me understand a lot more about computers and smartphones and architecture than strictly required for those games, of course. But that also gave me a lot of background to later on. If I look from added knowledge from a security perspective to understand what could possibly go wrong.

Jeroen Leenarts:

Nothing right.

Jeroen Willemsen:

Yeah, that's why I don't have a job. Right.

Jeroen Leenarts:

Okay, and what was your first computer that you got your hands on them?

Jeroen Willemsen:

I think it was a 286 40 megahertz with floppy drives like the five inches wants to know the bigger than the big ones and a turbo button right. There. This was a cheap one without a turbo button out okay. And I got it because it was broken. And it was mostly broken because when somebody opened the case to check for Ripper repairment a cable got disconnected. And they never found out that when you close the but the thing you would rip out the cable again. So you had to close the bucket in the box in a certain way to keep it working. And then I had my first computer when we found out Yeah, then start learning how to type start learning what those restrictions mean, see if you can actually run something from a floppy interesting days.

Jeroen Leenarts:

It's it's a story that you hear with a lot of software developers really that at some point in their life, they got their hands on a computer had some fun with that. Then mostly they started doing other things while they were growing up and then they got back to computers again. At the age of like 1617, maybe 18. And then it took off from there in various ways, sometimes through formal education, sometimes just through late night hacking, sometimes through meetup communities. It's also very interesting, that's one of the nicest things about software technology is that even if you're late to the table, you can still get into the business and become a good software developer, and even a good security consultant. Because the rate of change with software is so high that what we learned today is most likely outdated in a few months, and definitely in the next few years. So you have to keep on learning. And if you start on this journey for me, because a lot of my listeners are actually looking at getting into a professional iOS developer role. There's always opportunity out there. And is there anything that you still wanted to share? You don't

Jeroen Willemsen:

know necessarily? Well, regarding your last remark, of course, as long as your I believe, for security sake, if you're willing to put your time in to learn the basics first. So don't need to first learn how to break whatever mechanism but first understand how to mechanisms work and study that they you can really easily start having a basis. Of course, you can always shout that after having so many years of experience, but those first steps, they will be daunting, they will be throwing you off your guard many times, because all of a sudden with security, mathematics becomes important. All of a sudden, I'm not saying I'm a mathematician, I sometimes suck at it, staring at the numbers for hours to make sure I finally realize what's going on. And still often with many type of things, it's hard. But eventually, it's if you want to do that, right? Go into the depth of things and try to understand how mechanisms really work. And then security becomes a lot easier, because otherwise, you have to continuously keep up with learning the new tool and their interface. Whereas if you know the principles of how stuff is working, that new tool is just a new interface for you towards finding the same books, we always do

Jeroen Leenarts:

it because that's the fun thing with software as well, it changes all the time. But the essence of it never changes. People getting into software development or security, what are some recommendations of finding like minded peers and resources to to help you take away the roadblocks that you are bound to run into when you are trying to learn completely new things.

Jeroen Willemsen:

What can help it's understanding some basic slight Coursera courses. But that just gives you a theory. It's also good to ref back difference. And for that OWASP has a lot of resources to get you started. If you're for especially on mobile, of course, starting off with the mobile security testing guides, get access to the OWASP sec, which hopefully you can add as a reference with the invite link. And just start asking stuff in the security channel over there. And I can Mobile Security channel can help you a lot. Of course, after you will be asked to first read the part that's relevant on the heaviest EEG. But eventually, you'll find people that did the same journey as you did. And that will understand what happens. So doing that in parallel, like course, your courses and stuff like that together with the MCG material can really help a lot.

Jeroen Leenarts:

And of course, what's also very fun to do if you create your own little application, for instance, you're writing an app to learn or you're working on an app already professionally, is to just sometimes just take that app and look at it through the attackers perspective. And take the MCG and just go at it for a day and see what stuff you can break and what stuff you can find. And just doing it based on what is actually available in those guides, I think

Jeroen Willemsen:

definitely, definitely it will help a lot. And if you're doing it with your own professionally, the results can actually help not only you but also your team, your product owner and possibly your security team that might not be educated to the level that you are all of a sudden well the through the MCG if your friend says and we're with focus for company,

Jeroen Leenarts:

okay, I think that's a lot of good stuff that we can point people to from the show notes, I will make sure to link up everything. And with that. You don't. I want to thank you for your time. And it was great talking to you again And let's hope that maybe this year we get the opportunity to see each other again at some conference here in the Netherlands. Once we're all vaccinated and stuff and I'll hope to see you then, and I enjoyed talking to you.

Jeroen Willemsen:

Thank you so much feeling is completely mutual love this podcasts love this session. Also love previous sessions in there. Let's hope for the best prose fiction nation.

(Cont.) Jeroen Willemsen, principal security consultant at Xebia