We think we know podcast

We think we know hacking is a tool for deeper change

Publisher
Pentest-Tools.com
Updated at
Article tags

If you have questions that boggle your mind about penetration testing, Jayson is the person to learn from. 


In the fourth episode of our We think we know podcast, we delve into the world of ethical hacking with the legendary Jayson E. Street


As an icon in the penetration testing community, Jayson brings a unique blend of wit, wisdom, empathy, and a true understanding of the hacker mindset.


He has a unique talent for breaking down penetration testing into fundamental ideas, using memorable stories you'll want to tell others (and actually remember!).

Don’t miss this episode that packs so many practical, real-world examples you can learn from and apply in your life and work. 

We think we know hacking is a tool for deeper change

Jayson E. Street bio

Jayson E. Street

FOX25 Boston called Jayson "a notorious hacker", National Geographic Breakthrough Series described him as "the world class hacker", while Rolling Stone Magazine said he’s a "paunchy hacker." But he likes it if people refer to him simply as a Hacker, Helper & Human.

He's a Simulated Adversary for hire and the author of the "Dissecting the Hack: Series". He’s also the DEF CON Group Global Ambassador, speaking at DEF CON, DEF CON China, GRRCon, DerbyCon at several other 'CONs, and colleges on various Information Security topics. He was also a guest lecturer for the Beijing Institute of Technology for 10 years.

He loves to explore the world and networks as much as he can. He has successfully robbed banks, hotels, government facilities, Biochemical companies, across five continents (only successfully robbed the wrong bank in Lebanon once all others he was supposed to)!

Dive into this special episode with Jayson to learn:

  • Why every hacker needs to define and focus on their vision of changing the world [04:28]

  • How (and why) his unique, creative approach helps him tackle security issues [16:20]

  • Why automation can’t work as a defensive model for solving puzzles [25:05]

  • How to mitigate risks and explain them to clients in terms they care about [43:58]

  • Why AI won’t replace pentesters’ job, but enhance them [49:40]

Resources from this episode:

Listen to this episode on:

Episode transcript:


Andra Zaharia: Only the most curious and persistent people thrive in offensive security. How do I become a better hacker? How can I build and maintain my advantage over adversaries? And what's limiting my ability to think creatively?


This podcast is for you. If you're the kind who's always digging deeper for answers, join me as I talk to some of the world's best offensive security pros about their work ethic, thinking, and real-world experiences.


This is We think we know - a podcast from Pentest-Tools.com


Andra Zaharia: Welcome to an insightful episode of our We think we know podcast, where we delve deep into the world of ethical hacking with the legendary Jayson E. Street. As an icon in the penetration testing community, Jayson brings a unique blend of wit, wisdom, and a profound understanding of the hacker mindset.


Join us as we explore the essence of hacking, not as a mere technical pursuit, but as an art form, a craft that transcends traditional boundaries.


Andra Zaharia: Jayson, known for his one-man revolution in cybersecurity, shares his remarkable journey and the philosophy that drives his approach to offensive security.


[01:28] Andra Zaharia: We discuss the human element in hacking, how it's about seeing the world differently and using that perspective to creatively challenge and improve security systems.


From his insights on AI and its role in our future to his surprising attack vectors, Jayson's stories are not only compelling but eye-opening.


Come along as we navigate the intriguing, often misunderstood world of ethical hacking through the experience and wisdom of one of its most charismatic figures. Let's dive in.


Andra Zaharia: I can't think of any other, better person to talk to about this particular topic, about penetration testing being a craft and offensive security war being a craft.


When we thought about this podcast and when we did the I hack because I care T-shirts, which I'm also wearing right now, like I mentioned, I really thought of you. I thought of you because you were my first example of an ethical hacker who does so much more like you.


You used this phrase, and I loved it. You're a one-man revolution. When I saw your first on stage, I was blown away. I was like, that energy, those insights that are so particular and specific so unique.  So behind that, because we know your stories are just so incredible and we've learned so much from you, from your talks, from seeing you live, from seeing you on YouTube, from seeing you wired, from seeing you everywhere - which we love. This particular way of you approaching things. How would you describe this idea of craft that sits within this offensive security space?


Jayson E. Street: Well, I think one of the key things that people overlook sometimes is the very core of a hacker has nothing to do with computers. It's not based on computing. It's literally, to be a hacker is to be able to look at things and say, this is how it's supposed to be done, but I want to do something else with it.


I want to make it do this. It's like, this is what I see. No one has this, I'm going to create it.

Leonardo da Vinci hacker, Nikolai Tesla hacker, Ada Lovelace hacker.Hedy Lamarr is like, who was horribly characterized as an actor first when she's the reason we have Bluetooth and Wi-Fi, right? So it's like she was a hacker. It's like she was a hacker first before she was an actress and known because she could memorize lines and she looked pretty.


[04:28] Jayson E. Street: And all those hackers, the one thing they all have in common throughout the centuries and everything else is they saw things differently than everybody else did, and they wanted to make things different than everybody else said it was supposed to be.

So that's what a hacker brings. When you're doing pentesting, and when you're doing, anybody can do an audit. It's like, anybody can do a pentest with an automated tool or something like that. But until you get a hacker involved in it who's actually thinking, like, how would I try to break into this place? How are the controls? What's a way that I could do it? And also, you got to be honest, it's like, half of the time, I know when I'm doing it, it's like, how can I make it cool?


Jayson E. Street: How can I make it something different that they weren't expecting? It's like they're thinking things like that. It's not about just how do I get paid. How do I do it in style? It's like if I'm going to go to a bank and rob them and stuff, I'm showing up in a stolen Rolls Royce, okay, that I just told three minutes before to get my getaway car. That's how I'm rolling. So that's how the hackers, when they approach it. So anybody can have a tool. It's like, anybody can have a chisel. Everybody can have a hammer, okay?


Not everybody else can make a David. I can't. It's like, give me a hammer and chisel, I can create swollen thumbs. It's like, from trying to hammer stuff, but I wouldn't know about art. And that's the key thing about it. It's like we try to focus and say that the tool is the answer when it's always going to be the people wielding it and thinking different ways to do something with it. It's like that's what makes the tool effective.


Jayson E. Street: I use tools a lot differently than a lot of other pentesters out there because I try to make it more creative. I do stuff more on the ground in real life, dealing with real people.

So I'm like constantly having to find different ways to engage.

And also I make it more difficult on myself because I like it when I get caught, I like it when they get a win because I'm there to help educate. I'm not there to try to pwn them or teach them or test them, I'm there to teach them. So I like it that way. So I like being creative and I like having that license to do that. So I will use tools like I love O.MG's different cables. They got several different kinds of cables.

They got a USB blocker that looks like the things that are supposed to protect you from getting compromised. That's the device that compromises you. That's like a hacker. Like, oh, this is a device they want to sell to make it protect. I'm going to weaponize it and make them think that it's safe.


Jayson E. Street: And then you got the bash bunny or the bash bunny two, which is like an amazing device that hak5, Darren or Darren Kitchen created with hak5. And then you got like the rubber ducky, you've got all these different devices and those are my main toolsets, but it's how I use them that I make it more creative. It's like I will.. Think right now how many devices that you know of that are small like USB band or USB headphones or USB clocks or things of that nature that charge by USB-C cable or a little display unit, little gadget, little display units that you can connect to your computer.


Jayson E. Street: Now imagine taking the cable out of the original box, replacing an O.MG cable in it with a specific payload and a trigger mechanism that you control and then putting it back in the box, wrapping it up and shipping it to the CEO or a CIO or a CFO of a company for them to plug in to use in their thing. And they see this really cool, nice gift and it's like oh yeah, I want to plug that in. Sending a light-up gaming keyboard that requires a USB-C to the IT department. It's like how many are going to plug that in? And how many of those people have domain admin control? So it's the same tool, but it's how it's implemented that counts. It's how it's used that it counts.


[09:26] Jayson E. Street: I've created a technique, I've not heard anybody ever talk about it because I kept hearing people talk about hacking with drones. It's like they're using drones to hack. I created a whole different kind of threat model that way. It's like what I did was - I created a payload. Okay, I didn't create the payload, I created the mechanism. So the whole scenario is this, I take a drone, it's not an expensive drone, but it's got a micro SD card. So it's like the cheapest one I can find with a micro SD card that you can put in the drone. And I fly it at around 8: 30 at night, 8:30 at night and I deliberately crash it. I deliberately crash it into the guard booth area or the front lobby door where security will find it. You take that drone and you're like “Uh oh, someone was doing something nefarious. It's like, why is this drone here? What were they doing?” 


Jayson E. Street: And security found it. Not IT, not IT Security. So what is security going to do?

They're going to take that drone back inside and they're going to be like, oh, it's got a micro SD card. Well, this wasn't a drone. This isn't like something dangerous. Let's plug the micro SD card into the computer and check to see what's on there. There's two files, a Word doc or Excel file with macros enabled, of course, something that's like nice and nefarious, but the title of the file just says Pilot and drone registration information.

And then there's one video file on the device and it's DJAO, something like that with a lot of string of letters, and then .mov. Well, you click on that file and - because of my friend Titan Philippe - he created the payload for me. When you click on that .mov file, it shows up a YouTube video of Rick rolling and when you close that, there's a pop-up box saying you've been pwned because you thought it was a .mov file. It was the only one. So you assume that that's the video of the person flying it there. And so that's a whole payload and a whole attack vector that 100% requires just one important thing that cannot be combated against human curiosity 


Andra Zaharia: And perception, expectations rather. Expectations that we.


Jayson E. Street: Expectations, yes, again, it's a drone.


Andra Zaharia: Yes. Expectations that we have of technology, expectations that we have of everything that's around us. And again, all of these things are so commonplace. Curiosity is one of these tools that hackers are so good at using because they use it, let's call it, against other people, not against them, but to just explore and dig deeper and find all sorts of things through these kinds of unsuspecting criminals use it.


Jayson E. Street: Yeah, criminals use it for that reason. And I tell people that is one of the biggest things. There's a quote by Arthur C. Clark that states… something…Oh, man!


Andra Zaharia: Take your time.


Jayson E. Street: Hold on. I'm going to open up my phone. I'm literally using it in a presentation. It's a presentation I'm working on for next year where I literally talk about AI and advanced technologies and stuff. This Arthur C. Clark quote says: Any sufficiently advanced technology is indistinguishable from magic. Well, I read that and I decided to change it to fit a more realistic approach for us as pentesters and people and also cybercriminals that are going out there and doing it.


And my version of the quote is: “Any sufficiently common technology is indistinguishable from an attack tool”,  because so many more and more devices, everybody is so worried about AI and advanced attacks and all these advanced tools and all these advanced things that are coming out and technologies, and I'm like, I can own you with your light bulb. Do you know how many IoT devices are out on people's networks? There was a university in the US that was literally taken offline because of their vending machines and light bulbs. For some reason, they started doing random hits to a website, a sushi restaurant, web pools. It's like getting commands or talk back, and it DDoS-ed the whole network.


Jayson E. Street: They're light bulbs and vending machines. And you're worried about blockchain, kill chain, Bitchain, whatever, and your zero trust and machine learning models and AI.  All those things in the future that seem magical, that you don't have to really deal with. It's like, because it's something futuristic, you need to start approaching more realistically at the things that are commonplace that are actually going to do you harm.


Andra Zaharia: It's a bit like the boiling frog situation, isn't it? In the sense that all of these things that permeate, that have become just so pervasive that we don't find novel or interesting in any way. They're just part of our day-to-day. They keep decaying and eating away and just creating more risk, creating increasingly more risk, expanding our attack surface up until the boiling frog realizes the water is too hot, when it's too late, obviously. And when it's like all the bigger issues of the world. So can the pentesters kind of help their companies, their customers, their colleagues stay grounded? Like, have this reality check? That can pull them back into the reality of ordinary, let's call it ordinary work that's so necessary simply because those are the attack vectors that just like you mentioned, can take an entire university offline or worse.


[16:20] Jayson E. Street: Yeah, I think most companies should have a program in place where they allow at least a half day of every week for their pentesters to go crazy. It's like just say, hey, you've got these 4 hours. It's like, do some weird research. It's like go look at a weird zero-day or bug, or go try to do some bug bounty, or just go and try to use this tool differently or come up with a different kind of attack vector and have contest maybe among the pentesters saying, What's the most creative use of this tool? Here's your tool. You've got a month to figure out the most creative attack that is successful. And then at the end of the month, they get judged and then a winner gets like a gift card to go get dinner or something.


Jayson E. Street: Because once again, it's like we want AI. Actually, I'm talking about this as well. It's like I get so frustrated with - we are spending so much money and you don't realize how much money is being spent on training AI. We are literally underpaying and in horrible work conditions, people in Kenya and other countries and stuff to train humans to train AI instead of spending a fraction of that money to train the humans to be better at defending technologies and attacks. It's like, so we spend all this money on training the AI, but no one wants to train the people. It's just so ridiculous to me.


Jayson E. Street: You're never going to get a situation where AI, okay, I can't say never, but not in my lifetime because I'm old. It's like, so I can say that with safety. But never in my lifetime are you going to be able to get AI, to be able to be as creative, respond to sudden and unexpected situations as you would a human. It's like given the same tools.


So if you go onto a pentest and it's like you're going onto a pentest and you're just using a tool, it's like, well, guess what? That tool is not going to give you everything that you need. You're going to have to be creative about it. You're going to have to ascertain, what else can I find? Why is it reacting in this way? It's always going to be the person behind that. I just can't stress that enough is that we have to understand.


Jayson E. Street: It's like when you're building a house, it's pretty easy to lay bricks and you're using the same materials. And also, it's like all these wonderful castles, or like these unusual houses and everything. They were all built with the same tools. They may have hired the same crews to build them, because those guys were just building something, they were given instructions to do it. But the architects were probably totally different. One had a vision to do something with these materials, and then someone else had a better vision and a more creative vision. It's like using the same tools, using the same methods, and the same people created two different pieces. It's like, one, a standard, nice little house and shed, and another one, it's like the Taj Mahal. So it's like the same materials, same workers, different results, and that's how you get different results.


Andra Zaharia: And this is kind of the..Well, this is the spirit of this particular space. This is the spirit of ethical hacking. Well, of hacking in general, because I think hacking is ethical.


Jayson E. Street: Yeah. I never go to my dentist and ask my dentist if he's an ethical dentist or my doctor if he's an ethical doctor or an ethical banker. It's like the whole people ask me about if I'm a black hat or white-hat hacker. I was like, well, first, no, my head's too big. Have you seen the size of it? I can't wear hats. Second of all, I'm not going to identify myself by a racist stereotype that was started in the 50s during western cowboy movies. Sorry, not me. And I don't go to my banker and say, are you a black hat banker or white hat banker? Because I heard about some bankers getting arrested for embezzlement, money laundering.


Jayson E. Street: So I need to know. No, we don't do that. It's like you're a hacker or a criminal. And if you're a criminal, you deserve to go to jail. And also, would I get mugged by something? And I've been mugged by gunpoint before. I never thought once that guy was a gunsmith or an NRA member or a Second Rights Amendment activist and stuff, you know? Or a gun nut. No, he was a criminal utilizing a tool to commit a crime. Didn't effing know how to make that tool.


Probably wouldn't know how to clean it properly if I didn't know how to take care of it. It was just a tool that he committed a crime with computers. Computer crime is the same way. Half of those idiots. I'm sorry, that's unfair. 98% of those idiots that commit the crime, it's like, don't even understand the tool they're using. They just watched the YouTube video, had someone walk them through it, and then started committing the crime with it. 


There are some that 2%, those are criminals who also have that hacking ability and stuff, and they deserve to go to jail as well. But the majority of hackers are the ones trying to fix things. They're the ones exploring. Computers are the new way to explore and to change and be able to change things and do things with them. That's just the reason why hacking is so equated with computers right now, is because that has been the new medium. 


You see in California in the 1970s and 80s, it's like you saw all the low riders and all the car hacking that was going on. They were using hydraulics and suspensions and the wiring and stuff. That's car hacking. It didn't have a computer in those systems. You can't tell me that wasn't car hacking. And they turned it into works of art. Because that's what hacking is. And so that's what we do with computers now.


Andra Zaharia: And this is precisely why cultivating and supporting this uniqueness is so important. Because it’s about rebelling against standardization, rebelling against uniformity, and all of the decay and quality that it brings. And when I see, for instance, things that they want to automate and replace penetration testers and replace people in offensive security, that's so unrealistic. It's not just realistic, it's downright disrespectful of people's work, of people's unique traits and abilities, and the unique passion and curiosity that they bring to the space which inherently bring innovation. 


Because, for instance, if we would build companies the same way, and while a lot of companies have tried using formulas and failed, we're just going to get more of the same thing. What we need is actually a different way of doing things. I think misattribution of a quote to Einstein who said, well, again, it's not attributed, again, directly, that stupidity is doing the same thing over and over again and expecting a different result.


[25:05] Jayson E. Street: Oh, yeah, exactly. I totally agree. And one of the things when it comes to automated testing. I will tell you this. If I am automating the testing of wind turbines or engines, no problem. You know why? Because with those technologies, you have controlled circumstances of how they're going to react and how things are going to happen. Nine times out of ten, those can be automated. When you're trying to automate testing of a defense, that means you're saying that you can predict and understand the conditions of every single human attacker that will be coming at you, and that is impossible. It's like you cannot automate something in an environment that can never be guaranteed or duplicated sometimes or be static. So automated testing can never work as a defensive model


For a system to like, I'm going to verify you by, I'm going to go through these standard checks. That's where you get port scanning from. I'm going to make sure all your ports are closed. Okay, you checked, you passed. It's like trying to go for PCI compliance. And I tell people, I think PCI compliance should not be called PCI compliance. I call it Schrödinger's compliance. And the reason why I call it Schrödinger's compliance is because if you look at every breach in the last five years, every company that was popped, you will see a blurb in the news article saying they were PCI compliant till they weren't. Which is it? Are they PCI compliant or they're not PCI? We never know. It's like. And that's the same thing with trying to get a scripted attack tool, to just go over a routine and say, oh, we did these things. So therefore you're secure. And then a human comes by who doesn't have the benefit of having those tool sets, who decided to use a web browser in a weird way that owns your company. And then that’s a problem.


Andra Zaharia: It is. It really is. And again, this feels like, just like you mentioned around IoT devices, this conversation has been around for so many years because it's a necessary conversation that we still need to have, because offensive security still has a new field, because it's still stepping into the limelight, because people still don't know how to differentiate it from other types of activities, because, again, we feel like, beyond our echo chamber, people are interested in learning about new things, they're interested in expanding their abilities.

Even business owners who are sometimes just seen in a less than friendly way because they don't understand, but it's our job to explain.


It's our job to make this kind of very.


Andra Zaharia: To build these examples that really show them, just like you mentioned, that technology is now a tool for change. And we see it at a societal level, we see it across, well, every aspect of our lives, literally. And it's our job to explain that this offensive security is a driver for positive change. When you look at the big picture and each individual contributes to that change through the work that they do and especially through how they present it in the end.


Because if you do it like in your quarter, that's great. But if you talk about it, the impact is going to be much bigger. Which brings me to scalability, which is what every company out there wants. They want scalable models and things that they can do 100x, 1000x, anything. Why should penetration testing work? Should penetration testing be scalable? Does it bring any value to make it scalable infinitely?


Jayson E. Street: I think a couple of things. One, I think it's already scalable. It's like, to a degree, it's like. Because once again, I think it's to the brick and the bricklayer. It's not the materials, it's how they're being used. It's like a lot of criminal organizations that I know of that I've seen reports on, they literally hire, they have a warehouse. It's like this was a while back, so I can talk about it without hopefully dying. 


But it was in a very offensive country. And they literally had a warehouse where there were cubes, like cubicles, just like a little office cubicles of low-level it, but they had like a smidgen of it knowledge and stuff like that. They did the low-level scans, they did all this and they would find results of a target, a live target. We're like, oh, I was able to get in with this or something like that. They would then take that to another person that was a mid-level engineer criminal, who would then be able to start seeing if they could put some trojan, put some stuff, trust me. And then if that worked, they would hand it off to their elite guys. It's like, who would actually go in and actually execute most of the tools. 


So criminals are always going to be able to scale. Why shouldn't we be able to do that? It's like the automation is not the problem. It's like when it comes to that nature, it's saying that you don't need humans to be able to control it and operate it. That's the problem. I will totally agree that we can use automated tools, and we don't need people 100%. I will totally agree with that statement as soon as you show me. We have found a way to secure every single light bulb, printer, keyboard, mouse, coffee maker, refrigerator, oven. I'm trying to think of all the devices that I know of personally in my head that have been compromised and used to take over a network. When you secure all those, you let me know and I'll sign up on that whole statement, okay?


Jayson E. Street: But until then, I don't think we're ready yet. It's like when you still can get pop by your toaster oven, you have a problem. It's like we talk about wanting to build all these things in the pie, in the sky stuff, and that's the problem. It's like we don't want to think about it. And it's just human nature. We don't like thinking about the real-world stuff that's difficult. It's like having people go in and use the tools and be manual and stuff. It's tedious. It takes time. And I tell people, you want to chase all this zero-day and all this high technology because you want to just make the house pretty. But the foundation of the house is what makes it secure. And the foundation, let's be honest, sucks. It's ugly. It's got, like, rebar. It's concrete. No one ever goes. And when you compliment someone on their house, it's like their new house, you never say, oh, can I go to the basement and see the foundation? It must be beautiful. No, it's like no one says, oh, you should see our foundation. It's like we got the rebar all laid straight. No, it's a mess. It's like you don't want to see that. But that makes it possible for the house to stay. 


So if you want a network to be secure, your foundation is not concrete. It's asset management. It's patch management. It's difficult. Those suck. Have you ever tried to do asset management and see what's on your network? Have you ever tried network access control? That's why everybody talks about it. Hardly anybody does it. It's like, because it's difficult. So the same thing with that. They want to have all these other answers because their truth is staring them in the face. So they're looking for those other answers, and so you can look for those other answers about automated tools and all these things that are going to save you, when in fact, the truth is these people and the way they operate, these tools and the way that they approach. These attacks are what's going to help.


Andra Zaharia: Your perspectives have always been gold because you know how to really express and capture the essence of these things. This is why I believe that you have such a great impact. And, of course, it's not just me that I believe. It's the millions of people who listen to you at your talks, who resonate with what you think, who talk about the things, the topics that you bring out.


And I was wondering what were kind of like, let's say, the elements or the experiences that led you to develop your mindset and the way that you approach things, because there's a level of creativity there. It's insane. It's something that's so fascinating and bewildering. How did that happen? How did you end up refining your ideas, which we see shine and kind of every year or every hacking season, within new presentations and in new topics that you bring on? How did that happen? What influenced that?


[35:39] Jayson E. Street: I'd have to be. It's one of those funny things. But it's like, to be honest, it's a weird question because it's one of those ones that throws people for a loop because a lot of people ask me questions, and then they don't realize what question they really ask me until the answer comes. I think the basis of how that starts is based on childhood trauma and being on the spectrum. It's like, I just never really understood normal people. Humans terrify me. It's like the way that they go about neurotypical people. It's like the way that they operate has always been foreign to me since I was young and even before I was diagnosed, because I was born in the 60s, it's like I'm old. They didn't know about autism. They didn't know about. It was just, you were hyperactive, and you were a bad child, and you were doing things to be wrong, and you were weird back then. That's when it was beat out of you, including in school. So it's like you develop some issues. And so I learned to study.

I mean, I studied humans more intensely than David Attenborough and anybody else or Steve Wilder combined, okay? It's like Steve Irwin. It's like I studied humans way more intensely than they studied animals, okay? Because I needed to understand what made them tick. Why were they acting that way? Why would they respond one way or a different way? What were they expecting me to look like or act like to pass off as normal? And so most of my social engineering, most of the way that I break into places they find it so unique in this stuff. And I'm like, no, it's human nature. And I've always looked at things from that perspective of just looking at the bigger. Because of the way my brain, it's literally just the way my brain works. I see things so differently.


Jayson E. Street: And I always tried to delve into the thing that is one of the things I love when I go to different countries and stuff and people are saying, yeah, but you would have a difficult time trying to break into us in this country. And I'm like, no, I wouldn't. See you acting like you're foreign or something. You're trying to act like you're a foreigner or that there's a different foreign. No, it's like the biggest myth society shows us is that the foreigner, that someone's different. I've been to Beijing, I've been to Sao Paulo, I've been to Bucharest, I've been to Bangkok, I've been to Baltimore, okay? And I will tell you there, versus my culture of my ethnicity from where I'm at, it's like every single one of those were different, including Baltimore. It's like, oh, they're American. No, they may be American, but Baltimore is a whole different thing.


One of the biggest, and I don't mean to go on this tangent, and I don't mean to sound woke or anything, but one of the biggest things that really gets me when we're trying to talk about people and we try to talk about now, I didn't want to go on. I'm going to go on a whole different rant. I'm good enough to see the tangent starting, and I'm not going to do that. I'm not going to focus on that. But to answer your question, it's like I'm weird and I see things differently. The way that I approach things have always been different since I was young, before there were names for it, before there was labels, before there was science. I have always looked at things in a different way, and because of that, I've always been able to express those and be able to relate, because I had to make my life depended on making it available, to be able to communicate in ways that normal people understood and trying to relate to them in a way that I didn't understand, that was foreign to me, but I had to find a way to make it relatable. So they would understand where I was trying to come from or what I was trying to think. And so I just became very good at translating and being able to communicate in those things.


Andra Zaharia: And that takes a lot of effort, especially as a child, especially when we're still developing, figuring out what the world around us looks like and how it works. And I feel like as adults, especially in this field, or as developing adults in this field, because we're ever evolving, it's kind of the same thing. We're constantly trying to figure out how things around us work and how we're reacting to them, how they're changing us, how we're changing them, and how we're teaching, and how we're changing each other in the process.


And this is something that I find really special about the space, something that I find really special about hackers in their inherent attraction towards principles and their idealism, their will to improve things, and to go through the painstaking process of actually doing that. It's not easy to go and tell someone like, hey, there are some issues that we need to talk about. It's not easy to be the person that breaks the bad news. It never is, right? It never really is.


Jayson E. Street: Exactly. Once again, going back, it's like automated tools can give you numbers, and the tools can give you data. It is always going to be dependent on a human to interpret that data and then convey that message in a constructive way. Management does not understand security, and that's not because management is stupid. I hate the whole myth of the pointy-haired balls. They're intelligent, they know exactly what they're doing. They're making millions of dollars. They know their job. And part of their job is to hire people who know their job and explain things in a way that they can understand, so they can make business decisions. That's their job. Your job is not to communicate in your technical things and then say that they're stupid and they don't understand it. Your job is to communicate in a way that they understand. That's why they hired you. A tool just spits out numbers and data. It is up to you, the technician, to the person, to the engineer, to the security professional, to take that data and put it in a way that shows the most impact for the management, to make them understand the risk so that they will invest in getting that risk mitigated.


Or they know fully how to accept that risk, and they know what the risk is that they have to accept. Our job in security has never been to eliminate risk. That's not our job. People get all depressed because they think that's our job, and they say, oh, we're never going to be secure. No, you're not. Your job in security is to mitigate as much risk as you can. Then you show management how much risk they can offset. And then you say, this much risk you're going to accept. If you spend this much more money, we can narrow that risk to this.


[43:58] Jayson E. Street: But if you've got an Internet connection, you're accepting risk. Risk is involved. It will always be involved. There will always be a risk of a compromise or a breach. If you're not communicating with a device, it's like, then maybe not. But if you're doing anything that requires electrons going over some kind of distance, you're opening yourself to risk. And I know that for a fact because I know people who have been able to intercept air gap machines and stuff using lasers on windows. So it's like, don't tell me that. It's like you're accepting risk. So that's the whole point. It's like, our job is to mitigate as much, offset what we can, and then explain to management how they can limit the risk that has to be accepted. That's it. And we got to do that in effective management. And a tool is just going to give you an output, and management is not going to know what to make of that, and they're not going to take it seriously.



Andra Zaharia: In a way, hackers add meaning. Well, not just in a way. In many ways, they add meaning. They add meaning to data, they add meaning to consequences. They add meaning to risk. And they make that part of life instead of something that is far removed from us. Business doesn't care about it. It's not that they don't care about it, it's that they haven't found a connection to it. We haven't offered the connection to it yet.


Jayson E. Street: And also, every hacker in existence has never, ever created a vulnerability in someone else's program, ever. What they've done is they discovered the vulnerability that was already there, and they're just letting everybody know. Now, they could have created vulnerabilities if they coded their own tools and they realized, oh, I screwed up on this programming, but they've never created the vulnerabilities. It's like I've never created the unlocked door or circumvented the lock. That vulnerability was always there. I just made you aware of it so you can fix it. Criminals are not going to make you aware of it. You don't pay pentesters to attack you. That's not what you're paying them for. You're not paying a pentester to perform a pentest and go after your networks. That's not what you're paying for. You're getting that for free right now, this very second, from all over the world, people are giving you that pentest.


You're paying the pentester for the report, for the findings, and for the remediations and the suggestions and how to show the impact so management can make those business decisions to limit that risk. That's what they're paying for. So don't talk to me about the different tools that you use or all the things like, it's like, no, they're not paying you for all that. They're paying for the human that can put that all in words and ways to communicate it and stuff and give it to management so management can make effective changes and effective decisions on their security posture.


Andra Zaharia: I love that. That's so not just relatable, but I feel like a lot of people need these kinds of words to be able to express thoughts in their fans in a way that makes them more persuasive and that gives them a voice because it takes a while to develop that, and that's perfectly fine. We're not all natural-born communicators and just helping each other find those words, find that vocabulary, find those real techniques that are rooted in understanding human nature.


So thank you for offering that. I love that you went through all of these examples that showed that oversimplification is never the answer. We live in a complex society. We deal with complex things. Human nature in itself is really complex. So it's never going to be easy to do these things. And that's okay. Accepting that level of complexity is like step one in being a hacker. Things are going to be difficult, but we thrive on difficult challenges because that's the feel of it, that's the passion. That's what actually fuels hackers to keep going and keep wanting to do what they do. And that idea of assembly line, I thought, was such a good idea of how things work up to a point, up until the assembly line doesn't work, you cannot innovate with the assembly line. It was an innovation when it first appeared, but we're so way past that at this point.


Jayson E. Street: Oh, yeah.


Andra Zaharia: Given that you've seen so many things and doing physical infiltration gives you such a rich range of experiences and ways of interacting with people and technology, what's something that you're excited about right now? What's something that you're looking into that gets you like, "This is something that I really want to spend more time on"?


[49:40] Jayson E. Street: This is funny, considering it's like the topic of my talk for next year. But using AI in a more organizational skill, like someone, AI is something that can actually go and say, "Hey, did you remember to take your vitamins?" "Hey, you're going to do this thing, this is what you should be doing." It's like having an AI that can do that, that's mobile and just be around and just be someone that is like an assistant. It's not just the digital thing, but just someone that can say, "Hey, this is a thing that you guys set up." "Can you set this up for me?" and then have it do the programming tasks. People talk about, "Well, AI is going to replace humans." It's like in job functions, and it's going to take people's jobs. And my answer is yes, exactly. Because in 1902, the electric street lamp wiped out a whole industry of gas lighters. That's when gas lighters were actually good because they were actually lighting the gas lamps.


Okay, so it's like, you don't want to be gaslit today, but back then it's like that was okay. It wiped out the whole industry. That's why we shouldn't have gas lighting anymore. It's like because it's a dead industry. And that was because of the electricity. The horseless carriage destroyed industries and library stables and stagecoaches and things. But for all those jobs that it took, it just replaced people that couldn't adapt to the new technology. It's like, because every person that lost their job to that could have joined the electrical revolution and worked in engineering and worked in setting up the light post and working on the light bulbs, doing the wiring.


Jayson E. Street: Every person that lost a job to the horseless carriage could have gotten horseless carriage and become a delivery driver or a bus driver, know where stage coaches, we had buses and then airplanes. We look at all this technology and we look at AI like it's something different. Like it's not just like the light bulb or it's a new technology that if you can't adapt, it will definitely. And if your job was replaceable anyway, then that's on you. It's like you have to adapt to use that technology. The technology is never using us. It's like it's us using it as a tool. And if you don't know how to effectively use that tool, well then yeah, that's why your job was replaced. It's like if you can't learn how to make that tool effective.


That's what that technology does. And a friend of mine created a sticker because I did this quote and I said I wanted the sticker. Is that stop complaining about AI replacing your job when you could already be replaced by a bash script. Okay, seriously, that's what we're complaining about. Because current AI is only replacing redundant tasks and stuff that could be done anyway. It's like when you look at all like, well, they're programming their own hacking tools. They're programming their own. All these other new tools. Yeah, poorly written by humans. That taught them how to do the kind of basic coding that can be easily intercepted and stuff and detected. Once again, it's a tool. It's never going to be the tool that is the end all or the solution.

It is always how you use and operate that tool. That is the solution.


But we keep looking at AI as something different when in actuality it's just another tool, it's just another light bulb, a new kind of light bulb or a new kind of technology. It's like it's not something apart, it's just more technology that we have to learn with.


There were flyers, it's like back in the 1900s, about the dangers and the sickness you could get from having street lights, electric street lights. And we thought the 5G, Covid stuff was crazy. Okay, yeah, that was crazy. But still, it's like we've always been afraid of change.


Andra Zaharia: Yeah.


Jayson E. Street: And things that we don't understand, we fear. And what we fear, we try to destroy. It's just human nature. That's the reason why hackers get so much grief is because it's so unknown.


I did a talk at DEFCON 22 about that, where it's like we all want to be that special snowflake and the special snowflake blizzard and have our hoodies and be dark and mysterious. Well, that's great, because the more mysterious you are, the more people are going to be afraid of you and the more they're going to afraid of you. That is the reason why when you hack and you do something wrong and you screw up as a kid and you break into something that you shouldn't have, it's like you're going to get ten years in prison. But if you break into a person's house and trash it, you'll get a month of community service and a fine.


Because we understand the breaking in, the computer breaking in is mysterious still and we don't understand it, so we have to punish it. There are people spending more time in prison right now for computer crime than murder and rape. And it's like that boggles my mind because of the fact that it's not as understood and unfortunately, human nature. We're way familiar with the murders and stuff, but not with the computer stuff. So we're more afraid of it. So it gets the harsher penalties. But at the end of the day, it's technology. 


Technology is always going to be a tool. And if you make it something else or you try to put it somewhere different, that's you failing, not the technology. It's like, always remember that it's just a tool. And tools can be put away. Tools can be used to create things or you could stab someone to death with a screwdriver. It's still a tool.


Andra Zaharia: It still is. And this was incredibly not just educational, but also like clarifying. You provided so many strings to pull at for everyone listening. Depending on what they resonate with. There's something in there to chase, there's something to sit with as you're going through your day, as you're thinking about your career, as you're thinking about where your life is going and what kind of meaning you get from your work, and just how you contribute to the community, to society, to your life and other people's lives.


Thank you so much for this and for everything that you do in general, and for helping articulate these really important ideas, and for helping bring clarity.

And for just pushing this industry, this space, this community into the limelight and helping take away some of that mystery, like taking away some of the darkness and bringing a lot of light in.


Thank you so much, Jayson. This has been amazing.


Jayson E. Street: Well, thank you. I appreciate it. Thanks for having me on and letting me ramble again.


[58:10] Andra Zaharia: Ever wondered how deep the rabbit hole goes in the world of ethical hacking? Well, we're still falling and we're dragging you along with us. One question at a time. Thanks for wandering through this maze with us as we tackle the nitty gritty flipped misconceptions on their heads and maybe, just maybe, made you rethink some of the things that are important to you. 

This has been the We think we know podcast by Pentest-Tools.com and before I sign off, keep this in mind. There's always a back door, or at the very least, a sneaky side entrance.

See you next time.

Get vulnerability research & write-ups

In your inbox. (No fluff. Actionable stuff only.)

Related articles

Suggested articles

Discover our ethical hacking toolkit and all the free tools you can use!

Create free account

Footer

© 2013-2024 Pentest-Tools.com

Pentest-Tools.com has a LinkedIn account it's very active on

Join over 45,000 security specialists to discuss career challenges, get pentesting guides and tips, and learn from your peers. Follow us on LinkedIn!

Pentest-Tools.com has a YouTube account where you can find tutorials and useful videos

Expert pentesters share their best tips on our Youtube channel. Subscribe to get practical penetration testing tutorials and demos to build your own PoCs!

G2 award badge

Pentest-Tools.com recognized as a Leader in G2’s Spring 2023 Grid® Report for Penetration Testing Software. Discover why security and IT pros worldwide use the platform to streamline their penetration and security testing workflow.

OWASP logo

Pentest-Tools.com is a Corporate Member of OWASP (The Open Web Application Security Project). We share their mission to use, strengthen, and advocate for secure coding standards into every piece of software we develop.