Stuart King

CTO & Chief Architect at AnzenOT

 

Show Notes:

In this captivating podcast episode, host Kristin and her guest and fiancée, Stuart King, explore their rich experiences in cybersecurity, specifically focusing on the food industry.  They discuss the transformative impact of AI and machine learning on security practices within this sector.  The episode begins with personal stories that connect their professional paths to memorable food experiences, illustrating how these moments deeply intertwine with their work.

Stuart recounts his extensive career, starting with his initial experiences in the Royal Air Force and moving into significant roles in cybersecurity, particularly emphasizing his contributions to safeguarding food production systems.  The conversation then shifts to their joint venture, "AnzenOT," a tool they co-developed to simplify and democratize risk assessments for small to mid-size enterprises in the food industry.  They discuss how AI enhances this process through improved analytics to give context to risk, which is essential for food manufacturers aiming to fortify their defenses against cyber threats.

The duo highlights the crucial need to understand the human side of cybersecurity in food manufacturing, advocating for solutions that address the daily challenges employees face on production floors.  They discuss how "AnzenOT" uses AI to simulate realistic threat scenarios and offer practical, actionable insights, helping companies navigate the complex landscape of food safety and cybersecurity.

Throughout the episode, Kristin's cat, Kai, also makes a charming presence, adding a touch of warmth as she quietly joins in, much to the amusement of both hosts.

As the episode wraps up, Kristin and Stuart invite listeners to join them at upcoming industry conferences, including RSA, where they will discuss cybersecurity and operational technology in the food industry.  They encourage a broader dialogue about enhancing security measures to protect the vital processes that feed nations and maintain public health.

 

Episode Key Highlights:

(02:52 - 03:53) Unforgettable Food Adventures in Japan

(06:13 - 08:23) Career Evolution and Shared Experiences

(13:44 - 15:56) Critical Infrastructure Vulnerabilities in Production

(17:56 - 20:09) Importance of Understanding Environment in Cybersecurity

(24:04 - 26:00) Risk Management and AI in Industry

(29:27 - 31:18) AnzenOT Tool Benefits and Features

(34:08 - 35:40) Value for Money and Risk Assessment

(37:45 - 38:41) Network Security Vulnerabilities From Unknown Applications

 

How to connect with Stuart King: 

LinkedIn and Email

Our talk on the S4 Main Stage, “Factories Are Families: How Does Security Join The Family?”

Panel at IC3 Games 2023 on protecting critical infrastructure.

 

If you want to learn more about AnzenOT, please check the website or reach out on LinkedIn.

Also, if you would like to schedule a demo of AnzenOT or have a chat, please use this link to schedule a meeting.

 

Additional Show notes and information are on the Bites and Bytes Website.

If you would like to schedule a meeting with Kristin to discuss the Bites and Bytes Podcast, please use this link

Audience Survey can be found here.

Bites and Bytes Podcast Merch Shop!! (click here)

 


Listen to full episode :


Episode Guide:

(00:00) - Favorite Food Memories and Career Highlights

(06:25) - Learning on the Job in OT

(17:06) - Security and Environment Challenges in Industry

(22:57) - AI and Risk Assessment in Industry

(27:55) - Resilient Business Planning Tool Benefits

(32:35) - Navigating Risk and Security Challenges

  • Kristin [00:00:00]:

    To start off, before I introduce start officially, let's go through the best questions that I always ask on any of my shows. What is your favorite food and your favorite food memory?

    Stuart King [00:00:10]:

    Favorite food. Absolute easy one. I love indian food. I love hot, spicy indian curries. It's been my favourite food for years and I'm totally addicted to it.

    Kristin [00:00:21]:

    Yeah, you are, that's for sure. Favor for memory.

    Stuart King [00:00:24]:

    Well, I mean, so many to choose from. I mean, some of them have been with us together. I think one of the ones that definitely comes to mind, I think, is from our travels together on our travels over in the far East. I think some of my personal favorite food memories have been there. One of them actually relates to one of the jobs that we were doing together at a. I think it was at a factory in Malaysia where we went to get lunch and found that lunch had actually been bought in for us. It was all home cooked food from people in the factory.

    Kristin [00:01:03]:

    Was that when it was also a holiday for them as well? Like this one you were thinking of?

    Stuart King [00:01:08]:

    It may have been that one when.

    Kristin [00:01:09]:

    They were going through all the different food and telling us what it was and what it was made out of. Oh, that was brilliant because I was so worried I was going to have banana. I'm allergic to banana. You don't know this, listeners, but, yeah, it's a struggle sometimes, especially when you're traveling.

    Stuart King [00:01:23]:

    Yeah. I think both of us felt very privileged that people had gone to that trouble for us, and we weren't there to make anyone's lives easier. We were there doing a factory security audit, but these people had sort of bought in their homemade food and laid it all out for us, and we were the special guests.

    Kristin [00:01:44]:

    I actually, I remember kind of fighting back tears a little bit because it was that impactful that they were so excited to share this food with us. And I don't remember not having. I remember that food being so amazing and delicious and. And it was just amazing that they did that all week for us, too. In fact, I think we had, like, cookies and stuff that were brought in the next day and people were so excited about what we liked. They brought back in that food that we liked more. I don't think I've ever had such loving food. It was so amazing.

    Kristin [00:02:13]:

    That was. I will never forget that, too. I'm glad that you brought that up. I was expecting you to say something like a restaurant we went to, but no, you brought up, like, this sentimental factory moment, which is brilliant.

    Stuart King [00:02:23]:

    Well, yeah, I mean, there were lots. I mean, we had so much great food on our travels around Japan and Singapore. Some of the places that we went to sort of come to mind.

    Kristin [00:02:37]:

    I mean, and some of these weren't even planned. We just sort of, like, ended up and wherever we were eating whatever it was at the time, mainly out of desperation or hangriness. But some of them were planned, obviously, and then others were just on the whim. We won't ever forget the ice cream coming out of the atomic bomb museum in Nagasaki with that woman who was just so smiley and made a little flower on my ice cream as I'm weeping at the cart because the museum was so intense.

    Stuart King [00:03:05]:

    Yes, yes. Have a tour around images of genocide and now. Now have an ice cream.

    Kristin [00:03:11]:

    I mean, I felt like it was kind of the only way I was going to get through that. And also, you know, couldn't find a place that was open because there was some type of holiday. And we ended up at this, like, amazing steak place in. I think it was in Nagoya.

    Stuart King [00:03:23]:

    No. Was it Nagoya or Nagasaki?

    Kristin [00:03:27]:

    I think it was Nagasaki. I don't know. I don't remember, actually. Yes, it was Nagasaki. Because we had to walk through the Chinatown. I will never forget that because that steak was amazing and all the different things. I think it was, like, supposed to be like a hibachi grill for parties and stuff, but we didn't get that experience. Cause we were just two people.

    Stuart King [00:03:43]:

    That's the trip we turned up at the factory, and everyone had conveniently forgotten that you were going to be.

    Kristin [00:03:49]:

    Yeah. And also that I was a woman, which was quite shocking to them as well, since my name is Kristin. A lot of people think it's Christian. And I am a man. I am not a man. And that ends up causing quite a bit of drama, obviously. Drama just seems to follow me around whenever I'm in these places. It's just not my fault.

    Kristin [00:04:08]:

    I just exist. Unless there is drama and aries. Stuart, thanks for the memories. I'm sure we'll have more of them as we talk. Go ahead and introduce yourself to everybody that's listening who doesn't know you by now.

    Stuart King [00:04:19]:

    I'm Stuart. Stuart King. I'm a cybersecurity ot security professional. I've been in the industry for, oh, many years, going back 25, maybe nearly 30 years now.

    Kristin [00:04:33]:

    I think it's like 30, but I'm not counting for you.

    Stuart King [00:04:36]:

    Yeah. Started off in the. In the Royal Air Force, where I sort of learned a lot about sort of software development, programming, and networking. Coming out of the air force as a. As a programmer, eventually sort of found my interests sort of veering more towards the security, the application security side of things. As sort of the Internet became the dominating environment for development work. I found a skill in being able to do some vulnerability and pen testing on web applications.

    Kristin [00:05:10]:

    I forgot that you pen tested earlier.

    Stuart King [00:05:12]:

    In your career, joined a big organization, the read Elsevier group or Relx as they are now known. In fact, when I first applied for that job, they turned me down. Did you? Yeah, I found that, yeah. And because it was a job I really, really wanted at the time and I sent an email to the CISO time gentleman named Leo Cronin. He won't mind me name checking him on it, but sort of pleading my case. And I thought at the time I had really nothing to lose by saying, look, I know I can do this job. Give me the opportunity to sort of demonstrate that I can. And I made them an offer.

    Stuart King [00:05:56]:

    I said give me a three month contract, set me objectives which for me to demonstrate that I have.

    Kristin [00:06:05]:

    You have the ability to do this job and also you had the moxie or the gumption to even say I'm qualified, give me a shot.

    Stuart King [00:06:14]:

    Yep. And he took me up on it and I worked for them for the next sort of twelve and a half years. Yeah.

    Kristin [00:06:22]:

    The majority of your career.

    Stuart King [00:06:24]:

    Yeah, I spent a few years then sort of working, working in that role as a security consultant, then became head of it security for the Read Elseviers exhibitions group, Reed exhibitions, and then after that got a transfer to a role over here in the US. I'd been working in London and was invited to take up a position here in the US as the head of security assurance and I spent the next six years doing that.

    Kristin [00:06:52]:

    I suppose for the listeners he actually means he worked in London. Like downtown, not like London as in you don't know what anything is in the UK. He actually worked in London?

    Stuart King [00:07:01]:

    Yeah, I was a great location. It was actually red Elsevier's offices overlooked Trafalgar Square. So it was a brilliant location to go into. Back in the days when we went into an office every day of the week. A lot of people still do that. But then out of that role I sort of subsequently found my way working in more sort of OT focused roles. Ended up working for the same company that you were working for?

    Kristin [00:07:30]:

    Yeah, I suppose we should probably mention that we did work together. For those listeners who don't know our whole story. I was commuting between Boston and Atlanta when I was working for a bakery ingredients company and I found him sitting in my original cubicle that I started at about three years before. And I thought I better introduce myself since he is the new director of security. I was in Andrew's support and executive support at that time. And I leaned over and I saw him and, yeah, my world, obviously altered from that moment on, has never been the same since. In a good way. Yeah, I got to show Stuart kind of the ropes of what it meant to work in OT, and especially from a user side, got to show them around some of the factories in the US and the UK.

    Kristin [00:08:17]:

    It was quite an amazing experience because I remember walking into, one of my favorite memories, actually, of that time in the bakeries company was walking into one of the factories in California. And I had spoken to them ahead of time before I got there because I was very close with that factory, Colton, and I told him to go easy on you and not to haze you too badly because as everyone knows, working inside of factories, it's sort of like a lot of aunties and uncles coming in and, yeah, we were walking the factory floor and we were doing these extended security assessment, and they're like. And they kept looking at each other and they kept giggling. Some of the production people that we were with, and I was, I just kept saying, don't do it. No, don't do it. And one of the things they like to do is walk people over to the ovens because they wanted to see if people would, you know, get squirmy or, you know, be like, oh, it's so hot. Why am I over here? Or they just wanted to basically harass you. But no, you instead were like, woo hoo, it's in oven.

    Kristin [00:09:11]:

    And you like, ran up the stairs and they were like, wait, hold on, it's hot there, don't touch that. And you were like, yay. It was quite funny. And I think you earned their respect right then because you had no fear to understand every role and everything that was in that factory, what was going on. And you definitely won yourself some allies that day.

    Stuart King [00:09:29]:

    Definitely learned to loss about, you know, the whole sort of food manufacturing process to working for that organization. And one of the things, you know, particularly that was always a big surprise to me was how sort of how much sophisticated sort of automation there was around all of those factories. In fact, a lot more automation than what you and I subsequently saw in some of the electronics factories that we went into.

    Kristin [00:09:52]:

    Yeah, definitely. And I think that's what separates a lot of that out is because the automation in the food factories isn't necessarily for the production. Of course. It always tries to be, but it's about employee safety more often than it is about the production line. You would rather have people not having their limbs severed off by a cutting machine in a poultry plant, but if you can have a robot do it, that's going to be less likely. And also, robots don't have feelings. Sorry to break it to everybody.

    Stuart King [00:10:19]:

    Yeah. But we also sort of learned a lot about how interconnected everything is and how something that has an impact on one system can really have impacts right across all the other sort of parts of the production process as well, around even sort of back the different parts of the supply chain, too. So it was certainly a great sort of learning environment.

    Kristin [00:10:46]:

    Yeah, I mean, I definitely think that learning on the job is the best way to go. I mean, you can learn only so much inside of a classroom. If that's your preferred learning route, that's great. But I think that the experience and having to understand what's important and what's vital and what's not and what's hidden and what's not known, but still really very much important to the whole system and the ecosystem inside of a factory was probably some of the best learning experience I've ever had in my life, in my jobs. Wasn't my favorite job, but it definitely was right up there.

    Stuart King [00:11:18]:

    I will never forget the one. Where was the. Where was the plant? It was somewhere in South Carolina, Spartanburg. And the experience that we had in that particular planting.

    Kristin [00:11:29]:

    Yeah, I do think you should share that experience, but we will say that it was resolved within ten minutes. So, as you're hearing this story, I'm sure some of you are going to cringe, but it was resolved in ten minutes. So. Headstart.

    Stuart King [00:11:40]:

    So one of the things that we liked to do as part of the factory, sort of what we called them at the time, extended security assessment.

    Kristin [00:11:50]:

    We still do this to this day, actually. It's not just what we like to do then. We still do.

    Stuart King [00:11:54]:

    That now was take a look inside the server rooms. So, server rooms, data centers. I mean, in some places, it was nothing more than a closet inside of a woman's room.

    Kristin [00:12:06]:

    Next to the.

    Stuart King [00:12:07]:

    There was that one, yes. But at this particular factory, we walked into the server room, and it was just an absolute jungle of cabling.

    Kristin [00:12:18]:

    Rat's nest is not even. That doesn't even describe it.

    Stuart King [00:12:22]:

    It was appallingly bad. And I remember I was trying to identify what the machine was that was sort of. It was, like, hidden behind a load of cabling and moved some cables a little bit to see if I could see what the label was in front of the device, whether it was a server or a switch or a firewall, just could not see what it was at all. But while I was doing that, we started hearing from outside the room that there was an outage.

    Kristin [00:12:53]:

    No, that's not how it happened. And there was a knock on the door and they grabbed me immediately because I knew that I knew all about the systems and stuff and said, hey, we have an SAP outage. They didn't even make a connection that the network was down yet or the Internet was down. They just knew that the ERP system went down. And I was like, oh yeah, let me check on that. I had a feeling you did something, but I didn't, obviously wasn't going to throw my boss under the bus at that time. I just took the beating for it. So I quickly called the help desk and was like, hey, are you guys noticing there's an outage over here? And then I got one of the network guys on the phone and they said, yeah, you guys are down, what's going on? Stuart put his hand in the server room and then I remember the director of networking said, do not let door touch anything.

    Kristin [00:13:36]:

    So thankfully you figured out that you hit a switch and just re hit it within a ten minute block and it didn't halt production. We were fine. But that's really scary that there was one switch that could kill the entire production area.

    Stuart King [00:13:49]:

    Just when I'd been moving the cables, it had just leant on a power button and. Yeah, turn something off that should have been on.

    Kristin [00:13:57]:

    Yeah. Obviously that was marked as a critical priority in our risk assessment report. And also I think we did tell the networking team. So they went down and they fixed that eventually, because I think what scared me was that it was that power adapter that was connected to core switch for the whole facility. Why? Like, nobody understood why. And this is still the reality of what people are dealing with inside of some of these production factories is there's one power adapter or power cable or an extender that's connected to everything and it can be taken down at any moment, which is terrifying.

    Stuart King [00:14:29]:

    Oh, yeah. I mean, we used to joke about it about how easy it would be just to switch things off so we.

    Kristin [00:14:35]:

    Wouldn'T cry about it because it was that scary. And I think about some of the other critical infrastructures that are out there and thinking it's probably very similar in some respects. And as we've talked with some of our other otics colleagues.

    Stuart King [00:14:48]:

    Yeah, but what's interesting is the contrast between the neglect you would sometimes see around the. The technology infrastructure, but the absolute care taken around the. Around the production itself.

    Kristin [00:15:02]:

    That's true, but that was because of food safety and food security. It wasn't about our tech. It was about making sure the food was protected and safe or ingestion going out, which makes total sense in the long run. Why would the two be intertwined? But now they are, because there's so much digital around these lines.

    Stuart King [00:15:19]:

    Yes, there is. I mean, everything's now network connected. I mean, even those ovens were, you know, controlled, you know, from an HMI, you know, somewhere or workstation.

    Kristin [00:15:29]:

    They loaded in a plant dashboard in the timeframe that you worked there as well. I don't think they actually finished the project till after you left, but they had these plant view, like one pane of glass things which are super popular, and I don't disagree that that's needed, but they are connected to IoT devices, they are connected to the Internet. They are very much the vital chart of the. The facility. And a lot of that doesn't have, in my opinion, the right protection around it at the time. I believe that one didn't even have two factor authentication connected on it, and I believe you and I both put up a stink on that about it. And I think people were irritated that it was going to take time away from production because they had to enter in their, you know, six digit passcode or whatever it was. I'm sure there was some type of middle ground that was met, but you and I were long gone by then, already in electronic factories.

    Stuart King [00:16:18]:

    Yeah, but this. I mean, that's another thing that people also need to be aware of as well, because it's easy to sort of walk into a place and spot all of the sort of, so to speak, deficiencies in the way that the technology is set up and configured, you know, and a lot of time. And we've seen this everywhere where we've been as well, which is, you know, somebody from, you know, a corporate office, you know, will come along with a policy in their hand and say, oh, you know, you've got to have something, you know, this configured in a certain way and that set up in a certain way without really having any notion of what that means for the people who were trying. Trying to get production out, production runs.

    Kristin [00:16:58]:

    Completed, it's like the parent telling the kid, you need to do this. And the kid goes, why? And you're like, because I told you so. It's exactly what it is.

    Stuart King [00:17:07]:

    It's like environments. It's easy to set a policy that are, you know, that a workstation, you know, times out after so many minutes of not being used, and then, you know, expecting somebody to put in a, you know, a password to unlock it. But, you know, take into account that maybe the people in that environment are wearing thick gloves for protection, and do they then have to de glove and, you know, do put their password in like that every, you know, every five minutes?

    Kristin [00:17:33]:

    Weren't we in a semiconductor factory? And they had biometrics for their eyes on something, and the workers were complaining that they couldn't use it because they wear goggles, and they. It throws the sensor off when they use it, and they can't lift it because of the environment that they're in as a clean room. So if you lift your goggles, you could expose the environment to your eyelashes or whatever other particles could fall off your eyes. I think knowing your environment and understanding what it is is super important. And that was something that was really frustrating, is dealing with a company that didn't, that kept forgetting it was a manufacturing company. The 53% of its revenue came from there. And yet they were so focused on their entertainment aspect, which is not necessarily a bad thing. But when you go to talk to them about how, hey, we're actually a manufacturing company, and yes, we are.

    Kristin [00:18:18]:

    We are creating things that could disrupt the environments around these factories. That always bother me is people forget what they're actually serving and protecting. It becomes very siloed. And somebody once said silos of excellence, because they're trying to paint a little bit more of a positive light on it. And it's frustrating because we have to now become almost politicians, in a way. And that's a lot of what cybersecurity can be sometimes, is trying to win hearts and minds and kiss some babies. To get people to realize that this is important, you need to pay attention to it.

    Stuart King [00:18:49]:

    Well, I think there's a lot of truth in that. I think this is also something that people get wrong when they talk about the convergence as well.

    Kristin [00:18:58]:

    So the bingo card between OT and.

    Stuart King [00:19:01]:

    OT, by convergence, I think people think you're talking about. The important thing is having unified policies and procedures, but, you know, it really means having an empathy for, you know, for how. For how people need to get their jobs done more than anything else. And being able to work within whatever constraints that you have within a particular environment, whether it be a food manufacturing plant or an electronics plant.

    Kristin [00:19:29]:

    It's true. And I think that's how we tried to do the extended security assessments is with a people process thought in mind rather than just a policy. Security controls in mind. I think it becomes very sterile if you only focus on the security controls rather than how people interact with them and what kind of processes are around them. Because you and I both saw people circumvent security, like in real time in front of us while we were standing there at a security assessment. And then other times they would break security protocols or controls. And I think people who didn't have an understanding of people process inside of these facilities would have been freaking out. Example I always go back to is the dummy scanners with the passwords that were on the computers.

    Kristin [00:20:15]:

    It was a media factory, so CDs, DVD's, that kind of stuff. And they had barcoded passwords that were next to the terminals that they just picked up. A dummy scanner, just like a scanner you'd scan groceries with and scan it and the computer would unlock. I think that works great for a production floor. And then people like, well, what if someone took a picture of the barcode? Nobody's going to take a picture of the barcoca's phones are banned inside those facilities because it's a media factory. So it works in that regard. But would that work in a food plant? Probably not. Not necessarily, because again, these little devices are required in certain places.

    Kristin [00:20:47]:

    I mean, safety uses mobile devices and cell phones to, you know, scrape the side of eggs and test them for salamonella and stuff and then stick it in a little device that sticks in the bottom of the phone. That wouldn't necessarily work, but I think it's about understanding the environment, understanding what the need is, and that's what's really important. And that's how we try to do the assessments.

    Stuart King [00:21:06]:

    Yeah, ultimately, technical controls is the easy part. Setting technical controls and defining what those need to be is easy bit, because that's all written down somewhere. You know, there's a guide, there's a manual. You know, the vendors have got guidance on how to configure their systems. And, you know, organizations generally have, have policies around how they set up their networks. And the rest of it is technical knowledge that's easy to learn. You can learn from a book. The other bit, the part about knowing how to actually address the issues from the personal perspective, that's the really difficult part.

    Stuart King [00:21:48]:

    And you're right, it is very much about winning hearts and minds and being aware of politics and awareness and knowing the challenges that people face every day in their work. And if you can work within those boundaries, then you can be pretty successful when you're trying to actually implement new or educate people or just get new security controls implemented.

    Kristin [00:22:14]:

    I just realized we're starting to sound like our s four main stage talk that we did. I'll link that in the show notes if you're interested in watching us talk about this at a greater scale. And also, we look very fancy where.

    Stuart King [00:22:26]:

    We managed to finish our 30 minutes presentation in 15 minutes.

    Kristin [00:22:29]:

    Yes, because we are efficient at that. So let's transition a little bit, Stuart, because now when we were doing security risk assessments, at least on the scale that we're talking about here, we didn't have the aid of AI or any of this machine learning that's going around now. I love to have your thoughts on what that's doing for our part of the industry in terms of governance, risk compliance and that whole world.

    Stuart King [00:22:56]:

    So I think a lot of the way to do an efficient risk assessment really hasn't changed. All that's changed is that there are now tools that put different spin on the results that some of the devices and solutions that different organizations are implementing. Now there's more intelligence around things like risk scoring and obviously there's threat management and anomaly detection solutions out there now which pull a lot more sort of vulnerability data about things like field devices straight off their network and present more intelligence around scoring and vulnerability information and all of the things that have been sort of common on, say, normal IT networks for years now. That's solutions out there, especially being developed for OT networks where AI is now able to be useful. My opinion anyway is around quantifying some of the risk scoring and descriptions on how to address risk. So if you can get a good perspective on the outcomes and consequences of a particular scenario, and if you can tie that into, you know, a specific type of facility and where that facility is and the, you know, the type of production that's going on there, regardless of whether it's a food manufacturing plant or an oil refinery. But now you can, you can use AI to say, well, given, given this particular scenario and given the perspective on the effectiveness of the security controls that are there, you know, this is a scenario that actually need to manage the risk of, or it's a scenario that you need to, that you really don't need to be particularly concerned about because there's an extremely sort of unlikely hood of it happening. And that's what we've been busy building over the last few months in the application that we've developed.

    Kristin [00:24:57]:

    Yeah. And I think we need to just roll it back just a bit and say that AI isn't the answer. It's a tool facilitator. And we understand that it's a facilitator to help make better decisions. You still can't take it for base value. You cannot Google on AI search engines yet. That's not a thing. Please do your research.

    Kristin [00:25:17]:

    I want to put that disclaimer up because I know a lot of people get kind of twitchy about that and probably will continue. I do want to say, in swinging it back into the food industry quickly is a lot of what AI can do is help bridge the gap of knowledge between the digital frontier that we're now seeing the food industry in, more so than it has in the past, with best practices for cybersecurity and what kind of threats are going to need to be dealt with. As Stuart just mentioned, scenarios. For example, you'll be able to implement a scenario that includes, say, a food contamination or some type of food fraud and be able to have a better understanding of what that's going to look like for your organization in all aspects, not just in a cyber front.

    Stuart King [00:26:00]:

    Yeah, one of the things that I really wanted and one of the things I set out to achieve was, was to really lower the barrier of entry for organizations to be able to sort of complete and do some of these sort of complex ot risk assessments. They tend to be in the domain of the large consulting companies, are very expensive and really out of reach of a lot of smaller organizations who would like to be able to do these risk assessments but, but don't have the resources, either financially or from a skills perspective to be able to do them. So I figured out that there must be a way to be able to create a solution that has some intelligence built into it. And with the sort of advent of a lot of this new AI tools and the APIs that go along with them, to be an opportunity to try to develop something.

    Kristin [00:26:58]:

    I think what's interesting, too, is a lot of the talks and a lot of the questions that I get when I speak to the food industry or food professionals in general is they always say, I don't know where to start. I have no idea what our program looks like. Or if I speak to cybersecurity, insider food, they always say it's such a big task because they're trying to understand different processes and people within our own facility. And I think you can really use AI as a tool to try to help you at least get the baseline foundational understanding to be able to boister your the programs and bit more of a priority order as well, because that's what a lot of people always ask is, well, what's the most important? Well, the most important is protecting your food going out. Right. And then the second thing that's most important, if not close to the first, is keeping people safe inside your facility. Those are the two things that will bring down a food company quicker than anything. Than anything, than an act of God.

    Kristin [00:27:48]:

    What I think people get hung up on is they start to boil the ocean instead of make that cup of tea. And that's what becomes frustrating. And that lowering the bar of entry wasn't just the financial cost of a product, but also blowing the bar of entry of the knowledge you need to use to use a tool, because it doesn't integrate with systems. It's just a SAS tool that sits on a platform and you do all your data entry and then it does the analysis on that. And I think if we had something like that when we were back in the bakery company, or even when we were working with the electronic company and the semiconductor companies, if we had that knowledge with an AI brain running in the background, we could have created such comprehensive reporting that would have been way more useful to the people on the ground, not just to the executives it was given to.

    Stuart King [00:28:33]:

    Absolutely. Because we used to hypothesize scenarios and then try to figure out ourselves whether or not it was something that was realistic or could have any possibility of happening. So with the software now you enter a nice concise description of the scenario and it will even help you write it into some parameters and then it will create a scenario for you. And then out of that you do your business impact analysis and you consider safety impacts, environmental impact, yeah.

    Kristin [00:29:04]:

    Other ESG impacts, environmental, social governance impacts. I love that the tool actually has air quality on it and an understanding of how that can affect certain types of production, especially ones that have to control temperature and humidity and things like that in various different places. That just comes from a lot of our times around clean rooms. Obviously pharmaceutical people are acknowledging this, or even people who are making supplements will acknowledge this comment. I also think that, oh, by the way, I don't think we've actually said the name of the tool. It's AnzenOT, by the way. I just realized we probably haven't said it yet. And I don't want you to think this is complete pitch for the tool just talking about it, because we're really proud of it.

    Kristin [00:29:39]:

    It's come up out of grassroots knowledge. It's come out of a lot of frustration. It's come up out of a lot of, we need something better than what we have, and we all do not want to die on this spreadsheet hill because that's what it feels like a lot. Right. And I know that the food professionals also have their own spreadsheet hill that they're on as well. And I think that this particular tool will help bridge the gap between some of the different silos and towers. I think that, you know, food defense will be able to have a different conversation when it comes to business continuity planning with, say, their OT or their ICs or their cybersecurity or iT teams, because they can come at it with a different conversation of, hey, they understand that if XYZ happens, this is going to release a contaminant, and then these are the things that we've got to do to deal with that. Or we can be more preemptive.

    Kristin [00:30:25]:

    It's going to help with tabletop exercises and scenario drill runnings, which is something that the industry finally is embracing a little bit more of and having that conversation with different stakeholders at a table. I always consider it like adult dungeons and dragons, minus the dice and the role playing, but you're still role playing because you're role playing your role because, you know, not everybody knows what they're doing. And I think that helps facilitate a better understanding of what the industry, and your industry specifically, is going through, especially in those exercises. So being able to come up with scenarios on the fly, I think, is pretty huge and it quantifies it. So you get to see how much money you're going to, you're going to lose if some, something bad happens. And not just in the immediate, but also in the long term, which we wish we had something like that type of calculator. And I know there's calculators out there and I know people have done this, but this is done in seconds rather than days.

    Stuart King [00:31:19]:

    Well, it was always one of the big questions that would get thrown at me whenever I was presenting the results of a risk assessment would be, look, how much is it going to cost me? What's it worth? How much?

    Kristin [00:31:30]:

    And then also, what is the priority on this, exactly? What's the big red mark?

    Stuart King [00:31:35]:

    So, I mean, really, I started off, when I started down the development road, actually building this, the perspective I was building it from was just to create a tool that I would find useful and helpful for myself. And then out of that, somehow we've ended up where we are today with this.

    Kristin [00:31:52]:

    Yeah. And I think it actually works out really well because, you know, I've spent, I spent the last x amount of time so focused on the food industry and how do I serve them better? How do I provide a way for them to become more resilient? Because that's really what we're trying to do. We're not trying to solve all the cyber problems again. That would be boiling the ocean. We're just trying to help companies become resilient so they can survive the cyber attack when it hits, because it's going to happen. It's not an if, it's a when thing now. And everybody keeps saying that, but I don't think people realize how serious we are when we say that it's going to happen. But are you going to make it through? And if you're a small to mid sized company, which is obviously where we're trying to help the most, that could completely take your organization to bankruptcy.

    Stuart King [00:32:35]:

    Yeah, I mean, a lot of people that I talk to, all trying to address the same, the same challenges within their environments. And, you know, people struggle as well on how to present, on how to present risk. So just recently, I've been working with a risk manager who's, you know, implemented a new one of the, you know, threat management solutions within the environments that he has to manage. And, you know, it gives him a, it gives him a risk score, and it's, it's a number. And, you know, that number is supposed to represent, you know, the level of risk that's from vulnerabilities within his environment.

    Kristin [00:33:12]:

    We love to number things as humans. I was just thinking about all the medical scales that are numbered and how you have to fall between these ranges, otherwise you're in big trouble.

    Stuart King [00:33:21]:

    But what does, if you're given a risk score of 55, which is roughly the number that this solution is commonly reporting in this organization, 55, everybody would.

    Kristin [00:33:32]:

    Automatically think that's a failure.

    Stuart King [00:33:33]:

    But is it? I mean, it's sort of somewhat medium, but what does that mean? Does that mean I've got to worry, what's the context? And what these solutions don't have is the whole context. They can identify a vulnerability, and a lot of the devices that they're seeing, the vulnerabilities they're picking up are not new. In some cases, they've been there for years. But now that you've plugged in a solution that now can see the vulnerabilities, now you've got to worry about it because it's being reported and you've spent a lot of money on buying this solution in the first place. So how do you show that you're getting value for money. Well, you show getting value for money because you try to drive that risk score down and get that risk score as low as possible so that you can show that there's results. And there's a number of ways you can get that risk score down. I mean, some of that comes down to how you actually tune the device because you can sort of fool the system somewhat and say, well, actually, let's ignore some of these devices because we'll take them out of scope.

    Stuart King [00:34:35]:

    And that could be your way of reducing your risk score.

    Kristin [00:34:38]:

    So is that kind of like shoving the skeletons back in the closet?

    Stuart King [00:34:42]:

    You know, so what's, you know, what's needed is, you know, the ability to put greater context. So if you're going to report a score, you know, a risk score, then at least have, have the right context around it and make sure that it's actually based on a scenario or based on a threat assessment or based on, on something other than just the fact that you've now plugged in a device that can, that can see that you've got an old, an old Plc that has a, that, you know, that has a vulnerability with a high CSV score on it.

    Kristin [00:35:12]:

    You're throwing acronyms out and some of the listeners probably just roll their eyes, but it's fine. It all comes back down to people and process. That's what's going to give you the proper context. And if you are working in isolation instead of looking at it as a holistic system, you won't ever be really resilient. If you are only securing one portion of your business. The other portion is the damage. That's where the damage will happen more than likely. And I think that was the thing that was also really enlightening.

    Kristin [00:35:40]:

    Working in these different types of manufacturing environments is understanding that the enterprise network was viewed as the protected network and must be secure because data and all the other things that came with, you know, the secret data, but they never thought, oh, hey, we probably have a flat network. We don't have anything segmented away from each other. So if you could get into something on the enterprise side, you could probably get into the production. And we found that many times where I would be working at my desk after we just gotten back from, I don't know, Malaysia or Thailand or wherever we were after doing a very long extended assessment for a couple weeks. And I'd be going through different ip addresses, ranges and looking through, and this is just a little bit of purple teaming. It wasn't super red or anything. And then I would happen to come into a program inside of a file folder. And that program was the robot arms that were doing the screws on tvs inside of this factory in Malaysia.

    Kristin [00:36:40]:

    And they had told us that it was air gapped. And my response at the time was challenge accepted. But I never thought I would find it in a, buried in a file share, like three layers down just because I was just ding ding, ding, ding, just teetering around like I normally do when I'm just checking stuff out. But that was terrifying because first of all, I mean, it was a moment of like, woohoo, I proved you wrong because I don't like it when people use the term air gapped because it's just an automatic challenge to me. But the second part was, ugh. Oh, you know, great, this needs to be dealt with now because this was a, they said it was a very expensive line and they were really excited about it. And, yeah, that was very, very concerning and frustrating.

    Stuart King [00:37:21]:

    Well, I mean, a lot of things get lost in the complexity, don't they? And a lot of things are left open through firewalls and through file shares.

    Kristin [00:37:31]:

    And they don't even have the firewall configured because they think that having a firewall is compliance. But you need to configure the firewall. Sure, you're up to compliance because you have a firewall installed as a device, but you need to have it configured.

    Stuart King [00:37:44]:

    We used to find that it was the applications that were generally creating the holes through networks. You'd have something connecting out from the OT network that was being serviced from somewhere on it that required some file share to be accessible so that data could be sent back. And some of the applications were just cutting through network security, like the proverbial hot knife through butter or some of.

    Kristin [00:38:14]:

    My favorite things were we walk in and then you'd say, well, what's that system up on the wall? And they'd be like. And then you're like, what do you mean? You don't know there's a network cable in it. Like, I don't know. That's from such and such a vendor. They monitor such and such. And we're like, wait, what? Does anybody have a record of this random machines? We found that though just about every factory we'd ever stepped into, there was some random machine that nobody knew about that was connected to the Internet that was pulling data outside, essentially. And that was hilarious.

    Stuart King [00:38:41]:

    People used to get very nervous because I would start peering in the gaps between machines and underneath them.

    Kristin [00:38:46]:

    And you would do that. But then I would storm into a room that I wasn't supposed to be in. To me, I was there to help them do their job, keep people safe, keep production running, and keep food healthy. So I didn't care what I had to do as long as I was being respectful. Of course, I wasn't being disrespectful, but, you know, the little nuances of finding out that wifi signal didn't go through flower bags and in a full warehouse or those kind of like tribal knowledge you picked up as you went through.

    Stuart King [00:39:12]:

    Like CCTV cameras will get coated with sugar and you won't be able to see out of them if you're in an ingredient.

    Kristin [00:39:18]:

    Also, the sugar duct is highly flammable. Didn't know that at the time. That was terrifying. Cause I was constantly breathing that in. Or the fact that monkeys like to rip cables off. Networking, it's another Asia. Yeah. Networking IP cameras that were sitting outside factories.

    Kristin [00:39:33]:

    That was pretty fascinating. But that was a bonding moment with that team because I understand primates and that was fun. But yeah, there's just so much. So we really created ants and ot out of. Out of love for the industry and love for what we do because we wanted to, to be able to make it easier for people to do their job and keep people safe and keep food safe and keep critical infrastructure safe or whatever you were doing with it. So, yeah, that's why we created it ultimately, and I'm very proud of it. I think you did a brilliant job with it. I have a lot of input.

    Kristin [00:40:04]:

    I still beat on it for you in terms of a beta tester. Probably will always be, you know, the beta tester for you because I'm very happy to click on it like I'm a crazy user.

    Stuart King [00:40:14]:

    Should also mention that Anzan Ot, it uses a lot of AI on the back end. But I couldn't have made it without AI because I came into this without having done any programming for like 20 years and had no idea even how to get started. And using tools like chat GPT has really helped me quickly be able to write code. If I was stuck on a problem, then it's always been sort of quite easy to use AI tools.

    Kristin [00:40:48]:

    And that's how a lot of these AI tools are coming up so quickly is because the ability to create is there and have creativity within it. And I think there's an AI tool just about for everything now in terms of products. Yeah, I think I've seen just about everything at this point, but the other thing that I love about it is how it can be used in practical application like you've done for this, for Anznot. And also it still uses it to help do the analysis and the quantification of the risk. And having that comprehensive risk score and have a better understanding. It's not just about numbers and colors, it's about so much more of the context around the people in the process. But anyway, Stuart, thanks. I love reminiscing about our past life there together working.

    Kristin [00:41:31]:

    And now we're still working together, obviously, because we have this product and companies together. I want to thank our special co host here, the cat Kai, because she's been here and she's just sat there, hasn't she? No, she has not. I'm covered in hair. She's been all over me. And why do you think I'm sitting all the way off the cushion now? Because she is now fully on the cushion. But that's fine. This is cat Momlife hashtag. Anyways, we are both going to be at RSA conference this year in May.

    Kristin [00:42:01]:

    We'll be there the fifth through the 9th. I'm also speaking on the 6th in the morning about legacy devices inside of operational technology environments. I'm on a panel with some other really amazing people. So if anybody is planning on going from the security world or the food world, please come find us. Or feel free to message us prior because we'd love to say hey and high five. And I'd love to know if you're a listener of the show, but if you're interested in any type of a demo of AnzenOT that also can be arranged. I'll put everything in the show notes so you can look at it at your leisure. I've really enjoyed this conversation, Stuart.

    Kristin [00:42:34]:

    It's. We've. It's been a long time coming. We've been talking about it since the inception of the podcast that I've needed to have you on. Since we have the background in food and we've really come through together in this world in such a different angle than I think most people have, but also very similar and very relatable, of course, as well. Do you want to say anything to sign us off in terms of what you see for the future? Any other little quirks of stories?

    Stuart King [00:42:58]:

    No, this has been. This has been fun. I think I look forward to catching up with anybody who wants to talk at RSA. And obviously, if people want to reach out to me directly, then find me on LinkedIn or through. Through Kristin.

    Kristin [00:43:13]:

    Well, it will be in the show. And also people can just click on the link which is really very convenient in this modern technology world that we live in. All right, everybody, thanks for listening. Thanks for being your steward. I mean, you do live here, so I appreciate.

    Stuart King [00:43:25]:

    You said it was easy to get here, today.

    Kristin [00:43:27]:

    All right, until next time.

Previous
Previous

Ep. 014 - Leadership in the Food Industry with Tia Glave and Jill Stuber

Next
Next

Ep. 012 - Securing the Food Chain with Systems Thinking with Carl "CJ" Unis