Forensic Fix

Forensic Fix Episode 16

Episode Summary

In this episode of Forensic Fix, host Adam Firman interviews Simon Bailey, CBE QPM, former chief constable and current member of the Police Institute for the Eastern Region and the Child Rescue Coalition. They discuss the challenges of combating online child sexual abuse, the role of technology companies in preventing abuse, and the importance of officer well-being. Simon emphasizes the need for global collaboration, the development of safety by design in technology, and the establishment of a global hash database. He also highlights the work of the Child Rescue Coalition in saving children and identifying offenders.

Episode Notes

Takeaways

The well-being and welfare of officers and staff dealing with child sexual abuse cases is the greatest challenge in law enforcement.

Technology companies need to prioritize child protection and invest in preventing abuse rather than just identifying and reporting it.

The internet has transformed child abuse in a negative way, but it also allows for global collaboration and the sharing of best practices.

Forensic companies can assist law enforcement by providing fast and accurate digital forensic analysis, developing safety by design in technology, and implementing global hash databases.

Frontline officers should not be exposed to child sexual abuse material, and technology should be used to identify and flag potential abuse for further investigation.

Sound Bites

"Nobody should have to see, watch, listen to what you and I have unfortunately had to watch and listen to."

"Sometimes people saw it as a victimless crime that a lot of offenders didn't believe they were doing any harm because they weren't committing the abuse."

"The privacy lobby is incredibly well funded. It's very, very well organized."

Episode Transcription

Adam Firman (00:04.462)

So hello, welcome to episode 16 of Forensic Fix, a podcast brought to you from MSAB where we invite guests from the industry to discuss the latest news in digital forensics, current issues and a general chat about all things digital forensics and investigations. So I'm your host, Adam Firman a tech evangelist here with MSAB. So I'm absolutely delighted to announce that we have Simon Bailey, CBE QPM joining us for episode 16 of Forensic Fix.

 

So Simon is a former chief constable at Norfolk and Stadbury, very associated to my previous background. And Simon, you're now working with the Police Institute for the Eastern Region, and you're also working for the Child Rescue Coalition along with Protect Children. So that sounds like a busy set of roles. So in your current set of roles, Simon, what does a normal day look like for you? Oh, Adam, it can start very early in the morning with a team's calls.

 

Australia and then conclude with again teams calls or phone calls well into the evening with colleagues in North America. The Child Rescue Coalition tech solution to tackling online child sexual abuse is a global one so the demands are global and it is not unusual to have a day that starts very very early for me.

 

obviously later in the day in Australia and then conclude into the evening where of course we are five, maybe seven hours ahead of American colleagues. So it's a slightly unusual existence but a very fulfilling one. So I touched briefly about your previous background. Could you give us a bit more detail, Simon, how you came to be working in this space? Yes, of course. So,

 

2013 I was appointed as the temporary chief of North Constabulary when my predecessor Phil Gormley became the deputy director general of the National Crime Agency and at the very, very end of 2013 going into 2014 I was asked if I was considered becoming the national police lead for child protection and abuse investigations. I agreed very, very quickly, was humbled to have been asked.

 

Adam Firman (02:29.838)

to do it, then candidly not realizing what the next eight years would bring, how it would unfold. And I'm not sure anybody could have done. You could say, Simon, if you'd done better due diligence, you'd have had some idea of the gathering storms around Savile, around Rotherham, around the online harms agenda, a very critical

 

Her Majesty's Inspectorate of Constabulary's report pending. But actually, Adam, I had the most fulfilling and demanding eight years probably of my career. And those eight years as the National Policing Lead have set the foundations for what I'm doing now. And I'm now almost three years into retirement. Although I don't refer to it as retirement, it's my second career.

 

and I'm getting a huge amount from it, notwithstanding all the challenges that you know only too well, that we collectively in this space face, particularly with the scale of the online threat, let alone the challenges around familial abuse that we know is still so problematical. The peer -on -peer abuse, which seems to be a growing more and more and more, and just genuinely the societal trends,

 

and the challenges that we face and what we face as parents and grandparents and all those things come together in some respects because it's very, very difficult as I'm sure you're aware to switch off and not then be asked about your advice. Well, my 10 year old is thinking about getting a mobile phone. I'm thinking about it. What's your advice? You then have to try and get that balance right. So it's now been, it's a vocation, it's a passion.

 

It's something that I'm dedicated to and it's now crikey. I'm into my what 11th, 12th year now. So it's a big commitment. And like you were saying, how many years experience you've got with your vast experience and knowledge in this area. We know too well what law enforcement is dealing with and this challenge isn't going away. It's getting bigger. But what would you say is the biggest challenge that is facing?

 

Adam Firman (04:56.43)

law enforcement or forensic units who are dealing with child abuse on a daily basis? So I'll start with the most important thing from my perspective and I'll then come back to answer the question more fully. But the greatest challenge for me without any of that at all is the well -being and the welfare of officers and staff that are having to deal with this. Because candidly, nobody should have to see

 

watch, listen to what you and I have unfortunately had to watch and listen to. And unless you've been there, you've done it, you've seen it, you've heard it, you can't articulate the horror of what that's like. And look, how do you ever discuss at a dinner party the rape of a baby?

 

Nobody wants to talk about it and candidly people find that hard to believe. So I am increasingly aware of the well -being of officers, their mental well -being. I nearly lost a colleague who contemplated suicide because of what he had to deal with. And whilst we're getting so much better, Adam, at the welfare of our former colleagues,

 

We also know from the research that the research institute at Anglia Ruskin has done where I'm the chair of, that there was a real stigma attached and perceived around seeking help and seeking welfare for those officers. And we've got to kind of break, we have got to break through that. So that's, that is a huge challenge. And then of course you will, you will be aware of this probably more than I am just in terms of the sheer scale of the volume.

 

of images that are being created, videos that are being created, the generative child sex abuse material that's now being generated, the threat of virtual reality worlds and what's going to be taking place in those spaces. And it's that evolving technology. And for police officers, how do you know that the image that you are looking at is not now one that's being created by something like stable diffusion?

 

Adam Firman (07:22.35)

versus there's a real victim there. And how do you start to reconcile the fact that you are going to be looking at potentially tens of thousands, if not hundreds of thousands of images and identifying a fraction of them? And I think that's a tough gig. And you touched on a really good point with sort of AI and how we can define what's a genuine image and what's an AI generated image. And it was always,

 

Sometimes people saw it as a victimless crime that a lot of offenders didn't believe they were doing any harm because they weren't committing the abuse. Now with AI generating images, it gives validation to that even more so. I think in some people's minds, Adam, it does. But I think it's the responsibility of professionals working in this space like you and I that start to challenge that myth. And I believe that it is a myth. This is not.

 

victimless. You only have to look at some of the latest research that's being conducted by my colleagues in Protect Children in Finland. The number of people searching for CSAM responding to their redirection survey talking about if they had the opportunity to abuse they would. I think we have to bust through some of the myths. We have to challenge some of the language that is being used.

 

and recognise that this is far from a victimless crime. Every time an image of a child being sexually exploited is viewed, that child is being re -victimised. The generative C sound just creates the opportunity for people with a sexual interest in children to fulfil their most extreme and obscene and depraved realities. And what does that then lead to? And...

 

I think we need to break open this myth. I think we need to have more of a public debate. The public need to understand it. But I also bear the scars from having tried to do this. And I've come to reconcile myself to the fact that we are so reliant upon certain hardware and certain software as a society.

 

Adam Firman (09:50.254)

there are way too many people that are just simply not prepared to turn a blind eye because of the operating model that they favor because of the opportunities and the functionality of certain software, certain solutions which they love, they don't want to lose. So it's a case of saying, right, well, like it's not happening to me, it's not happening to my children. And they can move on. And I think we have to highlight this and we have to turn around and say, well,

 

You know, these companies now are now so big and so powerful that they are almost ignoring what I think is the growing lobby, the growing number of politicians around the world that are saying enough is enough. And actually you only have to look at some of those tech giants. Evidence before, I think it was a Senate committee in the USA. I didn't see many apologies there, Adam.

 

I didn't see any acknowledgments that actually we have a huge problem. And do you know what? It's not good enough to say from a meta perspective, we identify more child sex abuse material on our platform than anybody else. We're the best in the world at identifying it and reporting it. Well, hold on. How about you become the best in the world at stopping it and preventing it being uploaded?

 

And we have to challenge the narrative. We have to challenge the lawyers spin on it. And we need to start a global debate around what is truly happening. Because as you understand the technology, these companies could prevent it and stop it. They have the money to do so. Like you say, that they are probably more financially powerhouses than our governments. I couldn't agree more. Profit?

 

Profit has unequivocally been put before child protection. There is no doubt about that. There can be no doubt in anybody's mind. I was reading an article the other day where Norfolk County Council had won a civil suit against Apple because of some issues around the fluctuation in the share price and the impact on the county's pension scheme.

 

Adam Firman (12:15.278)

Apple with child were fined in the region, I think of 360 odd million pounds, two days profit for them. And it's that stark reality. Well, how about you were to put a fraction of that into the technology that doesn't allow child sexual abuse material to be stored in the iCloud. How about you building the technology that a parent can provide their child with an iPhone that actually says,

 

You cannot receive an icon image. You can't send an icon image. And we know the technology exists to be able to do that. Absolutely. There is, unequivocally, it's there. We know it's there. We know they have the ability to scan. We know. You only have to look at the functionality of the phone. You only have to look at the conversations that you have. All of a sudden, you start talking about something and then feeds start to come through with your phone advertising that very thing for sale. Yeah. Let's, again, there's some myth busting that needs to be done here.

 

Yeah, and it sort of goes back to when Apple did announce that they were going to start scanning people's iCloud accounts for CSAN, that the public uproar was absolutely amazing and Apple were already scanning your iCloud account because how else would they be able to allow you to run a search for text within a photo? They are scanning your images. So why was the public, why was that reaction? Well, Adam, I think

 

I think what I've seen over the years is that the privacy lobby is incredibly well funded. It's very, very well organized. You only have to look at the fact that the privacy lobby can take out an advert at half time of the Super Bowl and to realize just how wealthy they are, how well organized, that actually it was the privacy lobby that I think kicked that all off.

 

started the hue and cry because Apple handled it badly and the rest they say is history. And I don't buy this argument that the technology could then be used to support corrupt regimes. I just don't buy into that. And the likes of you and I, when we are signing up to a contract with Apple or Meta or anybody else, and if you were to read into that contract,

 

Adam Firman (14:42.926)

I expect all my data to be kept private. However, I acknowledge and recognize and accept that if I store child sexual abuse material or material related to terrorism, I forego that right to privacy. And I know that you will report me. Who of the right mind is going to come out and argue that that is not a reasonable expectation?

 

Nobody would object to that. So if you could have these big tech companies in a room, how would you get them to help law enforcement to deal with this global issue? Well, the first thing I would do is I would get them into a room and I would sit them down and I'd get them to listen to for an hour to some of the victims. Listen to a child that's been abused. Listen to the parents of a child.

 

that's been abused. Understand the impact it has, understand the trauma that it causes and for some people the lifelong trauma. And once they've then got that understanding, you would like to think that there would then be a realization we have to do more and then that should then be a conversation with digital forensics experts, with

 

law enforcement experts, investigators in this field to work out what do you need us to do? What do you need from us? But Adam, do you know what? It actually starts before that because too many children are being abused and the damage is already done by the time the police become involved. The tech companies need to go on the front foot and turn around and say, do you know what? We're not going to provide a

 

default setting on a phone that's given to a child that actually says, you can't take a nude and you can't receive a nude. Edex. Well, can you imagine, can you imagine all the abuse that will be stopped by doing that? The reassurance that will be given to parents. How you could prevent without a parent saying it's okay, you or I that are in, I don't know, 50s contacting a child that's 14 or 15. Yeah. No.

 

Adam Firman (17:07.15)

If it's our grandson or granddaughter, absolutely fine. But why would you or I be contacting or the messaging service a child? No reason for it. All these preventative tools could be put in place. And finally, the key thing would be if Simon Bailey or Adam Furman want an account, want to join Simon, we've got to provide the verification of whom we are.

 

Could you imagine the difference that would make to Twitter? All the abuse that takes place because it would instantly be known, right, it's this person that would then be expecting a knock on the door. And it can be done. It can be done. It probably goes to the same as well with the availability of people being able to purchase pay as you go SIM cards without registering details. It's exactly the same.

 

It is. It is. And let's bring about really effective age verification. Yeah. Should be able to access pornography unless you're 18. Sorry. And I'm not married White House, but I'm seeing the damage that's being done. Yeah. And it is really, I had a previous guest on the podcast, a lady called Jen Hoey, whose daughter was blackmailed into years of being scared. And, and, and

 

Jen is now very, very much sort of really pro children should not have screens. But it's a real hard balance to, because when children have got peer pressure, that their friends have got screens, it's a hard one to get right, isn't it? It really is. And it's the peer pressure that becomes really difficult because mum and dad then become the bad actors in all of this. You're the ones that are spoiling my fun. All my friends have got it. Mum, why can't I? Dad, why can't I?

 

And you're not, you're being a responsible parent, who is concerned about what your child or your son or daughter is going to be exposed to. And it's absolutely the right thing to do. But I also recognize that actually the peer pressure mounts and at some point the child is going to have a phone. So the most important thing without any doubt is having that conversation that says, we need to be open and honest. We need to talk. We need to be able to share everything. We need to be talking about what we're viewing online.

 

Adam Firman (19:33.422)

If anybody makes any appropriate statements or somebody approaches you, you don't know. And you just, you gain your child's trust and confidence. So they talk to you about their experiences. They talk to you about the apps they're using. They talk to you about the conversations they're having. And by doing that, I think you are mitigating not all of the risk because you're never going to be able to mitigate it all, but the majority of it. Yeah. And that's the message that James Tryon portrays. She set up a charity called

 

not my kid because you never believe it's going to be your kid and it's about education not only to children but to parents. Yes I couldn't agree more. I couldn't agree more. And we've spoken about your work with the Police Institute for the Eastern Region what exactly so for people who are unaware of that project and what's being done can you share the details of that?

 

or you've got a conference coming up soon as well haven't you? I'm privileged to work with an amazing professor by the name of Samantha Lundrigan and Sam is just a star in this space. We've now worked together for coming up three years in a chair and director of the Institute capacity. Sam leads a great team of academics and we're doing

 

groundbreaking work particularly in the welfare and well -being of officers and staff understanding routes to viewing child sexual abuse material understanding what good preventative measures look like for offenders we're doing that with organizations in Europe in the UK here with the Lucy Faithful Foundation the Internet Watch Foundation so Sam Leeds

 

I do my best to support a group of academics committed to improving the global response to child protection and other other areas of policing. And yes, you're absolutely right, 21st 22nd of this month, we will have held our annual conference. I've got a stellar lineup of speakers from all over the world attending, hundreds of

 

Adam Firman (21:53.102)

of people joining the conference for what is an event dedicated to improving the whole system response. So when I talk about the whole system, I'm talking about the education and the awareness. I'm talking about how you target offenders, how you work with offenders, prevent them being abused, dealing with the trauma, how you help children recover. Got some phenomenal speakers attending and what should be a really, really...

 

two good days and then on the first evening I'm going to be hosting the Excellence in Global Online Harm Protection Awards. It's a new concept. I've created it. I'm building this brand, if you like, with a wonderful former Suffolk Constabulary employee by the name of Lucy Sheehan. Between Lucy and I, we've created a global award ceremony and there will be six awards made after the Black Tie Dinner.

 

on the Tuesday evening, which I have to say will have been optional. So Adam, if you, I think you are coming so you don't have to wear a dickey bow if you don't want to. Six categories, 27 nominations again from all over the world. So I'm really looking forward to what should be a fantastic two or three days. And we talk about the internet and this space as being a bad actor, but.

 

It can also be good because it's also bringing us together globally. Yes. And allowing us to work together. We're no longer the silos are coming down in my opinion. Yeah. I'd agree. Look, and I will now routinely say that the internet has transformed the whole world of child abuse in a negative way. Yeah. But.

 

When the internet was being created and the software designers were building their solutions, did they ever envisage that their technology would be abused in the way it is now? No, they never did. But, but Adam, what should now be happening is new software designers should be building into their operating models safety by design. That should be an absolute must. And these multi -billion dollar a year companies should now be investing so much more.

 

Adam Firman (24:08.846)

in preventing the abuse happening in the first place. And actually going on the front foot and saying, right, we now acknowledge that we have facilitated crimes in a way that we never ever envisaged that we would. We now want to do something meaningful about it. And for me, that's got to be the next step or the next solution. Yeah. And you spoke about the Senate and there was no apologies coming out, but no one is blaming these companies because nobody could foresee what was going to happen.

 

but they now know that the problem exists. Exactly. Exactly. So I'm blaming them now. Yeah. Do something about it. You've got the money, you've got the technical know -how, do something about it. Yeah. You sort of briefly touched on this and as you said, you were a former chief constable. What made you want to remain working in this space? Because it is a hard space, but and not just walk away and retire.

 

and have those holidays. It gets under your skin. Yeah. It absolutely gets under your skin. And each day I'll get up and come down to my office or travel to London or jump onto an airplane or train, whatever it might be, thinking if by doing what I care about, what I'm passionate about, means that a child, one child is prevented from being abused, then I'll have had a good day. Yeah. It's not something, Adam, that

 

I think a lot of colleagues in this space can ever say that you can easily walk away from. And I still feel that I can make a small contribution to that global response. And look, it is small, but what I do know through Child Rescue Coalition, we are able to save thousands of children every year and identify thousands of offenders. So that, for me, is rewarding enough. And I also know through the research that we're doing at the Policing Institute, we're making a difference.

 

We're understanding the threat in a way that we never have done. We're doing some fantastic work to help ensure that colleagues that are investigating these crimes are able to do it. So it's not difficult to jump out of bed on a Tuesday morning, head down to my office and put in the, not the, they are hard yards, but ultimately this is all around trying to make that difference. Yeah. And you're right. It is the hardest, but yet the most rewarding.

 

Adam Firman (26:34.222)

knowing that you've done something to help and prevent the child from being abused. Yeah, it is. It really is. You spoke about the Child Rescue Coalition there and we recently caught up at their annual gala. What message would you have for anyone working in law enforcement and who are dealing with CSAM and are currently not using CRC technology? Well, thankfully, I can say with great confidence, I think everybody in the UK is. So that's...

 

And look, the results speak for themselves. The UK is the world's leader in tackling online threats. And what I am now seeing over the last 12 months is a step change in the number of countries around the world that are now using our technology and using it to great effect. Yeah. And actually our technology and the suite of our technology is growing. We're being able to deal with more threats. We're being able to protect more children. And actually the conversations that I'm having,

 

with my colleagues that I'm working with, encourage me more and more that we are making more than just a little bit of a difference. And for me, that's what it's all about, Adam, as you know. And we know one of the biggest sort of things to hold back law enforcement is finance. Then CRC technology for law enforcement is free. It is. It is. And...

 

And all we ask for when we're delivering the training is just cover the costs of the trainers and the officer's time, which has been paid for anyway. So look, we've trained thousands of people around the world. And we're now looking at how we can improve our business model, how we can deliver referrals that are the best that a police officer is ever going to be able to get to identify an offender. So that's the direction of travel. That's what we are going to try and do. We've got to keep pace with the scale of change.

 

in the same way that you at MSALB are having to keep pace with the scale of change and that's difficult and that requires big investments in technology. So look, it's not cheap doing what we're doing, but through our generous benefactors, our generous donors, through the support that we have from some phenomenal organizations, you know, we are producing tech that is really making a difference. And touching on finance in law enforcement and obviously, I

 

Adam Firman (28:55.214)

was in a department that pestered you a lot as in your previous role for money that we wanted software. How that must have been a hard balance to sort of get right because obviously we wanted everything. Of course we did and we were very fortunate that you understood that mission. But how does all of the police forces in the UK, how do you get that right, Simon? So Adam, look, that's the...

 

That's one of the great strengths, but also a weakness of policing here in England and Wales in that you have independent corporate souls called chief constables that can determine with their police and crime commissioners or their mayors what the priorities are. And actually I was fortunate that I had very supportive police and crime commissioners that understood what I was doing, trusted me to invest in technology where it was appropriate and look,

 

You were in there. We had one of the best digital forensic departments in the country. There was no doubt about that. Some brilliant, brilliant people doing some phenomenal work. There were never any huge queues. Submissions were being turned around within what, six, seven weeks? Yes. In some parts of the country, it was 12 to 18 months. Yeah. Ultimately, Adam, that comes down to the chief customer's priorities, his or hers priorities.

 

And we could then go off into a whole tangent around is that model fit for purpose? Should we be looking at regionalization? We both walks a little bit down that road when North and Suffolk got very, very close to sharing a control room and the politics got in the way of it. Ultimately, it's a cheap, constable's decision where they invest. My organization knew that vulnerability, exploitation, and abuse were very much a part of my tenure.

 

in my eight years. And I think we, Northwark and Suffolk, got very good at dealing with it and tackling the scale of the threat. Yeah. And it is hard because each chief constable is going to have their own set of issues because the UK is a very small place, but regionally it's very different. Yes, very much so. Yeah. And like you said, that's the hard part for the respective chief constables to get that balance right because they need to meet.

 

Adam Firman (31:21.166)

their regions issues. Yes, and of course we, you lived through it, I lived through it, austerity was not easy. No. Austerity was very, very difficult for every strategic leader at that time across the whole of the public sector. We all struggled. The one trick we really missed was better collaboration, closer working relationships would have saved jobs and would actually have maintained or perhaps even

 

seen a better service being delivered if we'd had the full sight to do it. Yeah, yeah. And I totally understand that each sort of police force needs to maintain its identity, but there could have been a lot of closer working to save money. Yeah. You don't have to give up your cat badge. No. I asked you the question about what would you ask these big tech companies to do? And obviously you helped

 

direct and fund money towards forensic companies who are providing software for law enforcement, who I work for now. How do you think the forensic companies can assist law enforcement in tackling this issue? Things such as officer wellness and things like that. So the, I mean, the critical, the critical things for me are we, the forensic companies could become a really powerful lobby in having one global date, one global hash.

 

database, that would make a big difference. The key for me around the forensic providers is the speed at which you can deal with the huge volume of memory you're now having to deal with, how you break through encryption and pass codes, how you're able to recover images that people have tried to delete or hide. It is that whole suite that actually affords

 

and officer the conference knowing that they will very quickly get all the digital material that they've recovered examined and a highly, highly accurate, evidential report produced and knowing that they've recovered everything that they can do. That for me is the key and actually 12, 18, 24 month backlogs, Adam, in digital forensic units are not acceptable.

 

Adam Firman (33:49.39)

There's no need for that. The technology is there and I had to be able to get through MSAB, provide that type of technology. Why would we, why would forces be waiting that long? And actually, let's understand that there's a victim in so many of these cases. It's not right that you can be waiting that long. Yeah. I think the hard balance is, and I think it's something, and I look back and I feel very proud that I was part of the unit that managed to, I believe, got

 

got that balance right. And it's about a forensic lab can no longer deal with every single digital device that's part of evidence. And you need to lose that sense of, it's my precious, I need to keep this. That workload needs to be shared out, but officers need to be trained to understand and report on information that they can go to court and report on. I'm not expecting a frontline officer to report on

 

how a deleted artifact got onto a device. But if it's a simple set of text messages between a drug dealer and their customers, most people can comment on that. They understand the text conversation. And I'm still finding police forces globally who are not allowing frontline officers to review that data because they're not a digital expert. Yeah. Yeah. And I agree. And, and,

 

there needs to be some reflection on actually what we want our cops doing. Do you want to train them up to be able to deal with everything or is that just a pipe dream? And do you know what, Adam, I firmly believed and to the best of my knowledge, didn't allow it when I was in the chief of Norfolk, your response officers should not be viewing child sexual abuse material as that need. You suspect it's there, you complete your forensic,

 

documentation you send into your DFU. I don't want cops looking at that. Right. And I know that the MSAB has put technology in that when phones are being extracted on the front line that you can now upload the CAID database. So when the device is the information is being sorted through before the officer even uses it, if there's a CAID hit it then tells them do not look at this submit to a DFU and that's the sort of

 

Adam Firman (36:14.414)

technology that I think needs to be available for frontline officers. Agreed. I couldn't agree more. And all an officer needs to do is run a device over a phone and just get a indication there is chance there's abuse on material, they stop. Don't need to do anything more. Complete the forms, get it into the DFU. Don't need to trust the technology and you'll save an awful lot of problems further down the road. Yeah.

 

And that's where you spoke about this, this sort of global hash sharing database. That could make it so much easier. It could, it really could. And funny enough, if any of the big tech companies are listening, they probably could have access to that database as well and easily scan their files that we know they're scanning anyway. Why would you not want that to happen? Exactly. But Simon, I just wanted to say a huge thank you for, I know how busy you are. I know with the conference coming up.

 

how many loose ends you're trying to tie together. And thank you for giving up your time and joining us. I'm sure our listeners would have found your journey and why you're still in it an incredible, insightful and humbling experience. And I'm going to post a link out to your LinkedIn profile on the show notes so people can connect with you, hear about the work you're doing at the Eastern region and the CRC. And I just want to thank you once again for giving up your time, Simon. Adam, it's been a pleasure. Thank you very much indeed.