WEBVTT
00:00:47.000 --> 00:00:49.159
Hi Warriors, welcome to One and Three.
00:00:49.239 --> 00:00:50.359
I'm your host, Ingrid.
00:00:51.000 --> 00:01:09.879
Whatever stage you're in in the continuum of abuse, whether you're currently cohabitating, trying to figure out if what you're experiencing is abuse, looking for a safe way to exit, trying to heal, or navigating post-separation abuse, it's very common to feel lost.
00:01:10.120 --> 00:01:16.359
Most victims aren't aware of the resources available or they can't access the ones they do know about.
00:01:16.680 --> 00:01:19.799
My guest today, Anne, is here to change that.
00:01:20.039 --> 00:01:31.719
She's built an AI technology specifically designed for domestic abuse, putting critical guidance and information into the hands of the people who need it the most.
00:01:31.959 --> 00:01:33.319
Here's Anne.
00:01:33.640 --> 00:01:35.640
Hi Anne, I'm so excited to have you here.
00:01:35.799 --> 00:01:38.200
Welcome and thank you for joining me on One and Three.
00:01:38.680 --> 00:01:40.439
Thank you so much for having me, Ingrid.
00:01:40.599 --> 00:01:41.640
I'm really excited to be here.
00:01:41.959 --> 00:01:42.359
Yes.
00:01:42.599 --> 00:01:49.159
So before we get into our topic, could you just give a little background on yourself so we can all get to know you some?
00:01:49.719 --> 00:01:50.920
Yeah, absolutely.
00:01:51.079 --> 00:01:52.599
Um, my name is Anne.
00:01:52.680 --> 00:01:57.319
I'm an urban homesteader in Denver, Colorado, here with my family.
00:01:57.480 --> 00:02:06.920
Um, what gets me out of bed every morning is working to really challenge systems, interrogate, if you will, the systems that are failing women and children.
00:02:07.159 --> 00:02:11.480
So that has had me working in evidence-based maternity care.
00:02:11.639 --> 00:02:15.079
That's had me working in childhood education.
00:02:15.400 --> 00:02:31.319
And now in this space of really trying to pair the evidence of what we know about domestic violence and coercive control and unhealthy relationships, really trying to give people who are experiencing that the access to the resources and the support that they need.
00:02:31.560 --> 00:02:41.800
Because right now, this is a huge failing in our system that doesn't only cost a lot of women their lives, but costs uh far more people a quality of life.
00:02:42.039 --> 00:02:42.680
Absolutely.
00:02:42.840 --> 00:02:50.039
And this, I'm so excited, I can't I'm like fumbling over words because I'm really, really excited to jump into this topic.
00:02:50.280 --> 00:02:52.919
Because there are not a there actually, I take that back.
00:02:53.000 --> 00:02:55.240
There are a lot of resources out there.
00:02:55.479 --> 00:02:59.800
They're just not accessible to everyone out there.
00:03:00.439 --> 00:03:07.479
And that's either financially not accessible or location-wise not accessible.
00:03:07.719 --> 00:03:19.560
And that makes a very scary experience for victims and survivors because they think they know what to do, but they're not sure what to do, and they can't, there's really nobody to bounce that off of.
00:03:59.639 --> 00:04:06.360
Certainly as well, um, around having ended up in an unhealthy or even dangerous relationship.
00:04:06.520 --> 00:04:17.240
Um, it's you know, people can't access resources that are behind a wall that says domestic violence unless they feel like, well, that's what happened to me, unless they they self-identify in that way.
00:04:17.399 --> 00:04:24.279
Um, so yeah, there's a lot of reasons that that all of those incredible resources can be hard for people to get at the right time, right?
00:04:24.439 --> 00:04:25.560
As early as possible.
00:04:25.800 --> 00:04:26.120
Yeah.
00:04:26.279 --> 00:04:31.720
And I think a lot of people, and actually a lot of people I personally know, they turn to the internet.
00:04:32.039 --> 00:04:38.840
They Google this just happened, or they go to Chat GPT and say, hey, this happened, what's going on?
00:04:38.919 --> 00:04:49.639
And they're asking advice from sources that may not have a strong domestic violence uh library, I guess, to pull from.
00:04:50.519 --> 00:04:52.920
Yeah, or or so first was the Google.
00:04:53.079 --> 00:05:09.160
Um, uh when people Google their experiences when it's not particularly dynamic, um, like it's you know, AI-generated responses, it's really easy for people to say, well, okay, I read on that website that physical violence is this, this, and this, but he pinched me.
00:05:09.319 --> 00:05:10.600
He didn't slap me.
00:05:10.839 --> 00:05:12.199
So it probably wasn't.
00:05:12.360 --> 00:05:13.560
It wasn't abuse then.
00:05:13.639 --> 00:05:15.240
It wasn't physical violence then.
00:05:15.399 --> 00:05:21.639
There's this um barrier to people accessing um supportive resources when they're not dynamic, right?
00:05:21.800 --> 00:05:25.879
Because people want to believe that they did not end up in that situation.
00:05:26.040 --> 00:05:34.199
Um, and then, you know, when we're talking about using something like generative AI, you know, Chat GPT is actually doing a pretty darn decent job in that space.
00:05:34.360 --> 00:05:43.240
There's some recent research that came out in March that, you know, 91% of the time it was fed information about an abusive relationship, it identified it as such.
00:05:43.399 --> 00:05:53.639
Um, so I know we're not gonna poo-poo on on Chat GPT in that regard, but they don't, you know, they're a generalist tool and they don't have, um, they're not building in what we call an AI vertical, right?
00:05:53.720 --> 00:05:56.680
Like we are making a product for a very specific use.
00:05:56.839 --> 00:06:04.279
That means it's highly trained in that regard, and then all the kind of the back-end engineering and everything that's going on is trained to that specific use case.
00:06:04.439 --> 00:06:10.199
And certainly that's not true of ChatGPT, as well as there's some privacy concerns with ChatGPT as well.
00:06:10.519 --> 00:06:15.000
Yes, and I think people are just becoming aware now of that privacy concern.
00:06:15.160 --> 00:06:17.959
So let's talk about your technology.
00:06:18.360 --> 00:06:49.240
Uh, yeah, so um Amy says uh is the name of our company, and Amy is our AI, and she's built um in this survivor first space, um, like I said, in this vertical of, you know, everything that we're doing is to serve the specific and unique needs of people everywhere from that first red flag, like I went on a weird date, or you know, I'm thinking about going on a date with this person, but I'm not really comfortable with this message that was sent to me, all the way through, you know, the very unfortunate rock bottom.
00:06:49.399 --> 00:06:51.639
I'm afraid I'm gonna lose custody of my children.
00:06:51.720 --> 00:07:00.839
Um, or the court has removed um, you know, my ability to visit the children based on false allegations because I claimed that there was abuse in the relationship.
00:07:01.000 --> 00:07:04.040
So we're really building out in that, uh, in that space.
00:07:04.120 --> 00:07:10.759
Or even the, you know, I'm co-parenting with my abuser, and that's its own special nightmare of post-separation abuse.
00:07:11.000 --> 00:07:16.439
But we took into consideration the things that something like a general AI is never going to do that they can't do.
00:07:16.600 --> 00:07:18.920
Um, privacy is a huge one of those.
00:07:19.000 --> 00:07:21.560
Um, we never train on user data.
00:07:21.720 --> 00:07:38.680
We're not reading the things that people submit to Amy, we're not reading what Amy is saying back to them, because that privacy and confidentiality is both important to the survivor and also in the legal context, you know, whether or not you actually waived confidentiality because you're using an open model, something like ChatGPT.
00:07:39.160 --> 00:07:44.680
Um, and then building inside of the kind of legal prep space, there's some additional protections that people have.
00:07:44.759 --> 00:08:00.839
Um, because you know, if you create documentation or, you know, your timelines, et cetera, in the context of preparing for future litigation, there are um often some additional protections and privacy for that work that are afforded by your state statute.
00:08:01.079 --> 00:08:06.519
Um, but we get to do really exciting things inside of that space.
00:08:06.680 --> 00:08:16.279
So, you know, if you're talking with ChatGPT, you're likely to get a really pretty high-quality conversation about the dynamics that are present in your relationship.
00:08:16.519 --> 00:08:35.000
If you have the same conversation with Amy, she's extracting from that pieces of information that she's putting on a timeline that she's labeling with the different types of abuse that are present, um, that she's going to be able to export for you, that you're going to be able to add to a digital binder that she creates in case you need to go to court.
00:08:35.160 --> 00:08:49.639
So it's really all of those kind of additional layers that that you know, my colleagues and I uh spent in crazy amounts of time thinking through and developing, testing, launching, um, and getting into the hands of folks who really need it.
00:08:50.039 --> 00:08:50.279
Yeah.
00:08:50.440 --> 00:08:54.920
So let's there are there's two different, I guess, um, tiers.
00:08:55.079 --> 00:08:58.679
You can have the free chat feature and then there's the uh paid for services.
00:08:58.759 --> 00:09:08.039
So let's just start with there's not a lot more to explain about the chat feature because it's I think a lot of people are already familiar with that sort of technology.
00:09:08.519 --> 00:09:09.000
Yes.
00:09:09.320 --> 00:09:20.360
So right now, you know, 40,000 people from 150 plus countries have had just anonymous conversations with Amy whenever they need, it's entirely free.
00:09:20.440 --> 00:09:22.840
Um, no data is stored, it disappears.
00:09:23.000 --> 00:09:26.200
And for many people, that is exactly what is needed.
00:09:26.360 --> 00:09:49.320
You know, our AI is trained in in uh course of control, and what what escalates risks, what kind of additional calculations might need to be made, supports that need to be provided in that context that make it a safer, more um uh uh more utilitarian was the word that I'm looking for, tool than just an anonymous chat with with Chat GPT.
00:09:49.480 --> 00:09:53.000
Um, we keep that free and and publicly available.
00:09:53.159 --> 00:09:59.240
We also offer any organization that works with survivors, we can put that free embeddable chat on your website as well.
00:09:59.320 --> 00:10:00.679
So folks do not have to leave it.
00:10:00.759 --> 00:10:03.000
It's free for them, it's free for the organization.
00:10:03.320 --> 00:10:13.080
Um, but we also offer um a subscription model, and that allows us to store the information so Amy remembers you, right, when you come back.
00:10:13.240 --> 00:10:14.679
And some people, that's it.
00:10:14.759 --> 00:10:18.440
They just want, you know, every time they have a chat with Amy, they don't have to re-explain anything.
00:10:18.600 --> 00:10:25.480
Amy has a growing kind of body of information about who you are and what you've experienced it, what you need, what your goals are.
00:10:25.559 --> 00:10:53.320
Um, and then also, you know, all of the kind of data management that is a really unique opportunity of artificial intelligence to tag all of that in the background, label it all, extract key details, put those things on a timeline, and really do the administrative, uh clerical, um, you know, uh paralegal type work that um is incredibly time consuming, re-traumatizing, um, and expensive.
00:10:53.399 --> 00:10:55.639
Certainly, if you're gonna have somebody else try to do it for you.
00:10:55.879 --> 00:10:56.039
Yeah.
00:10:56.200 --> 00:10:59.000
And I've um, I'll be honest, I've played around with it.
00:10:59.159 --> 00:11:06.519
I've told a few friends who are in some difficult situations, and we've looked at it.
00:11:06.759 --> 00:11:09.720
And what's cool is like how you said, there's like a timeline.
00:11:09.799 --> 00:11:17.080
So you can start with today, and you enter in an event for today, and then say, okay, I have a little bit of extra time.
00:11:17.159 --> 00:11:19.240
Let me go back to five years ago.
00:11:19.399 --> 00:11:30.120
This is what happened five years ago, and you can put that in, and it's a lot more organized than like an actual hard copy of a bunch of papers thrown into it.
00:11:30.679 --> 00:11:34.519
Um, the binder feature is really cool.
00:11:34.679 --> 00:11:45.960
And like if you're going to go to court for, I don't know, you're talking about uh custody timesharing, you can ask Amy, and correct me if I'm wrong.
00:11:46.200 --> 00:11:58.360
Like, we haven't gotten this far in playing with it, but you can say, hey, I'm going to court about my son and whether he can have time, whether there should be more time, whether he things should change.
00:11:58.600 --> 00:12:05.000
And Amy will pull the documentation that's in particular to that situation.
00:12:05.639 --> 00:12:06.759
Yes, exactly.
00:12:06.919 --> 00:12:13.960
So, first of all, um binders work because people have been doing their due diligence and documenting things uh as they go.
00:12:14.120 --> 00:12:25.480
If a brand new user comes on and says, build me a binder, you know, Amy is building it from what she knows about you and the things you've uploaded and the documents that you have there and the events that that she's either documented or that you have.
00:12:25.720 --> 00:12:35.720
So it's the binder tool is kind of the product benefit of having maintained a consistent relationship with the AI.
00:12:35.879 --> 00:12:43.960
But you're right, you could be like, well, he hasn't paid his support in X months, and I want to file for enforcement.
00:12:44.200 --> 00:12:45.559
You click the binder button.
00:12:45.720 --> 00:12:53.399
Amy asks you two questions and asks you to name the binder, so you'll be able to, you know, identify it next to another one.
00:12:53.559 --> 00:13:06.039
And then she will, in 30 seconds, pull every chat that you've had about uh that's related to non-payment, the impact of it, every entry that's in there, the documentation that you've uploaded for that.
00:13:06.200 --> 00:13:18.120
She'll put it together in a in a single binder that you know you can add to, that you can export, uh, you know, you can give it to an attorney, you can use it in your own, you know, if you're pro se as a as a self-represented litigant.
00:13:18.279 --> 00:13:23.399
But instead of you combing through, like you were saying, you know, all those loose pieces of paper, you know, we've got those everywhere.
00:13:23.559 --> 00:13:24.440
Oh, I'll remember this.
00:13:24.519 --> 00:13:28.120
I'll just jot it down on a sticky note, and that will be sufficient.
00:13:28.279 --> 00:13:31.080
Uh, she's able to pull all of that together for you.
00:13:31.159 --> 00:13:36.200
Um, it's an incredibly powerful, uh, powerful tool.
00:13:36.519 --> 00:13:43.879
Yeah, and in terms of documents, you can upload your court, your divorce decree or whatever court documents.
00:13:44.120 --> 00:13:47.879
You can upload financial documents as well.
00:13:48.279 --> 00:13:48.679
Yes.
00:13:48.919 --> 00:13:49.320
Yeah.
00:13:49.480 --> 00:13:51.879
Um, you know, it lists all the different file types.
00:13:52.039 --> 00:13:55.720
She's better at um kind of more narrative format.
00:13:55.799 --> 00:13:59.559
Um, you can't just do a straight Excel spreadsheet particularly well.
00:13:59.720 --> 00:14:07.000
Uh, but she you upload those things, you add them to a conversation so she knows that that's something that you want her to be looking at.
00:14:07.159 --> 00:14:17.080
Um, and she like I could upload six months of uh financial records and say, you know, he says he submitted these in discovery, he says that he can't afford to pay me.
00:14:17.159 --> 00:14:24.919
And um, and she will be like, well, it looks like he spent$16,000 on motorsports, uh, you know, um equipment.
00:14:25.080 --> 00:14:31.720
Um so perhaps the priorities are an issue, um, and and provide that information to you without you having to comb through it.
00:14:31.799 --> 00:14:33.320
Um, certainly orders are a great thing.
00:14:33.399 --> 00:14:43.720
If you have orders um to have in there, um, then when you have any, you know, writing messages to your ex, you can do it in accordance with the expectations that are outlined in those orders.
00:14:43.879 --> 00:14:49.240
It's like having um, you know, we all kind of want an expert on speed dial.
00:14:49.320 --> 00:14:51.080
Um, it's a really high stakes place.
00:14:51.159 --> 00:14:52.039
It's high stakes for us.
00:14:52.120 --> 00:14:54.919
Like if I make a mistake, you know, I'm gonna deal with the consequences.
00:14:55.080 --> 00:15:00.039
If I make a mistake, I could have to deal with legal consequences, or my child deals with consequences.
00:15:00.279 --> 00:15:02.120
It's a lot of pressure.
00:15:02.279 --> 00:15:08.600
And it's really nice to be able to do that with a companion who can see the broader picture, who understands what those goals are.
00:15:08.759 --> 00:15:13.080
Um, you know, certainly we want to save people, we save time, money, and mental health.
00:15:13.159 --> 00:15:15.639
Uh, and and that's what we've trained Amy to do.
00:15:15.879 --> 00:15:18.840
Yeah, and and the more you use her, she gets to know you more.
00:15:19.000 --> 00:15:26.120
So she knows more of the dynamics, uh, the abuse tactics that are being used, and formulates then an answer.
00:15:26.200 --> 00:15:39.639
If you go in and say, Hey, this happens and I need to communicate with my ex, Amy will say, Well, given this information we know of your ex, these are the potential concerns.
00:15:39.879 --> 00:15:42.759
And then helps generate an answer.
00:15:43.159 --> 00:15:43.399
Right.
00:15:43.480 --> 00:15:47.480
If you say, I'm concerned that this could escalate, or he might have a negative response to this.
00:15:47.639 --> 00:15:53.240
She knows all of those times you told her and what kind of flavor of escalation he uses, right?
00:15:53.399 --> 00:16:02.279
So you can help plan through is that a uh a risk that, you know, when I do kind of a risk-benefit analysis, is it a risk that I do want to take?
00:16:02.360 --> 00:16:05.000
How can I mitigate those concerns for escalation?
00:16:05.159 --> 00:16:11.480
Um, she's really a thought partner uh who, like, like you said, remembers all of the things that you shared with her.
00:16:11.559 --> 00:16:14.360
Um, so you don't ever have to go back and explain yourself again.
00:16:14.679 --> 00:16:15.000
Right.
00:16:15.159 --> 00:16:17.879
And then there's even the um, okay, you're sending this.
00:16:17.960 --> 00:16:26.200
Do you want me to help you come up with a response in case that person does their typical response?
00:16:26.519 --> 00:16:31.000
How you go about respond responding to that response, a whole conversation.
00:16:31.320 --> 00:16:32.279
Yes, yeah.
00:16:32.440 --> 00:16:34.919
Um, or it's like preparing for live conversations.
00:16:35.000 --> 00:16:37.080
Um, I'm gonna be at student conferences.
00:16:37.320 --> 00:16:41.080
This is what you know has happened in the past, like role play with me.
00:16:41.159 --> 00:16:43.080
What am I gonna say if this thing happens?
00:16:43.240 --> 00:16:45.080
You know, what do I say if he says this?
00:16:45.240 --> 00:16:46.759
Um, what if he doesn't show up?
00:16:47.000 --> 00:16:47.879
What should I do?
00:16:48.039 --> 00:16:51.240
Because there's so much kind of pre-conversation anxiety.
00:16:51.399 --> 00:16:58.200
Um, you know, we uh, you know, survivors are like, I got caught flat footed so many times.
00:16:58.360 --> 00:17:02.279
I I'm so anxious and anticipating the next time that happens.
00:17:02.360 --> 00:17:04.279
I want to do everything that I can to prepare.
00:17:04.360 --> 00:17:11.320
Um, and it's actually really fun to role play with Amy in a chat, be like, okay, if this, then what do I do?
00:17:11.559 --> 00:17:13.960
Or tell me like three things that he could say to me.
00:17:14.039 --> 00:17:18.759
I don't want to be caught off guard and and really preparing for those moments.
00:17:19.000 --> 00:17:19.320
Right.
00:17:19.480 --> 00:17:21.799
And then you can even use it to anticipate.
00:17:21.960 --> 00:17:27.159
For instance, I have a friend whose son wants to confront uh his father.
00:17:27.559 --> 00:17:33.720
And so uh we were playing around, like, okay, Amy, the son wants to say this.
00:17:33.960 --> 00:17:35.079
How can he say it?
00:17:35.240 --> 00:17:36.839
What should he anticipate?
00:17:37.000 --> 00:17:42.519
You know, what can his response be to what we're expecting his father's response to be?
00:17:42.839 --> 00:17:43.400
Yes.
00:17:43.559 --> 00:17:47.160
And like, first, you know, Amy is trained to support adults.
00:17:47.240 --> 00:17:49.799
Um, so certainly there should be an adult interface to this.
00:17:49.880 --> 00:17:55.880
Like, how can I support my child who's trying to say, um, you know, they really want to do this, they want to, these are my concerns.
00:17:56.039 --> 00:18:00.119
Um, you know, how can I support him in this uh this situation?
00:18:00.440 --> 00:18:03.240
Um, yeah, those are really, really good examples.
00:18:03.400 --> 00:18:07.160
You know, I think a lot of people think, like, oh, I just go and I can talk about abuse.
00:18:07.319 --> 00:18:08.920
Like, no, really dig in.
00:18:09.319 --> 00:18:12.440
This is what I'm trying to get to, but I'm concerned about this.
00:18:12.599 --> 00:18:13.799
How do I think through this?
00:18:13.960 --> 00:18:15.480
Is there a name for this thing?
00:18:15.559 --> 00:18:19.079
Because I find myself chewing on it over and over and over.
00:18:19.319 --> 00:18:21.480
Um, what kind of goals can I set for myself?
00:18:21.640 --> 00:18:24.839
How do I take care of myself in, you know, in the context?
00:18:24.920 --> 00:18:25.960
I feel burnt out.
00:18:26.119 --> 00:18:28.680
I feel, I feel resentful, right?
00:18:28.759 --> 00:18:44.119
Like all of the very colorful, rich tapestry of experiences of victimization and survivorship, those are all areas of exploration and you know, and documentation that Amy's able to go with you.
00:18:44.680 --> 00:18:50.279
And it's so, like how we were talking at the beginning, the lack of accessibility to resources.
00:18:50.440 --> 00:18:52.279
This is something that is very accessible.
00:18:52.440 --> 00:18:55.640
I've spoken with a lot of coaches and they're fabulous.
00:18:55.720 --> 00:18:58.039
And I would still recommend, you know, their services.
00:18:58.200 --> 00:19:00.119
They are really good at what they do.
00:19:00.279 --> 00:19:03.640
Um, but sometimes those come with a higher price tag.
00:19:03.720 --> 00:19:09.319
And it's difficult, especially if there's financial abuse involved, which a lot of times there is.
00:19:09.559 --> 00:19:09.880
Yeah.
00:19:10.119 --> 00:19:10.359
Yeah.
00:19:10.599 --> 00:19:12.119
I mean, I used to be a coach.
00:19:12.200 --> 00:19:13.960
This is how I came to this.
00:19:14.200 --> 00:19:21.960
Uh and people would pay me sums of money that like required, I needed to pay my bills to be able to provide that service, right?
00:19:22.119 --> 00:19:26.680
Like we get to charge for our services, but from a human rights perspective, right?
00:19:27.079 --> 00:19:35.400
We live in a world where people are allowed, specifically in the context of an intimate partnership, because we wouldn't tolerate this from anybody else.
00:19:35.559 --> 00:19:38.839
The systems would definitely intervene if you hadn't slept with the person before.
00:19:39.000 --> 00:19:42.839
But the systems, we live in a world that tolerates this ongoing abuse.
00:19:43.000 --> 00:19:50.680
I mean, many of these people are just being like machine gunned, buried in false accusations and you know, uh uh constant harassment.
00:19:50.839 --> 00:19:54.519
They have needs, their children have needs that are ever constant, right?
00:19:54.599 --> 00:19:56.039
They're being harmed actively.
00:19:56.200 --> 00:20:01.319
So we have these people in this situation, and then we say, okay, but there is support.
00:20:01.480 --> 00:20:07.640
It's available, it'll cost you$150 an hour, and you will need 11 hours a week of it, right?
00:20:08.200 --> 00:20:18.359
That that's from a human rights perspective, it's it's unconscionable that we don't have the ability to provide that service for free, right?
00:20:18.519 --> 00:20:21.720
Like free would really be the really be ideal.
00:20:21.880 --> 00:20:32.200
But even within a barely, you know, uh uh accessible range of you know of financial burden, we we just it doesn't, it does not exist.
00:20:32.519 --> 00:20:37.559
And Amy provides the types of services that did not exist, right?
00:20:37.720 --> 00:20:44.279
It's really not just duplicating the same kind of services you can get from an advocate or duplicating the same kind of things you can get from a coach.
00:20:44.359 --> 00:20:46.119
If you have access to those, use them.
00:20:46.440 --> 00:21:00.599
She is really filling out a space which was just white knuckled by the survivor before, or was just internalized, or it was just considered the cost of doing business, you know, with an abusive person.
00:21:00.759 --> 00:21:03.079
You just we don't question it.
00:21:03.240 --> 00:21:05.240
It's just the thing that you have to do.
00:21:05.400 --> 00:21:11.160
And that constant vigilance, that constant stress is literally killing survivors, right?
00:21:11.319 --> 00:21:12.200
It's increasing.
00:21:12.279 --> 00:21:13.400
I'm getting kind of excited here.
00:21:13.559 --> 00:21:14.599
I hope the mic isn't too much.
00:21:14.759 --> 00:21:16.839
Oh, I'm with you, I'm about to jump in here, too.
00:21:17.079 --> 00:21:17.160
Right?
00:21:17.240 --> 00:21:18.519
I'm I'm I'm escalating.
00:21:18.599 --> 00:21:24.599
I feel myself on the kind of the on-ramp of a of a freeway here of righteousness.
00:21:25.079 --> 00:21:26.279
It's killing people.
00:21:26.440 --> 00:21:31.480
It's it's increasing their risk of stroke, of cardiovascular, of heart disease, right?
00:21:31.640 --> 00:21:50.759
Of diabetes, of of uh fibromyalgia, of autoimmune disorders, you know, substance dependency that has a whole host of other potential, you know, physical and mental health implications, CPTSD, anxiety, depression, like increased risk of suicidality.
00:21:51.000 --> 00:22:04.839
These are all things that are the result of the constant stress that comes from having to white knuckle day to day your existence as a victim, as a survivor through post-separation abuse.
00:22:05.160 --> 00:22:17.960
And uh, you know, I'll be darned if I'm gonna rest my head on my final pillow at hopefully, you know, a late 90s age, um, without having done everything that I can to try to stop that.
00:22:18.519 --> 00:22:18.920
Yeah.
00:22:19.079 --> 00:22:24.359
And it's you are as a victim, you didn't choose to be abused.
00:22:24.519 --> 00:22:26.599
An abuser abused you.
00:22:26.839 --> 00:22:40.519
And then because of that, you're now faced, you are faced with the task of if you're still in the relationship, safely extracting yourself from that person, healing yourself.
00:22:40.759 --> 00:23:05.400
And if you have any connections with that person, if you were married, if you have financial commitments with each other, if you have children together, then you go into the world of post-separation abuse, where every thought you have, every action that you do, every word that you say, you have to process through because if you take the wrong step, then you end up looking like the bad guy.
00:23:05.559 --> 00:23:09.799
If you say the wrong thing, now you're the controlling person.
00:23:10.440 --> 00:23:22.680
And it's so hard, it's so scary going into that world, having no person to fall back on, no information to fall back on for any sort of guidance.
00:23:22.839 --> 00:23:33.480
And I mean, there's stories throughout the world where parents, the actual victimized parent, is being removed from children's lives.