Reclaimed & Unashamed

AI and the Future of Intimacy: The Struggle for Authentic Connection in a Digital World (With Dustin Freeman)

Kolton Thomas Season 1 Episode 21

Send us a text with feedback about the show or questions you'd like to see addressed.

Imagine a future where artificial intelligence is aimed at our deepest connections and desires. This is the potentially unsettling reality that Dustin Freeman and I wrestle with in this episode centered around AI and its impact on intimacy and sexuality. As technology seeps further into our lives, we confront the jarring reality that AI's role in intimacy could lead to a seismic shift in how we perceive human interaction—possibly sidelining the enriching complexities of real relationships for the convenience of simulated companionship.

As I sit down with Dustin, the pastor of a growing church and a father of five boys, we peel back the layers of how artificial relationships actually pale in comparison to the messy but beautiful tapestry of genuine human bonds. The fear that we're inching towards a world where emotional growth with real people is just an option looms over us. It's a poignant reminder that while AI can attempt to mirror the nuances of human intimacy, it lacks the transformative power of shared experiences that shape our character and resilience. 

Last, our discussion takes us to the evolution of pornography aided by AI—an area of concern where synthetic experiences could eclipse the need for authentic intimacy and sexual interactions with a real person. This scenario isn't just speculation; it's a genuine possibility that demands a response. We advocate for a life where technology should be approached very cautiously when it comes to aiding in our relationships. While we believe AI could serve some useful purposes in the relationship realm, such as an occasional tool for receiving feedback about ourselves, we also believe it is dangerous to see AI as a source of intimacy in and of itself, let alone a tool that can be leveraged to help improve our relationships. 

Our heartfelt message is one of empowerment, encouraging men to cultivate relational fortitude in the face of seductive digital advances, championing a future where meaningful human touch remains irreplaceable.

Music credit: Music from #Uppbeat (free for Creators!):
https://uppbeat.io/t/ben-johnson/some-kind-of-feelin
License code: QVDADXXPNNVQMPKV

Support the show

If you're enjoying this podcast, Reclaimed and Unashamed, please consider leaving a review and making a donation to help us deliver more life-changing content!

Donate here with Paypal - help me release consistent, well-researched, and zero-cost-to-consumer content by making a monthly recurring donation.

Subscribe on Youtube
Follow us on Instagram
Join Our Private Facebook Group for Men

Speaker 1:

Welcome to the Reclaimed and Unashamed podcast. We are helping men to rewire their brains and overcome the shame that often surrounds unwanted pornography use. I'm your host, colton Thomas, and we are back with episode number 21 with guest Dustin Freeman. We've had him on our podcast before. Dustin is a personal friend and a mentor, and I think this conversation is really important. It's about AI, how it relates to relationships and intimacy, as well as pornography. Before we get started, I'd just like to say it's good to be back, guys.

Speaker 1:

For those of you who have been tuning into the podcast since the beginning, you know that we've taken a few months off and it's been a little while since I've released an episode, so I guess we can call this season two maybe, but anyways, I've got several episodes recorded and ready to air, and so for the next several weeks, you can expect consistent episodes released, probably bi-weekly, maybe sometimes weekly, but it is good to be back with you, guys. I'm really excited to share with you these conversations I've been having. I think they're so important. I think they're going to help you move forward if you're looking to quit your habit with pornography and that's really our goal here right, I want to help you to get results with powerful knowledge and information and also actionable takeaways from the podcast. So it's my prayer that this podcast does just that for you, and I know that Dustin has some really insightful things to say on the subject of AI, so now I'm going to turn it over to him. Enjoy it, guys. Dustin's really excited to have you back on the show.

Speaker 2:

I'm glad you're back, Colton. Thanks for asking.

Speaker 1:

I really enjoyed our last episode. We talked about the epidemic of loneliness in men. I feel like that really struck a nerve with a lot of guys that listened. I heard really great feedback about that episode, and so today we are doing something that naturally, I think is related to that conversation and it's a danger, I think, for increasing loneliness to a much higher degree, and that is AI. You and I have had some conversations about AI. I've heard it come up at this church. I know you guys are thinking about it and talking about it here at St Andrews, and so it's such a vast subject. It's such a hot topic. You can go on and look up tons of interesting articles on it. Today, we were hoping to focus on AI and relationships and AI and pornography and the implications there, so we're thinking about the future and the dangers in the future, but also right here in the present and the now. There's a lot already happening, and so I'm sure we could easily talk about this forever, but I'd love to hear you elaborate more on your thoughts regarding AI.

Speaker 2:

Well, it's funny. I'm really bad at knowing how long ago things happened, but when ChatGPT really dropped, like most people, I was immediately caught up at the same time with Wondermint. It was like a kid in a candy store, like, wow, this is so amazing. It's like the future's almost here. That's really cool, um, and and also like horrified by all of the possible ways uh, that my brain ran off thinking about you know what kinds of uh you know new problems, um, something like this uh can can create. Um, you know I don't know about you and other uh those viewing, but you know I've watched uh different documentaries and things. What is the one on Netflix about the rise of the internet and phone addiction?

Speaker 1:

Yeah, the Social Dilemma.

Speaker 2:

Social Dilemma. Great little documentary and one of the things that the people in there talk about is how, starting out, they really felt like in building the internet, they were going to save the world. It was this powerful democratizing force that was going to give everyone a voice and bring us all together. Like the amount of pure idealism that was in that. And I, you know, I'm I'm 43. So I was kind of coming of age with the internet and I felt that you know and then you know, but you see all these consumer forces that ultimately end up directing how it's used and where it advances. And you know, a lot of the people in that documentary find themselves in a place where they wouldn't let their own kids use the services they, you know, built because of the kind of addictive and fundamentally dehumanizing tendencies that no, like one great big evil person like was like ah, in the background creating, but just market forces kind of created as an almost inevitability. And I feel like all those same kinds of things are just in the air as AI is on the rise.

Speaker 2:

I'm not someone who's necessarily for or against technology. I just recognize that, as long as human beings have brokenness inside of us, any tool that extends our capabilities is going to mean that there is greater good and greater evil done, you know, greater harm. That's just sort of the nature of things. And so the pornography connection is an important one. Pornography connection is a is an important one, um, and it made me think differently about, uh, the possibilities of AI. Not just that there would be AI porn, um, but you know, when you think about pornography and how it functions in people's lives, I think in many ways it's so ubiquitous because porn provides a really cheap way and by cheap I don't just mean inexpensive in the financially, I mean like it just doesn't cost you anything through porn to have immediate gratification with some of your most basic intimacy needs.

Speaker 2:

It's just really easy and that's hard to say no to, because those needs are real and then to just be able to push a button and have some sense of them being addressed is really attractive and that's a be able to push a button and have some sense of them being addressed is really attractive and that's a reward on multiple levels.

Speaker 1:

There's an intimacy and relationship reward. It's artificial but you kind of feel that reward. But there's also a dopamine reward all kinds of chemicals released when we know from other scientists podcasts I listen to Andrew Huberman. He's constantly talking about how dopamine is something that's meant to offer us rewards towards long-term goals that we're gradually working towards. So the feeling we get of satisfaction and some of those feel-good chemicals are supposed to be in conjunction with building something long-term that's tangible, of value, right, but pornography doesn't offer that, so it's yeah, it's instantly disposable, but it's giving you that feeling, the feeling without all of the realities that accompany you, right, right, but pornography doesn't offer that.

Speaker 2:

So it's instantly disposable, but it's giving you that feeling, the feeling without all of the realities that accompany Right? Well, it just strikes me, and it struck me immediately, that AI would create the possibility of having, in the same way that pornography meets, a lot of our most basic intimacy needs cheaply that.

Speaker 2:

AI would create the possibility of having much more complex, advanced, profound intimacy needs met in very cheap ways. And so I mean again, like already, there are a number of services out there inviting you to have a relationship with an AI partner in some form.

Speaker 1:

Have you heard of Love Plus?

Speaker 2:

No, I really haven't looked at a lot of the different services, I just know they exist.

Speaker 1:

Yeah, so I was just researching this to prepare for our podcast and so that it's in japan.

Speaker 1:

It's very popular in japan and I think it's actually out for the nintendo switch, I believe. Uh, so mainstream console and men in japan it's so popular. It's not uncommon to see a man going on a date with his virtual girlfriend at like a dinner. Or you go out, you go out to a restaurant and there's a man sitting there with his switch and like having this date, and I mean, that might sound bizarre to us in our culture right now, but I think we're gonna see that more and more commonplace here and that's something that was developed actually before. Now AI is getting so much more powerful, so this Love Plus has actually been around for a while and it's been popular in Japan for a while, but it's going to get increasingly more sophisticated and to the point where I think it's going to have a market here in the United States and we're probably going to see that kind of thing more often, especially as the dating landscape is already janked up.

Speaker 2:

That's right, and it's been shaped by technology in ways that I think most people who are participating in it at least what I hear from folks would say has not been whole or healing, or constructive or life-giving.

Speaker 1:

The stories where people meet and get married are kind of the exceptions, not the norms.

Speaker 2:

Yeah, and there's a lot to say about that. Um, but to your point, like right now, when you hear about, or maybe if you saw, someone going on a date, uh, with their lab, you know, with a, an AI in some form, uh, that would strike us as weird, it would seem deviant. We wouldn't feel good about it. Um, think the trajectory we're on is that it wouldn't be a mainstream thing. For now, people would feel like it's weird, but eventually there'll be people who are against it and other people who are saying look, this isn't hurting anyone, it's making people happy. In fact, you'll and I've even seen this that those who are creating some of these things would say look, we're offering a cure to loneliness. There's loneliness epidemic, here's medicine for it.

Speaker 1:

Like it'll be a positive for mental health.

Speaker 2:

Right. But so the critique that I would want to bring to that is really what I'm thinking about. It's not just hey, this is weird deviant behavior, so stop it, just for its own sake. It's more. What I see is and I got to step back a little bit here to get into this I don't know if you're familiar with the idea of the male gaze.

Speaker 2:

It's an idea that has floated around in gender studies and stuff, but basically it's the notion that, like for a long time, women and men have learned how to think about women by watching TV and movies and reading magazines, but that the people who are making those shows and movies and whatever were looking from a male perspective.

Speaker 2:

And so I mean the most you know, maybe obvious way of explaining something like this would be like a women in a beer commercial or something where you want for a man or a woman watching something like that. You're basically being trained in a very subtle way to see this is not as not as a person, but as an object. Right, it's objectifying, but part of the craziness of it is is, again, that it's not just training men to see women as objects, it's also training women to see themselves that way. Um, but I would say that there are other gazes, like we're trained by media and different kinds of things to see ourselves in the world in particular ways, and one of the most powerful gazes that I think is out there is the consumer gaze.

Speaker 2:

So like if you just exist in the world, I think, is out there is the consumer gaze. So, like if you just exist in the world, you're constantly being asked, uh, what your preferences are. You know, there are these big, powerful entities that are trying to understand, um, what you want so they can sell it to you or they're trying to shape what you want so they can sell it to you. It's a little bit of both.

Speaker 1:

Right.

Speaker 2:

Um, and you learn to relate to the world as a person who the truest thing about you is. You have preferences that need to be met, you know so that your whole way of being in the world is consumer shaped, and we bring that to our relationships with people so that we expect a person across from us to relate to us the way a business does as a customer. You know what do you want, how do I meet your needs, and so it's very self, it's extremely self-oriented, um, and I think dating apps on a lot of things um, leverage that and are and are taking us further in that direction. But, um, the possibility of having, um, an AI intelligence that I can make it look the way I want it to, I can make it sound the way I want it to in its whole reason for existing and its whole purpose is to relate to me in the way I want it to relate to me, right, like, think about, like, how that shapes me as a human being. And and what's nuts is like.

Speaker 2:

Again, when people talk about AI and its dangers, you know like a lot of people run to these like Terminator scenarios, you know, and it's like, well, that's pretty sci-fi, who knows, maybe we end up there, but AI doesn't even have to, it doesn't have to be self-aware for there to be a lot of complicated challenges that we're going to face, and one of them I mean human. You alluded to this. Like we already are competing with technology for FaceTime, for real interaction with other people, the better technology gets at being the place I want to be and being a conduit for my communication with other people, the better I get at this, but I'm going to tend to just, by number of hours spent, get worse at this. And it's not just that I'm becoming a little bit worse at social engagement, it's that everybody else is too, and so there's this negative feedback loop where we're all getting a little bit worse at relating to each other, while the machines are getting better and better at relating to us. And when that's going on, while the machines are getting better and better at relating to us, and when that's going on, it just makes sense that we would become more and more invested in relationships with technology and virtual people, uh, rather than real people, because the truth is, real relationships are always hard, they're always complicated.

Speaker 2:

I mean marriage, but also mean friendship, I mean family, like any kind of human relationship. Uh is hard work all the time, but it's also like the place of greatest value in our lives. You know, being able to work through problems with someone else and, uh, learning to forgive and to, to to earn the trust and respect of another human being. You know, like that's the most meaningful, beautiful stuff in life and I'm. If I don't, if I don't have to do that work with someone else, I'm never going to grow Like. That's the kind of thing that's going to make me grow as a human being um, showing up for the real relationships with the people who I'm growing in love for and with you know.

Speaker 1:

Right.

Speaker 2:

But if I have, uh, an entity whose whole purpose is to know what I want and to give it to me, I'll never have to do any of that.

Speaker 2:

And as that gets, and as and as AI gets better and better at providing that, it knows when I go, where I go, it's always available to me.

Speaker 2:

You know, it knows all my preferences, it can do all kinds of things for me and, of course, as we're having more and more of our romantic lives happening on our phones anyway, the difference between real and a relationship with an artificial intelligence would be smaller and smaller. So again, I'm just saying you know, maybe we get to a future where there are love bots and that's a whole thing, and who knows how near or far away something like that is. But I just think, with the technology that's currently existing, that this is already a challenge and it's just going to get bigger and bigger, such that, again, you get to a place where, if we're buying into this wholesale, if more and more people are having relationships with even friends that are AI instead of real people, right, with even friends that are AI instead of real people, right, I mean, the sounds really over the top, but it just starts to sort of unwind the social fabric of our world.

Speaker 1:

Yeah.

Speaker 2:

Um, so that's not even an explicitly Christian argument. Obviously I'm a pastor and obviously there are significant spiritual realities of the things I'm talking about, but just in terms of like, um, social like, like if I was a politician or like some kind of public philosopher, like I would be really worried about this anyway, not because I think, not just because I would say, like God doesn't think you should love a machine, but because I think it is. I think it will be fundamentally destructive to individual lives, giving them the appearance of something without the reality of it, but then also to our civilization as a whole. Yeah, it'll create people who are stunted and unable to negotiate and work with each other.

Speaker 1:

Yeah, and we're already feeling the effects of that. You're right. And the social dilemma, that documentary about how social media like we're all we're already used to the idea of how algorithms are training us because we see it through social media. And social media has been driving more and more sophisticated algorithms. But everyone's familiar with you know an exponential power multiplier. When's the last time you looked at like two to the power of 10? Like it gets crazy really fast.

Speaker 1:

I really see AI as a multiplier for the kinds of issues we're already facing in our society from the algorithms that social media have created. So you can imagine if social media is the two and then AI comes along very sophisticated and it will make it to the power of 10 and it will reinforce all this. I mean it'll make it like you said. It'll blur the line so much between what's lifelike and what we need and what's real man. It's scary to think about. And the Social Dilemma wasn't a Christian film either, you know. So I think people realize like it's manifestly obvious these things are kind kind of tearing us apart.

Speaker 2:

Yeah, we're all addicted, we don't know. There's no, there doesn't appear to be a place to stop the train right um, and it's it's hard to conceive of honestly there being any way to get off the train. The way our civilization works in a consumer culture like stuff that sells is going to keep selling, and that's going to drive so yeah innovation, and so I, honestly, I mean to bring it back around to a Christian thing.

Speaker 2:

um, I will say I think one of the main places of the church is, um, an opportunity for the church, uh, to be the church as it's meant to be in the days and years ahead will be um, be the church as it's meant to be.

Speaker 2:

In the days and years ahead will be um as a place where people are willing to, I hope, to do the hard work of having relationships with other people, to get into the messiness of that Um in in spite of the fact that it's hard, but to believe intrinsically that it's worth it. Um, because the value in other humans. I like what you're saying about the multiplier thing. Um, I think a thing that like helps give some flesh to that or to make it feel, is if you think backwards, like from where we already are. Yeah, like, okay, you know, it's not that long ago that the power of pornography was. You had to, you know, go to a store and face a person and buy something. You know, buy a magazine or drive, however, no, who knows how long to like the specialty, uh, video shop. I got to think of rural areas. Yeah, um, you know where I grew up, where there's that's not just something you see, but there's like 50 miles over.

Speaker 2:

There's just one place with a big triple X, sign up Right. Um, like it was. It was only a very motivated person who wasn't embarrassed. It a very motivated person who wasn't embarrassed. It's a very specific person. To source those things Right Now. It's ubiquitously with us, yes, but that's only a drop compared to the reality of the power these things will have.

Speaker 1:

Yeah, so you know. Another example is Snapchat. It's kind of an early adopter of a customizable AI. I was on Snapchat the other day and for a feed you can get Snapchat Advanced or Snapchat Plus and it allows you to type in some sentences to customize an AI friend in your app. So I think that's going to become something we see everywhere right More and more.

Speaker 1:

And, like you said, I mean I've been seeing all kinds of research and statistics lately showing how, around 2007, all kinds of negative markers in terms of mental health and social skills have declined, and that's when the iPhone came out right. I think we're going to look back and see the same thing around AI releasing and becoming more and more common. Again, it's so eerie because we're already seeing the damaging effects of technology, social media, iphone in every pocket, and I just can't imagine it getting a whole lot worse. But it could, oh, absolutely, and I think that's why this conversation is so important. I mean it's so important to have these conversations and for people to be thinking ahead. And for people to be thinking ahead because if you're already allowing algorithms to dictate a lot of your time from day to day and take away from your relationships. If you're not guarded against that, it's going to get worse. It's going to happen even more so. So I think this is an important conversation.

Speaker 2:

Yeah, I think that's great. So the implicit question, all the stuff we're saying, hopefully to your're, to your point, like it's not just hey, everything's going to be terrible, be afraid. You know, like that wouldn't be very helpful to say Um, but but it's more to say like part of the power of technology is to create new defaults. You know so, and I think about even small changes that have been made to different apps, you know from like when you have to like, choose to go to the next page, to infinite scroll, like that's a huge deal in keeping me on my phone, you know, and many of them are changing the default. So there's a sort of passive movement towards so to recognize, as you say, that these things are happening. It creates the possibility of stepping back from them and being intentional.

Speaker 2:

Um, I'm not a you know someone who's thinking of going full Amish. You know where I'm going to completely um step away from all uh, modern technology. Of course the Amish have the that um reputation. The truth is they do adopt technology. They just do it very slowly Um, very, very slowly, um, but I think they there's some wisdom in that. To just accept the delivery of new technology to us isn't something that we should just accept in a neutral way, but something that we should be more cautious about, more careful about and more thoughtful about.

Speaker 1:

And we can't trust these big tech companies to do that for us. We've got to learn how to. We can't trust them to protect us or our minds, because economy is what's going to drive their decisions at the end of the day, Even if they mean well, even if the whatever, whoever the CEO is, even if he makes a speech and he means well.

Speaker 2:

we've got to learn how to guard ourselves against this stuff and we can trust corporations, we can a hundred percent trust them to always do whatever is necessary to make a profit, and that is to, frankly, mine you know Our attention, our attention and our interests. Anything that's going to be an addictive and often it's lowest common denominator kind of stuff. It's going to be fear or lust, or shame, or one of those kind of buttons that's the easiest way to move a human being.

Speaker 2:

So those are going to keep getting pushed harder and more effectively and more often.

Speaker 1:

So yeah, so I'm already using chat GPT on a near daily basis to give me bullet points of organized thoughts that I'll take and write out for just different kinds of things. You know whether it's my work with Reclaimed preparing for this podcast episode, it's kind of ironic like using AI. Hey, give me some points that we could talk about in a podcast episode warning about the dangers of pornography and AI. So I do want to slant this here at the end towards pornography specifically. I want to talk about the potential for AI to just like how we're talking about how AI is going to make certain things seem almost irresistible. We're going to have to guard ourselves, train ourselves to guard against them. We think about pornography and how irresistible that already is to so many men and we think about adding an AI component to where you can almost in real time. So just to expand guys that are listening like, expand your imagination. Once AI and video capabilities get so advanced and we're already seeing a lot of like clips on the Internet showing how much you can do and it's scary, like I've seen some videos where, like hey, turn this barn into a castle and show me walking up to it in a video, and it's just like perfect, right. So pornography is, it's not a question of if it's when it's going to get to that point. And so again, we're talking about if pornography is capitalizing on your sexual fantasies and relationship fantasies. Its ability to lure you in is kind of dependent on how well it can do that right. And so you add AI on as that multiplier, that right, and so you add AI on as that multiplier and it's suddenly going to give you the ability to customize your fantasies in real time.

Speaker 1:

And I think here's one of the scariest things to me, even putting in asking AI to depict real people, right, celebrities, actresses. So Taylor Swift, you can go and find like a news article about this. She bought up a ton of domain names around her name because she knew that eventually AI deep fake pornography would get so sophisticated that she at least wanted to make it a little harder for people to find that kind of thing. Wow. So I mean that just kind of gives you an idea of how serious that that's going to be. So the fact that you could ask AI to against people's will portray them in a video, a pornographic video, is so frightening. Yeah, it is Like what do we even do about that? What is there to say to that?

Speaker 2:

I mean Like what do we even, what do we even do about that? What is there to say to that? I mean, well, I think, just on the on the personal moral level, right, like so many of these things are slippery slopes and, and I think in general, where sexual morality is concerned, the best defense is is to, is to build a fence. You know it's like like you're, you're much less likely to get. I mean, I think most like, the use of pornography is already shame inducing for most people, for many people, especially Christians who who do. But if you found yourself, you know, like deep faking your own porn of somebody you have a crush on, like a person you know, like. I mean, the moral implications of it are really terrifying because, I mean again, we take seriously that viewing something like that shapes the way you see a real person, um, it doesn't just objectify them, but I think it breaks down a certain amount of um, I have to think that it would increase the possibility of of of real behaviors being acted upon that are dangerous, violent.

Speaker 1:

Because it's no longer like. You're no longer watching pornography filmed by someone who lives in Hollywood. Right, you're watching pornography that AI has allowed you to use someone in your real circle of like relationships or something you know and you think about the penalties for, for example, like putting a camera up and watching someone you know. I mean there's severe penalties for that, but like, is the law going to be quick enough to catch up to protect people? Right?

Speaker 2:

And so I mean so I guess I'm just saying, like, on multiple levels, like there's a lot, there's the potential for harm to the other person.

Speaker 1:

Right.

Speaker 2:

For the person being portrayed or for the person who acting out, who's been using that kind of pornography, seems much greater. But even then, like if what I was originally going to say is, if a person gets to the place where they've done that, like the amount of shame and self-loathing seems like it would be so much greater because it's no longer an impersonal kind of wrong that's been done in some sense against yourself and some hypothetical stranger it's.

Speaker 1:

It's, it's much deeper than that, Um, but the way you avoid it is to stay away in the first place, you know, like not to uh start to go down that rabbit hole and so, um, as these new tools come about, just keeping a wide berth seems to me like that yeah, you know we've had podcast episodes where we talked about there's a lot of healthy and positive motivations and reasons to like remove pornography from your life right, or even social media and algorithms, for the most part to a certain degree, not beyond like what's practical or what you can handle, but like you got to start guarding against this stuff. Yeah, this episode I think is helpful because it's more about the dangers. We're trying to help men find a motivation by kind of striking some fear of the realities of what's to come. I think that fear of what could happen to you if you don't start with spiritual disciplines or other disciplines in your life to protect against pornography. Then you could wind up in a place like that.

Speaker 2:

Well, so yeah, to connect the dots as clearly as possible. It's sort of like these things are nascent, they're just beginning to take form today. They'll be significantly more powerful tomorrow. But, like, whatever you've got going, if you're looking for motivation to step back from pornography or an addiction to pornography today, it's sort of like well, like, that's where all this is going. And if you think it's hard today, it's only going to get harder.

Speaker 2:

So, there's no time to press it, to see healing and and to begin to build habits and practices. They'll protect you from those things, because I mean as addictive as it is, as we've been saying again and again it's hard to imagine that it's not going to get exponentially more powerful in getting and holding on to your life.

Speaker 1:

Yeah, well, guys, if that doesn't, you know, strike enough fear in you to start doing something about this. And I think one more last point I want to make is like I think one thing that enables men to sometimes spend years looking at porn but remain in this place where it's kind of secretive and they're still living their life, like you know, their lives are somewhat normal from the outside and they're watching porn all this time, sort of the porn game. I think that the consequences of you watching it will come on more quickly and more apparent if AI has more powerful dopamine surging capabilities. So I think that maybe that'll be a good. Maybe for some guys they'll see, hey, this is too much, and that line will be clear. Like I cannot cross this line, I need to stop, and I hope that that by listening to this podcast, I hope that's what happens for more men.

Speaker 2:

My, my fear is that, um as pornography becomes more able to provide the appearance of a complete relationship experience right. That it will take us from a place where it's sort of like well, this is aberrant behavior and in the sense that it makes it harder for you to have a healthy relationship, Like if you're looking at porn all the time, you're not going to be able to look at your partner without you know, comparing, and it's just kind of to destroy like intimacy there, those things.

Speaker 2:

But it's sort of like, if porn can take you all the way to a partner that talks to you and shows up and meets your higher intimacy needs, um, maybe it's like, well, there's no concern about this, this is where I'm going to stay, this is where I'm going to live, this is, this is my sexuality. Um, I don't know what that will be called, but it will have a name. It'll be some kind of coinage for that where, um, I'm a person who chooses to have that kind of relationship and again, um, whether you know, apart from a conversation about gender and sexual expression preference, um, those kinds of things, um, just simply understanding that the consumer power of a tool like that will be such that, um, it'd just be easier for anybody to go that way and when it's getting harder and harder to make real relationships with people.

Speaker 2:

when we're lonelier and lonelier, that's going to seem like a more and more appealing alternative, and it just, it cannot give you the same things that a real relationship can.

Speaker 1:

Yeah, ai intimacy that's one of the terms I hear thrown around a lot which is kind of a term that's getting coined, and I mean AI a lot which is kind of a term that's getting coined, and I mean AI intimacy. Guys, like, if you hear that, that should scare you. Like the fact that that's going to become more and more common phrase is going to become more acceptable.

Speaker 2:

Something you said really kind of like shook up another thought loose for me, and it is that you're talking about. Maybe the advancement of the advancement of these things will, by increasing the, you know, the, the potency of it. Maybe that'll shake some people awake or something. But maybe one of the ways that could be true, so sort of a different angle, is that we've already been shaped for a long time in a lot of ways, like I was saying, to objectify each other.

Speaker 2:

You know, it's like as if, as if I was looking for a mate simply to find a collection of beautiful body parts or something you know, just to have access to those things, like that was the point of relationship Other than a real human partner, you know, another eternal soul to, to learn how to love and trust and forgive and do all those kinds of things with that you can only do with another human, only do with another human. Maybe the emergence of something like intimacy with AI helps underscore the ways that we've already often been treating people we have real relationships with like we wish they were just just AI. You know, it's like I'm in a relationship with a woman but my fantasy is basically that she would treat me the way that the AI treats me already, that I just want her to do whatever I want her to do, to have my preference to. You know what I mean To simply cater to my needs, like how many people think that's what they want.

Speaker 2:

And of course, we all want a partner who's kind and who listens and those kinds of things, but the deeper truth is that we all need a real hold human being to relate to, who can and will challenge us when we need to be challenged, who will ask us hard questions, who won't buy our crap when we you know what I mean. Like that's the kind of person that will help us actually grow and mature and whose respect is worth winning and having. You know, and and so maybe, maybe, in some ways, an upside in the brokenness of AI intimacy is that we can learn to see how we've already um sought that from other people in a in a broken way.

Speaker 1:

Yeah, yeah, I think that's a great way to end this episode is I would challenge guys to find ways to use AI to improve your relationships. There could be some not going into this deep end rabbit hole of like actually getting your intimacy from it, but if there is ever an opportunity to get like feedback, then maybe see what you can do there. But I think you don't necessarily have to go Amish on it Although maybe one point of this podcast is that the Amish had it right all along but you know, um, there are some ways that could help, you know, improve your life, I think, and so seek out those ways. Guard yourself against a lot of the things that we talked about today, and you know that's what Reclaimed is here for. That's what, uh, the mission of Reclaimed is for, and we're not going anywhere. We're going to have a lot more content, more conversations, episodes, looking into the future, looking at AI, and our mission is just to see as many men as possible learn how to be self-aware, learn how to find and connect with that purpose in their lives, so that their lives would be better than porn better than porn lives, pornography free and also not needing AI intimacy when it comes, so it's not as tempting, but like having the relational skills developed to be able to withstand those kinds of temptations. And so, anyway, everybody, thanks so much for listening. Dustin, thanks again for having another conversation. Yeah, really enjoyed it man. Next time. Yes, sir, all right, really enjoyed it, man next time. Yes, sir, all right, man.

Speaker 1:

This episode was a little bit of a wake-up call for me. I don't know about you guys, but just to guard my heart and guard my eyes against things now, because what's to come, I think, will be even more tempting and powerful and alluring, and so I'd like to start talking about action steps at the end of episodes, because guys that know me and have done my 10-week program and have worked with me know that I'm really serious about taking action and thinking back on this conversation, I think one actionable step that really makes sense is to lean into relationships in your life, even the difficult ones, especially if that's your marriage, but really just any relationships, like if it's in your family, whoever it might be. Lean into those and learn to repair those, because when AI comes along providing this artificial companionship, it's going to be really tempting to choose that over relationships that aren't serving us well but are still important. You know people that we want to be close to, but it might take a little work, and with AI relationships, we're not going to have to work at all. Instead, ai is going to be working to try to please us, and so I think we really need to learn the skill of pursuing and repairing relationships.

Speaker 1:

I think that's the actionable step for today, because if you wait to do it, it's just going to get harder, and we already know that if you're hurting in the relationship area and you're feeling lonely, that pornography is also going to prey on that as well. But now we have AI and pornography, and sometimes those things are going to come together as a package and they're going to make a formidable predator on our emotions and on our loneliness, and they're going to try to leverage our selfishness against us and against other people if we allow them to do so. But we don't have to allow them to do so. We have agency and we have freedom in Christ, and Sigmund Freud believed that we were slaves to our instinctive sexual desires and impulses, and here on this podcast, we're calling that bogus. We are grown independent men with a choice, and my prayer is that you believe that and you strive for the right choice and you wouldn't be here if you weren't. And you can do it. You can do it, guys, so I hope that's helpful.

Speaker 1:

Now that we're back and we're breathing some new life into the podcast, I'd really appreciate you guys. If you haven't left a review for the podcast. Now's the time. Please go and do that, but that's all I have for now. Be looking for another episode to drop in a couple weeks and you guys take care. And please, if you want to step up and get more involved with Reclaimed and with what I'm doing and our community is doing, I invite you to check out our community and our app at communityreclaimedrecoverycom and there you can join. You'll have to answer some questions because I make sure the community is safe and so those questions are to safeguard and make sure everybody's going through an approval process and that everybody's there is there because they legitimately want to grow together with other men who are overcoming pornography. All right, guys, till next time, take care, god bless.

People on this episode