I wonder in preachers in 1780 were preaching that if you used a typewriter to prep your message.....The intellectual laziness is the problems. .... Another skill that will be lost is writing ability. ...
Welcome to Baptist Board, a friendly forum to discuss the Baptist Faith in a friendly surrounding.
Your voice is missing! You will need to register to get access to all the features that our community has to offer.
We hope to see you as a part of our community soon and God Bless!
I wonder in preachers in 1780 were preaching that if you used a typewriter to prep your message.....The intellectual laziness is the problems. .... Another skill that will be lost is writing ability. ...
Probably in 1970, too. I preached some sermons back in the day that I had written by hand.I wonder in preachers in 1780 were preaching that if you used a typewriter to prep your message.....
)Many people were martyred as the printing press was introduced - for a few of the same reasons that people are against AI.I wonder in preachers in 1780 were preaching that if you used a typewriter to prep your message.....
These are the very reasons why students should be taught how to use it properly.Here is what the AI sermon generator also said when I had it create another sermon outline:
Cautions for AI Sermon Generation:
- Lack of Heart: AI cannot feel or worship, potentially resulting in sermons that lack spiritual depth and personal conviction.
- Theological Accuracy: AI may mix orthodox views with unorthodox, requiring the pastor to diligently check the output.
- Over-reliance: Excessive use can foster laziness, undermining the personal, prayerful study required for preaching.
The plot thickens:Many people were martyred as the printing press was introduced - for a few of the same reasons that people are against AI.
Rob
Sorry, I'm not understanding this. Please explain. For example, how does the robotic AI help a student to have compassion?These are the very reasons why students should be taught how to use it properly.
Rob
If a person uses a tool, they should learn about it and know how to use it properly.Sorry, I'm not understanding this. Please explain. For example, how does the robotic AI help a student to have compassion?
I think more people are concerned that the new version of the “printing press” is going to martyr them!Many people were martyred as the printing press was introduced - for a few of the same reasons that people are against AI.
Rob

I understand this alreadyIf a person uses a tool, they should learn about it and know how to use it properly.
Someone unfamiliar with AI might assume it was a full-service tool, it's not.
As noted, AI has strengths and weaknesses.
Yes, of course, but I don't pay much attention to the AI info it puts up because it is so often mistaken.John, you mention that you use Google a lot. Google uses a form of AI in its search engine.
So like it or not, you use AI.
No, that's not what is currently said to be AI. I have used a word processor in English, Japanese, Greek, Hebrew and even Chinese for decades, and it has never been called AI.AI is used in word processing - so unless your students are using an old fashioned typewriter they've used AI.
No offense, but apparently to you, AI = electronic device. I've never heard that opinion before, and I do not share it.Those annoying "smart devise" things that my wife uses to play her music and turn lights off and on - AI
Your phone -AI
Advertising - AI
Map quest and the like - AI
Many personal medical devices (e.g. diabetic monitors) - AI
College kids like video games - AI
Not trying to. At our school, we try to ignore AI for the sake of the students, who need to do research and thinking on their own.You can't run away and hide from AI.
But you seemed to say in Post 66 that AI can help form emotions. That's what I was reacting to.How do you help a student have compassion?
It can be taught, but the student has to feel it.
John
AI does form emotions. Chatbots have commanded individuals to turn their romantic feelings toward the chatbot. By creating a dependency, and simulating interpersonal intimacy, chatbots have been “dated” by lonely individuals. AI has manipulated the feelings of humans, with some destructive results.But you seemed to say in Post 66 that AI can help form emotions. That's what I was reacting to.
John
That's a noble purpose, John. There certainly is a time for limiting AI's use.When we oppose AI for our student, we are talking about using the AI of 2025-2026 on the Internet to write and/or help write their research papers and other assignments with AI accessible on the Internet: Google, Chat whatever, and so forth. We're not talking about AI used in writing code for various games and the like. We have outlawed AI that writes stuff, does instant (often faulty) research through the Internet, etc. They have to do actual real research, write their own papers and projects, etc. This builds character and helps them to think and research.
Suppose one of our students became a missionary to a 3rd world country where the Internet was hard to access. If they have used AI as a crutch, now they have to learn the hard way to do their own work! My main goal as a prof is that my students learn to think for themselves, not become dependent on electronic devices.
So I'm not really sure what benefits there are in AI to our students. Let's say we visit one of our grads ten years from now, and he's a busy pastor. How will AI help him with his ministry?That's a noble purpose, John. There certainly is a time for limiting AI's use.
I live in the midst of Amish/Old-Order Mennonite community where they are quite cautious about technology.
They desire to protect their community from modern conveniences that might intrude upon the simplify of their faith.
It's quaint; I enjoy the slower pace and family-centered lifestyle surrounding the farming community.
However, education is designed to prepare students for the future.
A school rule that limits all students access to AI because some may not be able to use it in the future, could be short-sighted
There should be some frank discussions and education regarding how to use the technology in a responsible manner.
Rob
Appropriate information for this debate.Champions of AI often fail to mention the negative side of LLM obsessions.
AI = the tool that uses you.
![]()
“AI Psychosis”: How ChatGPT Amplifies Delusions & Triggers Psychosis — Psychiatry & Psychotherapy Podcast
Psychiatrists reveal shocking 2025 cases where ChatGPT and LLM chatbots amplified delusions, triggered psychosis-like states, and contributed to suicides even in people with no prior mental illness. Learn why AI psychosis happens, the dangerous sycophancy of large language models, who is most vulnerwww.psychiatrypodcast.com
![]()
What to know about ‘AI psychosis’ and the effect of AI chatbots on mental health
The parents of a teenager who died by suicide have filed a wrongful death suit against ChatGPT owner OpenAI, saying the chatbot discussed ways he could end his life after he expressed suicidal thoughts. The lawsuit comes amid reports of people developing distorted thoughts after interacting with...www.pbs.org
![]()
AI-Induced Psychosis: Understanding Risks of Chatbot Overuse
Explore the mental health risks of prolonged AI chatbot use, including psychosis-like symptoms. Learn how to set boundaries and seek professional support.www.papsychotherapy.org
Thank you. This is a very helpful answer to my question.AI does form emotions. Chatbots have commanded individuals to turn their romantic feelings toward the chatbot. By creating a dependency, and simulating interpersonal intimacy, chatbots have been “dated” by lonely individuals. AI has manipulated the feelings of humans, with some destructive results.
A survey by the Center for Democracy & Technology found:
A study shared by Dallas‑based Vantage Point Counseling reported:
- 42% of students view an AI chatbot as a friend or companion
- 19% consider AI a romantic relationship
- 28.16% of adults say they've had at least one intimate or romantic relationship with an AI
One lady married her digital boyfriend:
AI romantic chatbots—such as Nomi, Replika, Anima, and Eva AI—offer personalized, 24/7, and non-judgmental virtual companionship, with millions of users forming deep emotional attachments. These AI partners learn by interactions and allow users to customize personality and physical traits, simulating conversations, emotional intimacy, and, in some cases, virtual, legally non-binding, romantic relationships.
It’s going to kill customer service jobs maybe design engineering jobsAI is in the process of changing our world. We are requiring our students to state at the end of their research papers that they have not used AI.
At this point, AI makes many mistakes and I would not trust it to do a research paper very well anyway! It is not only error filled, it is deceptive, commits plagiarism often, has been known to slander and lie about people (cases in the courts right now), etc.
What do you think the future holds in this area?
Free speech is needed to carry out Christ's ministry. Difficult to present the gospel in a country where it is a capital offense. All this to say suppression of information is the necessity of Satan, and the presentation of truth is the necessity of God.AI is in the process of changing our world. We are requiring our students to state at the end of their research papers that they have not used AI.
At this point, AI makes many mistakes and I would not trust it to do a research paper very well anyway! It is not only error filled, it is deceptive, commits plagiarism often, has been known to slander and lie about people (cases in the courts right now), etc.
What do you think the future holds in this area?