5 Things To Know Before Asking a Chatbot About Cancer
What are the risks of this cancer treatment plan? Why did my doctor recommend chemotherapy instead of other treatments? Why won’t my doctor give me a prognosis? Is there something my doctor’s not telling me?
A cancer diagnosis is a lot for a person to process. The flood of information that comes along with it can be overwhelming. Patients and families naturally have a lot of questions. After meeting with their care team, many turn to the Internet for answers — social media, Dr. Google and, increasingly, ChatGPT and other AI-powered chatbots.
These patients may be asking chatbots the questions that they were not comfortable asking their doctors. Or they may not have easy access to a medical professional to ask follow-up questions. Unfortunately, the answers they’re getting from chatbots may be hurting more than helping.
Ghulam Rasool, PhD, a researcher in Moffitt Cancer Center’s Machine Learning Department, is leading a study to evaluate the accuracy of chatbot answers to cancer-related questions. Although he understands the barriers that some patients face to asking sensitive medical questions, he cautions against putting too much stock in AI-generated answers.
“At the end of the day, these models are simple predictors. They predict how often, on the average, one word appears after another word,” he explained. “So they don’t have any sense of what they are outputting.”
Rasool’s team has partnered with Health Choice Network to interview 50 cancer survivors about questions they had during their journeys but may or may not have asked their medical teams. The research team is now processing more than 500 questions through various chatbots to gather the AI-generated answers. The responses will then be evaluated by experts across Moffitt for accuracy, relevance, safety and harmfulness.
Although the research is still in the preliminary stages, Rasool notes there are several important tips people should already be taking into consideration when turning to ChatGPT with medical questions:
- Without sharing any private personal information, give the chatbot some context when asking a question. “If you just go to ChatGPT and type that ‘I have these symptoms. What does it mean?’ the information that the chatbots are going to give you is going to be vague and may not be related,” Rasool noted. “So providing the right context actually helps the chatbots focus their answer on the question.”
- When you get a vague answer, feed it back into the chatbot and ask for clarity. “You can say: ‘This is the information that you provided. Can you please go back and look for references? Review this information and make sure it is correct,’” Rasool said. “That forces these chatbots to process and reason about their output and, in most cases, identify gaps, hallucinations and provide a better response.”
- Check the sources of the information. “Most of the time chatbots like ChatGPT will give you clickable links so you can see the reference,” Rasool said. Make sure the information is from reliable sources, such as National Cancer Institutes or other trusted health care providers.
- Beware of outdated information. The chatbots that are available now have been trained with data that is one or two years old, Rasool warns. In contrast, doctors and scientists are making advances in cancer screenings and treatment every day. “The information that patients or survivors are getting from these chatbots, despite the fact that it is linked to some webpage, it may not be the most current guideline that physicians or the clinical care team is following now,” Rasool said. “New information is coming every day. New clinical trials are being added all the time. So the most updated information may not even be available even on the Internet.”
- Always talk to your medical team. “Your clinical team is in the best position to answer questions,” Rasool said, noting that you can use your own research to help get more out of conversations with providers. “Once you have gathered information from different sources, you can make a list of questions and talk to your medical team to get the most updated and current information.”
As Rasool and his team continue their research, he expects that the findings from his study will help set the stage for cancer centers to one day design chatbots that are specifically trained to answer questions from patients and survivors. Until then, he recommends that all searchers proceed with caution:
“There is a lot of fake information out there. Perhaps this type of study can actually help all of us think about these things and whether we should trust this information before taking any action.”