Forget 'Dr Google'! Five expert-approved tips for getting the best medical advice from ChatGPT

It's something that nearly one in ten people in the UK own up to doing—a figure that doubles amongst under-35s. Seeking medical advice from ChatGPT has become increasingly common in recent years, as AI models become more efficient and GP appointments scarcer. In fact, recent studies have shown that ChatGPT can now pass medical licensing exams and solve clinical cases more accurately than humans can. But the phenomenon has also raised concern amongst the medical community, particularly given the chatbot's propensity for 'hallucinating', or making things up. Some particularly disastrous examples of ChatGPT's poor advice have seen a man poison himself with potassium bromide—after being advised to use it instead of salt— and several tragic stories of teenagers being encouraged to take their own lives. For better or worse, however, it's a phenomenon that shows no signs of going away. Luckily, say experts, there are some simple ways to ensure you get the safest—and most accurate—medical information from chatbots. So read on for the best way to get medical advice from ChatGPT, and the scenarios where you need to speak to a human expert rather than an algorithm.  Experts say there are simple ways to ensure you get the safest – and most accurate – medical information from chatbots like ChatGPTUse it for treatment, not diagnosis The best way to use a chatbot like ChatGPT for medical advice, says pharmacist Deborah Grayson, is for treatment ideas, rather than diagnosis. 'If you're pretty certain you know what's wrong with you, then ChatGPT can be quite a good option,' she said. 'If you know you've got the flu, for example, you can ask what the best way to support you would be, and it will likely recommend paracetamol and rest.'When you're very clear about what the issue is, you can get some fairly standard advice.'The danger, says Ms Grayson, is when you're trying to use chatbots to get a diagnosis for something. 'ChatGPT will trawl all sorts of sources, and it's not very good at differentiating between the very rare and the likely,' she explained. 'Much like Googling the cause of a headache and being told you have a brain tumour, chatbots can easily scare users by suggesting a diagnosis that is statistically very unlikely given the symptoms. 'When you're using it to try and figure out what's wrong with you, you lose a lot of the clinical nuance that you would get when seeing a medical professional face to face.'Give it as much information as possible When you are trying to get a diagnosis— or ideas of what your symptoms may mean—says Ms Grayson, more information is better than less. While the urge may be to simply shoot off a quick question, giving more detail about what you're experiencing will allow the chatbot to rule out the more unlikely diagnoses. 'What you put in is what you get out,' she said. 'The more information you provide, the better the response back that you'll get. 'Rather than putting in a quick question, if you want the most accurate response, list all your symptoms—as well as how long they've been going on for—and any other relevant medical information and ask it to give you some examples of what the issue might be.'That will give you a much more rounded picture. If you only put in a few symptoms, the chatbot will cast a much wider net.'Be a proactive patient Rather than relying on ChatGPT to diagnose a medical issue, consider using it to make you a more proactive patient when seeing your GP. 'We only have a short amount of time with our doctors these days, so having a look beforehand at your symptoms can help you be more targeted when you do have an appointment,' Ms Grayson explained. 'Bring in the results you are given by a chatbot and ask your doctor about them, or use them to ask for more testing for specific issues. 'It can help you be more empowered, especially if you're a nervous patient. 'And sometimes, asking about specific things can help you get treatment.'ChatGPT can also help you decide whether or not to book an emergency appointment, says Ms Grayson. 'In my pharmacy, I often see patients come in and list off a whole load of symptoms that clearly require urgent treatment, but they didn't feel it warranted an emergency appointment with their GP,' she said. 'Use chatbots to help figure out just how serious the symptoms are, and whether or not it can wait.'   The best way to use a chatbot like ChatGPT for medical advice, says pharmacist Deborah Grayson (pictured), is for treatment ideas, rather than diagnosisSpecify the source Another trick to ensure the answer you're given is medically accurate, consider asking the chatbot to stick to certain sources when finding information. 'Ask ChatGPT to clarify the source its relying on,' says Ms Grayson.'Or, before you even get an answer, state in your question which sources it can use to cherry pick information from.'Some of the safest options would be the NHS website, government pages, or online research bases like PubMed, says Ms Grayson. 'If you've already received an answer, turn the question back to it,' she explained. 'Ask it to tell you the source and see whether it came from third party websites, or the NHS website or one of a similar standard.'Know when it's not appropriate Most important, says Ms Grayson, is to know when not to use ChatGPT for advice. 'If you've got a red flag symptom, do not got to ChatGPT for advice,' she said. These can include anything from extreme fatigue, weight loss, unexplained prolonged pain or unexplained bleeding, issues with heart rate, consistent vomiting, changes in bowel movements, or persistent fever. 'It's also important to caveat that while doctors can be hard to get hold of, pharmacists are quite readily available,' said Ms Grayson.'If you've gone to a search engine and are not sure how to proceed, it can be helpful to bring it to a real medical professional for further help or guidance.'People say they go to ChatGPT for accessibility, but your average community pharmacy will likely be equally accessible—and able to cut through the additional noise.' 
AI Article