New Delhi: Frustrated by long waits, rising healthcare costs and the feeling that their symptoms are often dismissed, many women are increasingly turning to AI chatbots for health advice. A recent report highlighted cases of women using tools such as ChatGPT to discuss symptoms, understand possible diagnoses and find validation when they feel unheard by doctors.
The shift may be more widespread than anecdotal accounts suggest. A survey of 1,000 women in the UK aged 20 to 50 found that 53 per cent said they would use a free AI tool for medical advice, even while acknowledging that such tools can have an estimated 20 per cent error rate.
The gender gap in healthcare
The report by Intimina, a Swedish company that makes women’s health products, also pointed to economic pressures shaping healthcare choices. About 66 per cent of women said they had avoided booking an appointment with a general practitioner or collecting a prescription to avoid associated costs, while 47 per cent said the cost of living crisis had forced them to delay buying treatments until symptoms felt “severe.”
However, research suggests AI may reproduce some of the same biases women face in healthcare. A study from the London School of Economics last year found that AI models systematically downplayed women’s symptoms compared with men’s, raising concerns that automated tools could reinforce existing gender disparities.
Other research has questioned whether AI actually improves health-related decision-making. A study by researchers at the University of Oxford, published in the journal Nature Medicine, analysed responses from nearly 1,300 participants and found that AI-generated answers were no more effective than traditional internet searches in identifying health problems.
The researchers concluded that using AI to make medical decisions could pose risks because of its “tendency to provide inaccurate and inconsistent information.” They noted that while chatbots now “excel at standardised tests of medical knowledge,” their real world use could still be risky for people seeking help for their own symptoms.
Yet the appeal remains strong, particularly for women who say their concerns are often minimised in clinical settings. Studies have long documented how women’s pain is more likely to be dismissed or misdiagnosed compared to men’s.
Also read: Safdarjung Hospital says govt, not doctors, provides beds—after woman exposes chaos in wards
AI players in health sector
In India, similar gaps in access and comfort with healthcare systems have led to new digital alternatives. In 2022, the women’s health platform Pinky Promise launched as a chat-first service designed to make gynaecological care more private, approachable, and free of judgement.
But researchers warn that AI tools themselves may introduce new inequalities. A study from the MIT Jameel Clinic found that several major AI models, including GPT‑4, Llama 3, and the healthcare-focused model Palmyra‑Med, recommended lower levels of care for female patients and were more likely to suggest self-treatment instead of medical consultation.
The study also found that messages (prompts, questions) containing typos, informal language, or uncertain phrasing were 7 to 9 per cent more likely to receive advice against seeking medical care, a pattern that could disadvantage people who are not fluent in English or less comfortable with technology.
Even as hundreds of millions of people turn to chatbots for health queries, tech companies are expanding their ambitions in the sector. In January, ChatGPT Health was introduced to analyse users’ medical records, wellness apps and wearable device data to answer health-related questions. Yet tests by researchers found that in more than half the scenarios where doctors would recommend emergency care, the chatbot said it was acceptable to delay treatment, underscoring the risks of relying on AI for medical decisions.
(Edited by Ratan Priya)

