Microsoft Copilot Health: Connect Your Medical Records & Wearables

▼ Summary
– Microsoft has launched Copilot Health, a secure feature within Copilot for managing health data, with a phased rollout and a waitlist for access.
– The tool is designed to help users understand their health information by importing records and wearable data, but it explicitly does not provide medical diagnoses or replace a doctor.
– It connects users to provider directories for finding medical professionals and cites information from credible health organizations to improve answer quality.
– Microsoft states that health chats are isolated with enhanced privacy controls, data is not used for AI training, and users can delete their data or disconnect sources at any time.
– Unlike some competitors, Copilot Health is not currently HIPAA-compliant for this consumer service, though Microsoft claims it follows high standards and has an ISO 42001 certification for responsible AI.
Microsoft has introduced a new feature called Copilot Health, designed as a secure environment within its Copilot AI for managing personal wellness information. This tool allows individuals to ask questions about lab reports, review medical records, search for healthcare providers, and analyze data from fitness trackers. The feature is launching in phases, with a waitlist available for early access. Microsoft emphasizes this is not a diagnostic tool but a resource for better understanding personal health data.
Users can import their medical history from more than 50,000 U.S. hospitals and clinics via HealthEx and bring in lab results through Function. The platform supports integration with over 50 wearable devices from brands like Apple, Oura, and Fitbit. The homepage can display real-time information such as step counts and reminders for future medical appointments, based on the data each user chooses to share.
Finding a suitable doctor is another function. The service connects to live U.S. provider directories, enabling searches filtered by specialty, geographic area, spoken languages, and accepted insurance plans. To bolster trust, Microsoft states it has enhanced response quality by prioritizing information from reputable health organizations globally. Answers provided within Copilot Health will include citations and links to sources, along with expert-written summaries from institutions like Harvard Health.
Privacy and security are central to the offering. Conversations within Copilot Health are kept separate from general Copilot chats and are governed by stricter access and safety controls. Microsoft asserts that data from these health chats is not used to train its AI models. Users retain control, with the ability to delete their health information or disconnect linked data sources, such as wearable feeds, at any time.
This move follows similar initiatives from other tech firms. OpenAI launched ChatGPT Health earlier this year, which also provides a sandboxed space for medical discussions and discourages using chat data for training. A key distinction, however, is current compliance with healthcare regulations. Unlike ChatGPT for Healthcare and Amazon’s recently expanded Health AI, Microsoft’s Copilot Health is not yet a HIPAA-compliant product. Anthropic’s Claude for Healthcare also markets itself as “HIPAA-ready.”
When questioned about HIPAA compliance, Microsoft’s Dr. Dominic King explained that the law is not legally required for a direct-to-consumer service where individuals are using their own data. HIPAA sets strict rules for protecting electronic health information, with significant penalties for violations by covered entities like hospitals. Since Microsoft is not a covered entity, it is not subject to those same legal repercussions. Dr. King noted that while not mandatory, meeting high standards is a priority, and the company plans to announce updates regarding its implementation of voluntary “HIPAA controls.”
Microsoft highlights that Copilot Health has received an ISO 42001 certification, an international standard promoting responsible AI use with focuses on transparency and reliability. This certification is also held by other Microsoft Copilot products.
Despite these safeguards and certifications, experts advise caution when sharing sensitive medical data with any AI system. Companies can alter their data privacy policies without notice, and AI has a documented history of providing incorrect or potentially harmful medical advice, with particular concerns raised about its handling of mental health topics.
(Source: The Verge)





