AI and Generative Technology Use at CUIMC
As we embrace innovative technologies to support the Columbia University Irving Medical Center’s (CUIMC) mission, it is crucial to address the responsible use of Artificial Intelligence (AI) and generative technology tools at CUIMC to ensure compliance with healthcare regulatory standards and to safeguard sensitive information. This document outlines CUIMC’s supplemental IT guidance on responsible and appropriate use of AI tools, including generative AI tools, while adhering to Columbia’s Generative AI Policy. Please review the Columbia University Data Classification Policy which defines the types of data referenced throughout this document.
Below are the guidelines outlining privacy and security principles, and prohibited and judicious uses of AI:
- Guidelines for the Use of AI in Clinical Care Delivery
- Guidelines for the Use of AI in Education and Training with Sensitive Data
- Guidelines for the Use of AI in Research with Sensitive Data (including RHI)
Approved Tools and Access Details
Columbia University provides access to HIPAA-compliant versions of OpenAI’s ChatGPT and Microsoft Copilot*, enabling our workforce to leverage these AI tools responsibly and compliantly. Workforce members must access CUIMC-approved enterprise versions of AI tools using their CUIMC-issued accounts. This authentication ensures a secure and compliant environment.
To help users confirm they are using the approved software versions, visual indicators are displayed within the interfaces of CUIMC-sanctioned AI tools. Users should always verify these indicators before entering any CUIMC-related data. If these elements are missing, you may not be using the approved instance. In such cases, immediately stop and report any issue to CUIMC IT Security Office.
- Columbia ChatGPT Education: Displays the Columbia University logo(1) at the top right of the interface.
- Available via the CUIT website https://www.cuit.columbia.edu/content/chatgpt-education
- Microsoft Copilot*: Included with CUIMC’s Microsoft license, with built-in contractual data protections. When used correctly via your CUIMC issued O365 account (UNI@cumc.columbia.edu), all prompt data remains within the CUIMC tenant and is not shared externally or used for training.
- Access via https://copilot.cloud.microsoft/
- Displays a shield icon (may appear as a green outline with a checkmark(2), a grey outline on a white background(3) or a white outline on a dark background(4), depending on the device) in the chat interface, indicating that Enterprise Data Protection applies to the session. Additionally, confirm your CUIMC email address, indicating you are logged into the CUIMC account.
- The use of third-party plugins and externally accessible prompts is currently restricted on CUIMC’s Microsoft Copilot
Non-HIPAA Compliant Tools Approved for Use With Non-Sensitive Data:
- Columbia CU-GPT: A streamlined chat interface developed in-house for various language tasks. This offering is currently in its pilot phase.
- Available via https://www.cuit.columbia.edu/content/cu-gpt
- Unlike ChatGPT Education, CU-GPT is not approved for PHI use. CUIMC users can request access for processing non-sensitive data in accordance with institutional guidelines.
Summary of Approved AI Tools
Name |
Access Link |
Approved Use |
---|---|---|
Columbia ChatGPT Education |
Approved for use with Sensitive Data (PHI) |
|
Microsoft Copilot* |
Approved for use with Sensitive Data (PHI) |
|
Columbia CU-GPT |
Approved for use with non-sensitive data. |
Data Classification and AI Use
Data Type |
Definition |
AI Use Guidance |
---|---|---|
Sensitive Data |
Information requiring protection due to privacy, regulatory, or security concerns. (PHI, RHI, PII) |
Permitted only on the ChatGPT Education and approved Microsoft CoPilot* platforms. Research protocol use requires IRB, and TRAC/ACORD approval. |
Confidential and Internal Data |
Contractually protected or proprietary data such as internal policies, drafts, student, financial, or unpublished reports. |
Permitted on approved ChatGPT (Columbia ChatGPT Education and CU-GPT) and MS Copilot*. |
Non-Public Data |
General CUIMC-related data classified as sensitive, confidential, or internal data |
Not allowed on unapproved AI tools (e.g., free/Personal/Commercial ChatGPT, Gemini, Claude, etc.). |
Note: CU-GPT is not approved for PHI use.
Additional Guidance and Resources
We encourage our staff to explore our CU learning content and general learning material on AI technologies available through Columbia’s LinkedIn learning resource portal.
All use of AI, Large Language Models (LLM), Natural Language processors (NLP) or Machine Learning (ML) systems at the Medical Center must comply with HIPAA and other relevant healthcare and IT regulations to uphold the highest standards of patient privacy and data security. For locally installed AI models (i.e. LLM, NLP, ML), a formal IT Risk Assessment review is required before deployment to evaluate potential security, privacy, and compliance risks.
CUIMC-IT may monitor AI software usage to ensure compliance with these guidelines. When generating AI content for audiences beyond your immediate use, it is imperative to verify all AI-generated information through authoritative sources and report its use in approved research activities. This practice ensures information integrity and helps avoid potential AI-related negative outcomes such as the dissemination of inaccuracies or biases.
Should you have any questions regarding AI Tools, become aware of any data exposures or misuse of sensitive information, or to report issues verifying approved tool instances, contact the CUIMC IT Security Office.
For reviews of AI technology use cases at the Medical Center outside approved scenarios, please submit a request through the IDEAS technology project proposal request form.
*Microsoft CoPilot is an overarching brand that covers several separately licensed Microsoft AI products. As of May 2025, CUIMC-approved HIPAA compliant products include CoPilot Chat (in the Edge browser), CoPilot Pro (in Teams), M365 CoPilot a.k.a. CoPilot Premium (integrated into M365 “Office” suite), and CoPilot in Microsoft Fabric. At this time, any other Copilot products such as CoPilot Studio and GitHub Copilot in VSCode are not approved for use with sensitive data.