AI Image & Video Generation

Summary

This statement documents the institutional decision to disable AI image generation capabilities within university AI tools (e.g., Microsoft Copilot) and provides the compliance rationale under applicable privacy and data‑protection regulations.

Body

Overview

NEOMED IT has disabled AI‑generated image and video creation and image transformation features within approved generative AI platforms used in NEOMED's environment.

This decision is based on a risk‑based compliance assessment aligned with the institution’s obligations under:

  • The Family Educational Rights and Privacy Act (FERPA)
  • The Health Insurance Portability and Accountability Act (HIPAA)
  • Applicable state privacy laws
  • University data governance, records management, and risk management standards

Compliance Rationale

Elevated Risk of Inadvertent Disclosure of Protected Information

AI image and video generation tools may unintentionally create or reconstruct visual or audiovisual content that includes protected or identifying information related to students, patients, research subjects, or other regulated individuals.

For video content, this risk is amplified because generated or transformed outputs may simultaneously contain:

  • Visual identifiers (faces, name badges, documents, screens, charts)
  • Audio content (voices, names, conversations)
  • Contextual inference (locations, classrooms, clinical settings, timelines)

Any of these elements may independently constitute protected information under FERPA or HIPAA.

Limited Classification, Labeling and Governance Capabilities

University data classification, sensitivity labeling, and data loss prevention (DLP) controls are not yet sufficiently mature to reliably govern AI‑generated image and video content.

Generated image and video files:

  • Often lack enforceable or consistent metadata
  • Are easily downloadable and redistributable
  • May exist outside University records systems and retention controls

This limits NEOMED's ability to:

  • Apply appropriate classification and retention
  • Enforce deletion requirements
  • Demonstrate compliance and audit trails

Ambiguity Between Creation, Simulation, and Disclosure

From a regulatory standpoint, FERPA and HIPAA focus on control of disclosure, not solely on whether data was intentionally supplied.

AI image and video generation introduces ambiguity regarding whether protected information was:

  • Provided directly by a user
  • Inferred, simulated, or reconstructed by the system
  • Represented in a way that could reasonably be interpreted as real

In such cases, NEOMED applies a conservative compliance posture.

Identity, Likeness, and Realism Risks

Video and image content that depicts people—whether real or synthetic—introduces heightened compliance, ethical, and reputational risk.

Generated content may reasonably be interpreted as depicting:

  • Actual students or patients
  • Real NEOMED activities or environments
  • Authentic educational or clinical interactions

This creates potential exposure even in the absence of deliberate data misuse.

Increased Incident Response and Breach Impact

Privacy incidents involving images or video are substantially more complex to investigate and remediate than text‑based incidents due to:

  • Difficulty conclusively identifying all protected elements
  • Challenges containing downstream sharing
  • Inability to guarantee complete deletion

Disabling image and video generation significantly reduces both likelihood and impact.

Permitted Use of AI Tools

NEOMED continues to permit AI capabilities assessed as presenting lower regulatory and operational risk, including:

  • Text drafting, editing, and summarization
  • Analytical and informational assistance
  • Policy development and documentation support
  • Non‑visual productivity functions

All permitted uses remain subject to existing acceptable use, privacy, security, and data handling requirements.

Governance and Review

This control reflects our current risk tolerance and regulatory environment. The decision may be reviewed periodically as:

  • Regulatory guidance evolves
  • Technical governance controls mature
  • Risk mitigation strategies become available

Any future enablement of image/video generation would require formal compliance review and approval by NEOMED governance bodies.

Details

Details

Article ID: 172676
Created
Fri 4/24/26 12:06 PM
Modified
Fri 4/24/26 12:15 PM