Generative AI Usage Guidelines

Overview

Generative AI tools (such as ChatGPT, Microsoft Copilot, and Google Gemini) are increasingly used for productivity, research, and education. This article explains:

  • What AI tools you may use
  • What data you may or may not enter into those tools
  • Which tools are approved for University data
  • How to request approval for new AI products

What Is Generative AI?

Generative AI refers to tools that create new content (text, images, summaries, code, audio, etc.) based on patterns learned from their training data.

Examples include:

  • ChatGPT
  • Microsoft Copilot
  • Google Gemini
  • Claude
  • MidJourney
  • AI features embedded in other products (e.g., OtterAI, Zoom AI, Fireflies)

Important Privacy Reminder

Anything you enter into most public AI tools may be stored, reused, or accessible to others.

Because of this:

  • Do not enter University Data into AI tools unless the tool is approved and the data type is allowed.
  • Only use generative AI tools that have been reviewed and approved by NEOMED IT when working with non‑public data.

If you are unsure whether a tool or data type is allowed, do not proceed and contact Information Security: itsecurity@neomed.edu

What Data Can I Use with AI?

To determine whether your data requires special attention, consult the Data Classification Guidelines and the University Data Policy. If your data is L1 (public), uploading it to generative AI tools is permissible. To process data above L1, any generative AI tool must have been approved through NEOMED IT’s procurement and security review processes.

  • Public Data (L1): Yes, including public AI tools
  • Internal (L2), Restricted (L3) or Highly Restricted Data (L4): Only in approved tools

Approved AI Tools by Data Type

Public Data (L1)

You may use publicly available AI tools, including:

  • ChatGPT
  • Microsoft Copilot
  • Google Gemini
  • MidJourney
  • Similar public AI services

Internal (L2), Restricted (L3) or Highly Restricted Data (L4)

Only the following is currently approved:

  • Microsoft Copilot
    • Must be signed in with your NEOMED Microsoft account
    • Approved for L2 and most L3 data
 Important notes for L3 data
  • Some L3 data types require additional contractual, legal, or policy controls.
  • Not all L3 data is appropriate for Copilot.
  • You are responsible for understanding your data’s obligations.

Questions? Contact itsecurity@neomed.edu

Requesting Approval for New AI Tools

All AI tools are treated like any other technology purchase. If you want to use a new AI product with University data:

  1. Submit a request through the Technology Procurement Process
  2. The tool will undergo security, privacy, and compliance review
  3. Approval is required before using the tool with non‑public data (L2, L3, L4).

Responsible Use Guidelines

When using generative AI, keep the following in mind:

Always Validate Output

  • AI tools can produce inaccurate or fabricated information (“hallucinations”)
  • Do not rely on AI output for decisions requiring accuracy without verification
  • Periodically re‑validate outputs, especially in operational or academic use

Respect Intellectual Property

  • Do not input copyrighted material, unpublished research, student work, or other protected IP without authorization
  • Ensure you have permission to use any original works entered into AI tools

Transparency and Informed Use

  • Users should know when they are interacting with AI
  • Use of AI should support informed decision‑making, not replace judgment

Need Help?

If you have questions about:

  • Whether a specific tool is approved
  • Whether your data type can be used with AI
  • Submitting a procurement or security review

Contact Information Security: itsecurity@neomed.edu

Related Articles