Responsible use of generative AI for creating research grant applications

Generative Artificial intelligence-based technologies, such as large language models (e.g., ChatGPT) and text-to-image generators (e.g., DALL-E), are online tools that can be used by researchers during the research application process.

Generative Artificial intelligence-based technologies, such as large language models (e.g., ChatGPT) and text-to-image generators (e.g., DALL-E), are online tools that can be used by researchers during the research application process. These tools can generate text and graphics that may be hard to distinguish from human-generated ones. The output of generative AI (GenAI) can be biased, invalid, inaccurate or completely wrong (“hallucinated”). It can include the texts or concepts of others without citing them or cite them incorrectly. Below are fundamental points to take into consideration if you use GenAI in the process of producing a grant application.

Protect information through secure GenAI tools. For cloud-based AI systems, use only tools that do not train on or otherwise allow access to your data. Microsoft’s Copilot Chat is an AI technology that can be used to handle confidential material, such as GDPR-applicable or potentially patentable information. The Copilot through DTU is secure when you are logged in to it through your DTU account, which will be indicated by a green “check shield” at the top of the Copilot window. See AI technology at DTU and Copilot Chat - AI  for more information.

It is your responsibility to ensure that all AI-generated content used in your research application is:

  • relevant, valid and correct (e.g., correct references),
  • does not include confidential, sensitive, proprietary or protected information introduced by the AI,
  • does not plagiarize and respects authorship by citing where appropriate.

The principles for AI-assisted research are the same as the general principles for research integrity. See Research Integrity and AI-assisted research for more information. To assess the similarity of text to already published text, and also the possibility that text is AI-generated, you can use iThenticate. Read more at iThenticate at DTU.

Be aware that research misconduct also applies to research applications. Thus, you can become liable for scientific misconduct if you or a co-applicant breach standards for scientific conduct. See What is scientific misconduct for more information. Note that AI cannot be given authorship for a research application.

Document and disclose appropriately. The principles for good scientific conduct and research integrity call for the use and outputs of AI tools to be properly documented as well as disclosed as appropriate. Read more at Referencing when using generative AI - DTU Inside.

Remember that text and images created with GenAI may be protected by copyright. All the standard copyright rules therefore apply. If you use a tool other than the DTU Copilot, remember to check the Terms and Conditions to make sure that your rights are not transferred. See Copyright and generative AI - DTU Inside for more information.

Make sure that you are aware of and comply with the funder's requirements:

  • Are there any parts of the application for which GenAI may not be used?
  • Must you report the use of AI in the scientific writing process?

As an example, the European Commission has added the following guidance on the use of AI for preparing proposals to the Standard Application Form for RIA and IA proposals:

‘When considering the use of generative artificial intelligence (AI) tools for the preparation of the proposal, it is imperative to exercise caution and careful consideration. The AI-generated content should be thoroughly reviewed and validated by the applicants to ensure its appropriateness and accuracy, as well as its compliance with intellectual property regulations. Applicants are fully responsible for the content of the proposal (even those parts produced by the AI tool) and must be transparent in disclosing which AI tools were used and how they were utilized. Specifically, applicants are required to:

  • Verify the accuracy, validity, and appropriateness of the content and any citations generated by the AI tool and correct any errors or inconsistencies.
  • Provide a list of sources used to generate content and citations, including those generated by the AI tool. Double-check citations to ensure they are accurate and properly referenced.
  • Be conscious of the potential for plagiarism where the AI tool may have reproduced substantial text from other sources. Check the original sources to be sure you are not plagiarizing someone else’s work.
  • Acknowledge the limitations of the AI tool in the proposal preparation including the potential for bias, errors, and gaps in knowledge.’

Further information on the responsible use of GenAI in research, including recommendations for when and how to communicate its use, can be found in the RECOMMENDATIONS FOR RESEARCHERS section of the European Commission’s Living Guidelines on the responsible use of generative AI in research.

Updated 27 januar 2026