Fine-tuning generative pre-trained transformers for clinical dialogue summarization
dc.contributor.author | Ronan, Isabel | en |
dc.contributor.author | Tabirca, Sabin | en |
dc.contributor.funder | Science Foundation Ireland | en |
dc.date.accessioned | 2025-02-19T16:19:54Z | |
dc.date.available | 2025-02-19T16:19:54Z | |
dc.date.issued | 2025-01-17 | en |
dc.description.abstract | Automated clinical dialogue summarization can help make health professional workflows more efficient. With the advent of large language models, machine learning can be used to provide accurate and efficient summarization tools. Generative Pre-Trained Transformers (GPT) have shown huge promise in this area. While larger GPT models, such as GPT-4, have been used, these models pose their own problems in terms of precision and expense. Fine-tuning smaller models can lead to more accurate results with less computational expense. In this paper, we fine-tune a GPT-3.5 model to summarize clinical dialogue. We use both default hyperparameters along with manual hyperparameters for comparison purposes. We also compare our default model to past work using ROUGE-1, ROUGE-2, ROUGE-L, and BERTScores. We find our model outperforms GPT-4 across all measures. As our fine-tuning process is based on the smaller GPT-3.5 model, we show that fine-tuning leads to more accurate and less expensive results. Informal human observation also reveals our notes to be of acceptable quality. | en |
dc.description.status | Peer reviewed | en |
dc.description.version | Accepted Version | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.citation | Ronan, I. and Tabirca, S. (2025) 'Fine-tuning generative pre-trained transformers for clinical dialogue summarization', 2024 International Conference on Frontiers of Information Technology (FIT), Islamabad, Pakistan, 9-10 December 2024, pp. 1-6. https://doi.org/10.1109/FIT63703.2024.10838420 | en |
dc.identifier.doi | https://doi.org/10.1109/fit63703.2024.10838420 | en |
dc.identifier.eissn | 2473-7569 | en |
dc.identifier.endpage | 6 | en |
dc.identifier.isbn | 979-8-3315-1050-3 | en |
dc.identifier.isbn | 979-8-3315-1051-0 | en |
dc.identifier.issn | 2334-3141 | en |
dc.identifier.startpage | 1 | en |
dc.identifier.uri | https://hdl.handle.net/10468/17083 | |
dc.language.iso | en | en |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en |
dc.relation.ispartof | 2024 International Conference on Frontiers of Information Technology (FIT), Islamabad, Pakistan, 9-10 December, 2024 | en |
dc.relation.project | info:eu-repo/grantAgreement/SFI/SFI Centres for Research Training Programme::Data and ICT Skills for the Future/18/CRT/6222/IE/SFI Centre for Research Training in Advanced Networks for Sustainable Societies/ | en |
dc.rights | © 2025, IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | en |
dc.subject | Machine translation | en |
dc.subject | Transformers | en |
dc.subject | Data augmentation | en |
dc.subject | Synthetic data generation | en |
dc.subject | Parameter tuning | en |
dc.title | Fine-tuning generative pre-trained transformers for clinical dialogue summarization | en |
dc.type | Conference item | en |
dc.type | proceedings-article | en |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- fineTuningGenerativePreTrainedTransformersForClinicalDialogueSummarization.pdf
- Size:
- 530.06 KB
- Format:
- Adobe Portable Document Format
- Description:
- Accepted Version
License bundle
1 - 1 of 1
Loading...
- Name:
- license.txt
- Size:
- 2.71 KB
- Format:
- Item-specific license agreed upon to submission
- Description: