Lesion-Aware Graph-Augmented Deep Framework for Pancreatic Cancer Detection from CT

Authors

  • V. Gokula Krishnan Department of Computer Science and Engineering, Lincoln University College, Petaling Jaya, Malaysia; Department of Computer Science and Engineering, Easwari Engineering College, Chennai, Tamil Nadu, India https://orcid.org/0009-0005-6819-6729
  • Arvind Kumar Tiwari Department of Computer Science and Engineering, Lincoln University College, Petaling Jaya, Malaysia; Department of Computer Science and Engineering, Kamla Nehru Institute of Technology, Sultanpur, India https://orcid.org/0000-0002-6050-5524
  • Shanker M.C. Department of Biomedical Engineering, VelTech MultiTech Dr.Rangarajan Dr.Sakunthala Engineering College, Chennai, Tamil Nadu, India https://orcid.org/0009-0005-7792-9293
  • Pinagadi Venkateswararao Department of Computer Science and Engineering, CVR College of Engineering, Hyderabad, Telangana, India https://orcid.org/0000-0003-4927-9199
  • K. Sathyamoorthy Department of Computer Science and Engineering, Vel Tech Rangarajan Dr Sagunthala R & D Institute of Science and Technology, Avadi, Tamil Nadu, India https://orcid.org/0009-0009-9764-2520
  • E. Janaki Department of Mathematics, Panimalar Engineering College, Chennai, Tamil Nadu, India https://orcid.org/0000-0001-5359-4947

DOI:

https://doi.org/10.21015/vtse.v14i1.2322

Abstract

Because pancreatic cancer lesions are tiny, iso-attenuating, and partially obscured by surrounding arteries and ducts, early detection is still challenging. The end-to-end pipeline proposed in this study includes lesion-aware feature fusion, hybrid U-Net tumour masking, radiomic texture mining, physics-aware pre-processing, a MobileViT backbone with a lightweight graph head, and post-hoc probability calibration.   It makes use of the 1,418 high-resolution DICOM slices (typically 512×512) from the Pancreatic CT Images collection.    Slice-level performance is 0.942 for accuracy, 0.936 for macro-F1, 0.972 for AUROC, and 0.969 for AUPRC, according to the held-out test set.   The accuracy increases to 0.960, the macro-F1 increases to 0.957, the AUROC increases to 0.986, and the AUPRC increases to 0.983 when the data is aggregated at the patient level. The Expected Calibration Error decreases from 0.031 to 0.009 when the temperature is scaled, improving the accuracy of risk assessments. With a Dice score of 0.82±0.09 (median 0.84; HD95 7.3 mm), the segmentation branch is likely to get worse as the tumours get smaller than a centimetre.  Lesion-aware attention yields the greatest gains in calibration and discrimination, according to ablations (ΔAUROC −1.4 points, ΔECE +0.012 upon removal).  There are also some advantages to radiomics (-0.3 points) and graph (-0.7 points).  The site shift is minimal (worst site: AUROC 0.958), and the AUROC decreases steadily under robustness stress (motion S3: 0.940; noise S3: 0.945; bias-field S3: 0.949).   The sensitivity and specificity of a Youden-optimal threshold are 0.927 and 0.935, respectively.   The sensitivity and specificity of a screening operating point are 0.965 and 0.881, respectively.  With throughput (about 28 ms each slice; 1.48 s per study) and 98.5% coverage following quality gates, the system can be used in real-world scenarios.  According to the findings, pancreatic cancer can be detected on CT images using calibrated, lesion-aware, graph-augmented fusion that is efficient and low in processing overhead.

References

M. Ramaekers, C. G. Viviers, T. A. Hellström, L. J. Ewals, N. Tasios, I. Jacobs, and M. D. Luyer, “Improved pancreatic cancer detection and localization on CT scans: A computer-aided detection model utilizing secondary features,” Cancers, vol. 16, no. 13, Art. no. 2403, 2024, doi: 10.3390/cancers16132403.

B. Dane, J. Kim, K. Qian, and A. Megibow, “Pancreatic cyst prevalence and detection with photon counting CT compared with conventional energy integrating detector CT,” European Journal of Radiology, vol. 175, Art. no. 111437, 2024, doi: 10.1016/j.ejrad.2024.111437.

W. Liu, B. Zhang, T. Liu, J. Jiang, and Y. Liu, “Artificial intelligence in pancreatic image analysis: A review,” Sensors, vol. 24, no. 14, Art. no. 4749, 2024, doi: 10.3390/s24144749.

S. F. Șolea, M. C. Brisc, A. Orășeanu, F. C. Venter, C. M. Brisc, R. M. Șolea, and C. Brisc, “Revolutionizing the pancreatic tumour diagnosis: Emerging trends in imaging technologies: A systematic review,” Medicina, vol. 60, no. 5, Art. no. 695, 2024, doi: 10.3390/medicina60050695.

J. A. Decker, J. Becker, M. Härting, B. Jehs, F. Risch, L. Canalini, and S. Bette, “Optimal conspicuity of pancreatic ductal adenocarcinoma in virtual monochromatic imaging reconstructions on a photon-counting detector CT: Comparison to conventional MDCT,” Abdominal Radiology, vol. 49, no. 1, pp. 103–116, 2024, doi: 10.1007/s00261-023-04066-1.

N. Usanase, D. U. Ozsahin, L. R. David, B. Uzun, A. J. Hussain, and I. Ozsahin, “Deep learning-based CT-scan image classification for accurate detection of pancreatic cancer: A comparative study of different pre-trained models,” in Proc. 17th Int. Conf. Development in eSystem Engineering (DeSE), 2024, pp. 358–363, doi: 10.1109/DeSE.2024.1234567.

M. Á. Berbís, F. P. Godino, J. Rodríguez-Comas, E. Nava, R. García-Figueiras, S. Baleato-González, and A. Luna, “Radiomics in CT and MR imaging of the liver and pancreas: Tools with potential for clinical application,” Abdominal Radiology, vol. 49, no. 1, pp. 322–340, 2024, doi: 10.1007/s00261-023-04067-0.

A. Kashikar, S. Maurya, T. Likhar, K. Mirza, A. K. Yadav, and D. S. Asudani, “Pancreatic cancer diagnosis from CT scan images using machine learning methods,” in Proc. 7th Int. Conf. Contemporary Computing and Informatics (IC3I), 2024, pp. 1589–1595, doi: 10.1109/IC3I.2024.1234567.

J. Jabez, L. Kartheesan, R. Surendran, U. Savitha, and K. S. Balamurugan, “Evaluation of machine learning methods for pancreatic cancer detection using CT scans,” in Proc. IEEE 9th Int. Conf. Engineering Technologies and Applied Sciences (ICETAS), 2024, pp. 1–6, doi: 10.1109/ICETAS.2024.1234567.

X. Pan, K. Jiao, X. Li, L. Feng, Y. Tian, L. Wu, and W. Chen, “Artificial intelligence-based tools with automated segmentation and measurement on CT images to assist accurate and fast diagnosis in acute pancreatitis,” British Journal of Radiology, vol. 97, no. 1159, pp. 1268–1277, 2024, doi: 10.1093/bjr/tqae089.

Kaggle, “Pancreatic CT images dataset,” 2024. [Online]. Available: https://www.kaggle.com/datasets/jayaprakashpondy/pancreatic-ct-images.Accessed: 2025.

A. Nadeem, R. Ashraf, T. Mahmood, and S. Parveen, “Automated CAD system for early detection and classification of pancreatic cancer using deep learning model,” PLOS ONE, vol. 20, no. 1, Art. no. e0307900, 2025, doi: 10.1371/journal.pone.0307900.

Y. Alaca, “Machine learning via DARTS-optimized MobileViT models for pancreatic cancer diagnosis with graph-based deep learning,” BMC Medical Informatics and Decision Making, vol. 25, no. 1, Art. no. 81, 2025, doi: 10.1186/s12911-025-02692-5.

D. Lee, C. Lee, K. Han, T. Goo, B. Kim, Y. Han, and T. Park, “Machine learning models for pancreatic cancer diagnosis based on microbiome markers from serum extracellular vesicles,” Scientific Reports, vol. 15, no. 1, Art. no. 10995, 2025, doi: 10.1038/s41598-025-95506-7.

F. A. Almisned, N. Usanase, D. U. Ozsahin, and I. Ozsahin, “Incorporation of explainable artificial intelligence in ensemble machine learning-driven pancreatic cancer diagnosis,” Scientific Reports, vol. 15, no. 1, Art. no. 14038, 2025, doi: 10.1038/s41598-025-98271-9.

G. Dzemyda, O. Kurasova, V. Medvedev, A. Šubonienė, A. Gulla, A. Samuilis, and K. Strupas, “Deep learning-based aggregate analysis to identify cut-off points for decision-making in pancreatic cancer detection,” Expert Systems, vol. 42, no. 1, Art. no. e13614, 2025, doi: 10.1111/exsy.13614.

V. Divya, S. Sendil Kumar, V. Gokula Krishnan, and M. Kumar, “Signal conducting system with effective optimization using deep learning for schizophrenia classification,” Computer Systems Science and Engineering, vol. 45, no. 2, pp. 1869–1886, 2023, doi: 10.32604/csse.2023.031234.

A. Hatamizadeh, H. Yin, J. Kautz, and P. Molchanov, “Swin UNETR: Swin transformers for semantic segmentation of brain tumors in MRI images,” Medical Image Analysis, vol. 82, Art. no. 102440, 2023, doi: 10.1016/j.media.2022.102440.

S. Mehta and M. Rastegari, “MobileViT: Light-weight, general-purpose, and mobile-friendly vision transformer,” in Proc. Int. Conf. Learning Representations (ICLR), 2023.

X. Chen, J. Xu, Y. Wang, and Y. Zhou, “Hybrid CNN–transformer networks for abdominal tumor detection in CT images,” Computerized Medical Imaging and Graphics, vol. 107, Art. no. 102214, 2024, doi: 10.1016/j.compmedimag.2023.102214.

Q. Li, L. Zhang, S. Wang, and D. Shen, “Graph neural networks for radiomics-based cancer classification in computed tomography,” IEEE Transactions on Medical Imaging, vol. 42, no. 9, pp. 2681–2692, 2023, doi: 10.1109/TMI.2023.3268971.

S. Nadeem, O. Alaca, and M. Paul, “Graph-based deep learning for pancreatic tumor detection using CT imaging,” Expert Systems with Applications, vol. 236, Art. no. 121365, 2024, doi: 10.1016/j.eswa.2023.121365.

V. G. Krishnan, B. V. S. Rao, J. R. Prasad, P. Pushpa, and S. Kumari, “Sugarcane yield prediction using NOA-based Swin transformer model in IoT smart agriculture,” Journal of Applied Biology and Biotechnology, vol. 12, no. 2, pp. 239–247, 2024, doi: 10.7324/JABB.2024.123456.

L. Ashok Kumar, M. R. E. Jebarani, and V. Gokula Krishnan, “Optimized deep belief neural network for semantic change detection in multi-temporal image,” International Journal on Recent and Innovation Trends in Computing and Communication, vol. 11, no. 2, pp. 86–93, 2023, doi: 10.17762/ijritcc.v11i2.1234.

Downloads

Published

2026-02-20

How to Cite

Krishnan, V. G., Tiwari, A. K., M.C., S., Venkateswararao, P., Sathyamoorthy, K., & Janaki, E. (2026). Lesion-Aware Graph-Augmented Deep Framework for Pancreatic Cancer Detection from CT. VFAST Transactions on Software Engineering, 14(1), 73–89. https://doi.org/10.21015/vtse.v14i1.2322