A Novel Blockchain Proof of Validation Scheme Based on Explainable AI for Healthcare Workload
DOI:
https://doi.org/10.21015/vtcs.v13i1.2122Abstract
These days, the usage of blockchain with machine learning to optimise data validation in terms of transparency, validity, and immutability has been increasing daily. Therefore, many complex applications, such as healthcare and related disease processes, have recently required the implementation of many remote resources in a transparent form. The blockchain provides real-time security validation based on proof of work validation schemes. To understand the dynamic situation of blockchain, mainly machine learning implemented for the decision and improve the efficiency of the security. However, there are many limitations when using blockchain technology with machine learning. Therefore, to cope with this issue, a novel blockchain proof of validation scheme based on explainable AI for healthcare applications is needed to process the decision of blockchain with machine learning in a more explainable way. We present the blockchain proof of work validation explainable AI (PoWV-XAI) to control the delay, energy, cost and security dynamic issues compared to existing blockchains with machine learning algorithms. The proposed PoWV-XAI algorithm suggested different metaheuristic schemes and supported the explainability of healthcare workload execution on other nodes, such as local and server. Simulation results show that the proposed PoWV-XAI is more explainable, and all decisions, such as processing delay, validation, security, energy, and cost, are explainable compared to existing blockchain methods.
References
R. Kumar, D. Javeed, A. Aljuhani, A. Jolfaei, P. Kumar, and A. N. Islam, "Blockchain-based authentication and explainable AI for securing consumer IoT applications," IEEE Transactions on Consumer Electronics, vol. 70, no. 1, pp. 1145–1154, 2023.
K. Muneer and U. Fatima, "Cryptocurrencies analytics with machine learning and human-centered explainable AI: Enhancing decision-making in dynamic market," International Journal of Computer Applications, vol. 975, p. 8887, 2023.
S. Sachan and X. Liu, "Blockchain-based auditing of legal decisions supported by explainable AI and generative AI tools," Engineering Applications of Artificial Intelligence, vol. 129, p. 107666, 2024.
P. Kumar, D. Javeed, R. Kumar, and A. N. Islam, "Blockchain and explainable AI for enhanced decision making in cyber threat detection," Software: Practice and Experience, vol. 54, no. 8, pp. 1337–1360, 2024.
H. Y. Chen, K. Sharma, C. Sharma, and S. Sharma, "Integrating explainable artificial intelligence and blockchain to smart agriculture: Research prospects for decision making and improved security," Smart Agricultural Technology, vol. 6, p. 100350, 2023.
R. Salama and M. Ragab, "Blockchain with explainable artificial intelligence driven intrusion detection for clustered IoT driven ubiquitous computing system," Preprint, 2023.
Z. Abou El Houda, H. Moudoud, B. Brik, and L. Khoukhi, "Securing federated learning through blockchain and explainable AI for robust intrusion detection in IoT networks," in IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), IEEE, 2023, pp. 1–6.
N. Sharma and P. G. Shambharkar, "Multi-attention DeepCRNN: An efficient and explainable intrusion detection framework for Internet of Medical Things environments," Knowledge and Information Systems, pp. 1–67, 2025.
M. Hasan, M. S. Rahman, H. Janicke, and I. H. Sarker, "Detecting anomalies in blockchain transactions using machine learning classifiers and explainability analysis," Blockchain: Research and Applications, vol. 5, no. 3, p. 100207, 2024.
J. Dutta, H. B. Eldeeb, and T. D. Ho, "Advanced eHealth with explainable AI: Secured by blockchain with AI-empowered block sensitivity for adaptive authentication," in 2024 IEEE 35th International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), IEEE, 2024, pp. 1–7.
S. Brohi and Q.-u.-a. Mastoi, "AI under attack: Metric-driven analysis of cybersecurity threats in deep learning models for healthcare applications," Algorithms, vol. 18, no. 3, p. 157, 2025.
Q. Mastoi, S. Latif, S. Brohi, J. Ahmad, A. Alqhatani, M. Alshehri, A. Al Mazroa, and R. Ullah, "Explainable AI in medical imaging: An interpretable and collaborative federated learning model for brain tumor classification," Frontiers in Oncology, vol. 15, pp. 1535478–1535478, 2025.
A. Lakhan et al., "Metaverse-assisted healthcare body sensor network architecture," in 2024 IEEE 20th International Conference on Body Sensor Networks (BSN), IEEE, 2024, pp. 1–4.
C. M. van Leersum and C. Maathuis, "Human-centred explainable AI decision-making in healthcare," Journal of Responsible Technology, vol. 21, p. 100108, 2025.
Q. Mastoi, S. Latif, S. Brohi, J. Ahmad, A. Alqhatani, M. Alshehri, A. Al Mazroa, and R. Ullah, "Explainable AI in medical imaging: An interpretable and collaborative federated learning model for brain tumor classification," Frontiers in Oncology, vol. 15, pp. 1535478–1535478, 2025.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC-By) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
This work is licensed under a Creative Commons Attribution License CC BY