Optimizing Metadata Structures for Enhanced Search and Retrieval in Digital Libraries
DOI:
https://doi.org/10.51983/ijiss-2025.IJISS.15.3.03Keywords:
Digital Libraries, Information Retrieval, Metadata, Semantic Search, Text Lines Artificial Gorilla Troops Optimizer-driven Attention Based Recurrent Neural Network (AGT-ARNN)Abstract
Purpose: Research aims to enhance search and retrieval in digital libraries by optimizing traditional metadata structures using deep learning. It addresses the semantic limitations of conventional metadata in handling unstructured full-text documents, enabling more precise and context-aware search outcomes. Methodology: A novel Artificial Gorilla Troops Optimizer-driven Attention-based Recurrent Neural Network (AGT-ARNN) model was developed to extract and classify high-quality semantic metadata from digital library documents. A comprehensive full-text dataset with manually annotated metadata was used. Preprocessing involved tokenization, lemmatization, and stemming to clean and standardize the text, followed by Word2Vec embedding to retain contextual semantics. This approach captured deeper syntactic and semantic dependencies, boosting the model's understanding of complex document structures. All phases of data processing, model construction, and evaluation were carried out using Python. Results: The AGT-ARNN model demonstrated superior performance with an accuracy of 97%, precision of 90%, recall of 93%, and F1-score of 95%, outperforming methods. It significantly improved retrieval speed and relevance, creating semantically rich metadata for digital library systems. Conclusions: The proposed framework effectively transforms digital libraries into intelligent systems by automating the generation of deep, semantically enhanced metadata. These findings support scalable, real-time, user-centered information access in next-generation knowledge environments
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 The Research Publication

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.







