Abstract
This paper explores how transformer-based AI architectures, in particular, BERT, can be integrated into the field of Biblical Studies research. With AI tools revolutionizing natural language processing (NLP) this research aims to familiarize biblical scholars with transformer models, emphasizing their potential to facilitate traditional textual analysis tasks such as authorship attribution, genre classification, and text dating. The paper serves as a primer for non-specialists, outlining the transformer architecture, embeddings, and the self-attention mechanism that enables sophisticated linguistic analysis. It provides a basic mathematical overview of these models and a brief survey of the current state of relevant research in text classification tasks. This paper suggests future directions for incorporating BERT models in Biblical Studies and calls for scholars to engage actively with this technology to avoid obsolescence. This paper will also address the challenges of implementing AI in Biblical Studies research, such as data scarcity in ancient texts.
Written: 2024, Published: TBD
Word Count: 11,340
Origin: [In Review] DeGruyter, Open Theology
Link: Access Paper Here