How and why will your New Testaments be changing in the computer era?

(COMMENTARY)

THE QUESTION:

How and why will a new technique for computer analysis of ancient texts affect the New Testaments you’ll be reading?

OSTLING’S ANSWER:

A revolution now under way will gradually change every future English translation of the New Testament you’ll be reading. Translations are based upon some 5,800 hand-written manuscripts of the New Testament in Greek that survived from ancient times, whether fragments or complete books. Scholars analyze their numerous variations to get as close as possible to the original 1st Century wordings, a specialty known as “textual criticism.”

Books by Bart Ehrman at the University of North Carolina tell how such differences turned him from conservative to skeptic regarding Christians’ scriptural tradition. Yet other experts see the opposite, that this unusually large textual trove enhances the New Testament’s credibility and authority, though perplexities persist.

Two years ago, a good friend with a science Ph.D. who closely follows biblical scholarship alerted me to the significance of the “Coherence-Based Genealogical Method” (CBGM). What a mouthful. I managed only a shaky grasp of CBGM and hesitated to write a Memo explaining it.

But he now takes up the topic, prodded by an overview talk by Peter Gurry, a young Cambridge University Ph.D. who teaches at Phoenix Seminary, video posted here (start at 33 minutes): www.facebook.com/phxsem/videos/10155643293078282.  I won’t attempt a full description, but you can learn details in Gurry’s article for the Journal of the Evangelical Theological Society at  www.etsjets.org/files/JETS-PDFs/59/59-4/JETS_59-4_675-89_Gurry.pdf, or his co-authored 2017 book “A New Approach to Textual Criticism.”   (Gurry’s doctoral dissertation on CBGM is available in book form but pricey and prolix.)

If it’s any encouragement, Gurry confesses he himself needed a year to comprehend CBGM, which he says “is not widely known or understood, even among New Testament scholars.” Despite confusion and controversy, he considers CBGM “a very good thing” that “boosts our confidence” in the New Testament heritage.

Scholars’ minds cannot possibly cope with the vast data of New Testament texts that computers can now process. CBGM is essentially a set of theories and tools for computer analysis of relationships among the texts to track their history by sorting them into lines of family “connectivity” and working back to the original wordings.

It seems inevitable that CBGM will overturn numerous conclusions from previous textual criticism. In almost all cases, future revisions will be minor and won’t alter basic beliefs, though with Holy Writ every word is important. (Older English renditions such as the King James Version will remain unaffected.)

The new technique was first announced in 1982 by Gerd Mink of the all-important Institute for New Testament Textual Research at Germany’s University of Munster. The institute evaluates and blends varied manuscripts to produce successive editions of the New Testament in Greek used for modern translations.

Already, the institute’s latest Greek edition (number 28) employed CBGM to examine 3,046 variations in 123 key manuscripts of the “Catholic epistles” (James, 1 and 2 Peter, 1, 2, and 3 John, and Jude). The result was 35 changes that are beginning to filter into updated English Bibles now on sale. Work on Mark, John, Acts, and Revelation is in process and the institute plans a full CBGM New Testament by 2030.

Tommy Wasserman, a textual critic in Norway, applied CBGM-like theories  to a famous problem verse in TC: A Journal of Biblical Textual Criticism.  The first verse in Mark’s Gospel in the Revised Standard Version is typical: “The beginning of the gospel of Jesus Christ, the Son of God.” But then a footnote says “other ancient authorities omit ‘the Son of God.’ ” Ehrman cites this as a major example of “corruption” by orthodox churchmen who deliberately rewrote the original version, in this case by adding “Son of God.”  CBGM will be central in settling such disputes.

Belief in Jesus as the “Son of God” isn’t at issue here, because the rest of this Gospel teaches that. The question is whether the original version of Mark  proclaimed this concept at the very start or let it develop throughout the book. Although the phrase is omitted  in two key early manuscripts, Wasserman concludes that the longer wording is most likely the original and some copyist’s mistaken omission was picked up in other texts.

Some other standard puzzles CBGM might settle: In Mark 1:41, was Jesus “angry” or “compassionate”? In Acts 20:28, did God or Jesus “purchase” the church? In Jude 5, did God or Jesus rescue his people from Egypt?  In 2 Peter 3:10, will the Earth and its works be “exposed” or “burned up”?

Greg Lanier of Reformed Theological Seminary, Gurry’s fellow Cambridge Ph.D., explains that textual criticism’s goal, now strengthened by CBGM, is not “getting back to what Mark wrote” but “getting back as early as possible” to “what the early church received as” what was written originally.  The German institute explains that  CBGM is producing “the hypothetical reconstruction” of the wordings as they existed at “the beginning of a textual tradition.”

This article was published on Get Religion.