The Rev. Timothy Cargal, Ph.D., serves as Assistant Stated Clerk for Preparation for Ministry in Mid Council Ministries of the Office of the General Assembly.
“... the Land that I Will Show You” is the blog of the Office of Preparation for Ministry of the Presbyterian Church (U.S.A.). This blog is designed to serve as a resource for those discerning and preparing for a call to the ministry of Word and Sacrament as ordained teaching elders of the church. It will also provide a place for reflecting on and dialoging about the changing context of pastoral ministry in the early 21st century.
For quick announcements about changes or developments in the preparation process, dates related to exams or other key events, discussion boards, surveys, etc., you can follow us on Facebook at “Preparing for Presbyterian Ministry.”
Well, the candidates have finished the first round of online testing for the senior ordination exams, and on the whole the early reviews have been very positive. That is not to say that there were not some problems or that there is no room for improvement.
As is widely known, every decision in life comes with trade-offs. When the Presbyteries’ Cooperative Committee on Examinations for Candidates (PCC) first began discussing going to computer-based testing back in 2007, they decided to build the system on World Wide Web technology because it provided the maximum flexibility in terms of testing locations and computers that could be used for the exams. Flexibility in those areas, however, comes at the expense of assuring absolute uniformity in testing experience for all candidates. In reviewing what we heard from the candidates who completed the surveys about the online exam experience (about 3 of every 4 candidates) I will be pointing out where feedback falls on that fault line between “Internet flexibility” and “uniform experience.”
Almost 9 out of 10 (86%) of all test takers reported they had no problem accessing the online exams. Among those who did have problems, two primary issues were identified: problems connecting or maintaining connections with the Internet itself, and uncertainty by proctors in how to open access to the exams. We will continue to work with proctors to develop guidelines and procedures that are as simple and easy to follow as possible. At the national level, however, there is little we can do about Internet connection issues. Maximizing testing site availability across almost 100 locations ranging from seminaries to presbytery offices to the studies of pastors serving as special proctors for single candidates means we must rely on what they tell us about their ability to get and keep test takers connected to the Internet. We do and will continue to stress the importance of this issue with those who offer to provide proctoring services to our candidates.
Nine out of ten (89.8%) candidates also reported they had no problems moving through the exams. The most common difficulty cited by those who did report problems was that it was cumbersome to have to page through all the questions to move between them. We will explore whether additional navigation options might be included, but will need to balance that with concerns the system not lead to inadvertently skipping past required sections.
Less than 1 in 20 (4.4%) reported any difficulty figuring out how to use the system, with 8 in 10 (81.3%) saying its “usability” was as good or better than other websites they use. Two “usability” enhancements, however, were widely requested. One was to add a “save current page” button between the “previous” and “next” buttons so that candidates could transfer answers themselves as they worked on them rather than relying only on the automatic background saves. The other was that candidates be able to see what was stored on the servers after their work was submitted. I have already met with our programmers to discuss ways to implement these requests.
There were a range of issues identified in the surveys that on the surface may sound quite distinct, but in the end all boil down to a common cause. Those concerns surrounded availability and functionality of “spell check,” differences in word counts and formatting possibilities between word processors and the exam system, the need for Unicode Greek and Hebrew, and the display size of text and the response boxes. The common cause for all these things is that the exams are being taken and will be evaluated by the readers within web browsers. Because of that, the system must require that all answers comply with web standards for displaying text (hence the more limited formatting options and need for Unicode), and the available features will be limited by each user’s browser choice and its configuration (“spell check,” word count, display sizes, etc.).
The best we can do at the national level on these areas, then, is to get out the information about the issues in advance and to provide a way for test takers to experiment with the testing environment on their own computers well before the testing periods. Because as so often happens with projects like this the development continued right up until the last hour, we could not get information out far in advance because things simply were not nailed down. The good news is that we now have the practice area established, and it will be available in all testing cycles from the opening of registration on. We will be working on clarifying the information in the handbook, and also developing what we hope will be a single-page “frequently asked questions” (FAQ) sheet that can serve as a quick reference and pointer to more complete discussion of issues in the handbook.
Attention now shifts to taking what we have learned and heard from the test takers to improve the system both for the readers who will evaluate the exams in a few weeks, and for those who will be taking online exams in the future. We are always glad to receive comments and suggestions for how we can find that elusive balance point between “maximizing flexibility” and assuring a “consistent and quality user experience.”