Disadvantages of open-book testing

The same articles pointed to six disadvantages. (Admittedly, all except one of the articles was favourably disposed towards open-book testing.) Open-book testing is not Edutopia. What are some of the disadvantages?

  1. They are harder to set. It requires more time, effort, and skill to formulate higher-order thinking questions. This is true whether the examiner uses open or closed questions. If the examiner does not take the time or have the skill to ask questions that call for application and critical thinking, the open-book exam will be too easy (Teodorczuk, Frazer, and Rogers 2018).
  2. They are harder to grade. Factual exams are quick and easy to grade, whether they are based on multiple-choice or essay type questions. For essay questions, the assessor simply gives marks for points. Because open-book papers call for application of concepts, they are more challenging to grade (Teodorczuk, Frazer, and Rogers 2018).
  3. They are harder to write. Students take longer to answer the questions because they call for deeper thinking. Therefore, students can answer fewer questions in an open-book paper than in a closed-book one (Teodorczuk, Frazer, and Rogers 2018).
  4. They are open to unauthorised collaboration. In the case of exams written online without supervision, students can confer or collaborate during the exam (Teodorczuk, Frazer, and Rogers 2018). A simple way to reduce this risk is to draw the questions from a question bank, so that each student gets a different mix of randomly selected questions.
  5. They lull students into poor preparation. Some students breathe a sigh of relief when they hear that the exam is open-book. Assuming it will be easy, they may not prepare as diligently as they would for a closed-book exam (Senkova et al. 2018). There are two simple antidotes. Firstly, teach students how to study for an open-book exam. The process should include giving them access to trial papers. If the trial papers are sufficiently testing, they will be forewarned and forearmed. Secondly, ensure that the open-book exams are testing. Students will soon learn that they need to prepare diligently.
  6. They undermine desirable difficult on factual tasks. The principle of desirable difficult holds that students’ long-term recall of information is improved when they need to struggle to recall it on a test. Open-book tests lose this benefit for two reasons—students can look up the answer instead of straining to recall it and examiners don’t ask simple, factual questions (Rummer, Schweppe, and Schwede 2019).

The most interesting thing amongst the six objections is that nobody expresses concern that open-book testing is too easy to be valid. The consensus is that it is more difficult, provided the examiner has the skills to set higher-order thinking tests.


For online distance education in Africa, securing traditional closed-book examinations is impractical at present given the bandwidth and hardware that proctoring software requires. However, it is as unnecessary as designing a pen for zero gravity; the solution is to use a pencil. The best solution for higher education is to embrace open-book testing. It reduces the risk of cheating, but it has many educational advantages too—provided that the examiners are well trained.

The implications for institutions offering online distance education are significant. As educators, we desire to focus our testing on higher order thinking rather than on retention of facts. Well-constructed open-book exams are ideally suited for this purpose, and they come with desirable fringe benefits such as less stress, equivalent long-term recall, fewer commission errors, and more flexibility in the curriculum. Why waste time and money on surveillance software to perpetuate a method that is best suited to rote learning when we can embrace a simpler, better approach that presses us to formulate more challenging tests?

Works Cited

Deneen, Chris. 2020. “Assessment Considerations in Moving from Closed-Book to Open-Book Exams.” University of Melbourne. Link.

Dicarlo, S. E. 2009. “Too Much Content, Not Enough Thinking, and Too Little Fun!” Advanced Physiological Education 33: 257–64.

Hughes, Gwyneth. 2021. “Open Book Exams: Open Season for Cheaters or a Better Form of Assessment?” UCL (blog). 2021. Link.

Rummer, Ralf, Judith Schweppe, and Annett Schwede. 2019. “Open-Book Versus Closed-Book Tests in University Classes: A Field Experiment.” Front: Pscyhol. 10. https://doi.org/10.3389/fpsyg.2019.00463.

Senkova, Olesya, Hajime Otani, Reid L. Skeel, and Renée L. Babcock. 2018. “Testing Effect: A Further Examination of Open-Book and Close-Book Test Formats.” Journal of Effective Teaching in Higher Education 1 (1): 20–36.

Teodorczuk, Andrew, James Frazer, and Gary D. Rogers. 2018. “Open Book Exams: A Potential Solution to the ‘Full Curriculum’?” Medical Teacher 40 (5): 529–30.


Short Bio: Kevin, who is the Principal of SATS, obtained his first doctorate from Stellenbosch University and his Ph.D. from SATS. He has a deeply insightful approach to theology and has already made a significant contribution in his relatively short career.