The Benefits of Asking Students to Write Their Questions Down

By C. Michael Smith, PhD, Roanoke College

Abstract

Many students seem to prefer to resist the opportunity to ask questions during a classroom lecture. However, several studies have suggested that when students develop questions, they increase their higher-level cognitive functions, thus potentially improving their understanding of the material. Therefore, for those lecturers who struggle to motivate students to ask questions orally, do the same benefits exist if students are required to pose written questions to the lecturer? The findings of this classroom experiment provide some indication that students perform better on testable material for which they generated written questions, than on material that was covered in the lecture and reading alone.

Introduction

Even after thirty years, many moviegoers of the 1980s quote from the classic teen comedy, “Ferris Bueller’s Day Off.” In one memorable scene, a business teacher, played by real life economist, Ben Stein, lectures his class on the potential impact of tariffs. As he goes over the material, he frequently attempts to obtain feedback from his students by posing a question, and following the question up with “anyone, anyone.” Unfortunately, the only response he receives from his students are glazed-over eyes and some occasional drool.

After many months of online learning, it seems that many students have returned to Ferris Bueller’s classroom. While attempts may be made by classroom leaders to encourage questions and feedback, students oftentimes passively resist the opportunity potentially to gain additional clarity through the act of speaking up (Graesser & Person, 1994; Dillon, 1990). However, even before the post-pandemic return to the classroom, many lecturers found themselves baffled by, what is perceived as, apathy on the part of silent students. However, it is important to note that, from the student’s perspective, the pressure associated with asking a good question in a live classroom can be somewhat daunting. While the question is first and foremost a request for more information, the question itself, implies that the student asking the question has sufficient knowledge to structure the question properly and understand the answer (Miyake & Norman, 1979). Therefore, the inability of students to understand if they have sufficient knowledge to ask a question or not, is a known barrier that prevents student questions in the classroom (Graesser & McMahen, 1993; Pressley, Ghatala, Woloshyn, & Price, 1990). Another reason that students oftentimes remain silent is due to the fear of appearing uneducated in front of their classroom peers (Graesser, McMahen, & Johnson, 1994). No one wants to be judged as having asked a “dumb question.” And yet, a third reason for student silence rests on the shoulders of the classroom leader. Lecturing faculty (who from here on will simply be referred to as lecturers) sometimes get on a roll, giving good examples, making strong points, and progressing the class efficiently to its end. Students legitimately do not wish to interrupt the lecture with questions that disturb the lecturer’s solid flow of material. In addition, lecturers tend to be poor role models for question-asking as most of the questions that are posed to the classroom are low-level, short-answer questions that are simply designed to confirm understanding rather than to provoke thought (Dillon, 1991). In short, many lecturers do not ask good questions, themselves.

There are likely added challenges associated with asking questions in interdisciplinary classrooms. Not only can interdisciplinary topics prove more challenging than many of the disciplines being taught in the average course, but students are also being asked to speak about interdisciplinary topics for which they are likely to be relatively unfamiliar. In fact, based on the nature of interdisciplinary studies, students may worry that in many cases, the question may be more important than the response! This is liable to cause interdisciplinary students, in particular, to feel a significant amount of pressure when it comes to speaking up and attempting to ask a good question.

Interestingly, when lecturers were polled as to the reason for the silence in their classrooms, the results came back very simply as “fear” and “shame” on the part of the students (Nuri, 2019). However, even though the lecturers deemed the silence to be primarily a student issue, they further posited that the solution was theirs to implement. The lecturers suggested that the best method of overcoming these two powerful emotions would be to find a way to motivate students to participate by providing an interesting and comfortable learning atmosphere.

The principal issue with this arguably good advice, from a practical perspective, is the fact that most lecturers with silent classrooms would claim that they already attempt to create positive learning environments and comfortable classroom atmospheres. Lecturers regularly experiment with different classroom methods in an attempt to efficiently convey the required information while fostering student engagement (Case, Bartsch, McEnery, Hall, Hermann, & Foster, 2008; Holley & Steiner, 2005). Still, many lecturers struggle. Oftentimes, even after considerable efforts to increase engagement, the only reward seems to be more silence from students. Given the difficulty that some lecturers experience in their attempts to evoke questions from their students, some may wonder: Is the effort worth it? Do students truly benefit from asking questions, or do they perform just as well when they allow the lecturer to cover the chosen material sans feedback?

Research would seem to suggest that questioning does have an important role to play in meaningful learning (Aguiar, Mortimer, & Scott, 2010; Chin & Osborne, 2008). According to Brown, Palinscar and Armbruster (1984) and Rosenshine, Meister and Chapman (1996), question generation is important to comprehension-fostering as it focuses the student’s attention on content. However, student questions are not the only way to generate higher-level cognitive functions. They are simply one way (Garcia & Pearson, 1990). Therefore, is it possible that lecturers, frustrated by their efforts to solicit questions from their classroom, may find greater benefit by refocusing on other areas? One very popular method, given the reluctance of many students to ask questions, is for the instructor to pose their own questions to the class during the lecture that are designed to focus student attention on the more important concepts. Parkinson and Whitty (2022) found that the use of these “tag questions” significantly aided student understanding of the material, but did not necessarily generate a natural verbal response. But is a verbal response needed to successfully convey classroom material? Higher-level cognitive functions in students could be fostered by an improved delivery of the course material, perhaps by carefully selecting relevant tag questions, or by developing assignments designed to increase student knowledge and retention. In fact, the assignment, itself, could be to pose relevant and thought-provoking questions to the course instructor.

Many lecturers have experimented with requiring oral student engagement in the form of assignments and other graded material. A popular method includes the “flipped classroom”, where the lecturer no longer lectures at all, but instead attempts to aid students, as they are responsible for their own learning (Lai & Hwang, 2016). While the flipped classroom provides many opportunities for an enhanced learning experience, this type of classroom arrangement also poses many challenges (Akcayir & Akcayir, 2018). Therefore, many lecturers may be naturally inclined to pursue a simpler means of increasing classroom engagement. Another strategy might be to simply require students to ask oral classroom questions, but it is arguable if this type of activity would truly motivate student engagement, or rather, create additional resistance and tension. However, what if instead of requiring oral questions, the students were only required to write their questions down?

While much research has looked at the potential benefits of asking oral questions naturally in a live, in-person, classroom, little research has analyzed the results of student comprehension improvements associated with written questions (Harper, Etkina, & Lin, 2003). Is it possible that if students won’t ask questions in a live classroom, that they could still benefit from writing their questions down, perhaps in a journal or some other written assignment? If this activity is constructive for the student, the benefits to the lecturer are obvious. Asking students to simply write down their questions is an extremely practical and tangible activity. Unlike the ambiguous recommendation to create a comfortable classroom environment, or the course strategy transformation associated with the flipped classroom, a written assignment where students pose questions to the lecturer is simple, tangible, and easily implementable in most instances. But does it help? Do students benefit when they generate their own written questions on class material, or do the knowledge gains only come from questions generated orally during classroom interactions?

Methodology

To better understand if students do, in fact, benefit from posing written questions to the course lecturer, a small experiment was devised that attempted to quantitatively test student performance on tested material using available pre-pandemic data. The first type of information analyzed were student responses to questions based on course material that had been lectured on without written student questions. This information represented the standard lecture format in a quiet classroom with no (or very few) student questions posed. The second type of information, however, were student responses to questions that were addressed with students as a response to required written questions, posed by students to the lecturer.

Data for the experiment came from the 2019 spring semester “Personal Finance” course at a small liberal arts college. This particular course is an introductory course, but typically fills up with the course maximum, twenty-five students, long before first- and second-year students are provided the opportunity to enroll. During the semester examined, over ninety percent of students were either in their third or fourth years. Approximately one-third of students in this business elective course were business majors, but the remaining students represented a variety of disciplines, including economics, psychology, sociology, public affairs, and human performance. As is the case in interdisciplinary studies, many of these students had little to no background in the course topic The wealth of disciplines represented in the class may help to explain why the students in this course seem to be particularly quiet during course lectures, even before the pandemic. As suggested above, the risk of being perceived as having asked a “dumb question” may be higher for students with little background in the subject matter.

The assignments

The course assignments that required students to pose relevant written questions to the course instructor were four personal finance case analyses that covered topics relevant to recent lectures. For each assignment, students had to provide an overview of the case along with an analysis of the information they found most important and how they could potentially apply the information in their own financial lives. However, in addition to these traditional case analysis methodologies, the assignment additionally required students to mindfully pose a written question that they still had about the case material or overall topic. Students were then informed that the instructor would answer questions that were deemed “thoughtful, insightful, helpful, and answerable” in the next class. Further, students were advised that these anonymous questions (along with the associated answers) should be considered testable material on upcoming exams.

Examples of questions that did not meet the specified criteria included:

  • What is the best credit card?
  • Should I just stay on my parent’s insurance plan after I graduate?

While these questions relate to the respective topic, they are largely opinion with potential benefits that may only relate to an individual in a specific financial situation. To truly answer these questions would require much more information to be provided, and even then, there is little insight that would prove helpful to the class, as a whole.

Examples of questions that met the specified criteria included:

  • What happens if you are an “authorized user” on a credit card, and the primary user dies with credit card debt?
  • Is it possible that someone could make too much money to be allowed to contribute to a Health Savings Account?

Both of these questions take subject matter from their respective cases and delve deeper into an understanding of the specific concept. They are practical applications, with researchable answers that could potentially prove helpful to the personal finances of many consumers.

The average number of student questions answered for each case assignment was seven (for a total of twenty-six answered student questions over the course of the semester). Numerous student questions did not fit the required criteria; however, some questions posed by students demonstrated an obvious misalignment between the subject matter taught, and a true understanding of the material. Armed with these questions, the instructor was able to clarify previously-taught material, based on the specific type of misunderstanding expressed. In addition, some questions were so deep and complex, that the instructor was forced to carefully research answers before providing responses.

Interestingly, even if students were not benefiting from the experiment, the instructor developed a stronger understanding of the course material in researching and preparing responses to the more thoughtful student-generated questions. It is worth noting that while this outcome was far from the original goal of having students benefit directly from the generation of their own written questions, it was still a noticeable benefit. Given this result, it would appear that questioning in the classroom (written or otherwise) may create a meaningful learning opportunity, not just for students, but for the instructor as well. Thus, the activity of questioning may have an even more important role to play in the classroom than the current literature would seem to suggest. It is possible that student questions not only enhance the learning opportunities for the students, themselves, but also provide the instructor with a natural way to keep the course material fresh and engaging.

The exams

Three exams were scheduled over the course of the semester. The exams consisted of multiple choice and essay questions. For the essay section of the exam, the students had to answer three out of four listed questions. Students were encouraged to skip over the question for which they were the least prepared to answer. Each essay question was worth ten points (out of a total of one hundred points on the exam).

For the spring semester of 2019, exams one and three contained one essay question based on the questions developed by the students, themselves, in their case assignments. The second exam contained two questions that were inspired by students. For each exam, the remaining essay questions were created by the instructor based on material covered in class and other readings.

The research question and hypothesis

For the purposes of this experiment, both types of questions (those developed by the students themselves, and those developed by the instructor) were analyzed to determine the average number of points deducted for incorrect, or partially incorrect responses. The primary research question for this study is: Would students perform better on the material that they helped develop from their own question generation?

Results

Before testing the hypothesis, descriptive statistics were run on the values. It is interesting to note that the N = 75 for the instructor-created questions is almost the same as the N = 72 for the student questions. Given that there were eight instructor-created essay questions and only four student-created essay questions, the students heavily self-selected their own questions when given the option to leave a question blank (most likely the question in which they had the least amount of knowledge).

The average number of points deducted on essay questions developed solely by the course instructor was 2.89 and the average number of points deducted on questions generated by the students was 1.64 (independent samples test confirms significance). Therefore, it would appear that the students were able to more accurately respond to questions they, themselves, had generated. See Table 1 and Table 2 for more information.

Table 1 – Exam Question Point Deductions: Group Statistics

Question Type N Mean Std. Deviation Std. Error Mean
Deduct Instructor created 75 2.89 2.518 .291
Student questions 72 1.64 2.247 .265

Table 2 – Independent Samples Test on Exam Question Point Deductions

Levene’s Test for Equality of Variances t-Test for Equality of Means
F Sig. t df Sig. (2-tailed) Mean Difference Std. Error Difference 95% Confidence Interval of the Difference
Deduct Lower Upper
Equal variances assumed 5.604 .019 3.182 145 .002 1.254 .394 .475 2.034
Equal variances not assumed 3.190 144.246 .002 1.254 .393 .477 2.032

Discussion

Many course instructors, in the post-pandemic classroom, may be thinking about student classroom response techniques. Both students and faculty are adjusting to a new classroom “normal” after months (or years) of online-only instruction. And, while much more research needs to be done in this area, the results of this experiment may provide lecturers who are adjusting (or readjusting) to face-to-face class meetings with additional insight into strategies and techniques that can serve to further enhance student learning objectives.

While lecturers can and should continue to experiment with methods to increase oral student engagement, an additional option may be to simply require students to thoughtfully develop their questions in writing, as this particular experiment noted tangible benefits and addressed several known barriers to oral classroom question-asking. While the significant limitations of this experiment suggest the results are far from empirical evidence in support of the technique, there still appears to be sound benefits to this extremely simplistic strategy. First, by using written questions, students must overcome the barrier associated with the feeling that perhaps they do not understand the material sufficiently to ask a relevant and insightful question (Graesser & McMahen, 1993). As noted above, this particular barrier may prove particularly troublesome in interdisciplinary studies where students may feel that they know less about the particular subject matter. The requirement that a question be generated, regardless of their current level of understanding, would seemingly serve to focus students on the topic and improve understanding. In addition, questions that demonstrate a lack of understanding are also extraordinarily helpful for course lecturers as insight can be gained into the course material that may need to be further explained or expanded upon. Further, written questions negate the temptation for students to remain silent due to social concerns (Graesser, McMahen & Johnson, 1994). Since the written questions are not, by necessity, publicly disclosed, students need not worry about what their classroom peers might think about their question.

Further adding to a lecturer’s incentive to require written questions, as opposed to the traditional “raising of the hand”, are technology advancements that make the implementation of the written question strategy both effective and efficient. Depending on the lecturer’s preferences, they may find success with an old-fashioned word processor and a homework assignment, or with smart phone applications and software that allow for a student’s written questions to be addressed in the classroom, in real time. There is considerable flexibility.

Instructors in interdisciplinary studies may find the flexibility associated with using technology particularly beneficial when it comes to student questioning due to the increased level of collaboration often used in the program. When it comes to addressing student-generated questions, students may benefit greatly from the instructor’s ability to consult with cross-disciplinary faculty connections. Technology, as a communication tool, could help significantly in making these connections across campus. Further, while feedback is necessary for any discipline, the need for carefully-crafted responses to be developed for interdisciplinary course questions is imperative. Technology, once again, can aid in making the process of generating useful feedback as efficient as possible.

In summary, based on the results of this classroom experiment, asking students to pose questions to the course instructor in written form appears to aid them in learning the material, as it relates to their ability and performance on exams. In addition, the information gained from the written questions can also be used by the instructor to better relate to student understanding, spot disconnects, and provide further explanation or clarity when needed. And lastly, the written questions may even aid instructors in their efforts to stay up-to-date on relevant course topics and material.

References

Aguiar, O. G., Mortimer, E. F., & Scott, P. (2010). Learning from and responding to students’ questions: The authoritative and dialogic tension. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 47(2), 174-193.

Akçayır, Gökçe, and Murat Akçayır. (2018). The flipped classroom: A review of its advantages and challenges. Computers & Education 126, 334-345.

Brown, A. L., Palincsar, A. S., & Armbruster, B. B. (1984). Instructing comprehension-fostering activities in interactive learning situations. Learning and comprehension of text, 255-286.

Case, K., Bartsch, R., McEnery, L., Hall, S., Hermann, A., & Foster, D. (2008). Establishing a comfortable classroom from day one: Student perceptions of the reciprocal interview. College Teaching, 56(4), 210-214.

Chin, C., & Osborne, J. (2008). Students’ questions: a potential resource for teaching and learning science. Studies in science education, 44(1), 1-39.

Dillon, J. T. (1990). The practice of questioning. Taylor & Francis.

Dillon, J. T. (1991). Questioning the use of questions. Journal of Educational Psychology, 83(1), 163-164.

Garcia, G. E., & Pearson, P. D. (1990). Modifying reading instruction to maximize its effectiveness for all students. Center for the Study of Reading Technical Report; no. 489.

Graesser, A. C., & McMahen, C. L. (1993). Anomalous information triggers questions when adults solve quantitative problems and comprehend stories. Journal of Educational Psychology, 85(1), 136.

Graesser, A. C., McMahen, C. L., & Johnson, B. K. (1994). Question asking and answering. In M. A. Gernsbacher (Ed.), Handbook of psycholinguistics (pp. 517–538). Academic Press.

Graesser, A. C., & Person, N. K. (1994). Question asking during tutoring. American educational research journal, 31(1), 104-137.

Harper, K. A., Etkina, E., & Lin, Y. (2003). Encouraging and analyzing student questions in a large physics course: Meaningful patterns for instructors. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 40(8), 776-791.

Holley, L. C., & Steiner, S. (2005). Safe space: Student perspectives on classroom environment. Journal of Social Work Education, 41(1), 49-64.

Lai, C. L., & Hwang, G. J. (2016). A self-regulated flipped classroom approach to improving students’ learning performance in a mathematics course. Computers & Education, 100, 126-140.

Miyake, N., & Norman, D. A. (1979). To ask a question, one must know enough to know what is not known. Journal of verbal learning and verbal behavior, 18(3), 357-364.

Nuri, B. (2019, October). The reluctance of students to ask in mathematics learning: How does the teacher solve it?. In Journal of Physics: Conference Series (Vol. 1320, No. 1, p. 012069). IOP Publishing.

Parkinson, J., & Whitty, L. (2022). The role of tag questions in classroom discourse in promoting student engagement. Classroom Discourse, 13 (1), 83-105.

Pressley, M., Ghatala, E. S., Woloshyn, V., & Pirie, J. (Summer 1990). Sometimes adults miss the main ideas and do not realize it: Confidence in responses to short-answer and multiple-choice comprehension questions. Reading Research Quarterly, 25(3) 232-249.

Rosenshine, B., Meister, C., & Chapman, S. (1996). Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66(2), 181-221.