Research Article Critique

By Suzannah Alexander

November 28, 2021

Reference

Mamun, M. A., Lawrie, G. A., & Wright, T. (2020). Instructional design of scaffolded online learning modules for self-directed and inquiry-based learning environments. Computers & Education, 144, 103695. https://doi.org/10.1016/j.compedu.2019.103695

Study Purpose or Problem

What is the purpose of the study?

Learner-content interaction is essential to effective learning in online environments, yet little research has focused on how students react to and best learn from an inquiry-based format without the support of instructor-learner and learner-learner interactions. This study explores how scaffolding through a predict, observe, explain, and evaluate (POEE) strategy can support self-directed, inquiry-based online learning. The researchers propose that this framework can be used to support learners’ independent study in blended learning environments (Mamun et al., 2020).

Is the purpose of the study or research problem clearly identified?

The researchers clearly state the purpose of the study and explain the deficiencies of past studies in related areas of inquiry, specifically the lack of studies on effective student-content online interactions.

Literature Review and Theoretical Framework

Is the review logically organized?

The literature review is logically organized and presents brief summaries of theories and past work on constructivism; knowledge reflection; scaffolding; the zone of proximal development; the community of inquiry framework and social constructivism; student-content, student-student, and student-teacher interactions and the equivalency theorem; independent learning and self-regulation; and self-efficacy, prior knowledge, and learning strategies needed for successful learning (Mamun et al., 2020).

Is the theoretical framework appropriate?

The theoretical framework is appropriate, in that it builds on past scaffolding studies that used a predict, observe, and explain (POE) design. The researchers theorized that adding an evaluation element to give synchronous feedback to learners in an online environment would strengthen this overall pedagogical strategy. They structured the learning modules using a macro-to-micro concepts formula that mirrored face-to-face instructor interaction but with carefully crafted guided technology and content support instead (Mamun et al., 2020).

Does it offer a balanced critical analysis of the literature?

The literature analysis is thorough and balanced, as it includes numerous studies over many years from a range of disciplines.

Is the majority of the literature of recent origin?

The cited literature goes back to the 1970s, but the majority is from the past 20 years. The most recent literature includes the studies that are most closely related to this study.

Is it mainly from primary sources and of an empirical nature?

Most of the cited literature is from primary, empirical sources. Some of the cited literature, however, covers specific educational theories and research methodologies that are applicable to the study framework.

Does it address the gap in the literature related to the research problem?

The researchers describe the gap in the literature as a lack of research on how students respond to scaffolding in self-regulated, inquiry-based online learning environments that do not involve instructor or peer interactions. In this study, the scaffolding is provided by the digital learning tools and the content that supports student learning and engagement (Mamun et al., 2020).

Research Questions or Hypotheses

What are the research questions or hypotheses being examined?

The research question being examined is: What type of scaffolding enables students to learn independently so that direct instructor support is unnecessary in an inquiry-based online learning environment? (Mamun et al., 2020).

Are they clearly stated?

The main research question is clearly stated. No hypotheses are examined in the study. The researchers’ goal is to establish an example of a scaffolding strategy that can add to the weight of successful learner-centered online learning options (Mamun et al., 2020).

Population and Sample (Quantitative) or Participants (Qualitative)

Has the target population been clearly identified?

First-year chemistry students are the target population (Mamun et al., 2020).

How was the sample selected?

The researchers used a convenience sampling technique by inviting all first-year chemistry students at a particular institution to participate via the learning management system (Mamun et al., 2020).

Who are the participants? What is the sample selection method?

Only students who were interested in participating were selected for the study (Mamun et al., 2020).

Is the sample of adequate size?

This study only included 30 participants, which is a small sample size. The researchers chose to focus on a more detailed, in-depth exploration of the experiences of a select number of participants (Mamun et al., 2020).

Research Setting or Context

What was the setting or context of the research?

Students were selected from those enrolled in a regular chemistry course, but this study was conducted outside of the regular classroom environment. Students were given brief instructions on how to use the technology tools and then left alone in a study room with only a computer to complete a 50-minute module. Researchers were then able to remotely monitor and record the students’ online activity (Mamun et al., 2020).

Method

What type of qualitative or quantitative method was used?

The researchers describe their study as a mixed methods approach that was heavily qualitative in nature; however, they fail to describe any of their quantitative measures. They also do not go over any of the quantitative results, instruments used, or quantitative data collection methods, other than to mention that the learning modules included both concept check questions that ranked their confidence on a scale from very high to very low as well as multiple-choice questions to determine students’ understanding of the concepts (Mamun et al., 2020).

Is the method appropriate? Why?

There appears to have been ample opportunity to acquire quantitative data to show the full student experience more clearly. The researchers could have used a pre-test to determine the level of prior chemistry knowledge of the students and compared that data to post-test scores. They describe using different inquiry methods throughout the learning modules, but they do not quantify any of the results from the concept check and multiple-choice questions. For example, there is no mention of whether the students answered the questions correctly in the evaluation portion of the scaffolding strategy. Moreover, they could have determined whether an interest in technology or science skewed the results based on established instruments (Mamun et al., 2020).

Instead, they reported a subjective review of the qualitative data to determine whether students successfully learned the material. They could also have sought to show that the intervention increased overall scores. They had the opportunity to compare the group who participated in the online learning module to students who did not participate. They could have devised a quiz and given it to both groups after the regular face-to-face instruction on the topic was completed, or they could have recruited multi-year or multi-semester cohorts and compared the results across multiple groups. While the qualitative measures are a good first step, study results showing improvements in learning gains would have been more powerful (Mamun et al., 2020).

Study Procedures

How was the research study undertaken?

The researchers used self-reported prior knowledge of chemistry to sort the students into two groups, but they failed to explain exactly how the students were sorted. It is unclear as to whether students with little to no knowledge were in one group and those with more knowledge were in another group, or whether some other sorting strategy was used. The researchers also devised two different topic modules, one on heat and one on phase change, but they do not describe how students were assigned each activity by topic (Mamun et al., 2020).

Was the procedure appropriate?

No control group was used. The researchers argue that a control group is unnecessary for this type of study; however, they also state that they could have used a pre-test to determine chemistry skill level (rather than self-reported knowledge) if they had included a control group. There was an opportunity to include a control group, as not all first-year chemistry students participated in working through the topic modules. The researchers do not specify whether prior knowledge affected which topic students were assigned or whether the assignments were random. Thus, more information is needed to evaluate whether the study procedures were appropriate. However, given these omissions, the results are questionable (Mamun et al., 2020).

Data Collection

What types of data were collected? What were the data sources? How were they collected?

The researchers recorded the students’ onscreen activity, took observational notes, embedded open-ended questions into the learning modules, and conducted stimulated recall interviews (Mamun et al., 2020).

Are the data collection methods appropriate?

The qualitative data collection methods were appropriate, but the quantitative data collection was lacking.

Is the instrument appropriate? How was it developed? Were reliability and validity testing undertaken and the results discussed?

The researchers did not describe any instruments or reliability and validity testing. While they discussed their qualitative results in depth, they did not discuss any quantitative results. In the methodology section, they mention that they collected quantitative data to strengthen their findings, but they never go into any further detail. They admit that the study has a qualitative bias but do not adequately explain why they chose to forego discussing the full research methods (Mamun et al., 2020).

Data Analysis

What type of data and statistical analysis was undertaken?

The researchers used thematic analysis and a prior blueprint for the inductive approach to look for patterns in the data. They coded the qualitative data collected. They also used a theory-driven approach to identify themes within the data (Mamun et al., 2020).

Was it appropriate?

It is difficult to ascertain whether the researchers coded the data appropriately, given that they do not describe the exact methods they used.

Was there any inappropriate use of statistics?

No statistics were discussed throughout this study.

How did the study produce and maintain its rigor and trustworthiness during the research?

The study was approved by an institutional ethics committee, and informed consent was required (Mamun et al., 2020).

Suggested improvements

The researchers should have given more detail about their data analysis process, as it is difficult to understand the methods they used. Although they referenced other research, unless the reader has access to those articles or has prior knowledge of the specific methodologies, the information given was insufficient to give a clear picture of the analysis.

Results and Discussion

Summary of the major findings

The researchers found that the POEE strategy upheld the constructivist criteria to stimulate students’ previous knowledge, create cognitive dissonance, allow students to apply new knowledge, and support reflection that clarified that knowledge during the learning process. The researchers found that the addition of the evaluation phase was effective in providing synchronous feedback. They believe that the combination of providing multiple representations of concepts, inquiry-based questions, and instructional guidance through technology successfully mitigated the lack of instructor and peer involvement (Mamun et al., 2020).

Significance of the findings

Prompting students to reveal what they already knew on the topic at the beginning of the module caused them to reflect on prior knowledge and reveal aspects of the concepts that were unclear. This confusion spurred awareness of any knowledge gaps and created a desire to discover the answers. They were then led into an exploration of the concepts through various simulations and visuals, with dynamic visualizations being the most valuable. The researchers also found that strongly guided explorations were more effective in a self-directed online learning environment. Because students were required to then explain what they learned, they either clarified their understanding or reinforced misconceptions. The concept check questions at this stage helped them to think about whether they answered correctly and in enough detail. Evaluation helped students focus back on the learning goals for the module to determine whether they needed to review the simulations and visuals from the earlier stage to verify the concepts or review them if they got the questions incorrect. This feedback loop further strengthened their understanding, as it was well developed and specific to the concepts (Mamun et al., 2020).

Are the findings linked back to the literature review?

The findings were linked back to the literature review, in that they clearly explain how the POEE strategy builds on earlier research.

Were the strengths and limitations of the study, including generalizability, discussed?

The researchers argue that because the POEE strategy builds on a well-characterized POE strategy that the findings are generalizable. They also believe that the strategy can be applied to other STEM subjects, in addition to chemistry. They admit that further research is needed to determine whether the strategy can be applied to different program levels, since only introductory students participated. The researchers also admit that they did not assess students’ skill level with technology nor account for students who may not be familiar with computers and the technology involved in the modules. Moreover, they did not address how students having little to no previous knowledge of chemistry affected the outcomes (Mamun et al., 2020).

Was a recommendation for further research made?

Further research was recommended to determine the right level of guided instructions in self-directed online learning environments, as some students still preferred minimally guided options. The researchers also recommended exploring ways to address misconceptions as well as to provide feedback in various forms at different stages in the learning process. The researchers recommended conducting studies on a larger scale, using multimodal scaffolding strategies without direct instructor or peer involvement in self-directed, inquiry-based online environments (Mamun et al., 2020).

Overall Evaluation

Strengths

This study clearly outlined a scaffolding strategy worth exploring in self-directed, inquiry-based online learning environments. The researchers validated the use of a specific strategy and added a new evaluation phase that proved effective (Mamun et al., 2020).

Limitations

The study had a clear qualitative bias. Many opportunities to associate quantitative data with the findings were missed. Thus, more research is needed to measure the effectiveness of the approach compared to traditional learning approaches.

Suggestions for improving the research design

Quantitative measurements, such as the number of moves, clicks, attempts, and elapsed time, could have been recorded and counted to determine the level of support the students needed as well as the level of engagement with the module activities. It would have been interesting to note how many students went back into the simulations to explore the concepts after they reached the evaluation phase, whether the students viewed all the various types of simulations and visuals, whether they skipped portions of the module, whether they got sidetracked during the exercises, or whether they scored well during the evaluation phase. The study could have been designed to compare the performance of students with and without instructor involvement. Pre-tests and post-tests could have been compared to evaluate learning gains on the specific concepts through a paired t-test design.

Now that a strategy has been established, it might be worth exploring how this strategy compares in different scenarios via analysis of variance (ANOVA)—with students who only participate in a traditional lecture, with those who participate in both the online modules and a traditional lecture, and with those who participate in the online modules only. In general, it would have been beneficial to know the quantitative results so that a greater understanding of the entire study could have been achieved. Leaving that information out only leaves the reader to wonder how the quantitative data does or does not support the findings.