Evaluating the Effectiveness of Technology Tools in Online Learning
- Ian Briggs
- Jan 19
- 3 min read

In a graduate course on instructional media, I was required to use Adobe Captivate to design a short interactive learning module. The assignment was to create a branching scenario with embedded quizzes, audio narration, and responsive design elements. The task aligned well with the course’s focus on multimedia learning. In practice, however, the tool created more barriers than opportunities.
The goal of the course was to design a multi-scene branching scenario that allowed learners to make choices and receive immediate feedback. The challenge of the assignment was immediately apparent. Adobe Captivate’s interface and workflow were far more complex than the assignment assumed. The instructions assumed prior familiarity with Adobe Captivate’s timeline, states, and advanced actions, but the course provided only a brief overview. Also, Captivate’s responsive design mode was unstable, causing layout shifts and broken interactions after saving. The software required a steep learning curve for basic tasks such as editing audio, adjusting object timing, or configuring branching logic. This diverted attention from the learning objectives to troubleshooting the tool itself. Accessibility features such as keyboard navigation and screen reader compatibility were difficult to implement correctly without extensive review of online tutorials, contradicting the course’s emphasis on inclusive design. Overall, the tool overshadowed the assignment’s purpose, demonstrating effective instructional design, not mastering a complex authoring platform.
To make the assignment more equitable and instructionally aligned, I would revise the directions and the tool requirements. This could include keeping Captivate but scaffolding the learning experience. As such, the course could benefit from step-by-step tutorials aligned with the exact features students must use (branching, audio, quiz slides). In addition, having starter templates pre-built would have allowed learners to focus on design decisions rather than technical configuration. Accessibility checklists could be included that show which Captivate features support the Web Content Accessibility Guidelines (World Wide Web Consortium, 2025). and which require workarounds. This approach aligns with Boettcher and Conrad’s (2021) emphasis on reducing unnecessary cognitive load, enabling learners to focus on core objectives.
The other option to consider is that Captivate simply is not the right tool, especially for beginners or for courses emphasizing Universal Design for Learning (UDL) principles and accessibility. Several emerging tools offer more intuitive, cloud-based alternatives. These include H5P for branching scenarios, quizzes, and interactive videos, Canva Docs and interactive embeds for scenario-based learning, Genially for visually rich, accessible, web‑native interactions, and AI-supported tools that scaffold multimedia creation without requiring advanced technical skills (Willmore, 2023).
If I were redesigning the assignment, I would recommend H5P as the primary tool. It is browser‑based, supports accessibility, and allows students to focus on instructional design rather than software mechanics. Integration would be simple by providing an H5P branching scenario template, embedding it in the LMS, and requiring students to customize content, feedback, and decision points.
I have found significant value in keeping a technology tool evaluation sheet to map out various tools throughout my instructional design journey. When determining whether a tool meets the needs of online learners, especially adult learners and those with accessibility needs, the first component I look for is whether the tool supports alt text, keyboard navigation, captions, screen readers, and WCAG 2.1 guidelines. (ADA.gov, 2022). Secondly, per UDL, the tool must support multiple means of engagement, representation, and expression. Thirdly, for both learners and instructors, the tool needs to be usable without extensive training to reduce extraneous cognitive load. As I found with my experience, the tool must function consistently, save work reliably, and avoid technical glitches that derail learning. Additionally, it should work across mobile, tablets, and assistive technologies without breaking, and as an added bonus, be affordable or available as an OER option, reducing barriers for learners. I believe that these criteria help ensure that technology enhances learning rather than becoming an obstacle.
My experience reinforced a key operational principle, that the tool should serve the learning, not the other way around. Emerging technologies offer exciting possibilities, but they must be selected with intentionality, accessibility, and learner experience. As Dimitrov (2023) notes, immersive and interactive technologies only succeed when they are meaningfully integrated into pedagogical design, not when they are used for their own sake.
References
ADA.gov. (2022, March 18). Guidance on web accessibility and the ADA. https://www.ada.gov/resources/web-guidance/
Boettcher, J. V., & Conrad, R.-M. (2021). The online teaching survival guide (3rd ed.). Jossey-Bass.
Dimitrov, K. (2023). A debate about emerging immersive technologies in the context of “higher education 4.0.” Trakia Journal of Sciences, 21(Suppl. 1), 242–247.
EDUCAUSE. (n.d.). Emerging technologies and trends. https://er.educause.edu/channels/emerging-technologies-trends
World Wide Web Consortium. (2025, October 20). WCAG 2 overview. W3C Web Accessibility Initiative (WAI). https://www.w3.org/WAI/standards-guidelines/wcag
Willmore, J. (2023, December 4). AI education and AI in education. U.S. National Science Foundation. https://new.nsf.gov/science-matters/ai-education-ai-education




Comments