Haga clic aquí para la traducción al español (Google Translate)
Introduction
In Computer-Assisted Language Learning (CALL), curriculum design decisions increasingly occur in technology-saturated environments where platforms, tools, & analytics promise efficiency & innovation. Instructional designers are persistently bombarded with demands by pundits & the EdTech industry to adopt the latest buzzword technology or risk becoming obsolete, putting them under pressure to be ‘innovative practitioners,’ even though innovation does not necessarily correlate with better learning outcomes.
This article critically contrasts this all-to-common technology-first approach with evidence-informed backward curriculum design by comparing their implications across three evaluative dimensions: achievement of intended learning outcomes (ILOs), student persistence, & student & tutor satisfaction with courses. This article argues that backward design provides pedagogical validity & cohesion, while technology-first design risks misalignment, ‘engagement’ without learning, & pedagogical drift.
A brief & necessary definition of terms
Since the terms syllabus & curriculum are typically poorly defined & frequently used interchangeably in the education literature & professional discourse, for the purposes of this article, they are defined thus: ‘syllabus’ (Latin for list) refers to intended learning outcomes or what students are expected to learn, i.e. the constituent knowledge, skills, & attitudes (KSAs), while ‘curriculum’ (Latin for circuit or course) refers to the instructional sequences, progression, learning activities, materials, scaffolding, & feedback or how students learn the syllabus:
- syllabus = what to learn &
- curriculum = how to learn it.
This article is mostly concerned with planning & designing curricula.
Achievement of intended learning outcomes
Alignment vs. drift
Good backward design begins with clearly articulated & narrowly defined Intended Learning Outcomes (ILOs) & aligns assessment & learning activities accordingly, operationalising the principle of Constructive Alignment (Biggs, 1996; Biggs & Tang, 2007). In CALL, this ensures that language constructs, such as pragmatic competence, genre control, interactional fluency, & integrated skills inform all design decisions. Technologies are selected instrumentally to support the use of specific cognitive & linguistic processes, thereby supporting language development, rather than determining what is taught.
Technology-first adoption inverts this logic. Activities are frequently shaped by what a tool can easily deliver, e.g. discussion posts, short recordings, automated quizzes, rather than by what students need to learn & how they need to learn it, i.e. by engaging in activities that require or at least encourage transfer appropriate processing. As a result, alignment between ILOs & evidence of learning is often weak or implicit, leading to pedagogical drift where learning goals are retrofitted to technological affordances rather than deliberately planned.
Construct validity vs. surface-level ‘engagement’
Because backward design centres on how learning occurs, it supports task designs aligned with evidence-informed SLA principles, including the primacy of comprehensible input (Boers, 2021; Nation, 2007, 2015), the development of linguistic awareness, e.g. genre, register, & rhetorical moves (Hyland, 2007, 2008; Swales & Feak, 2012), & developing form-meaning pairings. Learning sequences are staged & progressive, moving students from dependent & inflexible KSAs, toward independent & flexible performance (i.e. mastery), regardless of delivery mode. This reduces the likelihood of surface-level engagement that appears to be ‘active learning’ but lacks construct validity.
In contrast, technology-first design often conflates activity & engagement with learning (‘engagement’ is a nebulous, slippery term & a poor proxy for learning). The mere act of clicking, posting, or recording can be mistaken for evidence of learning, producing busywork that is visible & measurable but cognitively shallow (Coe, 2013). Language development is assumed rather than theorised & principled, & learning mechanisms remain implicit or under-specified.
Assessment validity vs. metric substitution
In backward design, construct validity for assessment is a requirement because designers explicitly define & justify what counts as evidence of learning. CALL tools are chosen & activities designed to generate valid artefacts that demonstrate development & mastery of ILOs such as extended spoken interaction, revisions over time, control of genre, register, & rhetorical moves, rather than proxy indicators such as usage frequency or time-on-task. Assessment rubrics should reflect specific, syllabus aligned communicative competence & language development, not mastery of a tool.
Technology-first adoption, however, often prioritises easily captured metrics over meaningful evidence. Learning analytics, which are offered in all of today’s major LMS’, may favour what is technically convenient to measure (an observational bias known as the “streetlight effect”), encouraging assessments that privilege participation traces over linguistic growth, thus undermining construct validity & weakening claims about learning outcomes.
Student persistence
Coherence & transparency vs. fragmentation
Students are more likely to persist in courses that they perceive as logically structured, predictable, & transparent in their expectations (Tinto, 1997). Backward design supports persistence by ensuring that each activity has a clear purpose & that increases in cognitive demand are staged & scaffolded progressively in non-arbitrary ways, from dependent & inflexible performance to independent & flexible (i.e. mastery). Students can readily perceive how tasks contribute to outcomes & assessments, encouraging goal-oriented engagement.
Students often perceive technology-first designs as fragmented & having a lack of cohesion. When activities are introduced because a tool is available rather than because it serves a learning trajectory, students may struggle to see its purpose. Tasks appear episodic rather than cumulative, thereby weakening student persistence over time.
Reduced extraneous load vs. cognitive overload
When EdTech is subordinate to pedagogy, interfaces & activity types are simplified & used consistently. This reduces extraneous cognitive load, allowing students to allocate limited attentional resources to language processing (Sweller et al., 2019), which is already cognitively demanding in L2 contexts.
Technology-first designs frequently introduce multiple tools & interaction formats, each with its own learning curve. In some cases, a single lesson may include the use of 10 or more different tools & interactional formats (The highest I’ve seen is 13 in a 1-hour lesson, each with a separate user account & login, which cannot be managed or overseen by the teacher). This results in tool & format learning competing with language learning, increasing cognitive overload, frustration, & attrition, particularly for lower-proficiency & at-risk students.
Relevance & transfer vs. novelty decay
Backward design explicitly links tasks to assessments, real-world communicative needs, & transferable academic or professional practices. This perceived relevance is a strong predictor of persistence, especially in adult & higher-education contexts (Biggs & Tang, 2007; Hyland, 2004).
Technology-first adoption often relies on novelty, entertainment value (AKA ‘edutainment’), & strategies & techniques from the gaming (gambling) industry such as point scoring, leader boards, random awards, etc. (so-called ‘gamification’), to drive engagement (I can think of several commercial language learning mobile apps that are guilty of this). While new tools may generate short-term motivation spikes, these effects decay rapidly once the “newness” wears off. Without strong intrinsic pedagogical value, persistence declines & engagement becomes compliance-based or worse, gaming addiction-based, rather than motivated by learning gains.
Student & tutor satisfaction
Pedagogical agency vs. tool dependency
From a tutor perspective, backward design enhances satisfaction by grounding decisions in principled pedagogical reasoning. Instructional designers & teachers retain agency over curriculum design rather than adapting to a tool’s default affordances. Courses are easier to revise, justify, & defend, both pedagogically & institutionally. However, it can take many thousands of hours of study, experimentation, & learning interaction development experience to master the appropriate use of an array of EdTech tools & platforms for effective instructional design.
Technology-first adoption can produce tutor alienation. Software & EdTech tools may implicitly dictate lesson structure, while preparation & troubleshooting demands increase with limited learning gains. Instructors may feel pressured to “innovate” rather than design meaningful learning activities, & thus eroding professional confidence.
Implications for CALL
The contrast between backward design & technology-first adoption reveals that the core issue is not EdTech itself but a kind of technological fetishism, the assumption that EdTech tools are inherently pedagogically valid & that innovation always results in improvement. Backward design resists this assumption by re-centring evidence-informed instructional design, the science of learning, & language development, thus treating EdTech as one set of mediating tools among many, & recognising that non-use of EdTech can be a legitimate & defensible design decision. In this sense, backward design in CALL is not anti-technology but explicitly pro-critical adoption. Anchoring CALL in backward curriculum design mitigates misalignment & surface level engagement while maintaining pedagogical agency. For CALL practitioners & institutions, the challenge is not to adopt more EdTech, but to design more deliberately.
References
Biggs, J. (1996). Enhancing Teaching through Constructive Alignment. Higher Education, 32(3), 347–364. https://www.researchgate.net/publication/220017462_Enhancing_Teaching_Through_Constructive_Alignment
Biggs, J., & Tang, C. (2007). Teaching for Quality Learning at University (3rd edn). Open University Press. https://www.worldcat.org/title/teaching-for-quality-learning-at-university-what-the-student-does/oclc/772088919
Boers, F. (2021). Evaluating Second Language Vocabulary and Grammar Instruction: A Synthesis of the Research on Teaching Words, Phrases, and Patterns (1st edn). Routledge. https://doi.org/10.4324/9781003005605
Coe, R. (2013, June 18). Improving Education: A triumph of hope over experience [Lecture]. Inaugural Lecture of Professor Robert Coe, Durham University. http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf
Hyland, K. (2004). Genre and Second Language Writing. University of Michigan Press ELT.
Hyland, K. (2007). Genre pedagogy: Language, literacy and L2 writing instruction. Journal of Second Language Writing, 16(3), 148–164. https://doi.org/10.1016/j.jslw.2007.07.005
Hyland, K. (2008). Genre and academic writing in the disciplines. Language Teaching, 41(4), 543–562. https://doi.org/10.1017/S0261444808005235
Nation, P. (2007). The Four Strands. Innovation in Language Learning and Teaching, 1(1), 2–13. https://doi.org/10.2167/illt039.0
Nation, P. (2015). Principles guiding vocabulary learning through extensive reading. Reading in a Foreign Language, 27(1), 136–145. https://scholarspace.manoa.hawaii.edu/bitstream/10125/66705/1/27_1_10125_66705_nation.pdf
Swales, J., & Feak, C. B. (2012). Academic writing for graduate students: Essential tasks and skills (3rd edition). The University of Michigan.
Sweller, J., Ayres, P., & Kalyuga, S. (2019). Cognitive load theory (2nd ed.). Springer.
Tinto, V. (1997). Classrooms as communities: Exploring the educational character of student persistence. The Journal of Higher Education, 68(6), 599–623.
“EdTech will not save you” image & concept credit: Dr. Donna Lanclos https://www.donnalanclos.com/