Specifying elearning resources and strategies

Specifying elearning resources and strategiesA common challenge facing those about to embark on elearning projects is knowing just what their options are and what they have to offer. Novice project managers, teachers, and curriculum developers often find themselves at a loss as to where they should start and what they should be looking into. This article is not intended to be a definitive guide (That could fill several books!) but more of a general outline and starting point to investigate and gain a broader understanding of what options may currently be available and how they can be used.

How does this guide fit in with elearning projects?

Organised learning involves some kind of explicit or implicit learning contract, i.e. an agreement and alignment between learners, teachers, and support staff of shared objectives and goals. Here’s a quick overview of some of the main aspects* of developing an elearning contract:

  • Diagnosing learners’ needs
  • Specifying learning objectives
  • Specifying learning resources and strategies
  • Specifying evidence of accomplishment
  • Specifying how the evidence will be validated
  • Reviewing the learning contract
  • Carrying out the learning contract
  • Evaluating learning

Adapted from: Knowles, M. S. Self-Directed Learning. A guide for learners and teachers, 1975, Englewood Cliffs: Prentice Hall/Cambridge.

This guide is limited to a broad overview of one particular aspect of investigating, planning, and developing an learning project; Specifying learning resources and strategies; and, in order to limit the scope, does not take into account strategies such as blended learning, i.e. combined face-to-face and online learning. Blended learning in particular, makes many more options available so curriculum developers, teachers, and learners can have the best (or worst?) of both worlds. It also assumes that your project will be centred around a learning management system (LMS) that supports some or all of the features and tools listed.

*Please note that this list is by no means definitive or a set sequence of stages. Developing elearning is often a complex, messy, recursive, fluid activity that frequently revisits and re-evaluates the various aspects in the light of unforeseen discoveries and developments.

What are the options?

This guide is by no means exhaustive and lists only the more commonly researched and used resources, activities, and strategies. There are more options and many that are specifically for particular topics and subject areas. I’m frequently surprised by the number of qualified, experienced teachers, instructional designers, and curriculum developers working in elearning who appear to be unaware or at least uninitiated in using many of the options available to them. Hopefully, this guide can provoke more investigation into and discovery of more of these options.


  • Text documents: HTML web pages, pure text, Microsoft Office Doc, Open Office ODT, PDF, and eBook formats, e.g. EPUB (free and open ebook standard), AZW, and MOBI.
  • Images: tables, charts, diagrams, infographics, illustrations, photos, etc.
  • Audio recordings: radio programmes, podcasts, lecture recordings, interviews, self-speech recordings, i.e. listening back to yourself talking your way through an activity or problem, etc.
  • Video recordings: similar to audio recording but also including presentations, visual documentaries, etc.
  • Animations: animated illustrations, animated 2D and 3D models, interactive models, etc.
  • Slide show presentations: PowerPoint, Adobe Captivate, Raptivity, Slideshare, Prezzi, etc.
  • 3rd party websites, databases and repositories: external sources of information and media; Wikimedia Foundation, Creative Commons, OER, Google/Yahoo!/Bing Maps, etc.

Synchronous activities


Allows participants to have real time synchronous text discussions. Pure text discussions have some advantages over voice discussions, in that although they are generally slower and convey less information, so they tend to provide stronger focus on the content of what participants are saying and can encourage normally reticent learners to make more contributions. Additionally, since it is more difficult for and more obvious when some learners try to dominate the conversation, there tends to be more evenly distributed participation and greater inclusion. Chat sessions are also easier to analyse and assess than voice over internet protocol (VoIP) since they’re already transcribed.

Chat services also allow learners to contact each other more spontaneously and informally to ask questions and/or ask for clarifications, and generally increase their engagement, social presence, and sense of community.

Popular examples: Skype chat, MSN Messenger, Facebook chat, etc. Almost all modern LMS’ have chat activities available.

Web meetings

These are real time online virtual spaces that often include multi-way chat, voice over internet protocol (VoIP), audio, and video, shared whiteboards, file uploads/downloads, and slide show presentations. Some services allow participants to attend by traditional telephone for when internet access/bandwidth is an issue. Many web meeting services also offer the option to record sessions so that learners and teachers can review and refer to them at a later date.

Web meetings are an effective way for learners and teachers to increase social presence, get to know each other, build trust and cultivate a stronger sense of belonging and community between  participants.

If users require or would like to access web meetings on mobile devices, it’s essential to check that whichever service you choose provides a native app for it. For the foreseeable future, web browsers on mobile devices are unlikely to have sufficient capacity to reliably support the high demands of multiple participants in multi-way, multimedia communication over the web.

Web meeting software services run on media servers with high processing and bandwidth requirements, and are complex and require highly specialised skills to maintain. Most media servers are consumed as 3rd party web services from independent specialist providers, even by many of the larger media organisations, universities, colleges, and institutions.

Popular examples: Big Blue Button (free and open source), Meeting Burner, Tok Box, WizIQ, Google Hangouts, Skype, and Blackboard Collaborate (formerly Elluminate).

Collaborative documents

Shared online text documents, databases, and spreadsheets that can be edited in real time by multiple participants simultaneously. Real time online co-construction of documents can provide a strong focal point to discussions and collaborative projects especially in Social Constructionist learning and teaching approaches, where the emphasis is on the process of creating a document rather than the finished document itself (process vs. product).

Popular examples: Etherpad (free and open source), Google Docs, MS SharePoint, etc. Here’s an example of integrating a collaborative document platform with an LMS: Etherpad and Moodle Integration

Asynchronous activities


Similar to traditional college and university essay “drop boxes”, assignment activities enable teachers and assessors to grade and give comments and feedback on uploaded files and assignments created on and off line. Submissions can be documents, images, diagrams, concept maps, infographics, posters, learners’ blog posts, inline web pages, audio, and/or video recordings. Some assignment activities support peer assessment. An advantage to online assignment activities is that learners and teachers can always be sure that they’re looking at the latest version of a document and its comments, avoiding the confusion of trying to manage multiple versions of files from multiple learners via repositories or email (Yes, some people do that!), and can also review earlier versions to see the progress of changes.

Also consider using forums, glossaries, databases, and wikis for collaborative assignments.


Enable participants to create, maintain and search a bank of record entries. Most people understand databases as MS Excel spreadsheets (although spreadsheets and databases are quite different). They can be a useful tool for learning how to categorise and organise information, construct overviews, and thereby gain a broader understanding of a process, system, or subject area. Databases needn’t be limited to storing text; they can support multimedia too. Having online databases means that learners can collaborate in editing them, leading to greater discussion, reflection, analytical and critical thinking, and therefore deeper learning.

Popular examples: Open Office Base (free and open source), DHTMLX.com (free and open source), Microsoft Access, etc.

Feedback (surveys)

For creating and conducting surveys to collect feedback from learners. High quality feedback can give teachers and curriculum developers invaluable information and opinions from learners related to resource, curriculum, and course design, as well as attitudes and relationships towards each other, teachers, and support staff. Feedback that is frequent, and easy to administer and easy to participate in, and is anonymous when and if required, is an effective way to offer real choice and control over to learners and make their learning experiences more democratic, inclusive, responsive, and engaging.


Forums allow participants to have asynchronous discussions. For many years, online discussion forums have been one of the main focal points of elearning, communities of practice, and communities of inquiry. They offer many of the benefits of face-to-face discussions and, in addition, give opportunities for different styles of discussion and interaction, as well as providing environments where normally reticent participants can contribute more and have a more influential voice. As forums are asynchronous, they allow time for participants to reflect on their ideas, do further reading and research, and give more informed and considered responses. Some forums support peer assessment via rating systems.

Popular examples: BuddyPress.org (free and open source software), phpBB.org (free and open source software), Elgg.org (free and open source software), Slashdot.org, LinkedIn.com, Actionscript.org, and Facebook.com.


Glossaries enable participants to create and maintain lists of definitions, like a dictionary.  Some glossaries support peer assessment via rating systems, peer, and teacher feedback, and hyperlinks can be automatically added to glossary entries whenever they are used in online text within the LMS. Learners can collaboratively build class glossaries, thereby demonstrating their understanding and mastery of learning objectives while they study and continually use them as a reference resource for key terms and ideas. They can also update and refine their glossary entries as they deepen their experience and understanding.


Lessons/Presentations are mostly used for bringing together different types of activities into one session and/or creating branching scenarios***. In most cases, lessons amount to presentations of information, maybe with some practice, and maybe with quizzes or tests, i.e. the so called “present-practice-produce” (PPP) approach to learning and teaching; appropriate for transmitting “useful to know” information. As an alternative or complement, it’s also worth considering reading texts, documentaries, and/or silent demonstrations with follow up chat and/or forum discussions so that learners and teachers can get a clearer idea of what learners have understood and learned from the information presented.

Beware: There are many elearning “experts” and quiz software vendors who claim that including quizzes throughout presentations promotes deeper learning. They frequently fail to differentiate between quizzes during presentations and spaced repetition (a technique for memorising verbatim information). To my knowledge, there is no conclusive evidence to support these claims. A meta-study of research papers** on present-practise-produce elearning with and without quizzes concluded that there were no measurable differences in learning outcomes and that including quizzes only managed to needlessly take up more of learners’ time for the same gains.

**Source: U.S. Department of Education, Evidence-Based Practices in Online Learning – Review of Online Learning Studies (2009) (PDF)

***A note on branching scenarios: They were an early attempt at adaptive learning, i.e. changing the activities and resources presented to learners according to their responses to choices and questions. They are very difficult and labour intensive to design and set up and have so far shown to be of marginal benefit in comparison to learner centred activities and decision making, e.g. reflective inquiry and reflective practice. Current research is looking into artificial intelligence for solutions but we’re a long way off from anything broadly productive.


A teacher or learner asks a question and specifies a choice of multiple responses and encourage participants to vote on them. Polls are a quick and easy way to offer choices and gauge reactions to and understanding of learning resources and activities. Many forum software packages, web meeting services, and some learning management systems (LMS’) have polling activities built in and/or are available as extensions.


Allow the teacher to design and set tests and exams, which may also be automatically marked and feedback and/or to correct answers shown. Quizzes can support audio, video, and animations, and some interactive features such as drag and drop matching, order sequencing, and identifying points and areas on images. Native learning management system (LMS) online quizzes have mostly taken over from earlier SCORM based assessment and testing. They are usually easier to create, organise, and maintain, are more flexible, support more features, are easier to make accessible (for Section 508 compliance or similar accessibility legislation), and are more secure, e.g. with SCORM the answers to quizzes are sent to the learners’ web browser cache where “tech savvy” learners can access them.

SCORM packages

SCORM packages are usually authored/created by instructional designers with rapid elearning integrated development environments (IDEs), e.g. Adobe Captivate, Raptivity, and Articulate, among many others. They present an easy entry point into elearning design and development and allow novice elearning instructional designers with very little technical know-how, a shallower learning curve to producing learning resources and activities. They were previously used to present content and give quizzes but have since been superseded by open format, easier to create, edit, and maintain resources and tools that most modern LMS’ support, e.g. presentations, lessons, and quizzes. However, they are still widely used in military organisations (e.g. the US Pentagon is a huge “cash cow” for SCORM based elearning products and services) and corporations for things like basic health and safety conformance/compliance training, and training to use software, since they are much cheaper than providing tutored or supervised training.

However, rapid elearning IDEs like Adobe Captivate and Techsmith Camtasia do have legitimate and productive uses, for example rapid prototyping of ideas for learning interactions, quick “How to…” guides for teacher and learner technical support, and silent demonstrations.

Also see: Cheating in SCORM


For gathering data from students to help teachers and curriculum developers learn about classes, resources, and strategies, and reflect on their own teaching. Appropriately designed surveys can also encourage reflective thinking and help to further develop learners’ analytical and critical thinking skills.

Popular examples: Lime Survey (free and open source), Survey Monkey.


A collection of web pages that participants can add to or edit; a kind of collaborative encyclopedia. Common activities are co-creating documentation, collaboratively constructing narratives and stories, and categorising, ordering, sorting, and organising information. Most wiki software keeps a record of changes, who made them, and when, making them useful tools for assessing contributions and collaboration between learners.

Popular examples: Wikipedia.org (MediaWiki, which is free and open source).

Caveats and common issues

Different learners will more than likely have different knowledge, experiences, and abilities, and many will be unfamiliar with some of the current elearning activities on web platforms. Which of the options available that you choose to use will depend on learners’ and teachers’ needs, prior knowledge, experiences, and abilities.

Despite what many people believe, we tend to be very poor at multi-tasking; only about 2% of people can multi-task efficiently; and we need to focus on one activity (frame of attention) at a time. In particular, learners and teachers frequently report that they sometimes feel overwhelmed by the skills and knowledge they have to learn in order to successfully complete learning activities. It’s possible to overload the best and brightest of learners by asking them to learn too many things at once. There are three main areas:

  • Tools: Do learners (and teachers!) already know how the tools work and how to use them? Can they easily perform all the actions the learning activity requires of them? e.g. navigate, create, save, edit, submit, download, upload, link to, recover forgotten passwords, etc.
  • Rubrics: The fundamental design of the learning activities. What do learners have to do? How complex are the activities, what are “the rules”, and how long will it take to learn them?
  • Learning objectives: The skill(s) and/or knowledge they are supposed to acquire and/or develop, i.e. the syllabus.

To avoid “cognitive overload” and demotivating learners as soon as they start an activity, it’s important to consider just how much it’s asking learners to do at once in relation to their existing knowledge, experience, and abilities. Ideally, we’d like to spend as much time as possible on learning objectives and as little time as possible on learning to use tools and understanding rubrics. However, some activities can offer significant learning opportunities that make them worth the time and effort. In such cases, we need to reduce the cognitive load from the learning objectives while learners focus on learning how to use the tools and/or what they have to do (the rubrics); so called introductory or user interface training activities.

What’s next?

Now that we’ve established a broader overview of some of the options available for developing resources and learning strategies, we have a starting point for further investigation. There are many more specific and comprehensive books and guides available, as well as large and growing bodies of research into online learning and teaching approaches, methods, and strategies.

However, and I can’t stress this enough, there is no substitute for hands on experience and experimentation, and “learning by doing.” Trying out elearning tools and strategies with learners and watching activities unfold in different contexts, and getting honest, direct feedback from learners and teachers is invaluable. It also gives a better understanding of research papers, providing much needed background procedural knowledge to their usually abstract, declarative generalisations.

A word of caution

Beware of books, guides, and gurus who say things like “This is how it’s done.” or “If you do X, Y will happen.” People are complex and unpredictable. It’s difficult to say how they’ll react to or behave in a given activity. More reputable researchers report their findings along the lines of, “I did this with these particular learners, here’s the context and their backgrounds, and here’s the data I collected and my interpretation of what unfolded.” Above all, be prepared to be comfortable with dealing with uncertainty and ambiguity, and getting mixed results. As with all learning and teaching activities in any medium, it takes time, insight, discipline, patience, and understanding complex concepts and interactions to get to grips with elearning.

Update on the free online interactive c-test generator

Flash logoIn January of this year (2013) I published an article on my C-Test Generator which I made freely available for everyone to use so that they can generate their own c-tests quickly and easily. Since then, it has occured to me that it would also be useful to be able to generate text versions of c-tests that can be printed and photocopied for those who don’t have computers readily available or if teachers would like to create specific language proficiency tests for their learners.

What is it and what does it do?

Please see the previous article, Free online interactive c-test generator, for more information about what c-tests are, how they are an improvement on cloze tests, and what the C-Test Generator app does.

What has changed?

The c-test generator app, in addition to creating an interactive online c-test, now generates an additional pure text version. By clicking on the “Copy” button, the text is automatically saved to your operating system’s clip board. You can then paste it into a text document such as OpenOffice/LibreOffice Writer or MS Word. If you adjust the kerning (space between letters), you’ll be able to see one underscore for each deleted letter, which looks more like betw_ _ _ than betw___. In OpenOffice/LibreOffice:

  1. Highlight the text that you want to adjust the kerning on
  2. Go to Format > Character… > Position
  3. Under “Spacing”, select “Expanded” from the dropdown list
  4. In the “by” section increase the pts to widen the character spacing (You should see a preview). About 2.0 pts should be enough.

The process wil be similar in MS Word. Check their help and documentation for details. Please note the the same C-Test Generator app is embedded in this page and the previos C-Test Generator article page.

[swfobj src=”http://blog.matbury.com/wp-content/uploads/2013/06/blog_c_test.swf” width=”600″ height=”600″ allowfullscreen=”true”]

Click here for direct link to C-Test Generator app for full browser window.

There’s also a licensable version of the C-Test Generator app that integrates with Moodle and saves learners’ grades and c-test texts in Moodle’s grade book.

“Unshuffled” option now available on MILAs

Listen and select MILA (unshuffled)I’ve added a new option to some of my Multimedia Interactive Learning Applications (MILAs). Teachers, curriculum developers and course content developers can now set MILAs to generate learning activities in sequential order, in other words, unshuffled.

Why generate unshuffled learning activities?

By default, MILAs shuffle the order that items appear in to make them less predictable and ensure that learners don’t rely on non-linguistic cues to find the correct responses, i.e. the answers and distractors are rarely in the same positions or order twice and so learners’ only option is to look and/or listen for linguistic cues. In the same way, it’s usually a good idea to shuffle a deck of cards regularly in a card game to make the order of the cards less predictable and the game more interesting.

However, in some instances it can be beneficial to learners if we present learning activities in sequential order. This allows a several possibilities I have thought of and probably many more that I haven’t. For example, if English as a Second, Foreign or International Language learners are acquiring language for daily routines, it would be more helpful to learners if they are presented in sequence at first, thereby preserving the narrative nature of the language and enabling learners to make sense of it (understand it) more easily, therefore making language acquisition more probable.

Another example would be using story telling/narratives to help convey meaning by presenting background and contextual story lines to teach salient points about more abstract concepts, ideas and theories. A narrative could convey background information and context, the HOWs and the WHYs and the process of discovery that the scientist or thinker went through before they arrived at their “eureka” moment; Otherwise known as a case study.

Yet another possible use would be to present incorrect or incomplete narratives based around some idea, topic or event as an introductory stimulus for a broader narrative inquiry based project. Learners would then have to discover what is wrong or missing and construct their own correct or complete narratives. Freer, more expansive narrative inquiry tasks could then follow.

…or it may be something as simple as matching pictures to the lines of a song.

I think (speculatively) such techniques can help learning to be more interesting, engaging and enjoyable for learners.

Which MILAs does this apply to?

The following MILAs now have the unshuffled option:

How do I use the unshuffled option?

The default setting is shuffled so updating your copy of these MILAs will have no effect on existing learning interactions. Simply by passing in the “shuffle = false” parameter sets any learning interaction to generate the activities in the order in which they appear in the Learning Content Cartridge SMIL XML file. In Moodle, it works like this:

  1. Edit or create a new instance of the SWF Activity Module
  2. Select the MILA and Learning Content Cartridge in the usual way and set any other parameters as necessary
  3. In the FlashVars Learning Interaction Data section put: Name: shuffle Value: false
  4. Save and preview
  5. That’s it!


Live demos of all the MILAs shown on this site are available to try out. Click on the matbury.com logo to go to the MILAs demo course on Matt’s R&D Moodle. Look for the SWF Activity Module instances called Numbers 0 to 120, unshuffled:

matbury.com Moodle logo

Related links

And finally

I’d be very interested to hear of any other possible uses of unshuffled MILA learning interactions. What could you use them for?

Free talking picture dictionary for Moodle

Free talking picture dictionary for MoodleI’ve just finished putting together a Learning Content Cartridge which I’m releasing for free under a Creative Commons Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0) licence for anyone to use, edit and redistribute as they please.

There are 28 entries (33 including plurals) and each one has a corresponding image and an MP3 audio recording. The package is designed for teachers and curriculum developers to use to test out my CALL Software (MILAs) before deciding if it’s suitable for their needs.

What’s more, I’ve used the images and MP3 files to create a talking picture dictionary for the Moodle 1.9 Glossary activity module. There’s an entry for each of the 28 Common objects items which displays a short definition, the corresponding image and the corresponding MP3 file, embedded using Moodle’s standard Flash audio player (Make sure you have the Moodle MP3 filter turned on in Admin settings for this to work). You can try it out on the Multimedia Interactive Learning Applications (MILAs) demo course on my R&D Moodle (use guest access). See section #1 titled, “MILA resource cartridge repurposed for a picture dictionary – Common objects“. I’ve packaged the exported XML glossary data, images and audio into a downloadable zip file.

Auto linking the dictionary

One of the features of the Moodle Glossary module is that it can be set to auto link to entries anywhere it find those entries in other text on Moodle pages (not PDFs or links to external pages). For example, if the Glossary module finds the word “dictionary” in a paragraph of text, it’ll automatically highlight it and place a link on it. When a learner clicks on the highlighted word, a pop-up window appears with the corresponding Glossary entry for “dictionary”.

Please note: Some of the text formatting in the exported talking picture dictionary is a bit messy but can easily be corrected in Moodle with the Glossary editor.



  1. Unzip the downloaded package and look for the file called, “exported_glossary_data_to_import_to_your_moodle.xml
  2. Upload the zip package to your Moodle course files directory.
  3. Unzip the package and make sure the directory structure is as follows:
        • commonobjects/mp3/etc…
        • commonobjects/pix/etc…
        • commonobjects/xml/etc…
  4. Create a new Glossary module instance on your course page.
  5. In the Glossary module instance, at the top right of the screen, click on “Import entries”.
  6. Browse to and select the “exported_glossary_data_to_import_to_your_moodle.xml” file on your computer and click on “Save changes”.
  7. That’s it. You’ve just created a talking picture dictionary on your Moodle course. Feel free to edit and modify it in any way you please.

There are free apps to try out in your Moodle on the SWF Activity Module project downloads page.

Why use a Learning Management System for elearning?

Download and read in ePub format

Messy NetworkMany teachers and organisations are now experimenting with collections of free online platforms and systems by commercial service providers such as Google and Yahoo! to use in their teaching practices. Some of the more popular uses are things like encouraging learners to submit course work via email, using free online discussion forums, wikis, microblogging, and social networking sites. If all this is available for free, then why bother using a Learning Management System (LMS)?

One place for everything

In their first experiments with elearning, many teachers tend to build a collection of free commercial web services that are “one trick applications”, for example, Facebook or Edmodo for social networking, email or Google Docs for submitting written assignments, drop boxes or file sharing services for media files, Yahoo! Groups for discussion forums, and free test preparation sites. Using all these different services requires learners and teachers to create a number of user accounts, i.e. one for each service, and manage them all accordingly. Admins must keep track of teachers’ and learners’ online activities, and teachers must keep track of their learners’ right across the assortment of services and sites. From an organisational perspective this is clearly not an easy arrangement to administer and almost impossible to do on a medium or large scale.

Presumably, any organisation taking this approach with young learners would also have to get parents’ signatures on Children’s Online Privacy Protection Act (COPPA) agreements, or their equivalents in their respective countries, for each site or service used. Some governments and educational authorities are also prohibiting the use of some web sites and services by teachers and learners such as Facebook.

A well designed Learning Management System will have all the tools and services you need to create and manage user accounts, courses, social networks, news, announcements and messages, discussion groups, assignment and file submission systems, quizzes, tests and exams, presentations and lessons, grading and feedback, etc. in one convenient place, with one user account for each participant.

More coherently organised courses

An LMS allows administration, curriculum developers, course content authors and teachers to create courses with activities, assessment, etc., that are coherently organised and easy to follow. Courses and course content can be quickly and easily updated and adjusted and as such are a more responsive and adaptive approach to developing effective elearning programmes. Teachers and learners can follow a clear, concise timeline of activities and projects, look ahead to see what’s coming up and review past work for critical reflection, all in one place.

More appropriate types of roles for users

Most social networking sites are designed for optimising corporate marketing opportunities and gathering users’ personal information and encouraging disclosures of personal and private details, while most Content Management Systems (CMS) are designed for e-commerce and/or web publishing contexts and user accounts tend to reflect this. eLearning, and learning and teaching in general, have different requirements that can’t normally be met by most CMS’ or social networking sites. An LMS gives finer, more specific control over what groups of users and individual users can and can’t do on the system from the administrative level right down to learners and guests. For example we may want to enable some teachers and staff to edit activities and resources but not others, or we may want to enable some learners to be responsible as moderators and/or helpers for some discussion groups. Another frequently requested role is to allow parents to view their childrens’ grades and attendance. We may also want users’ roles to be different on different courses, for example making teachers learners on professional development courses.

Additionally, in a learning environment it is desirable to monitor learners’ actions and activities in order for teachers, mentors and sometimes even peers to be able to give guidance and feedback. In this respect, most CMS’ provide only the most basic user tracking since guidance, mentoring and feedback are not seen as a high priority.

Record keeping and management

A single centralised record keeping system is easier to manage, analyse and understand than a collection of commercial web services and sites with varying degrees of administrative, teacher and user access and data transparency. A well designed LMS will enable admins and teachers to look up individual users or groups to see their activities and analyse learning outcomes; Teachers can see their learners’ grades and learning outcomes, learners can see their own grades and learning outcomes, and curriculum developers and course content authors can see the interactions and learning outcomes both in individual case studies and in aggregated analyses. Want to know how effective your resources, activities and courses are? – Look up the data, learners’ activities, and the learning outcomes.

More coherent communication

Keeping channels of communication open is an essential part of running any organisation smoothly and responsively. An LMS allows admins and teachers to send messages or make announcements to groups of learners, teachers and individuals. Ongoing channels of communication are easier to find and respond to, and to review. Teachers can provide feedback in relation to particular assignments, activities and learning outcomes for individuals and groups. Teachers can communicate with each other, and learners can also be enabled to do this, in a variety of synchronous (e.g. chat, VoIP “internet telephony” and web conferencing) and asynchronous (e.g. private messaging, discussion forums and assignment feedback) modes. Planning, co-ordinating, and collaborating on activities suddenly becomes easier.

Assessment tools

One of the better known strengths of LMS’ is that they can reduce a lot of the administrative work involved in assessment. Some types of summative assessment can be completely automated with self marking quizzes, tests and exams so learners can get their results immediately and teachers only have to analyse the results. There’s less marking to do and no need to enter the results into a database; it’s already done for you. An added bonus is that it can greatly reduce the amount of photocopying your organisation does. LMS’ may also include sophisticated analytical tools to examine test results and assessments to look for areas where the quality of activities, learning resources, curricula, and learning and teaching approaches could be optimised or improved. Additionally, a well designed LMS enables teachers to provide consistent, frequent formative assessment in the form of written or recorded feedback, exchanges of messages, VoIP sessions and learning reviews. Learners can build learning portfolios of compositions, projects and learner generated multimedia for more sophisticated assessments that are more reflective of real world abilities and practices.

Integration with 3rd party software and web services

For smaller organisations that can’t usually justify the expense of running their own dedicated web conferencing systems, many 3rd party web conferencing service providers offer plugins for the more widely used LMS’ so that the LMS can be used to manage user accounts and conferencing sessions along with access to subsequent recordings of the conferences. That means that, for a fraction of the cost, you can provide online classes from within your LMS. All the necessary co-ordination between the conferencing service and the LMS courses, groups, and learners and teachers can be taken care of simply, quickly and easily.


These are just a few examples of the benefits that LMS’ can have for educational departments and organisations of any size. There are many more that won’t be immediately apparent until you start getting more deeply involved in running online communities of learning and teaching. What are you waiting for?

Using the C-Test generator for self testing in extensive reading programmes

Using the C-Test generator for self assessment in extensive reading programmesThere’s growing interest in Extensive reading (ER), sustained silent reading (SSR), free voluntary reading (FVR), or reading for pleasure programmes for second language acquisition. They can be highly effective for second language acquistion and have been shown to significantly improve the fluency, complexity and accuracy of learners’ language use in authentic, real world tasks. If you’re considering introducing extensive reading programmes in your class(es) at your school, academy or institution, read on…

Why extensive reading programmes?

In short, extensive reading:

  • allows learners to meet the language in its natural context and see how it works in extended discourse beyond the language met in textbooks.
  • builds vocabulary. When learners read a lot, they meet thousands of words and lexical (word) patterns time and time again which helps them master them and predict what vocabulary and grammar may come next.
  • helps learners to build reading speed and reading fluency which allows them to process the language more automatically leaving space in memory for other things
  • builds confidence, motivation, enjoyment and a love of reading which makes learners more effective language users. It also helps lower any anxieties about language learning the learners may have.
  • allows learners to read or listen to a lot of English at or about their own ability level so they can develop good reading and listening habits.
  • helps learners get a sense of how grammatical patterns work in context. Textbooks and other study materials introduce language patterns but typically they don’t appear often enough in a variety of contexts to facilitate a deep understanding of how the patterns work.

Source: ER Foundation’s Guide to Extensive Reading handbook (PDF).

There has been a lot of research into extensive reading programmes for both first and second language acquisition in children and adults, which have given promising results. However, relatively little attention has been paid to the results by TESL and TEFL practitioners. A notable study was Dr. Patsy Lightbown’s New Brunswick Experiment* in which school children in French Canada where given 150 minutes (one 30 minute session per day) of extensive reading in English per week in place of traditional classroom instruction. It appears to indicate that learners can indeed “Do it themselves” and in some areas of language use, significantly outperformed classroom instructed learners, although the paper calls for further investigation.

The Extensive Reading Foundation provides help, advice, research evidence and reviews of graded readers and other books for anyone who’s interested in trying out extensive reading with their learners. A good place to start is with the ER Foundation’s Guide to Extensive Reading handbook (PDF).

*Lightbown, P. Can They Do It Themselves? A Comprehension Based ESL Course for Young Children, Comprehension based Second Language Teaching, by Courchêne, Glidden, St. John and Thérien (eds.), University of Ottawa Press, 1992.

Why use the C-Test MILA in extensive reading programmes?

It’s important and motivating for learners to be able to assess how much they are learning from their activities, i.e. how much progress they are making. Some learners (and teachers) can be sceptical about the value of extensive reading and reading for pleasure without a dictionary, grammar exercises or reading comprehension questions. The following is an attempt to address this issue constructively by allowing learners to do assessments specifically related to the texts that they are reading without changing the basic nature/procedure of extensive reading activities and requiring little, if any, extra input from teachers. An added benefit is that learners and teachers can easily monitor learners’ reading and overall progress from the c-test results in Moodle’s grade book.

How does it work?

Here’s one possible scenario for using the C-Test MILA for extensive reading. Learners:

  1. pick 3 paragraphs randomly from the text/book
  2. generate 3 c-tests by copying and pasting the 3 paragraphs into 3 instances of the C-Test MILA
  3. attempt the 3 generated c-tests; the c-test results and paragraphs of text are stored in Moodle’s grade book
  4. read the entire text/book over a period of hours/days/weeks
  5. repeat steps 1-3 with different randomly selected paragraphs from the text/book
  6. compare their results from before reading and after reading; Has their performance improved?

Additionally, if a learner scores particularly poorly on their initial attempts on the generated c-tests before they read, they may be well advised to postpone attempting to read that particular text until they’re “ready” for it and selecting an easier, more suitable text in the meantime. Learners can quickly generate as many c-tests as they need until they can find a book or text at a suitable level for them and that they’re interested in reading.

If your school has a library of graded readers, it would be considered fair use in most countries, i.e. within copyright law, to scan and copy paragraphs from each book and set up a selection of ready made c-tests for learners to take.

How to deploy the C-Test generator MILA

Example: How to set up the C-Test generator in Moodle. On the desired course page:

  • Create a new instance of the SWF Activity Module
  • Write the title, description, etc.
  • Select the C-Test MILA
  • In FlashVars Learning Interaction Data > Name, write “input”
  • In FlashVars Learning Interaction Data > Value, write “true”

FlashVars Learning Interaction Data

  • Click “Save and display” and test it with a suitable paragraph or two of text.

C-Test MILA Demo

There’s a demo set up on the Multimedia Interactive Learning Applications course on my R&D Moodle (login as a guest). Find a text to test yourself with (perhaps from Project Gutenberg?) and copy and paste it into the C-Test user input reading level test generator in the C-Test topic section (#11). There’s also some ready made c-tests there to try out.

Listen and select Multimedia Interactive Learning Application

What is it the Listen and select Multimedia Interactive Learning Application (MILA)?

Listen and select Multimedia Interactive Learning Application

It’s software for learning English as a Second or Foreign Language (ESL/EFL) written in Flash. When deployed in Moodle 1.9 using the SWF Activity Module, it has access to Moodle’s grade book and can save learners’ grades as well as other useful information. It can also be deployed in other learning management systems (LMS), in standalone web pages, and in content management systems (CMS) if you have some basic knowledge of HTML.

What does it do?

It loads multimedia learning content cartridges and uses the images and audio in them to generate simple, intuitive, interactive game-like activities. From a learner’s perspective, they:

  1. Look at the four images
  2. Listen to the audio recording
  3. Select the image which most closely corresponds to the audio
  4. If the learner selects incorrectly, they can try again until they find the correct image
  5. Go on to the next four images
  6. Repeat until all the images and audio have been matched

The Listen and select MILA automatically generates the activity and “shuffles” the images into random order each time it’s played. That way, each attempt at the activity is slightly different so learners can’t memorise the order they come in or remember any of the answers by their position. In other words, the only way to successfully select the correct image is to match it to the corresponding audio.

Common Objects Images Common Objects MP3s

Images and audio from the Common Objects multimedia learning content cartridge.

Since the multimedia learning content cartridges that it loads are external and interchangeable, the Listen and select MILA can be reused to generate any number of activities with different cartridges. For example, an elementary English course (A1) may include vocabulary related to the alphabet, numbers, phone numbers, jobs, flats and houses, food, times and dates, and common verb expressions. You can deploy an instance of the Listen and select MILA for each of the vocabulary sets or even break the vocabulary sets up into smaller, “bite size” activities and spread them out over the course of the week.

The cartridges themselves are open format, i.e. separate, reusable media files, which means they can be edited to create new combinations of images and audio, and with a little knowledge of SMIL XML (easy to learn) and some basic audio and image editing skills, you can create new cartridges of your own. Additionally, the images and audio in the cartridges can be used in other pages and activities in your CMS or LMS, for example in Moodle’s Glossary activity module to create picture and audio dictionaries.

When deployed in Moodle, the Listen and select MILA records the number of correct selections as a percentage of the total, the time taken to complete the activity, and the total number of images in the activity in Moodle’s grade book. The SWF Activity Module generates progress charts from Moodle’s grade book entries that show graphs of previous attempts over time, which can encourage learners to develop mastery of the activities.

Where can I try it out?

You can see a working demo of it deployed in a course in Matt’s Moodle here. Log in as a guest and go to section 5 of the MILAs course. Guest users cannot save their grades in Moodle’s grade book. Please contact Matt if you’d like to have a free and confidential learner account on the course so you can see how the progress charts work.

How can I get it for my school’s or organisation’s site?

You can buy a licenced copy of the Listen and select MILA by contacting Matt directly. All MILAs are licensed under a generous 30 year agreement which allows you to deploy as many instances as you like from the specified domain or website as well as any local testing servers or intranets you or your organisation might use. They also come with a free Common Objects starter multimedia learning content cartridge to get you started. Licences include updates, enhancements and bug fixes. Full details of the licence agreement are available on the MILAs demonstration course.

Using the Moodle Chat activity module for learning

Using the Moodle Chat activity module for learningThe Moodle chat activity module is a simple, basic chatroom platform specifically designed for learning and closely integrated with Moodle courses. Learners can chat, teachers can monitor and participate. Chats can be whole class or in small groups. Moodle saves the transcripts of all chat sessions. This means that teachers and learners can copy transcripts into other activities so that learners can analyse and reflect on their performance, correct errors and devise learning strategies to improve their future performances.

Learners can:

  • Role play dialogues
  • Ask for and give informal help
  • Collaborate on problem solving
  • Collaborate on projects
  • Do information gap and jigsaw activities
  • Play guessing games
  • Study transcripts for error correction, communication content analysis, etc.

Using chat

Why use Moodle chat when there are so many free online services that offer internet telephony, video and file transfer capabilities? Simply because Moodle chat rooms give you control over who can enter and interact with your learners. In order to participate in a particular chatroom, participants must be enrolled on that particular course. More importantly, you and learners have access to chat transcripts and since each chat is integrated with a course, it’s easier to keep track of learners’ participation for support, assessment, grading, etc.

Learner support and study groups

Some students find that online classes and courses can seem isolated. Having a strong social presence within a learning community is vital for cultivating learner engagement and reinforcing their cognitive presence. Additionally, as final exams and dealines approach, learners’ anxiety and stress levels rise and feelings of isolation can increase. Chat sessions provide much needed social contact and support for learners who are stressed and/or in difficulty. Learners can get the support they need from teachers and peers, and chat transcripts are a useful record of others’ help and advice. Something that we don’t normally get from face to face contact.

Creating study groups

An important factor in productive chat sessions is in limiting the number of participants. Some Human Behaviourism researchers claim that the maximum coherent interactive social group has up to five members at any one time and that more than five members becomes difficult to manage. With Moodle chatrooms it’s easy to create a single chat activity that automatically breaks large classes up into smaller assigned groups, each with their own sub-chatroom. A chatroom can be entirely separate or learners can be allowed to see but not participate in other groups’ chat.

Initiating new groups with review assignments

For learners who haven’t worked together before, it’s a good idea to get them started by assigning each one a review topic to prepare ahead of the chat session. You could assign the same topic for all learners in a group, e.g. for critical analysis, or you may want to assign different areas of a topic to create information gap activities.

The need for mediators

In online chat sessions, it’s easy for learners to “lurk”, i.e. watch without participating. This is where transcripts can be useful for assessment and for coaching learners towards more productive and collaborative participation but it’s also necessary for someone to be responsible for encouraging everyone to participate and draw lurkers into the conversation. Teachers can monitor chatrooms or individual learners can be assigned the role of mediator.

Encouraging learners to formulate questions

While it’s often necessary for teachers to provide initial questions to get conversations going, chat sessions can be more engaging and productive if learners are encouraged to think analytically and critically and formulate their own questions. This keeps chats from turning into teacher led question and answer sessions and encourages greater cognitive engagement in learners.

Using chat for tutorial review

Chatrooms are an ideal medium for tutorial reviews with individual learners. Teachers can encourage learners to be more responsible and reflective by discussing their submitted assignments and projects. Learners can review the transcript afterwards for further reflection and maybe even resubmission.

Using Moodle for written corrective feedback

Using Moodle for written corrective feedbackLearning management systems such as Moodle have many advantages over classroom and email based writing programmes. One such area is in corrective feedback and in this article I’m going to explore some of the possibilities for providing written corrective feedback for EFL and ESL learners.

What are the advantages?

  • Firstly, learners can do their writing assignments and submit them online and, in the case of blended learning, before their next English class, thereby speeding up the submission -> feedback process.
  • Unlike email, Moodle stores a single copy of the submitted work online. Only one copy in one place to keep track of. Simple. Easy.
  • Moodle also controls who can view and who can edit learners’ work and when. So learners can submit work, teachers get email notifications and go to Moodle to view the work and can make comments, recommendations and corrections on copies of the work that are stored alongside the original. Learners may be able to resubmit depending on how the writing assignment is set up.
  • Moodle’s Assignment module also facilitates peer assessed writing activities and automatically manages delegating learners’ work to their peers. Teachers and/or learners can define custom assessment rubrics and teachers can assess learners’ peer assessments. How cool is that?
  • Moodle has a central consolidated grade book for each course where teachers and, if permitted, learners can grade work and write comments.

Written corrective feedback options

There are a few options available to Moodle users:

  • Direct written corrective feedback
  • Indirect written corrective feedback
  • Metalinguistic written corrective feedback
  • Reformulation written corrective feedback

For a more in depth examination of these written corrective feedback types, Professor Rod Ellis gave a presentation which is available on YouTube.com. I’ve posted a video of the presentation and some accompanying notes in this article: Dr. Rod Ellis: TESOL Written Corrective Feedback. Right now, I’m going to focus on metalinguistic written corrective feedback because this offers the strongest advantages when implemented with Moodle. In the following section I’ve quoted from a paraphrased Dr. Ellis’ presentation:

Metalinguistic written corrective feedback

Metalinguistic written corrective feedback provides learners with some form of explicit comment about the nature of the errors they have made. This can be done in two ways:

Use of error codes

e.g. Abbreviated labels for different kinds of errors placed over the location of the error in the text or in the margin: art = article, prep = preposition, sp = spelling, ww = wrong word, t = tense, etc. Overall, there is very limited evidence to show that using error codes helps learners to achieve greater accuracy over time and it would also seem that they are no more effective than other types of written corrective feedback in assisting self-editing.

Metalinguistic explanations of errors

e.g. Numbering errors and providing metalinguistic comments at the end of the text. This is less common than error codes since it’s time-consuming and calls for the teacher to be able to write clear and accurate explanations for a variety of errors. Sheen (2007)* compared direct and indirect metalinguistic written corrective feedback. Both were effective in increasing accuracy in the learners’ use of articles in subsequent writing completed immediately after the written corrective feedback treatment but the metalinguistic written corrective feedback proved more effective that the direct written corrective feedback in the long term, i.e. in a new piece of writing completed two weeks after the treatment.

* The effect of focused written corrective feedback and language aptitude on ESL learners.

So, metalinguistic explanations of errors appear to be the most effective strategy but it’s time consuming and difficult to implement. This is where Moodle comes to the fore. We can make providing metalinguistic corrective feedback easier and less time-consuming by consolidating resources and making the metalinguistic explanations of errors freely available to learners and teachers. It’s a clever combination of error codes that link to the explanations.

A strategy using the Moodle Glossary module

The Moodle Glossary module allows teachers and learners to collaboratively create glossaries of terms which can include definitions and examples in text, images, audio and/or video. It also has a feature whereby Moodle will automatically insert links to glossary entries on any page in a Moodle course. For example, if we create a glossary entry with the term “lion” with photos and text, anywhere the word “lion” appears on that Moodle course will be automatically linked to the glossary term. When learners click on the link, the glossary term, definition and/or examples appear in a popup window.

For our purposes, we can create a glossary of errors commonly found in learners’ writing. We also use unique error codes as the glossary terms, i.e. cf-article = definite indefinite article error, cf-count = countable/uncountable error, etc. Here’s the metalinguistic corrective feedback glossary summary:

The metalinguistic corrective feedback glossary is a list of common errors found in learners’ written work that is created and maintained by teachers. It works like this:

  1. Learners submit written work, e.g. a writing assignment.
  2. Teachers identify errors in learners’ submissions, e.g. “a” and “the” as definite and indefinite articles.
  3. Teachers find the corrective feedback entries in this glossary that best match learners’ errors. If there isn’t a best match glossary entry, the teacher creates one.
  4. Teachers copy the corrective feedback glossary entry term short-code, e.g. cf-article, and paste it into learners writing.
  5. The glossary module automatically creates a link from the short-code to the corrective feedback glossary entry.
  6. Teachers submit learners’ writing assignment feedback.
  7. Learners review their assignments and when they click on the corrective feedback short-codes, the respective corrective feedback glossary entry is displayed in a popup window.
  8. Learners can then correct their writing and resubmit.

When learners are accustomed to this approach to receiving corrective feedback, the same process can be performed by learners themselves in peer review writing assignments, i.e. learners identify errors in each others’ writing to submit as peer review feedback.

Learners with higher levels of language proficiency can collaboratively create their own glossaries of common errors and find examples, correct them and write summaries of the rules.

Dr. Rod Ellis: TESOL Written Corrective Feedback

Professor Rod Ellis, gave a presentation which is available on YouTube.com. In it, he focuses on written corrective feedback. I’ve written a basic summary below. Get a drink, a snack, your notebook, make yourself comfortable and enjoy an allusive, informative explanation of the current state of affairs regarding written corrective feedback; the types and strategies, what we know, what we don’t know and what we should do.

[jwplayer file=”http://www.youtube.com/watch?v=wn35iHCljC8″]

Running time: 1:09:08

Why do we give written corrective feedback?

  1. To enable learners to revise their own writing, i.e. produce a better second draft
  2. To assist learner to acquire correct English

A Typology of corrective feedback types

  1. Strategies for providing corrective feedback
  2. How learners respond to the feedback

Written corrective feedback strategies

1. Direct written corrective feedback

Teachers provide correct form, i.e. crossing out an unnecessary word, phrase or morpheme, inserting a missing word, phrase or morpheme, inserting a missing word or morpheme, and writing the correct form above or near to the erroneous form (Ferris 2006)

  • Advantage – Provides learners with explicit guidance about how to correct their errors. Ferris and Roberts (2001) suggest direct written corrective feedback is probably better than indirect written corrective feedback with writers of low levels of language proficiency.
  • Disadvantage – It requires minimal processing on the part of the learner and thus, although it might help them to produce the correct form when they revise their writing, it may not contribute to long-term learning.

However, a recent study by Sheen (2007)* suggests that direct written corrective feedback can be effective in promoting acquisition of specific grammatical features (Low intermediate level learners).

* The effect of focused written corrective feedback and language aptitude on ESL learners.

2. Indirect written corrective feedback

Involves indicating that the learner has made an error but without actually correcting it. This can be done by underlining the errors or using cursors to show omissions in the learners’ text or by placing a cross in the margin next to the line containing the error. In effect, this involves deciding whether or not to show the precise location of the error, i.e. just indicate which line of text the error is on.


  • Caters to ‘guided learning and problem solving’ (Lalande 1982) and encourages learners to reflect on linguistic forms
  • Considered more likely to lead to long-term learning (Ferris and Roberts 2002)


  • Learners cannot correct if they do not know the correct form
  • Learners may be able to correct but will not be certain that they are correct

The results of studies that have investigated direct vs. indirect written corrective feedback are very mixed (cf. Lalande 1982 and Ferris and Robert’s 2002). No study to date (2012) has compared the effects on accuracy in new pieces of writing.

3. Metalinguistic written corrective feedback

Provides learners with some form of explicit comment about the nature of the errors they have made.

  • Use of error codes, i.e. abbreviated labels for different kinds of errors placed over the location of the error in the text or in the margin. e.g. art = article, prep = preposition, sp = spelling, ww = wrong word, t = tense, etc.
  • Metalinguistic explanations of their errors, e.g. numbering errors and providing metalinguistic comments at the end of the text.

Informal poll: learners were in favour of metalinguistic explanations but teachers were not. Rod Ellis suggested that it had something to do with hard work on the teachers’ part.

Studies on use of metalinguistic error codes

  • Lalande (1982) – A group of learners of L2 German that received correction using error codes improved in accuracy in subsequent writing whereas a group receiving direct correction made more errors. However, the difference between them was not statistically significant.
  • Robb at al (1986) – The use of error codes no more effective that three other types of written corrective feedback they investigated, i.e. direct feedback and two kinds of indirect feedback.
  • Ferris (2006) – Error codes helped learners to improve their accuracy over time in only two of the four categories of error she investigated, i.e. in total errors and verb errors but not in noun errors, article errors, lexical errors or sentence errors (e.g. word order errors).
  • Ferris and Roberts (2001) – Error codes helped learners to self-edit their writing but no more so than indirect feedback.

Overall then, there is very limited evidence to show that error codes help writers to achieve greater accuracy over time and it would also seem that they are no more effective than other types of written corrective feedback in assisting self-editing.

Studies on use of metalinguistic error explanations

This is less common than error codes. It’s time-consuming and calls for the teacher to be able to write clear and accurate explanations for a variety of errors.

Sheen (2007) compared direct and indirect metalinguistic written corrective feedback. Both were effective in increasing accuracy in the learners’ use of articles in subsequent writing completed immediately after the written corrective feedback treatment but the metalinguistic written corrective feedback proved more effective that the direct written corrective feedback in the long term, i.e. in a new piece of writing completed two weeks after the treatment.

Rod Ellis speculated that metalinguistic written corrective feedback forces learners to formulate some kind of rule about the particular grammatical feature and then they use this rule but it takes time for them to be able to use this rule effectively. Direct feedback might have an immediate effect but learners soon forget the correction, whereas if they’ve learned the rule, maybe it’s going to have a longer term effect on learners’ ability to avoid the errors.

4. Focus of the feedback

Focused vs. unfocused written corrective feedback

1. Focused written corrective feedback advantages, i.e. correcting just one type of error

  • provides multiple corrections of the same error
  • is more likely to be attended to by learners
  • is more likely to help learners to develop understanding of the nature of the error

2. Unfocused written corrective feedback advantage, i.e. correcting all or most of the errors

  • addresses a range of errors, so while it might not be as effective in assisting learners to acquire specific features as focused written corrective feedback in the short term, it may prove superior in the long term.

The distinction of focused and unfocused written corrective feedback applies to all of the previously discussed options. The bulk of written corrective feedback studies completed to date have investigated unfocused written corrective feedback. Sheen (2007) – Focused written corrective feedback, i.e. errors in the use of articles for the first and second mention, proved effective in promoting more accurate language use of this feature. However, to date (2012), there have been no studies comparing the relative effects of focused and unfocused written corrective feedback.

5. Electronic written corrective feedback

Extensive corpora of written English can be exploited to provide learners with assistance in their writing. Electronic resources provide learners with the means where they can appropriate the usage of more experienced writers.

An example of electronic written corrective feedback

“Mark My Words” (Milton 2006)

  1. An electronic store of approximately 100 recurrent lexico-grammatical and style errors that he found occurred frequently in the writing of Chinese learners
  2. A brief comment on each error an with links to resources showing the correct form
  3. Teachers use the electronic store to insert brief metalinguistic comments into learners’ text
  4. Learners consult the electronic resources to compare their usage with that illustrated in the samples of language made available. This assists learners to self-correct.
  5. An error log for each piece of writing, drawing learners’ attention to recurrent linguistic problems is generated

There has been no research to investigate whether this is effective or to investigate whether it has any actual effect on language acquisition, as measured in new pieces of writing.


  • Removes the need for the teacher to be the arbiter of what constitutes a correct form. Teachers’ intuitions about grammatical correctness are often fallible; arguably a usage-based approach is more reliable
  • Allows learners to locate the corrections that are most appropriate for their own textual intentions and encourages learner independence

6. Reformulation written corrective feedback

This involves native-speakers rewriting learners’ texts in such a way as ‘to preserve as many of the writers’ ideas as possible, while expressing them in their own words so as to make the pieces sound native-like’ (Cohen 1989: 4) The writers then revise their writing by deciding which of the native-speakers’ reconstructions to accept. In essence then, reformulation involves two options ‘direct correction’ + ‘revision’ but it differs from how these options are typically executed in the whole of the learners’ texts are reformulated thus laying the burden on learners to identify the specific changes that have been made.

Sachs and Polio’s (2007) study

  • This study compared reformulation and direct correction.
  • Learners were shown their reformulated/corrected stories and asked to study them for 20 minutes and take notes if they wanted.
  • One day later, they were given a clean sheet of paper and asked to revise their stories but without access to either the reformulated/corrected texts or the notes they had taken.
  • Both the reformulation and direct correction groups outperformed a control group. However, the correction group produced more accurate revisions than the reformulation group.
  • It should be noted, however, that reformulation serves also to draw learners’ attention to higher order stylistic and organisational errors.

Types of learner response

  1. Revision required
  2. No revisions required
    • Learners asked to study corrections
    • Learners just given back corrected text

Rod Ellis notes that learners may only look at their grade and nothing more if they aren’t required to study their corrected texts.

Ferris (2006) study

Ferris (2006) identified a number of revision categories in the re-drafts of 146 ESL learners’ essays. Out of the corrected errors:

  • 80.4% were eliminated in the redrafted compositions either by correcting the error or by deleting the text containing the error or by making a correct substitution.
  • 9.9% of the errors were incorrectly revised
  • 9.9% no change was made

Overall, research shows that written corrective feedback assists revision. Ferris’ descriptors were as follows:

Label Description
Error corrected Error corrected per teacher’s marking.
Incorrect change Change was made but incorrect.
No change No response to the correction was apparent.
Deleted text Student deleted marked text rather than attempting correction.
Substitution, correct Student invented a correction that was not suggested by the teacher’s marking.
Substitution, incorrect Student incorrectly made a change that was not suggested by the teacher’s marking.
Teacher-induced error Incomplete or misleading teacher marking caused by student error.
Averted erroneous teacher marking Student corrected error despite incomplete or erroneous teacher marking.

An important theoretical issue

Theories of language learning differ in the importance they attach to:

  • Noticing the feedback in input
  • Revising the correct linguistic forms in output

But no research has addressed this issue.

Chandlers’ (2002) study

This compared indirect written corrective feedback plus the opportunity to revise with indirect written corrective feedback with no opportunity to revise. Results:

  • Accuracy improved from the first to the fifth piece of writing significantly more in the group that was required to correct their errors than in the group that just received indication of their errors
  • This increase in accuracy was not accompanied by any decrease in fluency

However, this study cannot be used to claim that written corrective feedback with revision contributes to L2 learning as there was no control group, i.e. a group that received no written corrective feedback. Rod Ellis notes that a great weakness of studies that have investigated written corrective feedback is that the studies have had no control groups and this makes it very difficult to say whether the written corrective feedback is actually having any effect on learning.


The situated nature of written corrective feedback

Hyland and Hyland (2006) commented, ‘it may be … that what is effective feedback for one student in one setting is less so in another’ (p.88).
A sociocultural perspective on written corrective feedback would emphasise the need to adjust the type of written corrective feedback offered to learners to suit their stage of development (Aljaafreh and Lantolf 1994) although how this can be achieved practically remains unclear in the case of written corrective feedback.

By teachers

Teachers need to consider the various options and formulate an explicit policy for correcting errors in learners’ written work. They also need to subject their policy to evaluation by evaluating the effects of their error correction, e.g. through action research.

By researchers

There is an obvious need for carefully designed studies to further investigate the effects of written corrective feedback in general and of different types of written corrective feedback. Guenette (2007) observed that is is important that studies are conducted in a way that make them comparable but sadly that has not typically been the case. A typology of written corrective feedback provides a classification of one of the key variables in written corrective feedback studies – the type of written corrective feedback – which can serve as a basis for research.