Learning Styles, Mindsets, and Adaptive Strategies

Felder-Silverman Learning Styles InventoryDo learning styles promote learning? Are they helpful for learners at the various stages/levels of their development of understanding in their subject areas? Should learners use learning styles psychometric tests to determine how they should view their study habits and how they approach studying? In this article, I argue that far from being helpful, the fixed mindset that learning styles promotes acts to hinder learners’ cognitive and metacognitive development and can be counter-productive in the longer term. I describe how learning styles encourage learners to use the same study strategies regardless of context, as personal rules of thumb, and that this encourages learners to ossify their study habits rather than to allow them to develop and grow.

I argue that encouraging learners to think of their preferences as strategies that they adapt according to their current knowledge, skills, and abilities in a particular domain/topic will put them on a growth trajectory where they see themselves on trajectories of learning and development, from novice learner to expert learner, as they learn to think and study in new ways.

What are learning styles?

The basic idea of learning styles is that different learners have intrinsic personality traits that predispose them to particular media, modes, and strategies for learning. The learning styles hypothesis claims that if the concepts and subject matter are presented according to a learner’s preferred media, modes, and strategies, learners will learn more effectively and efficiently; a concept that learning styles proponents call meshing. Much has been written and debated about the learning styles hypothesis and there have been at least two major meta-studies which outline the research evidence available for their validity and reliability (Hayes, 2005; Pashler, McDaniel, Rohrer, & Bjork, 2008).

Rather than discuss the validity and reliability aspect, I propose that learning styles are not intrinsic personality traits but strategic adaptations for learning concepts and subject matter that learners use according to how experienced and knowledgeable they are in a particular domain. In this sense, the conventional assumptions of what learning styles are and how they work fall under the fundamental attribution error; a form of cognitive bias where we interpret someone’s actions as being intrinsic to them as a personality trait, and disregard the situation and context in which they are acting and responding to.

So, rather than being psychometric tests which diagnose our intrinsic personality traits, learning styles preferences can be better understood as indicators of our levels of cognitive development within particular domains of knowledge, i.e. where we are on the spectrum between novice and expert. They may be useful for adapting our learning strategies in appropriate ways. For example, rather than learners thinking of themselves as sequential or global thinkers, they should consider their current level of knowledge and understanding and which strategies will help them best, i.e. Novice learners should use a sequential strategy to learn the basic concepts with related concepts presented close together (in time and/or space) and with authentic examples (observational learning) and/or authentic experiences (experiential learning) which can be used by learners to see how they relate to personal subjective experience, while more experienced learners should take a more global approach and make more abstract generalisations in order to situate and connect the concepts they have already learned together in a coherent framework.

A Hypothetical Example: Novice vs. Expert Musicians

Let’s consider a hypothetical example scenario. When a novice musician is presented with the task of learning a new song, she will normally proceed to pick up her instrument and read the music notation on the page, playing the notes sequentially and listening to how they sound in terms of harmony, rhythm, and melody. A novice will need to play through the song a great number of times in order to develop their knowledge and understanding of it, hearing how the harmony, rhythm, and melody fit together and complement each other, and hearing any musical devices that create and release tension, i.e. what makes songs interesting and catchy, before the song “sticks” and she can perform it well without the aid of sheet music. A novice may or may not know how or why the song “works” and typically arrives at a superficial understanding of it, i.e. “It just sounds good.” This is a perfectly legitimate and appropriate strategy, considering the levels of development of her knowledge and experience of music and what she is capable of understanding.

In contrast, an expert musician will read over the whole song, usually without picking up her instrument. She will analyse any harmonic progressions, rhythmic patterns, divide and group the song into sections, e.g. verses and chorus’ or head and bridge, and will immediately draw upon her experience and knowledge to relate it to other similar songs, structures, and musical devices used to create tension and resolution. It takes considerably less time for an expert to learn to perform a song well than a novice because she is able to quickly and effortlessly situate the song in conceptual and contextual frameworks. While it may take a novice musician one or two weeks to learn a new song, an expert can do it in as little as half an hour.

Drawing from this example and according to Bransford et al (2000), we can say that experts differ from novices in that:

  • Experts notice features and meaningful patterns of information that are not noticed by novices.
  • Experts have acquired a great deal of content knowledge that is organized in ways that reflect a deep understanding of their subject matter.
  • Experts’ knowledge cannot be reduced to sets of isolated facts or propositions but, instead, reflects contexts of applicability: that is, the knowledge is “conditionalised” on a set of circumstances.
  • Experts are able to flexibly retrieve important aspects of their knowledge with little attentional effort.
  • Though experts know their disciplines thoroughly, this does not guarantee that they are able to teach others.
  • Experts have varying levels of flexibility in their approach to new situations. (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000)

A Strategic, Adaptive, Growth Mindset

So we see that novice and expert musicians use very different strategies to learn new songs, according to and dependent on their knowledge and experience. A novice does not have the necessary knowledge, skills, and abilities (KSAs) to examine, deconstruct, and understand songs in the same way that an expert does, so she tends to use strategies that are more hands-on, experiential, experimental, and sequential. She tends to learn by rote, repetition, and memorisation and follow rules and procedures that she can understand, given the KSAs she has developed so far. With time, practice, experience, guidance, and reflection she will be able to develop her own coherent conceptual and contextual frameworks for learning and understanding songs. A well-guided novice understands that, with effort and perseverance, she will be able to learn new songs in different and more efficient ways and understand them more deeply. In effect, what we are describing is a “growth mindset,” i.e. that learning styles are actually strategic, adaptive strategies that are developmental and modifiable (Chiu, Hong, & Dweck, 1997; Dweck, 2010; Hong, Chiu, Dweck, Lin, & Wan, 1999).

The Learning Styles Fixed Mindset

Now let us contrast this developmental, growth mindset view with what the various learning styles propose. In his review, (Hayes, 2005) identified 71 different schemas of learning styles but for the purposes of this article I am going to focus on one of the more popular schemas in use in higher education, the Felder-Silverman Inventory of Learning Styles (Felder & Silverman, 1988; Felder & Spurlin, 2005), for which they provide a web-form questionnaire for learners to self-assess their own learning styles according to this schema (Soloman & Felder, 1991a) and for which they offer study strategies advice to learners (Soloman & Felder, 1991b).

Learners complete the Felder-Silverman psychometric style questionnaire in order to be automatically assessed and categorised into balanced (1 – 3), moderate (5 – 7), and strong (9 – 11) learning styles preferences on four scales from 11 – 1 – 11 like this:

Active 11–9–7–5–3–1|1–3–5–7–9–11 Reflective
Sensing 11–9–7–5–3–1|1–3–5–7–9–11 Intuitive
Visual 11–9–7–5–3–1|1–3–5–7–9–11 Verbal
Sequential 11–9–7–5–3–1|1–3–5–7–9–11 Global

Felder et al (1998; 2005) claim that these preferences are consistent for each learner across domains and disciplines. When learners complete the Inventory of Learning Styles questionnaire it automatically informs them that the strategies they declare their preferences for are learning styles that are intrinsic personality traits to which they should adapt their studying habits. They are then referred to a document recommending study strategies that would best accommodate their learning styles preferences. In this sense, the Felder-Silverman, as well as many other learning styles schemas, promote a “fixed mindset,” i.e. that learning styles are fixed personality traits and not developmental or modifiable (Chiu et al., 1997; Dweck, 2010; Hong et al., 1999).

A Fixed vs. Growth Mindset

What’s wrong with a fixed mindset? It is tempting for musicians and practitioners in any field to view themselves as intrinsically capable or talented. However, Dweck (2010) informs us that for the purposes of learning and development:

“Individuals with a fixed mindset believe that their intelligence is simply an inborn trait – they have a certain amount, and that’s that.” “…when students view intelligence as fixed, they tend to value looking smart above all else. They may sacrifice important opportunities to learn – even those that are important to their future academic success – if those opportunities require them to risk performing poorly or admitting deficiencies.” (Dweck, 2010)

So, far from helping learners to develop the necessary knowledge, skills, and abilities (KSAs) through practice and perseverance, the fixed mindset, which I propose is at the foundation of learning styles, actively discourages learners who perceive themselves as less capable and talented; because their KSAs are not yet as well-developed as their peers; and discourages learners who perceive themselves as more capable and talented to not expose their shortcomings and instead encourages them to present a wall of (insecure) perfection to their peers. Have you ever noticed how some people feel intimidated and reticent when working with peers who they perceive to be more talented and capable than themselves?

Redefining learning styles in terms of a strategic, adaptive, growth mindset

In order to identify the differences between the fixed mindset promoted by learning styles and a strategic, adaptive growth mindset, let us take a closer look at the Felder-Silverman Inventory of Learning Styles schema, although this could equally apply to many of the many other learning styles schemas. When presented with a learning activity or opportunity, rather than a learner recalling her learning styles diagnosis from the psychometric test (fixed mindset), I argue that it would be advantageous for her to ask herself about how much she already knows, what experience she has, and to think about where she stands on the novice – expert learner scale, and which strategies are likely to help her most (growth mindset):

Active vs. Reflective Learning Styles Fixed mindset

According to the Felder-Silverman Inventory of Learning Styles:

Active

vs.

Reflective

Active learners tend to retain and understand information best by doing something active with it – discussing or applying it or explaining it to others.

Reflective learners prefer to think about it quietly first.

“Let’s try it out and see how it works” is an active learner’s phrase.

“Let’s think it through first” is the reflective learner’s response.

Active learners tend to like group work more than reflective learners.

Reflective learners prefer working alone.

Strategic, Adaptive, Growth Mindset

Novice learners’ developmental need: To gain basic, first-hand, experiential, implicit, procedural KSAs.

Experienced learners’ developmental need: To situate and connect already learned KSAs and relate them to each other.

Novice learners

to

Experienced learners

Make up the shortfall in basic KSAs in some way. A good strategy is to get some hands-on experience and active engagement with it.

Describe and analyse the context/situation we find ourselves in and reflect on how our KSAs apply/relate to it.

Sensing vs. Intuitive Learning Styles Fixed mindset

According to the Felder-Silverman Inventory of Learning Styles:

Sensing

vs.

Intuitive

Sensing learners tend to like learning facts.

Intuitive learners often prefer discovering possibilities and relationships.

Sensors often like solving problems by well-established methods and dislike complications and surprises

Intuitors like innovation and dislike repetition.

Sensors are more likely than intuitors to resent being tested on material that has not been explicitly covered in class.

Sensors tend to be patient with details and good at memorizing facts and doing hands-on (laboratory) work. Intuitors may be better at grasping new concepts and are often more comfortable than sensors with abstractions and mathematical formulations.

Sensors tend to be more practical and careful than intuitors.

Intuitors tend to work faster and to be more innovative than sensors.

Sensors don’t like courses that have no apparent connection to the real world.

Intuitors don’t like “plug-and-chug” courses that involve a lot of memorization and routine calculations.

Strategic, Adaptive, Growth Mindset

Novice learners’ developmental need: To learn the parts than make up the whole. To deepen understanding of KSAs to make them more explicit, i.e. make KSAs available to consciousness in order to develop more abstract and general hypotheses about them.

Experienced learners’ developmental need: Develop their conceptual frameworks further, locate gaps in KSAs, and situate new KSAs within their frameworks.

Novice learners

to

Experienced learners

Learn the basics, follow linear procedures, memorise information and methods, etc. Hands-on experience helps to put abstract concepts into context and is useful for testing/exploring boundary conditions, i.e. when methods, procedures, etc. start to fail/become inappropriate.

More experienced learners in this field can think more abstractly, can explore bending the rules, testing boundary conditions (i.e. where/when they break down/fail), and finding new ways to apply the knowledge.

Visual vs. Verbal Learning Styles Fixed mindset

According to the Felder-Silverman Inventory of Learning Styles:

Visual

vs.

Verbal

Visual learners remember best what they see – pictures, diagrams, flow charts, time lines, films, and demonstrations. Verbal learners get more out of words – written and spoken explanations.

Strategic, Adaptive, Growth Mindset

Novice learners’ developmental need: To develop an awareness of “the big picture,” that there are frameworks they must develop an understanding of, with gaps/spaces in which to situate new KSAs.

Experienced learners’ developmental need: To deepen understanding and develop more abstract concepts that they can generalise and use in novel situations and other domains.

Novice learners

to

Experienced learners

Are helped by having simplified, graphic overviews and illustrations of concepts and ideas; pictures, diagrams, flow charts, concept maps, time lines, narrative films, and demonstrations. They may need to see conceptual structures and frameworks in order to develop their understanding of them more fully.

Already have overviews and can map out the subject area. They are ready to go into greater detail and depth and reflect on the relationships and implications of the concepts.

Learning Styles Fixed mindset

According to the Felder-Silverman Inventory of Learning Styles:

Sequential

vs.

Global

Sequential learners tend to gain understanding in linear steps, with each step following logically from the previous one.

Global learners tend to learn in large jumps, absorbing material almost randomly without seeing connections, and then suddenly “getting it.”

Sequential learners tend to follow logical stepwise paths in finding solutions.

Global learners may be able to solve complex problems quickly or put things together in novel ways once they have grasped the big picture, but they may have difficulty explaining how they did it.

Strategic, Adaptive, Growth Mindset

Novice learners’ developmental need: To situate, relate, and connect concepts to theories and in frameworks, i.e. “the big picture,” and develop a deeper understanding.

Experienced learners’ developmental need: To sufficiently develop and build their KSAs into more abstract concepts so that they can easily transfer them across domains.

Novice learners

to

Experienced learners

Need structured, guided learning where they encounter related concepts close together (spatially and/or temporally) so as to emphasise their relationships/connections to each other. They need to understand some concepts before they can learn others in order to build a coherent picture of the subject/topic area.

Can connect the dots, have constructed larger, more complex, more abstract concepts and so can think more globally, taking the bigger picture, and the complex relationships between them into account. Much of their basic thinking has become automatic and barely registers in their consciousness (working memory). They are also more able to transfer those more abstract concepts into novel domains and adapt them accordingly.

Summary

I have proposed this alternative interpretation of learning styles, rethinking them not as fixed, psychometric attributes and personality traits, but as adaptive, strategic responses to the challenges that learners frequently face when acquiring and developing KSAs. By understanding learners’ preferences as indicators of their current levels of cognitive and metacognitive development, somewhere between novice and expert, we can help learners to develop learning strategies to situate themselves on trajectories of personal growth and to identify and prioritise the specific areas and aspects where they need to develop their KSAs further. In other words, to be balanced, self-aware, self-directed, strategic, adaptive, and well-rounded learners.

References

Bransford, J. D., Brown, A. L., Cocking, R. R., Donovan, M. S., & Pellegrino, J. W. (Eds.). (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition (2nd ed.). The National Academies Press. Retrieved from http://www.nap.edu/catalog/9853/how-people-learn-brain-mind-experience-and-school-expanded-edition

Chiu, C., Hong, Y., & Dweck, C. S. (1997). Lay dispositionism and implicit theories of personality. Journal of Personality and Social Psychology, 73(1), 19–30. http://doi.org/10.1037/0022-3514.73.1.19

Dweck, C. S. (2010). Even Geniuses Work Hard. Educational Leadership, 68(1), 16–20.

Felder, R. M., & Silverman, L. K. (1988). Learning and Teaching Styles in Engineering Education. Engineering Education, 78(7), 674–81.

Felder, R. M., & Spurlin, J. (2005). Applications, Reliability and Validity of the Index of Learning Styles. International Journal of Engineering Education, 21(1), 103–112.

Hayes, D. (2005). Learning Styles and Pedagogy in Post-16 Learning: A Systematic and Critical Review. Journal of Further and Higher Education, (3), 289.

Hong, Y., Chiu, C., Dweck, C. S., Lin, D. M.-S., & Wan, W. (1999). Implicit theories, attributions, and coping: A meaning system approach. Journal of Personality and Social Psychology, 77(3), 588–599. http://doi.org/10.1037/0022-3514.77.3.588

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning Styles: Concepts and Evidence. Psychological Science in the Public Interest, 9:3.

Soloman, B., & Felder, R. M. (1991a). Index of Learning Styles Questionnaire [Institution Website]. Retrieved August 4, 2015, from https://www.engr.ncsu.edu/learningstyles/ilsweb.html

Soloman, B., & Felder, R. M. (1991b). Learning Styles and Strategies. North Carolina State University. Retrieved from http://www.cityvision.edu/courses/coursefiles/402/STYLES_AND_STRATEGIES.pdf

Adding bibliography metadata to your web pages

Open Accessbibliography publishing is gaining popularity and, at the same time, increasing numbers of academics are uploading their papers on their personal websites, faculty pages, and blogs. This is great news for people who don’t have the luxury of an institution to pay for their access to the usually pay-walled research databases. Along with this positive development, I think providing bibliographical metadata in academic websites and blogs should become more of a priority. It is necessary for bibliography managers such as Mendeley and Zotero to quickly, easily, and accurately store, retrieve, and reference academic papers, which can save other academics, science writers, journalists, and students hours of work for each paper or article they write. If academic websites and blogs provide the metadata to support bibliography managers, it means that it’s that much easier for people to cite their research and provide links back to their websites, faculty pages, or blogs. However, unlike research databases, most websites, faculty pages, and blogs don’t usually provide this bibliographical metadata.

What is bibliographic metadata?

Metadata is intended for machines, rather than people to read. Bibliographic metadata tells bibliography managers and search engines by whom, when, where, what, and how an article or paper was published and makes it easier for people to find through title, author, subject, and keyword searches.

Can we embed it in a blog?

I use WordPress (this blog is built on my own customised version of WordPress) and it’s the most widely used and popular blogging software on the internet. While there’s a large and diverse range of plugins and extensions available, a search shows that while there are several that provide metadata for search engine optimisation (SEO), few offer support for bibliography managers, and none of the ones I’ve found support the minimum required metadata for academic citations. In order to find out how difficult or easy embedding metadata is, I tried an experiment on this blog to automatically generate as much relevant metadata from standard WordPress format blog posts as possible.

What does academic bibliography metadata look like?

Here’s an example metadata set for a published academic paper in an academic journal database (Applied Linguistics, Oxford Journals):

<!-- start of citation metadata -->
<meta content="/applij/4/1/23.atom" name="HW.identifier" />
<meta name="DC.Format" content="text/html" />
<meta name="DC.Language" content="en" />
<meta content="Analysis-by-Rhetoric: Reading the Text or the Reader's Own Projections? A Reply to Edelsky et al.1" name="DC.Title" />
<meta content="10.1093/applin/4.1.23" name="DC.Identifier" />
<meta content="1983-03-20" name="DC.Date" />
<meta content="Oxford University Press" name="DC.Publisher" />
<meta content="JIM CUMMINS" name="DC.Contributor" />
<meta content="MERRILL SWAIN" name="DC.Contributor" />
<meta content="Applied Linguistics" name="citation_journal_title" />
<meta content="Applied Linguistics" name="citation_journal_abbrev" />
<meta content="0142-6001" name="citation_issn" />
<meta content="1477-450X" name="citation_issn" />
<meta name="citation_author" content="JIM CUMMINS" />
<meta name="citation_author" content="MERRILL SWAIN" />
<meta content="Analysis-by-Rhetoric: Reading the Text or the Reader's Own Projections? A Reply to Edelsky et al.1" name="citation_title" />
<meta content="03/20/1983" name="citation_date" />
<meta content="4" name="citation_volume" />
<meta content="1" name="citation_issue" />
<meta content="23" name="citation_firstpage" />
<meta content="41" name="citation_lastpage" />
<meta content="4/1/23" name="citation_id" />
<meta content="4/1/23" name="citation_id_from_sass_path" />
<meta content="applij;4/1/23" name="citation_mjid" />
<meta content="10.1093/applin/4.1.23" name="citation_doi" />
<meta content="http://0-applij.oxfordjournals.org.aupac.lib.athabascau.ca/content/4/1/23.full.pdf" name="citation_pdf_url" />
<meta content="http://0-applij.oxfordjournals.org.aupac.lib.athabascau.ca/content/4/1/23" name="citation_public_url" />
<meta name="citation_section" content="Article" />
<!-- end of citation metadata -->

An APA Style (6th Edition) formatted citation from this metadata would look like this:

Cummins, J., & Swain, M. (1983). Analysis-by-Rhetoric: Reading the Text or the Reader’s Own Projections? A Reply to Edelsky et al.1. Applied Linguistics, 4(1), 23–41. http://doi.org/10.1093/applin/4.1.23

How can I add bibliographic metadata to my website or blog?

If you use WordPress, you’re in luck. I’ve made some modifications to my WordPress theme so that the appropriate bibliographic metadata is automatically added to the head section of each blog article.

Pre-requisites

  • A good FTP client. Filezilla is a good free and open source one.  Netbeans and Dreamweaver have FTP clients built in. If you’ve never used an FTP client before, look up some beginner tutorials to learn the basics of editing remote server files.
  • FTP access and login credentials to the web server where your blog is hosted.
  • A good text editor, e.g. Notepad++, NotepadqqGedit, GNU Emacs, etc., or an HTML integrated development environment, e.g. NetbeansBrackets, or Dreamweaver.

The metadata format for blogs is a little different from academic metadata, i.e. it uses the Dublin Core standard, but thee principles are similar. Here’s what I did:

  • I chose an existing WordPress core theme, twentytwelve, (but this should work with any theme) and created a child-theme: I created a new directory in /wordpress/wp-content/themes/twentytwelve-child/ WordPress automatically replaces files in themes with any files provided in child theme directories.
  • I made a copy of the header.php file from /twentytwelve/ and pasted it at /wordpress/wp-content/themes/twentytwelve-child/header.php
  • In a text editor, I opened the new header.php file and added the following lines of code between the PHP tags at the top of the page. This retrieves the metatdata from WordPress’ database:
// Set post author display name
$post_tmp = get_post($post_id);
$user_id = $post_tmp->post_author;
$first_name = get_the_author_meta('display_name',$user_id);
// Set more metadata values
$twentytwelve_data->blogname = get_bloginfo('name'); // The title of the blog
$twentytwelve_data->language = get_bloginfo('language'); // The language the blog is in
$twentytwelve_data->author = $first_name; //'Matt Bury'; // The article author's name
$twentytwelve_data->date = get_the_date(); // The article publish date
$twentytwelve_data->title = get_the_title(); // The title of the article
$twentytwelve_data->permalink = get_the_permalink(); // The permalink to the article
$twentytwelve_data->description = substr(strip_tags($post_tmp->post_content),0,1000) . '...'; // Take 1st 1000 characters of article as description
  • After that, in the same header.php file, between the <head> </head> tags, I added the following lines of HTML and PHP code. This prints the metadata on the article page. Please note that metadata is not visible when you read the web page because it’s for machines, not people to read. You can view it in the page source code (Ctrl + u in Firefox and Google Chrome):
<!-- start of citation metadata -->
<meta name="DC.Contributor" content="" />
<meta name="DC.Copyright" content="© <?php echo $twentytwelve_data->author; ?> <?php echo $twentytwelve_data->date; ?>" />
<meta name="DC.Coverage" content="World">
<meta name="DC.Creator" content="<?php echo $twentytwelve_data->author; ?>" />
<meta name="DC.Date" content="<?php echo $twentytwelve_data->date; ?>" />
<meta name="DC.Description" content="<?php echo $twentytwelve_data->description; ?>">
<meta name="DC.Format" content="text/html" />
<meta name="DC.Identifier" content="<?php echo $twentytwelve_data->title; ?>">
<meta name="DC.Language" content="<?php echo $twentytwelve_data->language; ?>" />
<meta name="DC.Publisher" content="<?php echo $twentytwelve_data->blogname; ?>" />
<meta name="DC.Rights" content="http://creativecommons.org/licenses/by-nc-sa/4.0/">
<meta name="DC.Source" content="<?php echo $twentytwelve_data->blogname; ?>">
<meta name="DC.Subject" content="<?php echo $twentytwelve_data->title; ?>">
<meta name="DC.Title" content="<?php echo $twentytwelve_data->title; ?>">
<meta name="DC.Type" content="Text">

<meta name="dcterms.contributor" content="" />
<meta name="dcterms.copyright" content="© <?php echo $twentytwelve_data->author; ?> <?php echo $twentytwelve_data->date; ?>" />
<meta name="dcterms.coverage" content="World" />
<meta name="dcterms.creator" content="<?php echo $twentytwelve_data->author; ?>" />
<meta name="dcterms.date" content="<?php echo $twentytwelve_data->date; ?>" />
<meta name="dcterms.description" content="<?php echo $twentytwelve_data->description; ?>">
<meta name="dcterms.format" content="text/html" />
<meta name="dcterms.identifier" content="<?php echo $twentytwelve_data->title; ?>">
<meta name="dcterms.language" content="<?php echo $twentytwelve_data->language; ?>" />
<meta name="dcterms.publisher" content="<?php echo $twentytwelve_data->blogname; ?>" />
<meta name="dcterms.rights" content="http://creativecommons.org/licenses/by-nc-sa/4.0/">
<meta name="dcterms.source" content="<?php echo $twentytwelve_data->permalink; ?>" />
<meta name="dcterms.subject" content="<?php echo $twentytwelve_data->title; ?>" />
<meta name="dcterms.title" content="<?php echo $twentytwelve_data->title; ?>" />
<meta name="dcterms.type" content="Text" />
<!-- end of citation metadata -->

Please note that I put the Dublin Core metadata twice in two slightly different formats for maximum compatibility with search engines and bibliography managers.

What about comprehensive academic bibliography metadata?

You’ll probably have noticed that the metadata I’ve included in my article pages, while sufficient for web page citations, doesn’t contain the same degree of detail as academic bibliography data (see first metadata snippet above), e.g. journal titles, ISSN’s, ISBN’s, etc.. As far as I know, there isn’t yet a way of storing that data in standard WordPress and so it more than likely needs a specialist plugin so authors can explicitly enter it to be stored and printed on the corresponding article pages. Would anyone like to develop one?

Online Cognitive Apprenticeship

A Cognitive Apprenticeship Approach to Student and Faculty Online Learning and Teaching Development: Enculturing Novices into Online Practitioner Environments and Cultures in Higher Education

cognitive apprenticeship

In a previous article, How prepared are learners for elearning? I wrote about the difficulties in identifying if learners are “ready” to study online and some suggestions for possible ways to identify the necessary knowledge, skills, and abilities for successful online learning.

I believe it would be unethical to identify or even diagnose such issues, thereby rejecting some learners who may otherwise be capable of thriving in online learning environments, without exploring some potential ways to address those issues. I’ve just created a small subsection on this blog that outlines a proposal for higher and further education oriented institutions and organisations that may help both learners and teaching practitioners involved in online communities of inquiry. It covers the following areas:

  1. Online Cognitive Apprenticeship Model
  2. Programme Aims and Objectives
  3. Organisational Structure and Context
  4. Programme Participants
  5. The Cognitive Apprenticeship Model
  6. Example Activities/Tasks
  7. Programme Delivery and Integration
  8. Evaluation and Assessment
  9. Participant Support: Necessary and Sufficient Conditions for Psychological Change
  10. The Programme as an Agent of Change
  11. References

Keywords: situated cognition, cognitive apprenticeship, meta-cognitive skills, enculturation, practitioner culture, legitimate peripheral participation, authentic tasks, reflective practice, online academic practice

Read the full proposal here: Online Cognitive Apprenticeship Model

PDF Version

pdfThere’s also a PDF version of the entire proposal from my Athabasca University Academia.edu account.

How prepared are learners for elearning?

distance educationWhat makes a learner ready to study online? How do they know? How do they find out?

In an attempt to address the issue of the higher student attrition (drop-out) rates in distance education (DE) and elearning than in face-to-face classes, it’s becoming more common for educational organisations to try to evaluate learners’ to find out who are unlikely to succeed on their courses and programmes and may require extra support and guidance. In other words, to assess learners’ preparedness for DE and elearning.

I think one of the biggest issues with self-assessment of readiness to study online is that learners often don’t know what the questions and ratings scales they’re presented with on questionnaires and application forms mean. Additionally, according to the Dunning-Kruger effect (Kruger & Dunning, 1999), learners with lower knowledge, skills, abilities, and experience tend to rate themselves proportionately higher than more experienced and proficient ones. In other words, the less they know, the less they know what they don’t know and the poorer they are at judging their own proficiency. In more exaggerated cases, we may actually be turning away more suitable learners and accepting less prepared ones.

In my attempt to address this issue, here are some qualitative questions that I feel are more likely to elicit responses that reflect a learner’s knowledge, skills, and abilities (KSAs), as well as their experiences, beliefs, attitudes, and values, that are relevant to distance education (DE) and elearning and to identify areas of strength and weakness rather than simple binary “yes or no”, or ratings scale responses. In other words, to encourage learners to describe their preparedness and have the interviewer decide how they compare to the minimum necessary KSAs defined in our learning organisation’s policies. It would also be possible to provide learners, whose KSAs may currently be insufficient or borderline, with personalised plans of action that they can use as a guide to “bring themselves up to speed” for successful DE and elearning.

Technical IT and Practical Requirements

What levels of technical and IT KSAs, practical facilities, and experience, beliefs, and attitudes, which are necessary for successful participation in DE and elearning, does the learner have?

  • Why does the learner want to take this course or programme? Is it for personal enrichment, professional advancement, retooling, retraining, or changing careers?
  • How many hours per week is the learner willing to commit to studies? How will the learner manage their time to prioritise and dedicate to uninterrupted periods of study without distractions from colleagues, family members, friends, and/or associates?
  • What tools, services, and technology does the learner have sufficient access to to support their learning, e.g. sufficiently powerful and useful computer, webcam & microphone, reliable high-speed internet, and appropriate software necessary to read digital files and formats?
  • How technology literate is the learner? How proficient and experienced is the learner with communication tools such as email, discussion forums, chat, and video web conferencing? What online websites, discussion groups, etc. has the learner participated in? What can they tell you about their experiences?
  • What support structures does the learner have at home or in the workplace? How understanding and supportive are their colleagues, family members, friends, and/or associates?
  • What are the learner’s previous experiences of education and learning at school, college, university, and/or the workplace?
  • Has the learner taken distance learning courses in the past? If so, what were the learner’s experiences?
  • What beliefs, attitudes, and values does the learner express that are necessary and compatible to participate in their proposed programme of study? How much direction and support will they need? How well developed are their metacognitive/self-directed learning skills?

Although more time consuming and labour intensive to conduct and administer, I think these more descriptive, open ended questions should be more helpful in allowing organisations to assess learners’ preparedness for participating in DE and learning courses and programmes effectively. Also, they can be adapted and made more specific so that they more accurately reflect the requirements of specific courses and programmes.

Reference

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. doi:10.1037/0022-3514.77.6.1121 Retrieved from: http://psych.colorado.edu/~vanboven/teaching/p7536_heurbias/p7536_readings/kruger_dunning.pdf

Ratings systems on social platforms can have unexpected effects

Plane in downward spiralThis is a quick post to share a recently published paper, How Community Feedback Shapes User Behavior, that examines the effects of ratings systems and up/down voting on social networking platforms and services. I go on to discuss some questions it raises for online social learning.

Abstract

Here’s the abstract to How Community Feedback Shapes User Behavior:

“Social media systems rely on user feedback and rating mechanisms for personalization, ranking, and content filtering. However, when users evaluate content contributed by fellow users (e.g., by liking a post or voting on a comment), these evaluations create complex social feedback effects. This paper investigates how ratings on a piece of content affect its author’s future behavior. By studying four large comment-based news communities, we find that negative feedback leads to significant behavioral changes that are detrimental to the community. Not only do authors of negatively-evaluated content contribute more, but also their future posts are of lower quality, and are perceived by the community as such. Moreover, these authors are more likely to subsequently evaluate their fellow users negatively, percolating these effects through the community. In contrast, positive feedback does not carry similar effects, and neither encourages rewarded authors to write more, nor improves the quality of their posts. Interestingly, the authors that receive no feedback are most likely to leave a community. Furthermore, a structural analysis of the voter network reveals that evaluations polarize the community the most when positive and negative votes are equally split.”

Summary of findings

  • The findings of the study appear to contradict the Skinnerian behaviourist model of operant conditioning (i.e. punishments and rewards or “sticks and carrots”).
  • Up/Down-votes and commenting provide a means for social interaction and “this can create social feedback loops that affect the behavior of the author whose content was evaluated, as well as the entire community.”
  • Authors of down-voted comments/posts tend to post more frequently and their comments/posts tend to be of lower quality.
  • Down-voted authors are also more likely to subsequently down-vote others’ comments/posts.
  • Down-voting tends to percolate throughout online communities having an overall negative effect.
  • Up-voting doesn’t appear to influence authors’ subsequent comments/posts in any significant way.
  • If comment/post authors receive no feedback, they are more likely to disengage with the community, i.e. fewer comments/posts and less up/down-voting.

The article concludes that ignoring/tolerating negative behaviour in online communities, i.e. giving no feedback whatsoever, is a more effective approach at discouraging it than addressing it directly, e.g. down-voting.

How does this relate to online social learning?

Firstly, we should be cautious about drawing any conclusions about online discussions and learning activities in online social learning. Firstly, the researchers report that, “…we have mostly ignored the content of the discussion, as well as the context in which the post appears… “, which can have significant and far reaching effects on the behaviour and interactions between participants.

Secondly, the social dynamics of social constructivist oriented online courses can be very different: The study focused on massive groups of self-selected users participating in communities based around popular media and entertainment websites, whereas in elearning, we’re typically dealing with smaller cohorts of learners who, at least in an ideal world, establish an atmosphere of mutual support, shared responsibility, and explicitly shared common purpose that is effectively moderated by skilled, experienced mediators/facilitators, e.g. teachers, teaching assistants, and/or moderators.

Rethinking the design of ratings systems

In my opinion, this paper raises more questions for elearning practitioners than it answers, which is a good thing:

  • How do learners use ratings systems and how does this affect their future behaviour in online learning communities? Is it significantly different to the users’ behaviour on social media sites?
  • Is it possible to design ratings/feedback systems that have more positive effects or at least avoid the potential negative effects reported in the paper?
  • How would the range of ratings options available to users affect the way they rate and comment, e.g. if you only include positive options in ratings?
  • How would providing ratings options that are more specific to the learning objectives of the particular learning activity affect the quality and quantity of comments and quantity of ratings?
  • What factors/influences affect learners’ behaviour in online learning communities more significantly with regard to ratings and comments? e.g. Does the degree of familiarity, mutual respect, and trust affect how learners respond to negative and critical ratings and comments?

Some example suggestions

In an earlier article, Implementing star-ratings in Moodle, I described how teachers and curriculum developers can create custom ratings in Moodle. As well as simple star-ratings, I listed some possible options which included Likert scales, prompts, showing interest, and expressing personal alignment, e.g. “This is(n’t) like me” statements. Most of these omit negative or neutral ratings, my reasoning being that, in order to give negative or critical feedback, learners and/or teachers have to take the time and effort to write sensitively phrased, personalised, specific, reasonable, constructive criticism, ideally with some kind of “what to do next”, so that it’s not just negative or critical but that it’s also helpful and purposeful in some way.

One strategy that springs to mind is to use ratings systems that, rather than ratings that suggest learners are being graded, i.e. “good vs. bad” comments, provide a set of prompts and/or questions and therefore are a convenient and helpful tool to encourage further participation. If learners have little experience of social learning and/or maybe need some initial support and guidance, having a convenient list of prompts/questions at hand could be helpful. For example:

Self-reliance questions

  • How do you determine this to be true?
  • Why don’t you consider a different route to the problem?
  • Why does that answer make sense to you?
  • What if I say that’s not true?

Reasoning questions

  • Why do you think this works? Does it always, why?
  • How do you think this is true?
  • Show how you might prove that.
  • Why assume this?
  • How might you argue against this?

Clarifying questions

  • Can you explain that in another way?
  • How does this relate to [discussion topic]?
  • Can you be more specific?
  • Can you give us an example?
  • Please tell us more about this.

Finally

It’s worth mentioning that a strong characteristic of these questions and prompts is that they are intended to stimulate analytical and critical thinking, which we usually expect to hear from teachers and mentors rather than from our peers. Learners don’t automatically assume that such questions and prompts are welcome or appropriate from their peers. In order for them to be positive and productive, participants should already be inducted into a familiar, trusting, mutually respectful and supportive group of peers, who all explicitly share a common purpose, i.e. learning objectives and/or “big/essential questions,” in a collaborative climate.

Discussion

I’ve started a discussion thread for this article on the Moodle.org community forums: https://moodle.org/mod/forum/discuss.php?d=261124 Joining the Moodle community is quick, easy, and free.

Image credit Wikimedia Commons

Reference

Cheng, J., Danescu-Niculescu-Mizil, C., & Leskovec, J. (2014). How Community Feedback Shapes User Behavior. arXiv:1405.1429 [physics, Stat]. Retrieved from http://arxiv.org/abs/1405.1429

Are teacher-led and learner-led approaches compatible?

tug of warAs learner-led/learner-centred learning and teaching oriented methods and principles gain attention and popularity, teachers, curriculum developers, and instructional designers are incorporating them into learning activities and courses. Many report mixed results and issues when they do so. The following article examines one possible contributing factor to such results and issues.

Defining terms

Firstly, I’m not arguing that teacher-led and learner-led views of learning and teaching practice are absolutes or binary states. I view them as being on the same scale from extremely prescribed and controlled by the teacher, e.g. the stereotypical Victorian school master, through to entirely self-organised, defined, controlled, and sustained learning by autonomous learners themselves, e.g. special interest groups and communities of practice, and I believe that most online curricula and learning and teaching practices are situated somewhere in between.

Teacher-led <———————————————————————–> Learner-led

Teacher vs. learner-led scale

When tensions arise

With the best of intentions and carefully and skilfully constructed learning activities, teachers, curriculum developers, and IDs can inadvertently create relational and motivational tensions between teachers and learners, and among cohorts of learners by the way they mix teacher-led and learner-led activities. Here’s a typical case scenario:

An experienced, well-informed teacher has developed an online course that is predominantly teacher-led. The course uses online presentations, readings, webinars, and forum discussions which are intensively monitored led by the teacher. The teacher conscientiously provides guidance, instructional scaffolding, and links to further resources at every turn. The teacher then decides to introduce some learner-led projects, problems, or tasks to the course (Perhaps as a way to make the course less labour intensive for the teacher?).

However, only a small minority of the learners participate as much as expected and/or required, and the majority go “off track”, waste time, and/or complain about aspects of the activity or the whole activity. The learning outcomes are mediocre at best or even poor, and it’s difficult to regain the previous “learning momentum” of the course.

Why did this happen? Is there something wrong with the activity? Is there some way to make it more productive? I suspect that in most cases, the activity is adequately designed and not the main contributor to the issue.

What contributes to these tensions?

If a course is predominantly teacher-led to start with, it creates an atmosphere and learning experiences that set up learners’ expectations that are aligned with being led and having critical learning decisions being made for them, or the feeling that any decisions they make need to be validated or approved by an authority figure; the teacher.

Additionally, some of the prerequisite conditions necessary for learner-led learning to occur, e.g. social presence and building autonomous, mutually respectful, and productive relationships between teachers and learners, and among the learners themselves, may not be in place and may have gone unnoticed since they aren’t critical to the success of teacher-led approaches. When suddenly faced with the responsibility of thinking autonomously, analytically, and critically, and having to work closely with peers, who they may or may not have got to know very well, and without the supervision, guidance, and approval of their authority figure (the teacher), the majority of learners’ expectations are not met; they feel lost, unsupported, and confused.

In my experience, the majority of learners are perfectly capable of being autonomous, thinking analytically and critically, and taking responsibility for their learning; most people do so from an early age in their public and private lives outside of education. However, because of most people’s previous experiences of education and strongly held cultural beliefs about it, we need to be explicit when asking learners to do so in situations and environments labelled “educational” and cultivate the atmosphere, and provide the environment, support, and resources that are necessary. Learners need to get to know each other and learn about what each of their peers on a course has to offer with regards to the subject matter and learning objectives. They need to build interpersonal relationships and cultivate trust so that they have the confidence to explore, experiment, and take risks and feel that they have the interest, approval, and support of their peers as well as their teacher.

In conclusion

I’m not arguing here that teacher-led and learner-led methods and activities are inherently incompatible, just that from what I’ve seen in practice in the majority of instances, both in face-to-face and online contexts, tensions and issues can and do arise when certain conditions and factors aren’t taken into consideration. When we break with educational traditions and orthodoxies, and/or atmospheres of learning that have been cultivated within organisations, we need to be explicit about what we’re doing and why, and ensure that the prerequisite conditions are in place for learner-directed learning experiences to be purposeful, successful, and productive.

What can learners and teachers do to limit corporate surveillance while working online?

Big Brother Google is watching youSince security and surveillance expert Edward Snowden blew the whistle and leaked damning NSA documents to investigative journalists Laura Poitras and Glenn Greenwald, the implications and ramifications of the NSA’s dragnet surveillance, partly enabled by IT giants like Google, Microsoft, and Facebook, have been cause for concern everywhere, and not least in elearning. As educators we bear a responsibility to our learners and other educators to protect their basic civil rights wherever and whenever we can. By being well-informed about internet surveillance and the tools and strategies available, we can offer useful, effective advice and help to reduce both the quality and quantity of personal data collected from them in the course of their online studies and work.

Why is internet surveillance an important issue?

First, here’s the scary bit. Below is an interview with a Journalist who’s looked into the business of internet surveillance:

“Pulitzer Prize-winning investigative journalist Julia Angwin joins us to discuss her new book, “Dragnet Nation: A Quest for Privacy, Security and Freedom in a World of Relentless Surveillance.” Currently at ProPublica and previously with the Wall Street Journal, Angwin details her complex and fraught path towards increasing her own online privacy. According to Angwin, the private data collected by East Germany’s Soviet-era Stasi secret police could pale in comparison to the information revealed today by an individual’s Facebook profile or Google search.”

So, What can learners and teachers do to limit corporate surveillance while working online?

What advice can we give and what measures can we put in place? Are they practical, understandable, and easily do-able? Here’s some practical suggestions to get the ball rolling…

Turn off local storage on Flash Player

Turn off local storage on Flash Player: Local Shared Objects (LSOs) are used extensively by surveillance organisations, including Google, because they reveal more information about users’ computers and software, making it easier to uniquely identify individuals, and LSOs aren’t deleted when you clear/purge your browsers’ cache, i.e. they’re more persistent. The benefits of allowing LSOs is minor and easy to live without.

Install a cookie manager

Along with your IP address and HTTP headers, cookies are the primary means of identifying and tracking individuals. There are several cookie manager extensions/plugins available for browsers that manage cookies for you. Those that you want to keep, e.g. for sites that you want to remain logged into, you can white-list them, everything else gets deleted when you navigate away from the site. My favourite is Self-destructing cookies for Firefox.

Block JavaScript from surveillance sites

This one’s a bit more problematic and can “break” page displays on some sites. If at all practical and workable, JavaScript blocking prevents some very detailed surveillance from taking place. From my experience with using free and open source web analytics software, I’ve witnessed how rich and detailed the collected data can be. There are two main approaches; white-lists and black-lists. A white-list is a list of approved sites to allow JavaScript from, a black-list is a list of sites to block JavaScript from. Both require someone to maintain the lists and block or allow new sites as they come up: many sites nowadays use CDNs and/or 3rd party libraries for JavaScript libraries and blocking them can make many sites unusable. In short, you have to maintain a list of legitimate JavaScript CDNs and 3rd party libraries as well as for the individual sites. I use a white-list plugin for Firefox called NoScript.

The added benefit of JavaScript blocking with white-lists, is that they also prevent the vast majority of web malware attacks. JavaScript has consistently been identified as the primary technology used in malware attacks by all the major anti-virus firms.

Use privacy protecting search engines

Rather than use Google, Yahoo!, or Bing as your search provider, why not use one of the more ethical and privacy protecting services? They don’t store your search history or your IP address and as an added advantage, they don’t filter their search results according to a personal search history profile thereby preventing the “filter bubble” phenomenon which can hide useful, relevant results from users.

Automatically generate random tracking noise

The browser plugin for Firefox and Chrome TrackMeNot periodically generates random but convincingly genuine search strings and sends them to search engines that may be tracking you. This “muddies” the profile they can build up on you, making it less accurate and less revealing about you.

Use different browsers

Using different web browsers, e.g. Firefox, Internet Explorer, Safari, Chrome, Chromium, and Opera, for different purposes puts up barriers between surveillance companies that use the same tracking techniques, e.g. “+”, “Like”, and “Share” buttons (In many cases, you don’t have to click on them, they’re watching you anyway), across multiple websites. For example, use one browser for searches and another one for social networking sites, and another one for logging into email (or better still use a free and open source email client that doesn’t send tracking data to surveillance companies, e.g. Thunderbird.

Use a privacy protecting proxy

This is one that organisations’ IT support can implement on their users’ behalf. A proxy can filter out personally revealing information from HTTP requests and in some cases hide users’ IP addresses.

Why if my school/college/university/institution has switched to using Google services?

It would be expensive and difficult to switch back, so more than likely not a feasible option for many organisations. The best advice I can think of is to create Google accounts specifically for use with that educational organisation and don’t use those for anything else and, if you already use Google services, e.g. GMail, Google+, and/or GCalendar, migrate to a different service provider; preferably a more ethical one, if that’s possible. The idea is to create as many barriers as possible between your private life and your studies and work, and reduce the quality and quantity of your personal information and internet usage habits that are available to one single surveillance organisation.

Use TOR

TOR (The Onion Router) is at the extreme end of anti-surveillance techniques. It’s quite restrictive and only practical for searching for and viewing a narrower range of web media, e.g. viewing Flash-based media such as video and audio can reveal personally identifiable information thereby defeating the purpose of using TOR. TOR would be particularly useful for users researching politically and culturally sensitive topics, e.g. child abuse, sexual behaviour, terrorism, copyright infringement, and political activism, that might lead to inappropriate interventions by security and law enforcement agencies (or get followed around the web by some highly inappropriate targeted advertising!) It’s widely used by journalists, political activists, and people who want to circumvent censorship and surveillance around the world. TOR comes in a ready configured, optimised, easy to use, standalone package called the TOR Browser Bundle which can also be run from a USB thumb drive or CD ROM.

In summary

These measures cannot prevent surveillance entirely but they can significantly reduce the quality and quantity of data that corporations and government security agencies can collect about you. If you can add suggestions to this list that are practical, understandable, and easily do-able, please participate in the discussion. I’ve linked to this article from Moodle.org community forums. Joining Moodle.org is free and quick, and a great place meet elearning professionals and discuss things that matter to you and to them. (Disclosure: Moodle.org uses Google Search and Analytics but I’d like to get that changed!)

Using chat to facilitate more interactive classes

chatHere’s how you can make your face to face lessons more inclusive and interactive quickly and simply by using a chat session during class, and open up a range of benefits that aren’t immediately apparent.

How does it work?

Before a face to face lesson or lecture begins, the tutor/teacher/TA opens or schedules a chat room in the course on the school’s, organisation’s, college’s, university’s, or institution’s Moodle*. All the class participants login and join the chat session. They can use their laptops, netbooks, or mobile devices. Now everyone can submit questions, requests, and comments and everyone can see each others’ during the lesson or lecture.

*Or any chat client on an elearning platform that has appropriate user management, privacy, and oversight facilities (e.g. most commercial chat services such as Facebook, Google+ don’t allow right of audit, which is necessary addressing ethical and behavioural issues), and that admins, teachers, TAs, and learners can access transcripts of previous sessions for learning and professional development (PD) purposes.

How does this affect the classroom dynamic?

  • All learners, even in a relatively large class, have the opportunity to participate in significant and meaningful ways.
  • Learners don’t have to raise their hands to interrupt the flow of the class just to have their question, request, or comment expressed and considered.
  • Less gregarious learners don’t have to compete for attention/get noticed and can therefore contribute their questions, requests, or comments more easily; everyone has an equal voice.
  • Learners can see their peers’ questions, requests, or comments whether they are addressed/focused on or not in the lesson.
  • Teachers/tutors can choose which questions, requests, and comments, in what order, and when to address/focus on.
  • Points raised by learners can be dealt with appropriately and in a timely manner and never “get lost in the moment.”
  • The transcript of the chat session is an invaluable record of what actually happened and when during the class, making it an excellent resource for critical reflection.
  • Teachers/tutors can review the transcript to see where the lesson could be improved and/or consider alternatives.
  • Teachers/tutors can see who’s participating more or less than they should be and find out why.
  • Teachers/tutors can assess learners based on their participation both quantitively and qualitatively even if it didn’t get addressed/focused on in class.
  • There’s a record of questions, requests, and comments that it may not have been appropriate to address/focus on during the lesson but could provide productive avenues of inquiry in subsequent classes.

Could it also get learners off of Facebook during class?

Presentation on learner-centered (self-directed) learning

peopleIn the previous article, Am I a learner-centered or a teacher-led teacher?, I compared and contrasted learner-centered (self-directed) learning with teacher-led (teacher-directed) learning, outlining some of the differences between them regarding learning and teaching theory and practice. In order to further clarify what the implications of learner-centered (self-directed) learning and teaching theory and practice are, I’ve composed and uploaded a presentation (slideshow).

Click here to view the presentation (opens new tab/window)

I hope you find it interesting and useful!

Technical details about the presentation software

The presentation is hosted on my Moodle installation, using a resource module (plugin) that I’ve developed and am experimenting with. My Moodle Presentation module is an implementation of the  free and open source Javascript-based slideshow project Reveal.js by Hakim El Hattab.

Please contact me if you have any difficulties in viewing the presentation.