Monthly Archives: October, 2014

Teaching and assessing problem solving skills – PISA recommendations

In 2012 PISA (Programme for International Student Assessment), which operates under the OECD, released their evaluation of problem solving skills in 15 year old students. The study defined problem solving and established an assessment regime. They then compared the results from 65 participating economies. The full report is at http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-v.htm

I don’t propose to critique, or even attempt to summarize the report, but as an educator there are some interesting points in it which I would like to refer to. I will be skipping through the report, highlighting what I see as key ideas for classroom teachers and mixing in a few personal observations and ideas. For the full context and supportive data you should refer back to the original report.

Importance of problem solving skills in the 21st Century

Across the countries it was found that a large majority of workers are expected solve a simple non-routine problem (taking less than 30 mins) once a week. One in ten workers are confronted on a daily basis with harder problems. Complex problem-solving skills are more in demand in the faster growing managerial and professional occupations.

A suggested explanation is that automated systems are increasingly dealing with routine problems, leaving the workers to deal with the unexpected or unfamiliar situations.

As a result we see the following trend in employment towards non-routine analytical and relational skills.

As educators preparing students for the world, we need to consider these trends. Amongst other things we need to be seeking to equip our students with the skills to thrive in non-routine problem-solving situations.

What are problem-solving skills?

In order to teach and assess problem-solving it is necessary to define the skills and understand them.

PISA defines problem-solving as:

…an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.

(http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-v.htm pg. 30)

The key domains in the definition are identified as:

  1. Cognitive domain: The problem solver needs to engage, understand and resolve the problem.
  2. Problem domain: The problem is non-routine, meaning that the goal cannot be achieved by merely applying a proviously developed solution.
  3. Affective domain: The problem solver needs willingness to tackle the problem. I would add that willingness also pre-supposes confidence and the ability to handle failure positively.

The problem solving framework defined by PISA involves three main elements, further divided into parameters or processes:

Using this framework a problem can be categorized and the problem solving process (not necessarily linear as shown) described and evaluated.

Levels of problem-solving competence

To evaluate  the level of competence of a problem-solver there needs to be a progressive developmental framework. In previous posts I have discussed the progressions to evaluate collaborative problem-solving skills proposed by the ATC21S (Melbourne University) group and 21CLD . I have summarized these down to 4 levels relevant to secondary education https://doncollegegrant.wordpress.com/2014/08/06/digital-badges-for-collaborative-problem-solving/.

PISA present 6 progressive levels of development for problem-solving:

Level 1

  • At Level 1, students can explore a problem scenario only in a limited way, but tend to do so only when they have encountered very similar situations before. Based on their observations of familiar scenarios, these students are able only to partially describe the behaviour of a simple, everyday device. In general, students at Level 1 can solve straightforward problems provided there is only a simple condition to be satisfied and there are only one or two steps to be performed to reach the goal. Level 1 students tend not to be able to plan ahead or set sub-goals.

Level 2

  • At Level 2, students can explore an unfamiliar problem scenario and understand a small part of it. They try, but only partially succeed, to understand and control digital devices with unfamiliar controls, such as home appliances and vending machines. Level 2 problem-solvers can test a simple hypothesis that is given to them and can solve a problem that has a single, specific constraint. They can plan and carry out one step at a time to achieve a sub-goal, and have some capacity to monitor overall progress towards a solution.

Level 3

  • At Level 3, students can handle information presented in several different formats. They can explore a problem scenario and infer simple relationships among its components. They can control simple digital devices, but have trouble with more complex devices. Problem-solvers at Level 3 can fully deal with one condition, for example, by generating several solutions and checking to see whether these satisfy the condition. When there are multiple conditions or inter-related features, they can hold one variable constant to see the effect of change on the other variables. They can devise and execute tests to confirm or refute a given hypothesis. They understand the need to plan ahead and monitor progress, and are able to try a different option if necessary.

Level 4

  • At Level 4, students can explore a moderately complex problem scenario in a focused way. They grasp the links among the components of the scenario that are required to solve the problem. They can control moderately complex digital devices, such as unfamiliar vending machines or home appliances, but they don’t always do so efficiently. These students can plan a few steps ahead and monitor the progress of their plans. They are usually able to adjust these plans or reformulate a goal in light of feedback. They can systematically try out different possibilities and check whether multiple conditions have been satisfied. They can form an hypothesis about why a system is malfunctioning, and describe how to test it.

Level 5

  • At Level 5, students can systematically explore a complex problem scenario to gain an understanding of how relevant information is structured. When faced with unfamiliar, moderately complex devices, such as vending machines or home appliances, they respond quickly to feedback in order to control the device. In order to reach a solution, Level 5 problem-solvers think ahead to find the best strategy that addresses all the given constraints. They can immediately adjust their plans or backtrack when they detect unexpected difficulties or when they make mistakes that take them off course.

Level 6

  • At Level 6, students can develop complete, coherent mental models of diverse problem scenarios, enabling them to solve complex problems efficiently. They can explore a scenario in a highly strategic manner to understand all information pertaining to the problem. The information may be presented in different formats, requiring interpretation and integration of related parts. When confronted with very complex devices, such as home appliances that work in an unusual or unexpected manner, they quickly learn how to control the devices to achieve a goal in an optimal way. Level 6 problem-solvers can set up general hypotheses about a system and thoroughly test them. They can follow a premise through to a logical conclusion or recognise when there is not enough information available to reach one. In order to reach a solution, these highly proficient problem-solvers can create complex, flexible, multi-step plans that they continually monitor during execution. Where necessary, they modify their strategies, taking all constraints into account, both explicit and implicit.

I believe that this progressive development is sufficient basis for an assessment rubric and to establish zones of proximal development for students, which is the first step towards developing an educational process for teaching and assessing problem-solving. Test questions have also been developed and they can be viewed at http://www.oecd.org/pisa/test/ .

Recommendations

Analyzing their assessment results PISA found that some countries were doing better than others at teaching problem-solving skills. On this basis they were able to make recommendations on improving education in this area. There were suggestions for improving educational policy, but I have adapted the following 5 points from their recommendations, which can be implemented at the classroom level. I suggest you refer to the original document for a more detailed discussion.

Don’t teach solutions

In general problem solving is taught by focusing on rule-based solutions. This is most obvious in Mathematics education. This is really a two step process, the first step being formulation of the problem from a messy real-world scenario, the second step is the application of the solution. Once the solution path is established the rest of the process can be automated, so it is the first step that is the more valuable skill.

In order to assist students to develop skills in problem analysis and solution formulation they need to be exposed to numerous real-world problems.

In the language of ATC21S this means exposing students to real-world problem spaces so that they learn to develop, evaluate and select solutions. Presenting students with restricted problem spaces leading to defined solution paths is not developing effective problem-solving skills.

Teach for skill transfer by looking for connections

problem-solving skills developed in one domain do not readily transfer into another domain. Teachers can assist this transfer by using diagrams and illustrations to highlight the similarity between strategies across domains rather than the superficial differences of jargon or context.

In practice this might involve finding the similarities in the design process when designing a house or a prom dress, or calculating loads on roof trusses and optimum tacking angles for a yacht. These pairs of problems seem superficially different, but the problem spaces have things in common.

Skills are best developed in meaningful contexts

People are less likely to transfer isolated pieces of knowledge than they are to transfer parts of well-integrated hierarchical knowledge structures. The more connections a learner sees between the learning environment and the outside world, the easier the transfer will be.

(http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-v.htm pg 121 )

Teachers need to be prepared to look at the real world, particularly the world that students live in. I have always been aware that this helps with the affective domain in problem solving (willingness and persistence), but the evidence shows that it also assists with the cognitive domain by aiding skill transfer across contexts.

Encourage metacognition

Students need to be encouraged to think about how they are thinking about a problem. Self awareness through the process is extremely powerful in developing problem-solving skills.

This can be encouraged by “thinking aloud” sequences. Solving problems in a collaborative setting is also a way of encouraging this, particularly if the communication is managed. For example if students are placed in the situation where they are collaborating through a network chat session it encourages them to communicate their thinking explicitly to one another. It also records that communication for later analysis and discussion. ATC21S made use  this strategy in their work. I have also explored this using etherpad.

Teachers also need to be courageous enough to model this behavior for the students. Because of their familiarity with the subject teachers tend to model problem-solving as a routine activity with their classes, simply because they generally go into class knowing how to solve all the problems. It is not a bad idea to sometimes attack a problem that the teacher does not know how to solve.

Utilize the visual arts

The visual arts are often devalued by teachers, particularly teachers of the core disciplines like Maths and Literacy, as a place where real problem-solving does not happen. The visual arts can be a powerful vehicle for developing problem-solving skills. On a superficial level students are learning skills and techniques, but on a deeper level participating in the visual arts involves:

  • Envisioning: Students are asked to envision what they cannot observe directly.
  • Observing: The skill of careful observation is taught.
  • Reflecting: Teachers often encourage reflection by asking open-ended questions about the work. Students are therefore encouraged to develop metacognitive awareness of their work.
  • Engaging and persisting: Students tackle projects which engage them, and they need to persist through frustration as they refine and develop their skill with the medium.
  • Stretching and exploring: Students stretch themselves and take risks in producing their work.

So the visual arts are a powerful context in which to teach the basic problem-solving tools.

Conclusion

This post constitutes a summary of what I learned form the PISA report. It has given me a lot to think about as I evaluate my teaching practice. 21st Century skills, like problem-solving, are the key to future success for our students. Effectively teaching them is as challenging as it is worthwhile and work by PISA, ATC21S and 21CLD are showing the way for classroom teachers like yours truly.

 

 

Advertisements

Badge Taxonomy – Badge Alliance consultation overview

Let me start by making it clear that I have not personally been involved with the Badge Alliance working groups. As an impartial observer I am free to congratulate them on the progress they have made. The report on the first cycle of consultation is at http://www.badgealliance.org/blog/celebrating-our-successes-an-overview-of-cycle-1/

Having posted in the past on badge taxonomy I was particularly interested in their comments on that. Under Badges for Educators & Professional Development the working group made the following recommendation:

[To]…Loosely standardize a set of badge types. This would not dictate the content of the badge (i.e. assessment), but the class of badges or the general type of activity/assessment it represents. This would help educators, administrators and employers more easily anticipate the value or weight of various badges, and ensure some commonalities in experience across different badge systems.

Potential examples:

◦Participation/Attendance – a badge for attending a conference or seminar, participating in an online community or event. No assessment other than proof of attendance/participation.

◦Skill – a badge representing a distinct skill. Assessment is tied to demonstration of that skill.

◦Achievement – a badge representing a completed set of activities or a set of skills. May have a number of skill badges that ‘stack’ to an achievement badge, or unlock access to it. Assessment involves demonstration of the sub-activities or skills, and perhaps some expression of the cumulative learning.

◦Specialty – a badge representing an interest area, area of training or skill set. Assessment is most likely tied to demonstration of the sub-skills, but also includes evidence from educator’s own experiences and approaches.

◦Peer/Social – a badge representing qualities or skills, awarded peer-to-peer. Assessment is peer review/recognition.

◦Community – a badge representing behaviors, values and roles within a particular community. Badges are defined and issued by members of that community to reflect values, behaviors and roles that are important to them.

These categories are more refined and expanded than my initial definitions. I do have a couple of comments though.

In my taxonomy I used the term “mission” badges where the badge alliance have referred to “achievement” badges. I still prefer the term mission badge, as it speaks to me more of engaging in a process or journey to achieve the badge. To me an achievement is more suggestive of a ‘one off’, and I reserved that term for badges that mark specific achievements, such as breaking sports records etc. On the other hand these are just names, would a rose by any other name smell any less sweet?

I like the idea of the community badge, participation badge and the specialty badge.

Overall I congratulate the badge alliance on their work and I encourage you to read through all the recommendations.

Flipping classes using Edpuzzle

This year I have been reading about flipped classes. I must say that I am amused by this term. Here in Australia “flipping” is used as a softer version of another less socially acceptable “F” word, as in “He is a flipping idiot!”. As a result, talk about flipping maths classes might generate some smirks or raised eyebrows amongst my peers.

For the uninitiated this is what flipping classes really means. Typical (teacher centered) teaching involves using class time to teach new material and homework to practice and consolidate. In a flipped class students cover the new material in their own time before class and class time is spent on practice and consolidation. Flipping classes is made possible by the availability of online multimedia resources. The advantages are that the teacher can spend less time teaching from the front and more time providing individual help to students. More able students can move ahead more quickly while less able students have improved access to the teacher.

This is a great idea, but so far I haven’t done much with it. I guess my resistance comes down to two main points. Firstly it takes significant time to prepare or find suitable resources. Secondly I don’t really trust my students to do the pre-class work. If I turn up to class and discover half of them didn’t bother to do the preparative work, then I need to rewrite my lesson plan on the fly.

Recently I have discovered a tool which makes flipping more convenient. EDpuzzle allows me to take clips from any online source (or one of my own) and annotate them at particular points with comments, questions or commentary. I can create classes within EDpuzzle and it will record the responses of my students so I can tell before class who has watched the video and how well they understand the content.

There are other services that do the same sort of thing, such as Blubbr, but I found EDpuzzle to be very easy and flexible to use. I sat down and was immediately able to work up this simple 5 minute flipped lesson presentation ( https://edpuzzle.com/media/54338e38ae72d0930a6fd241 ) There are a lot of other examples on the EDpuzzle site. In fact all work is available to all users of the site, so a teacher can use the search function to find suitable presentations prepared by other teachers.

So the advantages of flipping lessons with EDpuzzle are:

  1. You arrive at class knowing who already has a good grasp of the topic and who needs extra help.
  2. You spend less time speaking from the front and more time on individual assistance.
  3. Students that miss classes have the capacity to catch up more easily.
  4. It makes viewing clips fully interactive rather than passive.
  5. You don’t have to do the marking.
  6. The data collected by EDpuzzle can inform your reporting on things like participation and effort.
  7. The EDpuzzles can be embedded in a VLE and used over and over by classes.

I now have no more excuses and I will be starting to flip more of my lessons in the new school year.

Badge Taxonomy – further thoughts

In a post earlier this year I voiced some thoughts on a taxonomy for open badges. In this entry I proposed that open badges needed some categorization to maintain their utility and integrity. To this end I suggested that badges be classified according to genera and species (to steal terms for biology and maintain the taxonomical feel)
I proposed dividing badges into badges based on a competency and those not based on a measurable competency.

Within the non-competency badges we have two species:

  • Encouragement badges are awarded like good work stamps to encourage (mainly) young learners.
  • Social badges are used like friendship cards, or for fun.

Competency based badges divide into three species:

  • Achievement badges are issued to credential demonstration of a specific skill or achievement. An achievement badge might be issued for running 100m in 10 seconds, for being elected class captain etc. The achievement is defined in the badge and evidences attached.
  • Skill badges are issued to credential expertise in an area. They include a series of criteria that need to be met. For example they might be issued to staff who demonstrate effective integration of an ICT package into their teaching. Skill badges differ from achievement badges in that they have more complex criteria and do not apply to a single achievement or event.
  • Mission badges are used where a person (usually a student) has embarked on a series of activities with the aim of achieving a badge. These missions are often cross curricular and involve the development of a skill followed by a culminating achievement. Mission badges occupy the area between skill and achievement badges. Not surprisingly a mission badge might be issued as the culmination of a group of related skill and achievement badges.

Flavio Escribano has since extended this concept of badge taxonomy , including my nascent thoughts and those of Charla Long at Lipscomb University, and developed a more robust classification for badges.

Long describes an excellent badge system implemented at Lipscomb University. They have 7 badges categories further divided into 41 competencies, each measured at 4 levels of achievement, giving rise to 164 badges. The 41 competencies are identified workforce skills and my impression is that the badge ecosystem at Lipscomb is designed to map college education to the  requirements of employers with more granularity than the traditional credentials.

Escribano introduces the idea of the BadgeRank and the BadgeScore. The BadgeRank is a number based on the rank of the badge developed from the rank of the institution, the position of the badge in the institutional ecosystem, the teacher etc. The BadgeScore is based on the BadgeRank, but also takes into account the context in which it will be used. In other words the BadgeScore considers things such as the relevance of a badge to an employer or the desired career path of the earner.

Extending and generalizing the system at Lipscomb, Escribano proposes that badges be categorized according to fields, competencies and categories:

As you can see from this graphic the system proposed also provides for badges to credential a mix of these parameters to varying degrees.

So in summary, a huge amount of work has gone into improved ways to categorize and improve the robustness of digital badges.

My thoughts

As you will gather from the title my objective with this post is not to simply provide an overview of developments in badge taxonomy, but to document how my thinking has developed in response to this work.

I will start by saying that I am very impressed by the work of Long, Escribano and others. It is not my intention to present a critique of their work. These are merely my thoughts in response.

Taxonomy vs. Ecosystem

In my reading it looks to me as if there is some confusion around these terms. I will go out on a limb and say that they are not synonymous and I would define them as follows:

Taxonomy

A way of classifying different types of badges into groups. These groups are broad and refer to the general characteristics of the badges. I have suggested taxonomical groups as Skill badges, Achievement badges, etc. Taxonomic classification would be a property of all badges.

Ecology

A way of defining the interrelationship between badges. How they span the curriculum/competencies and the levels of competence from novice to mastery. In biology, ecology only has meaning in the context of an ecosystem. Similarly, a badge ecology only makes since within an institution or educational system.

The systems proposed by Long and Escribano are very good, but I think that much of what they describe is ecology rather than taxonomy. As such they would be difficult to apply to the secondary school curriculum I follow.

Having said that I believe that badge ecology is a much more interesting problem than taxonomy. Sample ecologys need to be developed and shared so that institutions can easily develop their own robust ecology.

BageRank and BadgeScore

This is a powerful idea. I have written previously about the need to make badges comparable, but this is the first attempt I have seen to quantify badges for comparison across institutions. Having said that, it seems to me that the BadgeRank is quantifying information which is largely already in the metadata. When someone presents a badge I will be able to look at the issuing institution, the competencies and the level of competence in the metadata and make a good assessment of the value of the badge (essentially the BadgeScore). If I am presented with a number I will not know what to make of it without a lot of interpretive documentation anyway.

As a secondary school teacher I am not going to be very interested in the BadgeRank and BadgeScore, but colleges and universities are much more preoccupied with these things.

Leveling

I have written before that achievement badges are of limited use educationally. Achievement badges allow students to mark milestones, but they don’t support the continued development of skills. In order for students to recognize achievement and also be guided forward through stages in skill development there needs to be a series of levels of skill badges built into the badge ecology.

Finally

I am excited by the amount of thought going into open badge development. Some powerful and sophisticated ideas are coming forward, particularly in relation to badge ecology. The work I have discussed here typifies that.

My final observation is that most of this work, for various reasons, is being lead by colleges and universities. Other institutions are deploying badges, but usually (based on my reading) achievement badges with little or no ecological context. I agree that badge ecology is vital, but it might be a pity to find growth and development of badge ecology dominated by higher education institutions.