#H800 Virtual worlds and Identity

Second Life is one of the topics in week 25 of H800, with in particular a focus on how people engage with an ‘online identity’, given shape by their physical manifestation online, their avatar.

A key element in the discussion on virtual worlds is the concept of identity or, better, multiple identities.
The issue of multiple identities reminds me of the identification concept of Wenger (1998).  Wenger considers identity as a social construct, formed by our participation (and non-participation) in a wide range of communities and our influence in those communities (our ability to negotiate meaning, as he calls it).  In this regard, participation in Second Life adds another identity component.  We have one identity (the ‘self’) according to Wenger, but it is composed of a range of ‘identifications’ in communities.  This resonates with the following excerpts:
What is changing is not the self, but the ability we have to explore different manifestations of this self. ‘In conventional terms of reasoning, post-modern identity can be considered schizophrenic; however, it should not be looked upon as pathology but as a virtue’ (Adrian, 2008, p369 in Peachey, 2009, p.4).
The role of groups in shaping ‘real life’ identities is implicit, as is the multiplicity of ‘real life’ identity. What is interesting and new about virtual worlds is that they make this group-shaping explicit and multiplicity of identity actionable. (Adrian, 2008, p368 in Peachey, 2009, p.14).
Peachey (2009) describes how people construct their avatar very carefully, like spending a lot of time in choosing a name and physical appearance.  First impressions matter, also or even more so, in virtual realities.  For some identification with their avatar can become very strong and spill over to their real life identity, blurring boundaries between the two.  For example, people create Facebook profiles with their avatar name or adapt their hairstyle to match with their avatar’s looks (Peachey, 2009).  Experiences with an avatar may result in increased confidence in real life, or, conversely, may also result in real emotional pain.  A strong identification with one’s online identity makes me wondering about the person’s satisfaction with his/ her real world’s identity.
The notion of extending avatar identity into the real world inevitably raises the possibility of overidentification and withdrawal from essential realities (usually where contributing factors in the real world might predispose such withdrawal). (Peachey, 2009, p.13)

For some students balancing ‘real life’ and ‘virtual’ identities may create tensions, fearing that their online identity ‘[takes over’ their real identity.

What comes through most strongly from the accounts … is that there is a tension in students’ narratives between the ideal of an embodied, authentic, anchoring self, the self that goes along to tutorial classes on Tuesday afternoon, and the possibility of other, deviant, less authentic selves which emerge online and which threaten the anchoring subject with the possibility of their autonomy (Bayne, 2005).

I found the discussion on people with disabilities interesting.  For this group of people a virtual world such as Second Life may provide them with an appearance that doesn’t generate the instinctive reaction of ‘patronising convention’.   However, some still choose an avatar with physical disabilities.  .
But what is the potential for education?  The author sees a lot of potential for collaborative learning.  I can see that you can re-create some of the visual clues that are absent in regular online interaction.  I can also imagine that community bonding could be stronger in a Second Life environment than in a regular forum or e-mail discussion.  On the other hand, it is another social network to engage in, another place to login and check updates.  I’m not sure whether the added value of Second Life weights up against this reluctance to engage in another network. 
Within H800 it could be an interesting experience to move some of the tutorials from Elluminate to Second Life.  A disadvantage in Elluminate is the lack of visual clues, which makes it difficult to engage in a muti-person informal discussion.  Second Life may provide a richer and more intuitive environment for discussions.  However, the considerable time investment to get started in Second Life would require using it at least a few weeks to make the effort worthwhile. 

#H800 Reification and Participation in Wenger’s Communities of Practice

Communities of Practice are a powerful and highly influential concept, developed by Lave and Wenger (1991) and later refined by Wenger (1998).  Based on sociocultural approaches to learning, it may seem an abstract and challenging concept, but, once understood, results in a better conceptual understanding of learning and design for learning.
Identity is a socially constructed through participation (and non-participation) in a range of communities.  There can be communities at work, school, family or nation-wide.  Identity is also determined by the way how we ‘negotiate meaning’ in those communities.  In other words, how we influence activities inside them.
Negotiability refers to the ability, facility and legitimacy to contribute to, take responsibility for and shape the meanings that matter within a social configuration.
Wenger identifies two ways of influencing within a community: participation and reification.  Participation is the direct interaction between members of a community.  Reification is the creation and use of artefacts such as lesson plans, guidelines or a curriculum to impose or affect others’ behaviour.  In learning design an optimal combination of both ways is necessary to achieve learning outcomes.
The balance between and participation and reification is a key duality in communities of practice.  Wenger refers to the ‘double-edged sword of reification’.  Too much focus on reification may stifle creativity and hurt group dynamics.  However, reified artefacts such as guidelines, ‘ways of doing things’ or a specific vocabulary provide an anchor point to novice members and help creating a community identity.

In a workshop for teacher trainers a sole focus on participation may have the result that participants find it difficult to ‘transcend’ their practice, cross boundaries and develop new conceptual understanding.  Too much focus on reification, for example by imposing a rigid lesson plan template, may create alienation and stifle creativity with participants, who feel they don’t have any impact on the design process.  It lowers the status of the participants who are likely to formally comply without taking any ownership.

#H800 Communities of Practice, Networked Individualism, Network Dynamics and Mycorrizhal Networks

Communities of Practice

The concept of Communities of Practice (COP) was introduced by Lave and Wenger (1991) to highlight and conceptualize forms of learning that focus on participation and are not limited to the individual level (don’t stop ‘at the skin of the individual’).  Legitimate peripheral practice is the conceptualization of the apprentice learning a craft with a master, thereby gradually gaining mastery in the subject. Learning was an inevitable aspect of all productive practices and not limited to a formal setting within schools or institutions.  The communities in the COP can be characterised as:

Clearly bounded structure, with rules for membership.

  • Single centre of supreme skill and mastery
  • A hierarchical structure where new members (apprentices) start at the ‘periphery’ and gradually move toward the centre of the community as they obtain mastery.  Also called a ‘centripetal’ structure.


However, recent work by Engeström (2007) challenges Wenger’s analysis by pointing out oppression by dominant figures in communities and rebellion by apprentices.  He argues that learning is not smooth and necessarily centripetal. Instead, the movement of learning is shaped by learning’s participants.  Those at the hub of the learning experience exert the greatest influence.
Jones (2004) recalls the successful application of network dynamics in domains like transport and politics.  He argues that mathematical network principles like power laws, clustering and the strength of weak ties can be applied on learning networks.  Networks can be identified from the level of societies and institutions down to individuals and ideas.  Individuals can be nodes in a network, engaged in linking to other persons, agents and resources.  Or, ideas can be considered as nodes with associations between them forming the links of the network.  His ideas remind me of Connectivism – discussed in an earlier blog post– which claims to be a learning theory for the internet age.  Connectivism describes learning as building networks, not only of physical persons, but of associated ideas.
Engeström considers COP as a-historical structures (unlike activity systems in Activity Theory) , which means they are not connected with the varying dominant organisational structures in a society.  Engeström argues that COP are mainly useful in a society dominated by craft and industrial production.  Craft is characterised by a master-apprentice relationship and industrial production is characterised by teams, small groups with well-defined membership and with members who have complementary competences and work together on certain task.  Gradually, however, a ‘co-configuration’ model is evolving in which companies are becoming more susceptible for input from customers and other outside sources (co-creation of value as described by Prahalad and Rawasmamy, 2004).  Adler and Hecksher (2006, cited in Engeström, 2007) call the organisational structure for the post-industrial society a ‘collaborative community’.
A collaborative community is different from a team in 4 aspects:
  • Boundaries become more fluid;
  • Very high level of division of labour and diversity of knowledge;
  • Authority based on knowledge and expertise instead of status;
  • Values are orienting and motivating elements for members of the community.
  • Increased importance of motivation follows from looser membership rules.
Engeström points out that such a collaborative community is no longer a team, but a network.
Social production and the mycorrizhae analogy
For Engeström the organisational structure of the information society is social production. The flagship example of social production is the open source movement.  Thousands of developers and bug fixers work together in a loosely-structured network without a clear centre – although a success factor of the Linux and Moodle open source networks is arguably the role of their respective leaders -, but with a compelling common goal, called ‘runaway objects’.  Examples are global warming and a free operating system.
Engeström develops an analogy with a mycorrizhae system.  This is a symbiotic relationship between a fungus and the roots or rhizoids of a plant.  The fungus delivers nutrients and water to the plant, which returns the favour by providing the fungus with energy-rich sugars.
The analogy plays out as follows:
  • A underground network of roots with interspersed nodes resembles a loosely connected and expanding network of individuals, exchanging information and with multiple loyalties.
  • Visible fungi resemble the visible, erect, bounded and institutionalised structures that emerge from this network.  Engeström uses the term ‘wildfires’ to describe sudden bursts of activity – in the same of a different place – with long dormant periods in between.
  • There is no hierarchical structure in mycorrizhae as is the case in the root system of plants.
  • The mycorrizhae work in symbiosis with plants as informal learning networks can work in symbiosis with formal structures.
  • The mycorrizhae is hard to kill, but also vulnerable due to the lack of clear leadership.
Networked individualism
The model is very suitable for a mobile generation of learners, who form very easily temporary networks, dominated by weak bounds.  Castells (2008) calls this a ‘networked individualism’, claiming that not mobile, but individual is the defining property of the mobile generation, since people experience a much higher freedom.

Final remarks

These post-industrial organisation modes seem fit to explain learning activity in a Web 2.0 environment.  Students are developing personal learning networks (PLE) as a complement to the institution’s learning management system (LMS), in which they communicate not only with peers, but with a wider network of friends, alumni, bloggers, staff and domain experts.  Twitter seems the ultimate tool to facilitate this kind of weak-tie dominated networks, much more than Facebook, which has a more personal and bounded network structure.  An interesting analysis how Google + mixes weak-tie and strong-tie networks was made here by George Siemens.  Traditional classroom practice can be described as a community of practice, with a clear master (the teacher), a bounded membership structure (the classroom) and a clear, externally defined objective (the curriculum and assessment).  The current trends in (higher) education towards a more student-centred learning approach, with a higher responsibility for the student to create a learning network, more flexibility in courses and more fluid boundaries between formal and informal learning seem fit with a network approach and a mycorrizhae analogy.

Main references

Engeström, Y. (2007) ‘From communities of practice to mycorrhizae’ in Hughes, J., Jewson, N. and Unwin, L. (eds) Communities of Practice: Critical Perspectives, London, Routledge.

Jones, C. (2004) ‘Networks and learning: communities, practices and the metaphor of networks’, Association for Learning Technology Journal, vol.12, no.1, pp.81–93

Lave, J. and Wenger, E. (1991) Situated learning: Legitimate peripheral participation, Cambridge university press.

Castells, M. et al (2004) ‘The Mobile Communication Society: A cross-cultural analysis of available evidence on the social uses of wireless communication technology, A research report prepared for the International Workshop on Wireless Communication Policies and Prospects: A Global Perspective, held at the Annenberg School for Communication, University of Southern California, Los Angeles, 2004

#H800 Alternative Representations of PLE

I reconstructed my PLE, reordering the tools according to their main function. When I couldn’t decide, I put the tool twice in the scheme.
If you use a wide definition of ‘technology’, reading articles on paper and jolting down notes in the sidelinemight still be the most frequently used technology.  However, whether using or not using a particular tool is often in the details.  Since my discovery of Instapaper, for example, my e-reader use has increased sharply.

Instapaper is a tool that lets you store interesting web pages such as articles and blog posts for later reading.  All selected texts are nicely ordered and can be exported to the e-reader friendly epub format.

Another way to represent a PLE is to place them along 3 axes (based on Conole, 2008; link here):

– individual vs social
– information vs experience
– active vs passive (like in immersive language learning)

This allows to make the relation between the use of technology, notably Web 2.0, and pedagogy more explicit.

#H800 Technology’s broader impact on education

Technology arguably has an impact on education which extends beyond the teacher and the learner. The broader impact of technology on education has been described in various reports. In week 22, we had a look at the Horizon Report (2011 edition) from the New Media Consortium, the National Science Foundation (NSF) Cyberlearning Report (2008) and the Becta Report (Harnessing Technology: Next Generation 2008-14). I compared the key elements of these reports with the Cambodia Master Plan for ICT in Education 2009-2013 (Third Draft version, 2009).

All reports consider technology as a major contributor to improving education and to meeting upcoming challenges.  In Cambodia challenges centre on dealing with an increasing influx of students as a result of demographics, reduced drop-outs and increased enrolments. In the US and the UK, they focus on globalisation and competition in a global workforce.

Both recognize the importance of aligning education with digital literacy and employability requirements for graduates. They regard technology in education as a prerequisite for improving the quality of education.

While reforming the public school system is well beyond the scope of our present task force, positive effects on schooling would certainly result from invigorating and inspiring learners through the rich new environments made possible by the Internet and developments in cyberinfrastructure.(NSF, p.12) 

One might say the reports suffer from a light to severe form of ‘techno-utopia’, regarding technology as the solution for all educational ills.  In this over-reliance on the role of technology in education, there is a lot of writing about ‘potential’ and ‘best practices’, but very little in the way of actual impact in real life contexts. There is hardly any discussion on the impact on students, teachers (apart from increasing training needs) and institutions. The Cambodian plan states the introduction of e-learning and a national Open University to reach remote areas and non-traditional students, but doesn’t mention deeply ingrained suspicion of e-learning and less teacher-dependent forms of education with students and the population. The NSF report considers all learners as ICT proficient ‘Millennials’ and brushes over indications that a deep digital divide persists, also in developed countries. 

Ensuring effective use of technology in education requires that also content and pedagogical knowledge are sufficiently developed. Merely focusing on technology provision leads to teachers letting students copy their notes into a Word document instead of a notebook or reading from a PowerPoint presentation instead from a book. There is no strategy in the Cambodian Master Plan how teachers will be trained from basic skills up to efficiently integrating technology, content and pedagogy, through an intensive in-service programme. Lack of content knowledge leads often to a teacher-centred approach with little room for experimentation, discussion and exploration, since these activities may expose the teacher’s alleged weaknesses.

There are some striking differences between the Cambodian Master Plan and the other plans though. The NSF report stresses the need for ‘cyberlearning’. This is learning mediated by networked computing and communications technologies. Students could experience more authentic learning by using large amounts of data, like climate data sets or databases generated by sensors. They could learn finding meaningful patterns in the datasets, tweaking parameters and ways of representing the data patterns. These kinds of activities require multidisciplinary approaches and networking between educators and scientists. The NSF report refers to the relevance of the ‘long tail’ in markets for education, allowing catering for students’ interests.

The global scope of networked educational materials, combined with “recommendation engine” software, helps individuals find special, niche content that appeals to their needs and interests.(NSF, p.16)

The Cambodian ICT Master Plan on the other hand stresses the alignment with the country’s curriculum. Software, textbooks and all ICT resources are to be fully in Khmer and explicitly approved by the Ministry of Education. The focus is more on an ‘isolated’ desktop model of ICT where learners access carefully selected resources, in contrast with a ‘web 2.0’ approach to technology-enhanced learning, let alone stimulating them to create Personal Learning Environments (PLEs).

All plans are surprisingly vague in pointing out evidence or strategies in how technology may improve learning outcomes. Monitoring is focused on measuring outputs, whereas improved outcomes on learning as a result of technology seem to be taken for granted. The NSF and Becta reports argue that better data collection can allow for better individual tracking of learning progress and better information for parents and educators.
Technology has an important role to play in ensuring greater participation by parents in children’s learning. It enables schools and colleges to report to parents on their child’s progress as it develops, rather than at a few fixed points in the year, and offers parents new opportunities to engage in dialogue with providers. Patterns in achievement, attendance and behaviour become easier to identify, so teachers and other professionals can make more timely and effective interventions (Becta report, p13).

Finally, the Cambodian plan is heavily focused on infrastructure, which seems logical given the low penetration rate of ICT equipment and internet connectivity. There is very little mention of mobile technologies however, although these may offer more chances to get large numbers of students connected to the Web than desktops in computer classes.  The NSF and Horizon plans put heavy emphasis on these mobile devices, including not only smartphones, but also tablets and e-readers.


Becta (2008) Harnessing Technology: Next Generation Learning 2008–14, Becta report on behalf of the Department for Children, Schools and Families and the Department for Innovation, Universities and Skill, Coventry, Becta; 

National Science Foundation (NSF) (2008) Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge, A 21st century agenda for the National Science Foundation, report of the NSF Task Force on Cyberlearning, Arlington, VA, National Science Foundation

New Media Consortium (NMC) (2009) Horizon Report, report from the New Media Consortium and the Educause Learning Initiative, Austin, TX, The New Media Consortium

Ministry of Education, Youth and Sports (MoEYS), 2010, MASTER PLAN FOR INFORMATION AND COMMUNICATION TECHNOLOGY IN EDUCATION 2009-2013 , available online at http://www.moeys.gov.kh/ict_master_plan_2009-2013.php

#H800 Horizon Report 2011

The 2011 Horizon Report from the New Media Consortium (NMC) and Educause provides some clear indicators for key trends and drivers for technology in education. As ever the report outlines key trends, critical drivers and short and long term forecasts. 

Each year, the Horizon Report describes six areas of emerging technology that will have significant impact on higher education and creative expression over the next one to five years. The areas of emerging technology cited for 2011 are:

Time to adoption: One Year or Less
Electronic Books

Time to adoption: Two to Three Years
Augmented Reality
Game-based Learning

Time to adoption: Four to Five YearsGesture-based Computing
Learning Analytics

There is an extensive collection of resources, although there seems to be very little in the way of critical reflection on the role of technology in education, its relation with content and pedagogy etc.

The fact that the NMC was founded in 1993 by a group of hardware manufacturers, software developers, and publishers might have something to do with that. Current corporate partners include Adobe Systems, Apple Inc. and Pearson College.

The report is great if you want a short description of the key characteristics and potential for education of the upcoming technologies, as long as you don’t expect a balanced account, supported by research data.

#H800 My Personal Learning Environment (PLE) and the OU’s VLE


Starting from the different digital tools I’m using, I created a mindmap of my own Personal Learning Environment (PLE).  I tried out different designs, but found that a separation of ‘target groups’ or ‘motivations’ wasn’t helpful.  Some of the tools can be nicely classified, but others I use interchangeably for learning, work and leisure, such as Google Reader, Diigo, Calibre and Twitter, adhering to the concepts of ‘lifelong’ and ‘lifewide’ learning.

My PLE …anno 2011

Creating my PLE made me realize how many tools I’m using.  There are around 20 tools in my PLE, not taking into account some obvious ones such as Google (for searching), Google mail and Wikipedia.  Quite some are Web 2.0 tools.  Tools such as Picasa (for photos), YouTube (for videos) and SlideShare and Prezi (for presentations) are convenient for storage and let you share stuff in the same breadth.  Much more people access my presentations through Slideshare than through viewing them on a workshop or conference. Using these tools for storing (and sharing) instead of my desktop computer has strongly expanded my PLE in a short time.
Studies, both formal with the OU (H800) and informally through participating at MOOC’s (LAK11), have expanded my PLE as well, bringing me in contact with tools such as Diigo, Netvibes (ditched again in the meantime) and Elluminate.
Mobile 2.0 tools are rather absent from PLE, with Calibre (for e-book management) as an exception.  I’m a reluctant user of iTunes – I like Apple devices, but hate being locked in by dreadful programmes such as iTunes – and I receive tweets as sms on my cell phone, a great service offered for free by the Cambodian telecom industry.
There is some integration of tools in the PLE mostly grouped around Google (Picasa, Google Reader, Google Mail) and the Moodle module of the OU.  Overall, however, most tools are used in isolation and for one specific purpose. 


In the PLE vs VLE debate, my opinion is heavily affected by my experience with the (Moodle) system we’re using within H800.  I find that approach rather well-balanced.  We’re invited to explore a lot of externally hosted tools such as Compendium, Twitter, Diigo and Google Reader.  A VLE with a basic, but robust forum, wiki and blogging functionalities is used for the ‘core’ communication with each other and the tutor. 

I find it important that a course (in particular on technology-enhanced learning) stimulates learners to develop their own PLE in order to enable us to continue our learning informally afterwards.  On the other hand, I also like the ‘brand identity’ a VLE offers.  I also agree with the arguments that not all learners (and lecturers) are equally tech-savvy (and thus able to create an efficient PLE) and that assessment –related tasks should use internally-hosted systems.
Within our group of students the VLE offers a convenient one-stop place, where we can read and react on each other’s comments.  Try-outs with everyone blogging (on different blogging platforms) for course posts went nowhere, because it was much more time consuming to visit everyone’s blog for reading and commenting.  If you stimulate learners to develop a PLE, the logical outcome will be that everyone has a different set of tools, and communication will end up more fragmented than in a one-stop shop, where everything happens.
The introduction of Google Apps for Education and Elluminate within the OU VLE indicate that the boundaries between internally hosted VLEs and externally hosted Web 2.0 tools are blurring.  So, it looks like the debate might be settling itself.

#H800 Does the Semantic Web create a ‘filter bubble’?

Week 21 makes a little detour from Web 2.0 to Web 3.0 (more aptly called the Semantic Web), looking whether the latter terms signals a break with Web 2.0 or more a logical evolution.  The Semantic Web was a core topic in the Learning Analytics open course (LAK11) earlier this year, and I discussed the concept of the Semantic Web in an earlier blog post (here).
An interesting angle is the relation between the Semantic Web and the information overload, experienced by many users, trying to find information with search engines and soldiering to read all kinds of ‘interesting’ and ‘must-read’ articles, blog posts and videos.  Clay Shirky doesn’t use the term ‘information overload’, but prefers to speak about a lack of filtering. Tools to enable improved filtering seem like a good thing.  I use tools such as Twitter, Google Reader and Diigo to filter interesting content from the abundance around.  These filtering tools are based on social networks, using other’s preferences and selections to help me making my own.
This improved filtering is one of the alleged hallmarks of the Semantic web.  Instead of HTML pages, that store layout information, but are unsuitable for data analysis.  A new standard of data storage is proposed that would enable machines to read, interpret and use more easily data from websites.  There are some notable examples of the Semantic Web right under our noses.
Search results for Google, for example, are different from person to person, when you are logged in with your Google account.  And online retailers such as Amazon, change their homepage, suggestions and (notoriously) their prices depending on who’s visiting the website.
Elif Pariser has pointed out that much of this filtering happens beyond our conscience, risking trapping us in ‘content bubbles’ with search engines and recommendation systems systematically selecting for us what we ‘want’ to see, instead of giving an ‘objective’ account of the information available.
The result is a “filter bubble”, which he defines as “a unique universe of information for each of us”, meaning that we are less likely to encounter information online that challenges our existing views or sparks serendipitous connections.
It’s a worthwhile warning, but I wonder whether it’s new to the internet.  Do people not tend to select friends, information sources and books, that correspond with the views they already have?

The Filter Bubble: What the Internet is Hiding From You. By Eli Pariser.Penguin Press; 294 pages; 

Web 2.0 tools and how they affect learners, educators and institutions. Some thoughts.

How do Web2.0 tools affect educational institutions and the educators and learners within them?  Do institutions better prepare for a complete overhaul in order to stay relevant?  Or, can they integrate Web2.0 as an extra layer into their current practices?  These are a few of the key questions addressed in Weeks 21 and 22 of H800.
There’s a lot of recent research on the topic with, among others, two large studies (Redecker, 2009 and JISC, 2009) offering a plethora of case studies.  The background reading on the topic is a text from Conole (2011).  Below, I describe  some personal conclusions.
First, educators often seem to be the driving force behind integrating Web2.0 in teaching.  Few institutions seem to have an official policy on the topic, although issues such as privacy, reliability and assessment surely affect the institution as a whole.  So, how could institutions deal with Web2.0 cases?  From the readings, I would conclude the following:
  • Encourage early adopters, the technology enthusiasts that are willing to invest the time and climb the learning curves to design activities.  Although plenty of cases are available, translating them to the particular lesson context is often time consuming.
  • Document. Stimulate these early adopters to keep track of their experiences, reflections and decisions, if possible publicly.  Support them to monitor and evaluate the experiences of learners, in order to assess improvements or scaling-up options.  The documentation can prove useful for later in-service training activities.
  • Allow for time for trying out, ‘tinkering’ and experimenting with Web2.0 applications.
  • Think about a policy, or at least some guidelines.  How to deal with the privacy of students when using blogs?  How to assess individual contributions when working with a wiki?  How to avoid time consuming plowing through forum posts in blogs? How to deal with external software that is suddenly unavailable or behind a paywall?

The cases also discuss the implications for learners.  Here, I had following thoughts
          Web2.0 tools are often touted as supporting the way of teaching we currently see as most desirable, being collaborative, social, authentic, differentiated, lifelong and life-wide.  With Web2.0 tools teachers have an extra battery to turn their lessons into student-centered feasts.  However, students may also need to make a mental switch and turn from passive ‘receivers’ into active ‘creators, or turning from competitive individuals into sharing collaborators. It’s important to make sure that students are aware of these changing expectations.
          Learners cannot hide anymore from digital technologies that have turned / are turning into a necessary life skill.  However, some technologies may pose a steep learning curve for students and get in the way of the topics that they are supposed to be learning (as well).
So, should educational institutions consider sweeping changes in the way they are running in order to incorporate the Web 2.0 army?  I’m not so sure.  I think that Web 2.0 is offering a wide range of interesting applications to improve teaching and learning.  It suffices to skim through the case studies to get convinced of that.  But these case studies also show that students still need guidance, assessment and engaging activities in order to learn.  Web2.0 tools offer a medium, but the input from teachers and students give them their added value.