Federal Virtual Worlds Challenge 2012

14 03 2012

The Maya Astronomy Center built in Second Life (HMS Center Region) is a Finalist at FVWC‘s Focus Area: Engaging Learning.

Here is the video tour:

Beverly Gay McCarter notes: “www.HumanMosaicSystems.com — This is a BRIEF video tour of the HMS Maya Astronomy Center_Phases 1 and 2 COMBINED.

The Maya Astronomy Center_Phases 1 & 2 is an interlocking kinetic modular learning system that is a resource intensive build. It explores the Maya understanding of astronomical events and how that information impacted their culture and society. This learning module utilizes a multi-floored resource Library, as well as related independent learning modules that expand on the central subject being taught.

The Center is a free standing learning module that is able to be a stand alone exhibit or be integrated into other related builds expanding its capability. It demonstrates a pedagogical model that can be used with a variety of subject matter. The learning environment uses interactive intelligent agents, a HUD learning management system, music and dance to reinforce learning, interactive 3D models, a narrative structure to help explain the complex dynamics involved with the topic and to set the learner on an engaging Quest, hidden traps and reward systems that impact tokens earned, and multiple quizzes that award prizes.

This self-guided immersive learning environment utilizes Maya cultural mentors who appear and guide the participant as they explore the various interactive 3D exhibits in the Maya temple and on the grounds giving the participant more in depth information through the use of interactive 3D models, chat, note cards, slide presentations, web links, and videos.

The exhibits help the participants understand the complexity of the subject by breaking it down into different related units that build upon one another as the participants explore the information in this interrelated learning module.”

 

Be sure to watch this video and tell us what you think! Thanks

Advertisements




Daden release results of its Authoring Tools for Immersive Training Survey

16 02 2012

posted 8 days ago by Soulla Stylianou, reposted here:

“The results of an Authoring Tools for Immersive Training survey have been released by Daden Limited (Daden). The survey indicated that educators want to be able to easily create 3D training exercises themselves without needing the help of technologists nor expert knowledge of how to build the “sets” in their chosen virtual world. The survey asked questions about current uses of immersive worlds for learning, experiences of PIVOTE (Daden’s existing system) and users’ requirements for the future development of authoring tools.

Nearly 47% of the respondents were from education, 15% from the health professional training sector and interestingly 19% were from the corporate sector – especially as there’s little sign of a significant uptake of immersive training in that area. Second Life, OpenSim and Unity were the top three platforms and Second Life, despite the removal of the educational discount, dominates still with 39% respondents using it.

Of those using immersive worlds for learning, 44% were using them at least monthly with about 17% regularly using them on a daily or weekly basis and over 58% on a quarterly basis. Encouragingly most sessions were between half an hour and two hours – matching typical lesson durations. This all suggests that tutors are really using them, not just dabbling or doing proofs-of-concept.

Existing PIVOTE users generally valued the product and thought it was heading in the right direction and liked the “ability to create learning exercises with little scripting”. Top of respondent’s wish list for improvements to PIVOTE included making it even easier to use, improved user interfaces, and smoother incorporation of media. Daden are currently developing a Second Generation Immersive Learning Authoring Tool, known as OPAL. The requirements for the future section of the survey was to gauge opinions on functionality and features for this.

The top four functions rated as “vital” importance for an authoring tool for virtual worlds were the ability to “set up rules/logic for actions” (48%), “allow multiple choice questions” (40%), “can choose from a library of objects for an exercise” (39%) and be “object orientated” (select an object to set a behaviour) (38%) – all of which will be available in OPAL. Users felt it was also vital, in terms of “deploy and play functions” that the authoring tool has the flexibility to support multiple virtual worlds and that exercises can be played and tested in 2D on the web. PIVOTE already has this, and OPAL will further improve on this functionality.

The ability to log all student/group actions was rated the most important reporting and integration feature of an authoring tool. Being able to pass student results back to Virtual Learning Environments (VLE’s) and Learning Management Systems (LMS) were also high on the agenda. Both features will be present in the first release of OPAL.

David Burden, Daden’s Managing Director says “We hope that OPAL will give educators and trainers working in immersive environments (and those new to such environments) a powerful tool with which to create immersive experiences in a cost-effective way.”

The survey results confirmed that OPAL will address most if not all the requirements expressed by the respondents in its first release – planned for March 2012. Users want a good, easy to use and flexible authoring tool for the standard delivery of immersive learning exercises to PC’s on their chosen platform.

The survey can be downloaded from Daden’s website at
http://www.daden.co.uk/tools/download_files.html

Read the original press release: http://www.daden.co.uk/press_releases/daden_release_results_of_its_a.html

Also make sure to browse Daden’s White Papers ; they are full of very interesting implementation ideas for classrooms.

So much to read. Here is a badge for your efforts! Let me know in comments that you’ve checked this out! Thank you!





Learning Without Frontiers

3 02 2012

Repost from http://conta.cc/xlbdK1 (02/04/2012). Here is the latest newsletter, titled future positive, featuring reviews and the opening talks from the  2012 LWF conference including:

Noam Chomsky -The Purpose of Education
Noam Chomsky discusses the purpose of education, impact of technology, whether education should be viewed as a cost or investment and the value of standardised assessment.

Ray Kurzweil -Exponential Learning & Entrepreneuship
Inventor & futurist, Ray Kurzweil on DNA, 3D printed buildings, prediction accuracy, adjusting to change, neuroscience, innovation in schools & learning.

Jaron Lanier -Learning by experience and Play
Renowned computer scientist, pioneer of virtual reality, artist, musician and author, Jaron Lanier presents this talk about learning, experience & play.

also present were Ellen MacArthur and Keri Facer.

Tweet the post. Help LWF makes some tracks all over the internet.





Mind controlled-avatar and telepresence

7 12 2011

On November 12, the Jerusalem Post broke the news (and reposted on the Digital Journal) that the team of the Advanced Virtuality Lab (AVL) is working on several brain-computer interfaces. The VERE project (Virtual Embodiment and Robotic Re-embodiment) is funded by the European Union. The team recently reported the successful use of a brain scanner to control a computer application interactively in real time. Friedman, the team leader, commented on the potential applications of the recent achievement: “You could control an avatar just by thinking about it and activating the correct areas in the brain.”

BCI research has already provided some (relatively) mainstream applications, as you remember showed up on June 5 and 6, 2010 at a global aerospace convention in Ontario (ref. https://sabinereljic.wordpress.com/2011/05/31/technology-that-can-help-physically-challenged-humans-communicate/)

The Islreali team is also interested in telepresence and what they call intelligent transformation, regarding avatar behaviors and cultural variables.  The project BEAMING (Being in Augmented Multi-modal Naturally-networked Gatherings) aims to develop new approaches to producing lifelike interactions using “mediated technologies such as surround video conference, virtual and augmented reality, virtual sense of touch (haptics) and spatialized audio and robotics.”

As seen in the Second Life recording,  a BEAMING proxy is a bot that has been programmed to answer questions and also reproduce the characteristic mannerisms and body language of the human it duplicates.

Go ahead and tweet this post. This is great news spanning BCI, semi-intelligent agents, and telepresence.