How To Make Recordings in Second Life

27 02 2012

Colleagues have asked me how I create inworld recordings. I had already posted my early trials and tribulations on my first virtual blog (http://educedge.edublogs.org/category/how-to/page/2/) but I have not used that blog space in several years, and in re-reading what I posted, I realize that many things have evolved making the process a bit easier. I have not produced a film with a creative storyline or staged any cinema scenes in a long time. What I can offer here, though, is my experience with recording and archiving a live talk show, hosted on CAVE Island in Second Life. I suspect that since the software I use is web-based, this technique would work with any virtual platforms (video games, virtual worlds, …). I work on a PC, so this post would have to be adapted to anyone using a Mac.
Here are my steps:
1) Create an account on Livestream (free): http://new.livestream.com
2) Download Procaster
3) Watch what this kid does. His tutorial is quick (less than 3mns) and clear.

4) Click on Preferences to make sure that you are recording your desktop screen and not from your webcam.
5) Before I click “Broadcast”, I open my SL client.
6) I resize my SL screen so that the Procaster window is on the right side of my desktop screen, and not in my SL screen.
7) Click on “Broadcast”. Now you are broadcasting live your screen and your audio.
8) A Procaster UserInterface bar will appear at the bottom of my screen. So I resize my SL screen so that only the SL screen shows up in the broadcasting brackets.
7) When I am done broadcasting, I click “stop” on the bottom UI bar. Procaster will then ask me to delete or record the broadcast. I type in the name of the show, and click on “Save Recording”. Go slow because there is no recovering a show if you click on delete.
8) Then click on the Livestream logo (which directs you to your livestream account, click on My Account > My Channels > Studio > Video On-Demand. There you can select your latest recording, listen to it and make sure that it is alright. From there you can also choose the embed code or download.
9) Click on your account name, and this will get you to your public channel page. There you’ll have the url to the latest recording.

Here is the ARVEL Channel on Livestream that I’ve been managing for the ARVEL SIG: http://www.livestream.com/arvelsig
You can also embed the Channel on your Ning: http://arvelsig.ning.com/page/livestream

My Tips:
1) Because I do not want to do any post-editing, i click on “Broadcast” only when I am ready to screencast to the world. Everything that happens from then on is broadcasted and will be recorded and published as is.
2) I always use a headset to limit environmental noises outside the computer, so that the software records only what comes from inworld. This also means that I mute my mic once I am done introducing the guest speaker.
3) I manage the broadcast on livestream and the hosting inworld simultaneously, which means that I also login in SL with my admin avatar on my laptop, also with a headset plugged in (thanks Liz Dorland/Chimera Cosmos), to mute all external sound. Now I can have one avatar doing the hosting, and the other busy with the recording. This allows for a freedom of camera view when necessary from the admin avatar (security and other land permissions), and from the hosting avatar (changing camera view for a more interesting broadcast and recording).
4) You can also encourage your online listeners to type in their comments in the chat room on Livestream, that you can forward inworld to your discussion table. It makes for an outstanding mixed media experience.
5) Finally, know that because this is a free account, there is an ad starting at the beginning of every recording that you’ll make. You’ll have no choice over that.
6) Finally, Finally, I noticed that I must move very often the avatar on the computer I use to livestream otherwise the video gets recording in several short ones. There is no loss of video or audio, but it’s a bit annoying to start a new video to get to the next segment of your talk show recording…i’ll let you know if or when i find a solution to that.

I do not suggest that this is the best way to record live shows, but this is the way that works for me. I would be very pleased if you want to share your preferred tips, tricks and software to create recordings inworlds.

This movie camera icon is yours when you share your tips here in the comments! Thanks.





Daden release results of its Authoring Tools for Immersive Training Survey

16 02 2012

posted 8 days ago by Soulla Stylianou, reposted here:

“The results of an Authoring Tools for Immersive Training survey have been released by Daden Limited (Daden). The survey indicated that educators want to be able to easily create 3D training exercises themselves without needing the help of technologists nor expert knowledge of how to build the “sets” in their chosen virtual world. The survey asked questions about current uses of immersive worlds for learning, experiences of PIVOTE (Daden’s existing system) and users’ requirements for the future development of authoring tools.

Nearly 47% of the respondents were from education, 15% from the health professional training sector and interestingly 19% were from the corporate sector – especially as there’s little sign of a significant uptake of immersive training in that area. Second Life, OpenSim and Unity were the top three platforms and Second Life, despite the removal of the educational discount, dominates still with 39% respondents using it.

Of those using immersive worlds for learning, 44% were using them at least monthly with about 17% regularly using them on a daily or weekly basis and over 58% on a quarterly basis. Encouragingly most sessions were between half an hour and two hours – matching typical lesson durations. This all suggests that tutors are really using them, not just dabbling or doing proofs-of-concept.

Existing PIVOTE users generally valued the product and thought it was heading in the right direction and liked the “ability to create learning exercises with little scripting”. Top of respondent’s wish list for improvements to PIVOTE included making it even easier to use, improved user interfaces, and smoother incorporation of media. Daden are currently developing a Second Generation Immersive Learning Authoring Tool, known as OPAL. The requirements for the future section of the survey was to gauge opinions on functionality and features for this.

The top four functions rated as “vital” importance for an authoring tool for virtual worlds were the ability to “set up rules/logic for actions” (48%), “allow multiple choice questions” (40%), “can choose from a library of objects for an exercise” (39%) and be “object orientated” (select an object to set a behaviour) (38%) – all of which will be available in OPAL. Users felt it was also vital, in terms of “deploy and play functions” that the authoring tool has the flexibility to support multiple virtual worlds and that exercises can be played and tested in 2D on the web. PIVOTE already has this, and OPAL will further improve on this functionality.

The ability to log all student/group actions was rated the most important reporting and integration feature of an authoring tool. Being able to pass student results back to Virtual Learning Environments (VLE’s) and Learning Management Systems (LMS) were also high on the agenda. Both features will be present in the first release of OPAL.

David Burden, Daden’s Managing Director says “We hope that OPAL will give educators and trainers working in immersive environments (and those new to such environments) a powerful tool with which to create immersive experiences in a cost-effective way.”

The survey results confirmed that OPAL will address most if not all the requirements expressed by the respondents in its first release – planned for March 2012. Users want a good, easy to use and flexible authoring tool for the standard delivery of immersive learning exercises to PC’s on their chosen platform.

The survey can be downloaded from Daden’s website at
http://www.daden.co.uk/tools/download_files.html

Read the original press release: http://www.daden.co.uk/press_releases/daden_release_results_of_its_a.html

Also make sure to browse Daden’s White Papers ; they are full of very interesting implementation ideas for classrooms.

So much to read. Here is a badge for your efforts! Let me know in comments that you’ve checked this out! Thank you!





Interactive Surfaces

7 02 2012

We are hearing more and more about flexible epaper, flexible screens -especially for mobile devices, and other flexible AMOLED-based technology…

My personal favorite is the Nokia Morph concept

…but here is something very interesting as well: Another really cool innovative proposal from glass manufacturer Corning, “A Day Made of Glass 2”

I believe that’s why we should add all science fiction resources as required readings or viewing. Now that we’ve caught up with the Minority Report, when do you think we’ll see something similar to the Matrix-download? Prophets of Science Fiction, all of them.

Add another prophets of science fiction in comment. Cheers!





Merging Kinect and inworld objects/avatar

6 01 2012

Repost of “Disembodiment: Interview with Glyph Graves about his Kinect Performance”, Dec 8, 2011 on http://lindenarts.blogspot.com/

Kinect-controlled face in SL

“Artist Glyph Graves has pushed the boundaries between virtual and physical realities once again. He programmed and scripted his own hack for the Kinect to map and move his real life physical body and movements onto virtual objects in Second Life. Glyph’s earlier piece, “Faceted Existence,” (see: http://www.youtube.com/watch?v=9q5N1X5Cs30) used 2500 spherical primes to indicate his face, which loomed large above avatars in the virtual landscape, and was revolutionary as the first use of Kinect to make art that others could see, as opposed to only controlling an avatar with the alternative controller.

“Disembodiment,” the new performance piece that Glyph debuted at the Linden Endowment for the Arts exhibition, InterACT! on Sunday, December 4, 2011, extends his previous experimentation to represent not only his face, but now a body represented by spheres. Here he discusses how and why:

Lori Landay (L1Aura Loire): Can you tell us how you use the kinect controller to manipulate objects in Second Life?

Glyph Graves: For those that don’t know what a kinect is, it’s a sensor device that can capture depth and shape data using a combination of infrared and visible light. I take the data and stream it into Second Life, then reconstruct the shapes by positioning prims.

For all of us technical geeks, how about some more detailed specs? The rest of you can take a catnap and come back for the answer to the next question!

I first made this in August right after the face piece but never got round to showing it except to a few people….” […Go to http://lindenarts.blogspot.com/2011/12/disembodiment-interview-with-glyph.html to read the rest of the interview]

Here is a cute badge for twitting this post. Thank you! Make sure to add the hashtag #educedge so that I can count how many tweets you’ve posted. There is a special award for that!





Mind controlled-avatar and telepresence

7 12 2011

On November 12, the Jerusalem Post broke the news (and reposted on the Digital Journal) that the team of the Advanced Virtuality Lab (AVL) is working on several brain-computer interfaces. The VERE project (Virtual Embodiment and Robotic Re-embodiment) is funded by the European Union. The team recently reported the successful use of a brain scanner to control a computer application interactively in real time. Friedman, the team leader, commented on the potential applications of the recent achievement: “You could control an avatar just by thinking about it and activating the correct areas in the brain.”

BCI research has already provided some (relatively) mainstream applications, as you remember showed up on June 5 and 6, 2010 at a global aerospace convention in Ontario (ref. https://sabinereljic.wordpress.com/2011/05/31/technology-that-can-help-physically-challenged-humans-communicate/)

The Islreali team is also interested in telepresence and what they call intelligent transformation, regarding avatar behaviors and cultural variables.  The project BEAMING (Being in Augmented Multi-modal Naturally-networked Gatherings) aims to develop new approaches to producing lifelike interactions using “mediated technologies such as surround video conference, virtual and augmented reality, virtual sense of touch (haptics) and spatialized audio and robotics.”

As seen in the Second Life recording,  a BEAMING proxy is a bot that has been programmed to answer questions and also reproduce the characteristic mannerisms and body language of the human it duplicates.

Go ahead and tweet this post. This is great news spanning BCI, semi-intelligent agents, and telepresence.