Simulations: Ten Cool Things People are Doing

9 03 2012

Last week on http://www.presterafx.com/wordpress, Gus Prestera blogged about simulations and posted a very useful list that summarizes well argument points we want to bring to your next PTA or whoever you are trying to get onboard. Here is the list:

#1: Sims get legit
#2: Sims go to class
#3: Sims creep into page turners
#4: Sims get visibility
#5: Sims go cloning
#6: Sims invade virtual worlds
#7: Sims go back to basics
Read the details of #1 to 7 at Part 1
#8: Sims go 3D
#9: Sims get game!
#10: Sims go mobile
Read the details of #8 to 10 at Part 2

Picture credit: Jonathon Richter at the Tulane Center for Advanced Medical Simulation, New Orleans, 2011





Mind controlled-avatar and telepresence

7 12 2011

On November 12, the Jerusalem Post broke the news (and reposted on the Digital Journal) that the team of the Advanced Virtuality Lab (AVL) is working on several brain-computer interfaces. The VERE project (Virtual Embodiment and Robotic Re-embodiment) is funded by the European Union. The team recently reported the successful use of a brain scanner to control a computer application interactively in real time. Friedman, the team leader, commented on the potential applications of the recent achievement: “You could control an avatar just by thinking about it and activating the correct areas in the brain.”

BCI research has already provided some (relatively) mainstream applications, as you remember showed up on June 5 and 6, 2010 at a global aerospace convention in Ontario (ref. https://sabinereljic.wordpress.com/2011/05/31/technology-that-can-help-physically-challenged-humans-communicate/)

The Islreali team is also interested in telepresence and what they call intelligent transformation, regarding avatar behaviors and cultural variables.  The project BEAMING (Being in Augmented Multi-modal Naturally-networked Gatherings) aims to develop new approaches to producing lifelike interactions using “mediated technologies such as surround video conference, virtual and augmented reality, virtual sense of touch (haptics) and spatialized audio and robotics.”

As seen in the Second Life recording,  a BEAMING proxy is a bot that has been programmed to answer questions and also reproduce the characteristic mannerisms and body language of the human it duplicates.

Go ahead and tweet this post. This is great news spanning BCI, semi-intelligent agents, and telepresence.




NatGeo live AR experience

7 12 2011

What a busy month with so many exciting news. First, great stuff coming our Augmented Reality way. On November 7, 2011, National Geographic used AppShaker to create some live AR experience at the mall for anyone  to interact. The great implementation here is that no one in the audience needs their phone and an app to see what is happening. A big screen is set up and showing in real time what the AR experience is. Now you can share the experience as a group -or a class- without limitation of a small mobile screen. Watch:

Whether the idea came from promotional application (such as this Victoria Secret Angel falling from the sky -March 14, 2011)

or not, the NatGeo application conveys more interactivity than the fallen angel. We had school in the park programs, what about museum at the mall?

Enjoy, and let me know if you found something similar.

 





Song of the Machine

17 10 2011

I.am.blown.away. This is absolutely amazing work. It all started with a notification from a TED conversation that I had to jump in this morning:

“As unmanned drones, algorithms and prosthetics blur the distinction between man and machine, what, if anything, does it mean to be human?” hosted by Anab Jain (designer and founder of the London and India based collaborative design studio Superflux.

The site is shock-full of amazing projects and proof of concepts. I was particularly interested in the research done on retinal protheses (injecting a virus to infect the degenerate eye with a light-sensitive protein + an optoelectronic augmented wearable).

Song of the Machine explores the possibilities of this new, modified – even enhanced – vision, where wearers adjust for a reduced resolution by tuning into streams of information and electromagnetic vistas, all inaccessible to the ‘normally’ sighted. ” (read more here: http://superflux.in/work/song-machine)

I understand that this particular project targets the sight-challenged people. The implications are heartbreaking-ly awesome. Imagine a blind person seeing the world for the first time, or recovering his/her sight after an accident or disease. But let’s imagine the use of the augmented goggles and joint portable device that project a visual interface for use in mainstream, regular, day-to-day augmented reality. Many conversations have brought up the use of 3D panoramic headsets/glasses to be rewired to project ones Layar-enhanced smartphone view, or a pair of glasses already uploaded with such AR layer (maybe similar to the smart windshield, ref. https://sabinereljic.wordpress.com/2011/05/31/autoglass-hints-at-ar-windshield/). But I have yet to see any prototype of such glasses. On the other end, Superflux offers this proof of concept and is tabling the work on the device and interface with Dr. Degenaar leading the team.

Did you watch that Song of the Machine video? What do you think? find similar work? Post it here and give us your feedback!





Online gamers crack AIDS enzyme puzzle

20 09 2011

Repost from AFP 09/19/2011 (see source at the end of article). Online gamers have achieved a feat beyond the realm of Second Life or Dungeons and Dragons: they have deciphered the structure of an enzyme of an AIDS-like virus that had thwarted scientists for a decade.

The exploit is published on Sunday in the journal Nature Structural & Molecular Biology, where — exceptionally in scientific publishing — both gamers and researchers are honoured as co-authors.

Their target was a monomeric protease enzyme, a cutting agent in the complex molecular tailoring of retroviruses, a family that includes HIV.

Figuring out the structure of proteins is vital for understanding the causes of many diseases and developing drugs to block them.

But a microscope gives only a flat image of what to the outsider looks like a plate of one-dimensional scrunched-up spaghetti. Pharmacologists, though, need a 3-D picture that “unfolds” the molecule and rotates it in order to reveal potential targets for drugs.

This is where Foldit comes in.

Developed in 2008 by the University of Washington, it is a fun-for-purpose video game in which gamers, divided into competing groups, compete to unfold chains of amino acids — the building blocks of proteins — using a set of online tools.

To the astonishment of the scientists, the gamers produced an accurate model of the enzyme in just three weeks.

Cracking the enzyme “provides new insights for the design of antiretroviral drugs,” says the study, referring to the lifeline medication against the human immunodeficiency virus (HIV).

It is believed to be the first time that gamers have resolved a long-standing scientific problem.

“We wanted to see if human intuition could succeed where automated methods had failed,” Firas Khatib of the university’s biochemistry lab said in a press release. “The ingenuity of game players is a formidable force that, if properly directed, can be used to solve a wide range of scientific problems.”

One of Foldit’s creators, Seth Cooper, explained why gamers had succeeded where computers had failed.

“People have spatial reasoning skills, something computers are not yet good at,” he said.

“Games provide a framework for bringing together the strengths of computers and humans. The results in this week’s paper show that gaming, science and computation can be combined to make advances that were not possible before.”

source: http://tinyurl.com/3c8lt5d

Watch the report from MSNBC:  http://www.huffingtonpost.com/2011/09/19/aids-protein-decoded-gamers_n_970113.html

To quote Jane McGonigal, Gamers can save the world.





Designing Digitally; Inc. wins 2011 Air Force Research Lab Virtual World Contract

20 06 2011

Original post 05/26/2011

“Teaching students how to build airplanes, robots and simulations is serious business, but Designing Digitally, Inc. plans to make it a game — literally.

Designing Digitally, Inc., the web-based training firm that specializes in E-learning, virtual worlds, and 3D simulations, was recently awarded a 2011 Air Force Virtual World contract from the United States Air Force Research Laboratory located in Dayton, Ohio. This means the simulation experts at Designing Digitally, Inc. will dedicate the next year to creating an OpenSim virtual world grid, called Virtual Discovery Lab (ViDL), in which high school and college students can explore these technical topics in a hands-on, educational environment.

“We are extremely honored to have been selected for this contract,” said Andrew Hughes, President of Designing Digitally, Inc. “It is going to be a fantastic and rewarding challenge to create the OpenSim virtual world infrastructure for United States Air Force Research Laboratory’s Discovery Lab.”

The contract award not only calls for Designing Digitally, Inc. to help create the Virtual Discovery Lab OpenSim virtual world grid, but also to work hands-on with students interested in the field over the summer at WPAFB Tec^Edge office. Students in the program are able to explore a number of areas of interest in an educational, hands-on setting that is challenging and fun….”

Read all of it here: http://www.designingdigitally.com/blog/2011/05/designing-digitally-inc-wins-2011-air-force-research-lab-virtual-world-contract





Serious Gaming Demystified

17 06 2011

by Atos Origin

 

Go get your 16-page document at http://www.atosorigin.com/en-us/Business_Insights/Thought_leadership/Thought_leadership_Container/serious_gaming_demystified.htm

There is a request form, but don’t let it stop you. It’s a free document. (Nic Mitham of KZero does the same with his yearly virtual reports and charts). http://www.kzero.co.uk/ Just a way to measure interest)





Virtual Tools for Students International Collaboration

6 06 2011

source: http://spotlight.macfound.org/featured-stories/entry/students-use-virtual-tools-to-collaborate-across-the-globe-on-real-world-en/

Students Use Virtual Tools to Collaborate Across the Globe on Real World Environmental Conservation

…”The conservation program is based around two reefs, one real (a reef off the coast of Fiji) and the other virtual (designed by the Field Museum). WhyReef, the digital replica, allows kids to “dive” on two reefs, where they can examine 50 different species and play games that teach about biodiversity and conservation. Students also use a reef journal and interactive guides drawn from the Field Museum’s databases and the “Encyclopedia of Life” to identify unknown fish and test theories about healthy and sick reef environments.

In addition, both groups were exposed to a completely foreign culture via social media tools, all with the goal of understanding the science and ecology of coral reefs while attempting to provide real-world solutions to the problems that they see in their own environments.

“We had started looking into digital worlds as a way to work with youth and teens, to reach out to them and get them really participating in science,” says Beth Sanzenbacher, a coral reef specialist at the Field Museum. “So we designed a program to get teens in Chicago and Fiji to interact and learn about biology, ecology and conservation, and to become stewards of their environment.”…

Also read Civics under the Sea: Why Happens when Kids Dive in to WhyReef (http://spotlight.macfound.org/featured-stories/entry/civics-sea-kids-dive-in-to-whyreef/)>>How Whyville teaches middle school kids about fragile ecosystems





Technology that can help physically-challenged humans communicate

31 05 2011

Reposting David Teeghman’s post on DiscoverNews, July 15th 2010

“Slowly but surely technology is seeping into airplanes, which up until a couple of years ago felt like a final reprieve from the digital world. You can use your cell phone at certain times before take off and after landing, you can watch DVDs on your laptop and you can surf the Internet using in-flight Wi-Fi.

But it doesn’t stop there.

How about playing a thought-controlled game?

“We think it’s time that in-flight entertainment does more than simply distract you,” said Ariel Garten, CEO of Toronto-based Interaxon, which created the technology.

Passengers wear a special headset sensitive enough to pick up brainwave activity, basically electrical patterns resonating outside your head. Proprietary software converts the brainwave activity into binary code, that is the ones and zeros that make up computer code. That code becomes a signal that controls the game. With practice, a person can learn to manipulate his own brainwave activity.

The system scans your brain to see if you are in the mood for relaxation or concentration. If you are giving off alpha brainwaves and want to relax, the system sees that and sets up something soothing like a meditation program.

If you are giving off beta brainwaves and are looking to do something that requires more concentration, you could try out a program that improves your golf swing.

The ability to operate your electronic gizmos just through your brain is becoming more and more popular. You can even control robots just with your mind now, so it’s not a huge leap forward to be able to play games on an airplane using your mind as the remote control.

This technology was first put on display June 5 and 6, 2010, at On the Wings of Innovation, a global aerospace convention in Ontario. Everyone from astronauts to Apple founder, Steve Wozniak, gave the technology a test drive:”

My comment: How wonderful to adapt this for basic functions such as writing, playing music or painting. However, in an era when we have to pass laws for people to stop multitasking while driving, I don’t think that using this technology for driving a car is a good idea.





HumanSim: get the free preview on your iPad

18 05 2011

The free download is a short interactive demo for the iPad
http://itunes.apple.com/us/app/humansim-preview/id437066468?mt=8

This free preview, constructed with Epic Games’ Unreal® Engine 3, demonstrates the look and feel of the HumanSim immersive world in which medical professionals will soon train-to-proficiency on rare, complicated, or otherwise error-prone tasks. See how state-of-the-art healthcare training and education will soon be provided to physicians, nurses, EMT’s, combat medics, clinical students/residents, emergency services organizations, and healthcare education institutions.

also check out the Virtual Heroes page in Healthcare: http://virtualheroes.com/healthcare.asp. It provides more details on the HumanSim training model