How will The Future of Interface Design look like?

David Leggett tried to answer this question in a blogpost written for uxBOOTH. He writes:

The future of how we interact with computers is exciting to say the least. What once seemed like nonsense outside of Hollywood and Science Fiction is now starting to find it’s way into reality, and some of the technology is a bit overwhelming. Have a taste of what the future of interface design has to offer:

Here are some of the most interesting videos from the post as well as some additional ones i have found:

Head Up Displays for Consumer Vehicles

The following video is showing an Apple iPhone App using the concept of Heas Up Displays. Working not so bad actually.

Originally Head Up Displays were developed for military use:

Gesture Based Interfaces

Applications like the following are geture based and increasingly becoming populous.

http://www.youtube.com/watch?v=kCJ5dyNDfkE

Another very interesting type of gesture based interfaces are multitouch screens like seen in the following video:

Spatial Motion Interfaces

In Spatial Motion Interfaces the user becomes the controller and can controll the application through movements the user is performing. Check out the following video for a demonstration. Lookes really interesting.

Augmented Reality

Augmented Reality is brings virtual reality and the real work together and makes it possible to build great applications. As mobile devices increase in computing power this applications are becoming more and more popular:

http://www.youtube.com/watch?v=ReH9dmqfOqA

Other sensor Based Interfaces

Other sensor Based Interfaces are Brain Computer Interfaces (BCI) enabling to communicate with an application through just thinking of specific commands:

B2B – BrainToBrain: A BCI Experiment

Mind Control Device Demonstration – Tan Le

Smart Surfaces

Smart Surfaces are becoming more and more popular too. Here is a video showing this concept:

Feel free to add some interesting videos which you have found on the web. I am happy if you post them in the comments.

Related articles

[video] – User Interface Fundamentals for Programmers

Recently I have found a great video demonstrating the user interface design fundamentals every programmer should know. Check out the video, it is really interesting!

UI Fundamentals for Programmers by Ryan Singer from ChicagoRuby on Vimeo.

(via Smashing Magazine)

[video] – The Sixth Sense demonstrates the great abilities of User Interfaces

Wear Ur World
Image by Larry Johnson via Flickr

Wired.com states in an article (found by ReadWriteWeb) as follows:

LONG BEACH, California — Students at the MIT Media Lab have developed a wearable computing system that turns any surface into an interactive display screen. The wearer can summon virtual gadgets and internet data at will, then dispel them like smoke when they’re done.

Pattie Maes of the lab’s Fluid Interfaces group said the research is aimed at creating a new digital
“sixth sense” for humans.

In the tactile world, we use our five senses to take in information about our environment and respond to it, Maes explained. But a lot of the information that helps us understand and respond to the world doesn’t come from these senses. Instead, it comes from computers and the internet. Maes’ goal is to harness  computers to feed us information in an organic fashion, like our existing senses.

The prototype was built from an ordinary webcam and a battery-powered 3M projector, with an attached mirror — all connected to an internet-enabled mobile phone. The setup, which costs less than $350, allows the user to project information from the phone onto any surface — walls, the body of another person or even your hand.

I’ve found some great videos demonstrating the Sixth Sense, a research project from Fluid Interfaces Research Group at MIT Media Lab. They demonstrate a wearable computing system which turns any surface into an interactive display screen. The great thing: It costs less than $350

Pattie Maes demonstrating the “Sixth Sense” at TED Talks:

(via TED)

ReadWriteWeb writes in an article called:  The Wearable Internet Will Blow Mobile Phones Away the following:

We at ReadWriteWeb are very excited about next-generation Internet interfaces, such as augmented reality and so-called cross reality. These wearable devices strike me as being the most impressive future Web interface that I’ve seen in a while. Check out the video and see if you agree.

I find it extremely interesting because this kind of user interface is build up with very low cost and can therefore be spread easily.

The first video demonstrating the “Sixth Sense”

Here you can see the first video which demonstrates how a wall is being used to project and interact with several applications. Using a really small projector enables people to project to walls while speaking to a friend standing nearby to someone. Using a projector integrated into a helmet is not very applicable because while you are speaking to your friend the projection is made on your friend’s face. Thus switching to a smaller device hanging around your neck is more applicable as your head is free fro movements.

(via Wired.com)

Another video demonstrating the “Sixth Sense”

For example someone being in a supermarket and heading from product to product trying to find the cheapeast or most appropriate one is done easily by just scanning the product and retrieving real time information directly from the internet. This has great potential and can be used in a variety of applications.

(via Wired.com)

More ressources can be found on Wired.com, on ReadWriteWeb as well as on the Sixth Sense Project Website, on the Fluid Interfaces Research Group Website and at the official site of MIT Media Lab.

Reblog this post [with Zemanta]

Scientist developes a Brain-To-Twitter Interface

Image representing Twitter as depicted in Crun...
Image via CrunchBase

Adam Wilson, biomedical engineering doctoral student at University of Wisconsin-Madison has developed a brain to computer interface which makes it possible to directly write tweets through thoughts. ReadWriteWeb writes:

Technically, what Wilson did was come up with an interface combining an Electroencephalogram, or brain wave monitor, with an on screen keyboard for selecting letters. The system lights up each key on the keyboard but is able to notice a difference in brain activity when the desired letter for input is lit. Wilson compares it to clicking through multiple letters when texting on a mobile phone.

Once you’ve found a new way to input text – what are you going to do with it? Use it to Twitter, of course!

Clearly, there’s some gimmickry going on in the news of Wilson’s interface. Who knows if this is better or worse than saying that a technology is developed to assist physically disabled people when it’s really going to be used by the military? Wilson does say that the technology will be helpful for people with active brains but immobile bodies. Now they’ll be able to Twitter, among other things, he says. Fair enough.

[UPDATE] Here is a little video showing how the Brain Twitter Interface workes:

[youtube 205dHV55XWQ]

Personally I think it is a great idea to use the currently hyped microblogging service for research activities.

(via ReadWriteWeb)

Reblog this post [with Zemanta]

Web Design Interfaces

I have found two great posts on web design interfaces. The first one deales with 30 essential controls used in web interfaces. The second one descirbes 15 common components used in web user interface design.

On designwebinterfaces.com you can find many more useful posts dealing with web design interfaces.

Reblog this post [with Zemanta]

User Interface der Zukunft! Unglaublich …

Hier ein User Interface der Zukunft! Das ist wirklich unglaublich, was uns das zukünftig für greznenlose Möglichkeiten bieten wird! Viel Spaß damit!