Tuesday, February 28, 2012

Using AR in Life-Death Applications

The article that I found this time described a recent application that integrated several means of information into an augmented reality environment.  The target audience for the application is military infantry, though it could be just as well suited for law enforcement and fire response teams.  The challenge was to aggregate and filter various input relative to a situation and to relay pressing information rapidly for the infantryman's benefit.  A vehicle-mounted prototype was designed for proof of concept without the restraints of portability.

The infantrymen in this application are to be outfitted with several sensors.  An inertial measurement unit is placed on one foot and on the helmet to measure distance, speed and orientation for tracking movement.  A LiDAR system is mounted on the helmet to correct for drift and other navigational error.  This allows for the central hub to track friendly infantrymen's position in an outdoor and indoor environment, independent of GPS. The augmented reality system will overlay colors on structures and targets to indicate friendly or enemy structures, as well as neutral or unknown structures.  What this altogether allows is for a field commander to observe more accurately and react quicker to real-time situations.



Source:
http://iospress.metapress.com.lib-ezproxy.tamu.edu:2048/content/bq0632q474310576/fulltext.pdf

Tuesday, February 21, 2012

Sony's Augment-able Reality System

In this great article I found a near exact replication of what we hope to achieve in our own project.  Written back in 1998, Sony developed a prototype AR system that allowed for digital content to be tagged in an environment either to virtual areas, or physical markers.

Sony designed a system comprised of a head-mounted system containing a monocular display screen, a camera, and an infrared sensor.  They coupled the headset with a wearable computer able to connect to the Internet.

Sony chose to use a wearable computer design because they believed it was the technology of the future, that it would become much more popular as the years went on.  Today the wearable computer is all but forgotten, and has instead been replaced by high-performing smart phones.


The team from Sony created a software system that allowed for the detection of physical contexts, such as rooms, and also to recognize physical markers such as black-and-white matrix codes.  Infrared beams emit codes on occasion that signal the identity of a room location.  This allows for the system to track its location on a floor map.  For specific objects, they created unique ID codes for physical objects such as a VCR.

As for what the head-up display showed the user, while the user is viewing the environment they will see a video overlay, with some additional information on what is available in a side pane such as what is available for viewing.  The user can also create content, either voice or images, to append to a location using drag-and-drop on the display.  The microphone is cleverly hidden inside the mini mouse.

Adding voice content to a location


While there are many similarities in the concept, the implementation between Sony and our design is suitably different.  For example, Sony uses IR light to detect a room location, and a high contrast ID matrix code.  We will be using a tracking system that relies on the ID matrix codes, images, and GPS location.  Additionally, while the filtering that they utilized is very similar to our idea (users can filter out content and establish privacy options) our work will additionally allow for collaboration of content that is lacking here.




Reference:
Rekimoto, J.; Ayatsuka, Y.; Hayashi, K.; , "Augment-able reality: situated communication through physical and digital spaces," Wearable Computers, 1998. Digest of Papers. Second International Symposium on , vol., no., pp.68-75, 19-20 Oct 1998
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=729531&isnumber=15725

Thursday, February 9, 2012

An Evaluation of AR Tool Kits Among Platforms

I stumbled on this interesting article when looking for research performed with AndAR, primarily to see if there had been an expansion to support markerless AR tracking.  What I found here is more of a light evaluation of different tool kits for developing AR applications.  The authors of this paper were not performing benchmark tests or formal evaluations but rather experimenting with the feasibility of developing across several platforms and SDK's and relating their experiences and performance achieved by each SDK.  To test this, the authors designed a basic AR application that was modeled from typical archaeological surveys.  Each grid of the "site" was represented with a card or stack of cards, which a handheld device would register.  The grid squares would at the beginning each display an undisturbed patch of dirt, and deeper layers of cards may contain objects drawn in 3D.  This in addition with registered "tools" allows for a more immersive experience in simulated archaeology.

In order to accomplish this application, several base functions were designed for all the functionality of the application.  Then the AR was implemented on Android and iOS devices, using different SDK's.  For the iOS, the developers used ARToolKit for the iPhone, which is distributed commercially by ARToolworks--the company that developed ARToolKit.  They  found that while the end performance in tracking was acceptable (ARToolworks claims that this SDK can track up to 30 fps) and ARToolKit itself was usable, the team spent a significant amount of time trying to understand the iOS framework and properly implement it.  In comparison, the Android AndAR (also based off of ARToolkit) was relatively simple to set up but performed poorly.  This no doubt stems from the fact that AndAR is based off of the free ARToolKit, which has not been updated since 2007 and as such is not as advanced as ARToolworks' more recent projects, where the iPhone's implementation is more current.  Another freeware, Qualcomm AR (QCAR) was also tested and found to be not only usable, but also performed very well in their evaluations.  So while this article did not create any additional contribution to the field of AR, it is helpful to read the results achieved by developers trying different configurations.


Article:
http://www.ideals.illinois.edu/bitstream/handle/2142/27688/AR_Smart_Phone_Note_rev3.pdf?sequence=2

Tuesday, February 7, 2012

PDA Augmented Reality

This time I chose to read an article that discussed augmented reality acting on a handheld device to see the challenges and overall performance issues.  This article was posted in 2003, so the hardware that was available then as compared to today is much more limited.  Still, the authors decided to make use of a PocketPC for its comparably advanced hardware of the time, with a 400MHz processor; a 240x320 16-bits display; 64MB RAM; 802.11b wireless network interface; as well as a camera addon with 320x240 color resolution.  Compare this to the hardware available in the Nexus One phone, which boasts a 1 GHz processor, has a GPU processor, 480 x 800 pixels 16M color display, 2560х1920 pixels camera with geotagging ability.

In the paper, the authors used a hybrid tracking system that utilized ARToolKit as a foundation.  The hybrid system comes from allowing the PDA to act as a standalone computation device, which can work autonomously, as well as allowing a PC connected wirelessly to shoulder the expensive tracking computation and thus increase overall performance.  The PDA was given SoftGL for drawing, which is a light version of OpenGL.  Due to the limitations of the PDA being unable to utilize floating points, which OpenGL relies on quite extensively, there were slight performance losses due to translations between integers and floats.  Overall, the PDA + camera addon were able to achieve performance of approximately 5 fps when utilizing a supporting PC for computations, and otherwise 2.5-3.5 fps.  This is promising for our project, as it demonstrates that even hardware that is not meant to handle augmented reality and has comparatively limited computing limitations can achieve modest results.


Source:
Daniel Wagner; Dieter Schmalstieg; “First Steps Towards Handheld Augmented Reality,” Vienna University of Technology, Favoritenstr.
http://www.icg.tu-graz.ac.at/Members/daniel/Publications/HandheldAR_ISWC03final.pdf





Thursday, February 2, 2012

The Hybrid Tracking Algorithm

     As the project I'm working on will consist of implementing augmented reality, I chose to read an article that detailed an algorithm that was related to it.  The algorithm builds on past approaches to tracking reality and blends them together to take advantage of the strengths of each one.  In the beginning, GPS and magnetic sensors were used to obtain a position and direction that the camera is aimed at.  Then accelerometers and gyroscopes were added to detect the orientation of the camera for additional accuracy.  This allowed for augmented reality devices to determine much more effectively where to draw objects on the screen and how to orient them.  Computer vision techniques were also implemented to detect objects such as buildings and add another level of accuracy for realistic application of virtual images.

     In this article the authors discuss how they can merge the different features together.  The goal is to create a highly accurate, efficient algorithm that can follow and detect real life images in real time.  The GPS and gyroscope, along with magnetic sensors and accelerometers act to allow images drawn on screen to appear in the right place and properly oriented.  To detect surfaces, the algorithm references textured 3D models and uses edge detection on those models to match the real-life location of corresponding edges.  The algorithm then compares this to the next video frame to detect camera motion and compensate in the video.  This allows for the augmented reality device to accurately and in real-time display an augmented image on top of an underlaying video, and also allows for realistic occlusion of hidden surfaces.


References:
Reitmayr, G, and TW WDrummond. "Going out: Robust model-based tracking for outdoor augmented reality." 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE: (2006). :153-153.
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4079263&tag=1