Soldier Targeting Goggles "Augment" Human 3-D Vision Tracking

Kris Osborn

Video: Networked Army Radar Destroys 2 Maneuvering Cruise Missiles

by Kris Osborn - Warrior Maven

(Washington D.C.) Imagine this land-war scenario: An enemy fighter is several hundred yards away, another is attacking from one mile while yet third fires from a nearby room in a close-quarters urban warfare circumstance, when U.S. Army soldiers apprehend, integrate, and quickly map the locations of multiple targets at once in 3D, all while knowing the range and distance of the enemy forces.

How could something like this be possible, one might wonder, given the nuances in perspective, range, navigational circumstances and the limitations of a human eye?

These complexities form the conceptual basis upon which the Army is fast-tracking its Integrated Visual Augmentation System (IVAS), soldier-worn combat goggle engineered with advanced sensors able to overcome some of the limitations of human vision and quickly organize target data.

“We take all soldiers who have IVAS and turn them into a sensor collecting data to share with a greater network. The screen can chart a path and tell you where a reported adversary is. You can see through heat and augment existing light,” Gen. Joseph Martin, Vice Chief of Staff of the Army, told an audience during an event at the Foundation for the Defense of Democracies. “If you have been dismounted, you know it can be lonely. You want to have a link to your fellow soldiers. This is what IVAS is delivering to our formation.”

Martin explained that the IVAS system is being quickly improved and upgraded with new software by virtue of a “soldier touchpoint” collaborative process wherein soldiers exercise with the goggle and offer feedback to developers.

Dr. Bruce Jette, Assistant Secretary of the Army, Acquisition, Logistics and Technology, told Warrior in an interview earlier this year that engineers created IVAS with an ability to compensate for what might otherwise be some of the limitations of the human eye. Operation of IVAS calls upon a degree of what could be described as “Human-Machine Interface” because it integrates some of the neurological processes of human vision with software engineered to process, organize and display otherwise challenging factors such as “depth perception,” surrounding peripheral objects and other elements of human visual orientation.

“We don’t perceive distance with one eye, we just see larger or smaller - but if I can put it in both eyes I can get the object in 3D. To do that I need to have the sensing system to know where the eye is looking and focusing. The IVAS does that. It determines what you are looking at and what type of object you are looking at and focusing on to generate a 3D image in front of you. The good part about this is I don’t need all those heavy optics on my face,” Jette said.

As designed, the IVAS system is built to lessen the hardware footprint, reduce weight and, perhaps of greatest combat relevance, streamline time-sensitive combat data.

“The sensor is seeing where my eyes are looking and preserving it based upon certain measurements. Then if I fly a UAV up there, IVAS can show the UAV coming into the scene - and converge the two onto each other so I can put the UAV right where you want it,” Jette said.

Part of the soldier feedback process, interestingly, involved requests to build even more data, icons, detail and combat information into the sensor. Developers deliberately limited the amount of information displayed on the IVAS system to avoid overloading soldiers, however soldiers really liked the system and asked for an even more integrated display.

“Soldiers asked if they could see more things on there. The 20-yr olds have done this their entire lives and they said we can use more information,” James E. McPherson, Undersecretary of the Army, said at the event.

-- Kris Osborn is the Managing Editor of Warrior Maven and The Defense Editor of The National Interest --

Kris Osborn is the new Defense Editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.

Comments (1)
No. 1-1
ecandy
ecandy

“We don’t perceive distance with one eye, we just see larger or smaller - but if I can put it in both eyes I can get the object in 3D. To do that I need to have the sensing system to know where the eye is looking and focusing. The IVAS does that. It determines what you are looking at and what type of object you are looking at and focusing on to generate a 3D image in front of you. The good part about this is I don’t need all those heavy optics on my face,” Yes! https://cookie-clicker.co/


Land

FEATURED
COMMUNITY