On February 9, 2001, 9 miles off the south coast of Oahu, Hawaii, a US Navy submarine, the USS Greenville, was demonstrating some fairly routine underwater maneuvers for a group of visitors on board, including a rapid resurfacing after a deep dive. Unfortunately, the submarine surfaced directly underneath a Japanese fishing trawler, the Ehime Maru. Its rudder sliced the Ehime Maru’s hull from starboard to port, sinking the fishing trawler in five minutes. Nine people aboard were killed, including four high school students and two teachers. The Ehime Maru was a training vessel for students considering a fishing career.
There was a great deal of finger-pointing and blame as a result, and the submarine’s captain was reprimanded for what turned out to be an operational error: he and his crew were distracted by their visitors and failed to accurately keep track of the Ehime Maru’s position before resurfacing. But while such a thing is rare, it is not unprecedented. These things happen because submarines are at their most vulnerable when surfacing. Stealth is key, so that can’t use active sonar to scan for other boats as they resurface, so there’s always a point where they’re literally just flying blind, just for a minute or so, before breaking the surface.
If only there were a way to provide a passive set of “eyes” capable of scanning the surface from below for nearby boats to get rid of that blind spot without sacrificing the need for stealth, collision avoidance wouldn’t be such a major issue. If Bruce Hubbard of Arete Associates succeeds in his quest to build a “virtual periscope,” the US Navy will have just that. Arete’s system — still under development — uses a compact set of sensors and computer algorithms to process the data and correct for the inevitable image distortions caused by waves along the ocean’s surface.
Hubbard was on hand this morning at the IFP meeting to discuss Arete’s progress to date on the Virtual Periscope (VP) system. It seems like a simple enough concept: just place a passive sensor/camera on top of a submarine to collect all that lovely refracted light playing across the surface and then use advanced imaging techniques to reconstruct those bits of light into a usable image. “The VP system uses the ocean as a lens to give submarines the much-needed ability to image the surface,” he said.
Alas, it is an imperfect lens. If ocean waters were always calm, this wouldn’t be a problem, but there are often surface ways of varying intensity rippling across the surface, and these cause serious scattering effects.
So Arete turned to computer software to process the raw data into usable imagery, based in part on a program called RenderWorld — the same software used to enhance critical water-centric scenes in the blockbuster films Titanic and Pearl Harbor. Arete adapted Renderworld to work with two other pieces of software to produce their secret weapon: a three-part algorithm that applies multiple signal processing methods (wave estimation, deconvolution, and matched filtering) to the distorted image to correct for the aberrations.
The critical factor is vertical angular resolution, which is around 30 degrees for the raw images; the optimal goal is 1 degree. This would enable the VP system to detect a 100-foot tall object to about 2000 yards out, or a 50-foot object at about 1000 yards out, sufficient distance for a surfacing submarine to avoid a collision in time.
The algorithm thus far has successfully reduced the vertical angular resolution down to 3 degrees, and early testing provided proof of principle — enough for the Office of Naval Research to provide much-needed R&D funding. But the successful detection of objects a mere 100 yards out just won’t give sub operators enough time to react and avoid a collision.
So what’s next? According to Hubbard, one goal is to combine the three software pieces into one integrated technique to further enhance clarity of the processed images. (Adaptive optics, used to correct aberrations in telescope images caused by atmospheric turbulence, cannot be used because it is an active system, and the signal would give away a surfacing submarine’s position.)
Ultimately, says Hubbard, the key will be to more precisely map out the topology and dynamics of the ocean’s surface, and obtain a better understanding of that constantly changing environment. Right now, the algorithms must make certain critical estimates of conditions before it can process images — and they are clearly not accurate enough, although enormous improvements have been made in recent years.
If only Arete hadn’t lost its federal funding! I guess that makes the Virtual Periscope system the latest victims of draconian budget cuts for science. But I hope Arete finds the funds to continue developing the system. Our submarines clearly need better underwater “eyes.”