Tracking Underwater Robots Using “Pesky” Snapping Shrimp

University of California Division of War Research illustration of natural underwater sounds, including snapping shrimp, which cause interference with sonar and other underwater acoustic devices. 1944 Scripps Institution of Oceanography Photographs

Snapping shrimp are amazing creatures. They use their claw to create a powerful “snap” that induces a cavitation bubble that stuns their prey. This tiny bubble reaches temperatures of up to 8,000 degrees Fahrenheit and 200 decibels. These little shrimp are so amazing that they were discussed on a Radiolab episode.

Typically these shrimp are so loud that they create large disturbances in acoustic field which negatively effects wireless underwater communication and robotic localization. This is illustrated by an old (1944) UCSD Scripps Oceanographic cartoon from the UCSD Library Digital Collection). But what if we were able to utilize these snaps in a beneficial way to find the positions of a swarm of underwater vehicles?

Our recent work published in IEEE Access shows how to track a swarm of underwater vehicles using passive signals present in the ocean’s ambient soundscape. We demonstrate our method using noise from these “annoying” snapping shrimp on a swarm of underwater vehicles that was deployed off of San Diego’s coast. Our self-localization estimate compares well against the current state of the art. This video depicts the localization of both of these techniques: using our snapping shrimp derived localization scheme and the standard, infrastructure-heavy technique of deploying fixed buoys with active acoustic pingers. Our method is the first step towards an infrastructure free, low power, high endurance localization technique for underwater vehicle swarms.

Fixing Noisy Neuromorphic Vision Sensors

Neuromorphic vision sensors are uncommon and new(-ish) way to perform analog feature detection in an event-based manner. They employ a fundamentally different technique for sensing compared to “typical” cameras. This results in a more complex pixel detector architecture, and correspondingly means their noise differs from ordinary vision sensors. The picture show KRG PhD student Alireza Khodamoradi recently presented our research in this domain in a paper titled “O(N)-Space Spatiotemporal Filter for Reducing Noise in Neuromorphic Vision Sensors”. Here we analyzed this noise and introduced a novel filter with higher accuracy and minimal memory complexity compared to previous works. This paper was accepted to the IEEE International Conference on Computer Design (ICCD) and was invited to IEEE TETC. This is a special honor only extended to the best papers. We plan to have it published after journal reviews by early 2018.

2017 Research Group Retreat

Our research group made our annual pilgrimage to Mammoth Lakes, CA to kick off the beginning of the (academic) year. The evenings were filled with discussions on past, current, and future research and imbibing on delicious homemade food. The days were spent enjoying all the activities that the Eastern Sierras had to offer. The group hike this year was the Twenty Lakes Basin Loop Trail (though about half of us decided to only tackle the loop around Saddlebag Lake). But those that did the entire 10 mile route were rewarded with a snow cave unlike anything that I’ve ever seen. Quentin — our resident group photographer and videographer — created a wonderful video showing the epicness of the hike (the snow cave comes around 0:45).

Team Rabbit Ears places 2nd in Software Defined Radio Contest

The RFNoC & Vivado HLS Challenge is an open invitation to create innovative and useful open-source RF Network on Chip (RFNoC) applications. The goal was to highlight the productivity of Xilinx Vivado High-Level Synthesis (HLS) design tools using the National Instruments/Ettus Research Universal Software Radio Peripheral (USRP) hardware. The USRP is one of the most successful hardware platforms for software defined radio.

Team Rabbit Ears were up to the challenge. Team members Alireza Khodamoradi (CSE PhD student in our research group), Andrew Lanez (Wireless Embedded Systems MAS Alumni), and Sachin Bharadwaj Sundramurthy (CSE MS student) created an HDTV receiver block. This is able to pick up HDTV broadcast over the air. Have a look at their video below for more details.

They were awarded second place which comes with a complete USRP system from Ettus research and a presentation at the 2017 GNU Radio Conference. If you want all of the details, their work is open-source and is available on the Xilinx github repository.

Congrats to Alireza, Andrew, and Sachin! What a great team spanning multiple graduate programs in CSE!

Cognex Funds Hardware Accelerated Computer Vision Project

Our group has a long history with Cognex. The company itself is headquartered in Natick, MA, but they have a growing research lab in San Diego. Our group was first “raided” for talent when John McGarry (Sr. VP R&D) hired (my then post-doc) Ali Irturk. Since that time, they have expanded substantially which includes a large number of people with direct ties to the Kastner Research Group (KRG). This includes Isaac Philips (undergraduate KRG researcher and MS student), Janarbek Matai (KRG PhD alumni), and Wireless Embedded Systems (WES) MAS alumni Chris Neuhauser. Several of our current graduate students have also spent time at Cognex including Alric Althoff and Alireza Khodamoradi (also a WES MAS alumni). The picture shows those at Cognex SD research with UCSD ties.

We are happy to continue our research collaboration with Cognex. Their high-speed image sensors (operating at 10,000s frames per second!) provide unique research problems. And we enjoy seeing our ideas transitioned into their products. More specifically, our collaboration has resulted in several major technical contributions including developing hardware accelerated architectures for compressed sensing and investigating novel event based sensors. We look forward to our future research collaborations with Cognex. And hopefully, we will continue to serve as a pipeline for future Cognex employees.

Press releases: UCSD, UCSD Jacobs School , UCSD Computer Science

Using Drones to Automate the Understanding of Mangroves

Our latest Engineers for Exploration (E4E) project aims to use multispectral camera on an unmanned aerial vehicle to help our scientific collaborators better understand the mangrove ecosystems. Research engineer Eric Lo and Nikko Bouck and Brynn Hall (two of our Summer E4E students) accompanied our SIO collaborators from the Aburto Lab to Bahia Magdalena, Baja California Sur to perform an initial survey of the mangroves around the region. This data is currently undergoing processing with the ultimate goal of automatic large scale aerial mangrove classification. We are currently investigating a number of supervised and unsupervised machine learning techniques to perform species level classification. Our ultimate goal is to track changes to the mangrove ecosystems over time.

For more information, see the blog post by Aburto Lab member Astrid Hsu – Mangroves from Above and Below.

This project was featured in thisweek@ucsandiego:

Dr. Dajung Lee Successfully Defends Her PhD Thesis

Dajung is putting the finishing touches on her PhD with a successful defense. Dajung has been working broadly in the realm of hardware acceleration. Her research specifically focused on developing a system that can analyze high-speed cellular images in real-time. Imaging flow cytometry is a technique that enables cellular studies, e.g., determining the presence of cancerous cells, identifying mature stem cells, and characterizing sickle cell anemia. Imaging flow cytometry uses cameras operating at greater than 10,000 frames/sec, and thus requires special hardware to analyze the images. This is especially important to enable cell sorting rather than just cell screening. Dajung thesis details her work that studied algorithmic and hardware design techniques to perform real-time cellular analysis. Her next stop is as a researcher at Intel in San Diego where she will continue work on hardware acceleration (though targeting different application). We will miss both Dajung and Coco (pictured in the “Questions?” slide).

Two Research Papers at Design Automation Conference

"Hi, How Are You?" by Daniel JohnstonOur group had two papers in the Design Automation Conference, which was held in Austin, TX. This is the biggest conference for the design and automation of electronic systems. Our two papers were both related to hardware security.

The first paper “Arbitrary Precision and Complexity Tradeoffs for Gate-Level Information Flow Tracking” was a highly collaborative work spanning N. America (UCSD), Europe (EPFL), and Asia (NPU). This work started during my sabbatical to EPFL (Fall 2015 – Winter 2016). The research looks at the tradeoffs between accuracy and speed for different library elements in our gate level information flow tracking work.

The second paper “An Architecture for Learning Stream Distributions with Application to RNG Testing” described Alric‘s latest research to develop a low complexity hardware cumulative distribution function estimator. This is broadly useful for summarizing the internals of integrated circuits. The paper used monitoring the security of random number generator as an exemplar application, but it is applicable to many other domains.

Both Alric and Andrew did a great job in their presentations. And congrats to all the authors!

Oh, and by the way, next year I’ll be a part of DAC’s “Special Focus Committee”. If you have any thoughts on how to better make the program better (in particular, with respect to security), please get in touch with me. I would like to hear your ideas.

Finally, the picture is an example of “keeping Austin Weird”; it is the “Hi, How Are You?” mural from the great Daniel Johnston.

Wei Hu Begins “Post” Post-Doc Career

“Vinnie” Wei Hu has been in our research group for 4 years across two separate occasions — 2 years as a visiting graduate student (shortly after I moved from UCSB to UCSD) and the past 2 years as a post-doc. During that time he published many of the fundamental papers related to GLIFT and has more recently been the leader of our security research group. As anyone who has worked with him knows, he is a patient mentor and an outstanding researcher. He will certainly be missed. But alas all the best people eventually need to leave and go on to do other great things. Vinnie will be doing this as a Professor at Northwestern Polytechnical University in Xi’an. We look forward to continued collaborations with him and his students.

A Modern Take on Palissy Rusticware

Bernard Palissy was a 16th century French ceramicist known for making large decorative platters from the casts of animals (reptiles, fish, crustaceans, etc.). As part of their class project for CSE 145, Erica Sugimoto & Christopher Chinowth, along with their mentor staff Eric Lo, are recreating his artistic process, but with a modern spin. Instead of using dead animals as a mold, they are using Intel RealSense depth cameras to capture a 3D model of a live animal. This model will then be used by artist Miljohn Ruperto to create a modern version of Palissy ware. This artwork is scheduled to be displayed in the Haus der Kulturen der Welt (House of World Cultures) in Berlin, Germany within the next year.

The first try at capturing the data happened today. We have the cameras and the 3d data capture software ready, we just needed a subject. For that, we enlisted “snake wrangler” Josh Ruffell who brought in three large (4-5 ft) Florida King Snakes. The second of the snakes, who goes by the name “Florida King Snake #2
“, made for the best model. Erica and Chris have a few more weeks before the end of the quarter to post-process the data, and determine the best models for Miljohn.

More information can be found in a slideshow developed by Eric and Chris presented earlier this quarter as part of their CSE 145 class project overview (Palissy Snake Project Overview) or their class project webpage (Palissy Snake Homepage).

The class has 12 other projects. To learn about those, check out the Class Project Webpage.

Special thanks to Intel (in particular Byron Gillespie) for providing the RealSense Cameras.