CSE 237D: Embedded System Design

CSE 237D – Embedded System Design

Note: This class will follow the CSE 145 schedule. Please refer to that class webpage for info.

Instructor: Prof. Ryan Kastner (kastner@ucsd)
Time: Tuesday-Thursday, 12:30-2pm
Location: CSE 2154
Office Hours: After class or by appointment

This is a project oriented class on the broad topic of embedded systems. As such, the grading is primarily based on your project, and the homeworks are set up as project milestones to insure that you make progress over the course of the quarter.

Class Materials:

Vision

Mentors: Dr. Jung Uk ChoDeborah Goshorn, Dr. Ali Irturk

Vision applications are appearing in many embedded systems including robotics, security and human computer interaction. They utilize a number of different computing platforms including FPGAs, GPUs, DSPs, etc.

FPGAs are an excellent device for image computation. My post doc has developed an FPGA platform that can hook up to a variety of image inputs (LVDS, composite, component) with a camera. There are many things that can be here, most revolving around real time image processing. We have a variety of camera to play with (PZT, CMOS) including the monitors displays that are set up around the building and there are many interesting things one could do – real time tracking, filtering, image distortion, etc.

More specific ideas include:

  1. GPU implementations of different vision algorithms using Jacket – GPU engine for MATLAB. I am particularly interested in object detection, recognition and tracking.
  2. Utilizing Mentor Graphics Catapult C to implement different vision algorithms.
  3. Creating a Simulink-like library for Jung’s existing image processing implementations and a simple application using them as building blocks.
  4. Performing fish detection and classification using Jung’s existing hardware platform. This would be useful to NOAA, Birch Aquarium and Reef Check

Autonomous Underwater Vehicle

Mentors: Chris Barngrover, Thomas Denewiler

The Stingray is an unmanned underwater vehicle that was initially developed by a bunch of undergraduates at UCSD and has since evolved into a much bigger project. Their main goal is to compete in the annual AUVSI AUV contest held every summer. My interests are to turn this into a vehicle that can be used for coastal and harbor explorations. Example applications include mapping coral reefs topology, detect and track harmful chemicals, search boats for unidentified objects (e.g., mines), etc. There are a number of research goals needed for both of these pursuits. These include:

  1. Develop a more efficient networking software infrastructure that allows the various components of the Stingray (control loop, planning, vision, data logging, battery monitoring, etc.) run more effectively. Much of the initial software is already built but with hacks that are not scaling well. One example would be to utilize the Boost libraries.
  2. Re-write the networking/messages code. It would be nice to have a proper TCP and UDP server-client library. Then we could use that to build a messaging library on top of the networking library. Right now the networking and messaging is a bit buggy and the two pieces are dependent on each other.
  3. 2) Re-write the GUI. Clean up the Gtk code or (preferably) re-write using OpenGL. Add the ability to embed video from the robot in the GUI. This would work for live and recorded video/images because the video server takes care of that part. But it would be nice to have two window widgets for displaying that video. Add the ability to display real-time graphs of data in the GUI. This would be data like target yaw vs. current yaw. Then we could see how well our controller was working. Maybe even have a tab (or something similar) where we had ~6 graphs of different things. Or a way to turn on the graphs of certain data when you wanted to see it.
  4. Improving the camera: Calibration – There are offline methods for determining the intrinsic camera properties using known objects in the scene. That could be useful for future video work. Image collection and labeling – This would be taking the cameras out into the water and recording images/video and then labeling them based on what objects are in the scene. Different light and water conditions would be useful as well.
  5. Integrate a chemical sensor into the Stingray. Prof. Joe Wang has developed a number of chemical sensor for tracking harmful chemical (e.g., explosives, chemical from runoff). I would like to integrate on of these into the Stingray enabling the Stingray to utilize the chemical sensor to find and track chemicals of interest
  6. Tune the current image processing algorithms for underwater object detection. We have created a number of algorithms that can find buoys and pipes (for the AUVSI contest) using color classification. We need to enhance these algorithms and further integrate them with the Stingray control systems.
  7. Enhanced battery monitoring and energy saving schemes. We have code that will monitor the batteries, in theory. I would rather see this re-written so that it hooks into the Linux ACPI system, just like laptops monitor their batteries. Then the Stingray computers will shut down when they are supposed to and we can still get access to the battery information but would be using system calls instead of our own program. Based on this you could turn parts of the system off to save energy when necessary
  8. Figure out how to use quaternions instead of Euler angles so that we can do flips and barrel rolls. Most IMUs (ours included) have a singularity when some of the angles (pitch and roll, not yaw) become too large because the trig they use ends up dividing by zero. Quaternions avoid this. For example, if the Stingray is sitting level and we want to roll 45 degrees clockwise I just tell the roll control loop to go to 45 degrees and I am done. How would this maneuver be accomplished using quaternions? We can log both Euler angles and quaternions from the IMU and just rotate the Stingray in every dimension so that we could look at when the IMU breaks down (documentation says it’s around 70 degrees) and see if we determine smooth 360 degree rotations using the quaternions.
  9. Create a image processing box for use in the Stingray and other AUV/UUVs. The Navy is interested in developing a combined optical/acoustic sensing system for detecting things like mines. This would use both visual and acoustic information to identify and inspect potential targets.

Underwater Communication
Mentors: Dr. Feng Tong, Bridget Benson

We are building a hardware platform that interfaces with a variety of sensors, store and process sensor data and wirelessly transmit the data. The end result is something akin to an “underwater mote”. There are a number of potential projects here including:

  1. Experiment with different physical layer communication protocols. Currently we are using frequency shift keying (FSK). Other more advance protocols include phase shift keying (PSK), DSSS, OFDM, etc.. We are working on a digital acquisition system that enables us to transmit and record different signals underwater. The first part of the project would be to get this system working. Then you could transmit different signals in a tank, pool, Mission Bay, etc. and analyze their performance. Finally, you could implement this protocol in software/hardware
  2. Develop and simulate network protocols for wireless communication to conserve energy.
  3. Implement the DataTurbine software – a streaming data server into the system.

Hardware Security 
Mentors: Jason ObergVinnie Hu

This area has seen a lot of interest in recent years. I have been working on security for FPGAs for the past few years as part of two NSF funded projects called RCSec and 3DSec. The goal of the RCSec project is to investigate security primitives for FPGAs. The 3DSec project aims to use 3D integrated circuit technology for the purpose of securing modern processors. An example of this would be to use a separate computational layer to monitor and insure separation of cache resources. It has been shown that a process can modify the cache entries of another process running some encryption scheme and eventually find the key. I believe that we can use 3D technology to mitigate or eliminate such attacks. There are many other interesting problems that could be solved, including data tagging and tracking, monitoring of shared resources to eliminate side channel attacks, etc.

Financial Computation 
Mentor: Dr. Ali Irturk

Many financial institutes that trade commodities (stocks, options, …) perform a wide range of analysis on prices and trends in trading. These are often done overnight on large servers. There is the potential for huge benefits (aka making a lot of money) if one could do such analysis faster, e.g. in real time. This is a new interest of mine, so projects in this area are somewhat undefined at this point. I would expect you to study the area in great detail and propose, new, interesting opportunities for research.

Signal Processing for Embedded Devices
Mentors: Dr. Ali Irturk

Design, simulate and synthesize an signal processing core. I’m particularly interested in things like matrix multiplication, matrix inversion, FFT, correlation, QR Decomposition (QRD), QRD-M, Kalman Filters GSIC/MP, JPEG/MPEG compression, etc. I’m also open to other applications. There are many different ways to develop these cores. One natural, incremental way is to first develop C, Java, Matlab or your other favorite high level language. Then you an port that code to some acceleration engine, e.g. GPU or FPGA. This would require architecture specific to optimization, e.g. extracting task level parallelism and add custom functions to speed up computationally intensive parts. The FPGA implementation could use Catapult C to create the RTL HDL. Another route (for those who are more savvy with HDL) is to directly develop, simulate and synthesize a RTL HDL core.

Architectural Synthesis
Mentors: Dr. Ali Irturk, Dr. Gang Wang

Part of the material presented in class will focus on the scheduling problem, formulation and effective algorithms to solve the problem. One of my former PhD students has developed a instruction scheduling framework that you can enhance for part of your project. This framework has a set of benchmarks and many different methods for time and resource constrained scheduling. There are several project ideas including:

  1. Multiple operation to resource mapping. The current framework does not handle the case where a resource can handle multiple operations (e.g. an ALU can do add, subtract, shift, OR). Develop an algorithm to effectively partition operations to resources.
  2. Integer Linear Program (ILP) formulations: Develop code to generate an ILP solutions to various scheduling problems. The input to your program is a DFG and the output is an ILP formulation that can be solved by CPLEX – a commercial ILP solver. Compare the performance of the various benchmarks and scheduling formulations.
  3. Enhance the visualization tools for the scheduling results
  4. Scheduling is a fundamental problem in a variety of areas. The existing framework is highly flexible and is easily adapted. It would be interesting to utilize the framework to tackle other scheduling problems, e.g. multiprocessor scheduling, distributed scheduling, etc.

Biomedical Imaging
Dr. Ali Irturk
I am working on a number of preliminary projects related to accelerating biomedical imaging. These include optical mapping of heart tissue, stem cell identification and selection, and radiotherapy. All of these applications are for the most part, currently done in a tedious, non-optimized manner, and involve hours/days of time for manual “computation”. Automation of these techniques could have profound effects on experiments, treatments and therapies for ailments ranging from heart disease to cancer. These projects are all in their early stages and involve understanding the algorithm and translating onto a computation device (GPU, FPGA, multicore processor) with hopes of creating a real-time implementation.

Other Applications

I am open to other projects. However, these should not be “off the cuff”. They should be well developed ideas that you have experience working on. Please contact me early to discuss potential ideas.

UCSD CSE237D – Spring 2011

Nathan Manning, Julian Fessard – webpage

Cory Li – website

Danny Anderson – website

Aruna Ravinagarajan – website

Joon Lee, Karthik Nagabhushana Sanji, Nameeta Patil – website

Eric Yip, Michael Chin, Seemanta Dutta – website

Jason Oberg – website

Ming Jia, Xia Zhang – website

Avinash Anathakrishnan – website

Yashar Asgarieh – website

UCSD CSE237D – Winter 2010

Per Magnus Østhus, Per Christian Corneliussen – webpage

Krishnam Raju Indukur – webpage

Harshit Chitalia, Po-Chao Huang – webpage

Liang Cheng – webpage

Digvijay Dalapathi – webpage

Janarbek Matai – webpage

Christopher Lei – webpage

Germán Alfaro – webpage

Laura Pina – webpage

Derick Johnson – webpage

Sam Wood, Alex Indaco – webpage, Pingfan Meng – webpage, Pavan K Nimmagadda – webpage

Hayden Gomes – webpage

Hemanth Meenakshisundaram – webpage

Nima Nikzad – webpage

Kaisen Lin – webpage

Yuan Wang – webpage

UCSD CSE237D – Spring 2009

Priti Aghera – webpage

Chris Barngrover – webpage

John Kooker – webpage

Maulin Patel, Grace Wang, Tyler Netherland – webpage

Edoardo Regini – webpage

Erik Rubow – webpage

Ishani Selvaratnam – webpage

UCSD CSE237D – Spring 2008

Liang (Leo) Chen – webpage

Chun Chen Liu – webpage

Jamie Bradley Steck and JunJie Su – webpage

Frank Zhang – webpage

UCSB ECE253 – Winter 2007

Hsiu-Ming (Sherman) Chang – webpage

Prerna Prerna, Ajay Ramji -webpage

Sydney Pang – webpage

Vivek Padi – webpage

Brett Brotherton, Nick Callegari, Ted Huffmire – webpage

Rami Yassine, Tyler Barton – webpage

Navraj Chohan – webpage

Sean Gordoni – webpage

Adam Volk – webpage

UCSB ECE253 – Winter 2006

Ali Irturk, Shahnam Mirzaei – webpage

Jimmy Lin – webpage

Morten Engen – webpage

Dongwoo Hong – webpage

Banit Agrawal – webpage

Ricardo Sanfelice – webpage

Ryan Avery – webpage

Roopa Chari, Justin Kane – webpage

David Ansari – webpage

Kai Yang – webpage

Adam Brill – webpage

Class Calendar: