- My most recent research work is in the broad area of error correction coding for storage and communication systems; including the design, analysis, and optimization of innovative algorithms.
- Recent work can be reached at:
- "Founsure" project, Storage system modeling, Mojette-LDPC relationship, Green Storage (Tape), ...
Please contact me at firstname.lastname@example.org if you want to work on interesting and modern problems regarding data storage, coding, distributed computing, multimedia communciations...As for the requirements, I'd say motivation, basic/intermediate level of computer programming, linux and algorithms knowledge.
Take a look at the projects page.
- Our team at QTM focused on cloud and cold storage technologies.
Data integrity and protection is one of mainstream objectives of data storage business. The reliability, durability and availablity of such coded system architectures is one of the controversial issues of the market. Our main aim is to develop algorithms and approximate random processes in order to predict the reliability and durability of Quantum's product line as well as help other research groups predict their own by establishing a common set of tools. Erasure Correction coding is used to add redundancy to the systems to protect data against system abnormalities/failures. Our team's inclination has been a research towards modern erasure codes with linear-time encoding and decoding complexities. We consider various coding algorithms ranging from Cauchy Reed-Solomon codes to Fountain codes and locally decodable linear codes to achieve our design objectives.
The biggest challenge in this whole process is to adapt such neat and beautiful theory to the actual Quantum product line to develop solutions at the system level. The integration of efficient implementations of these algorithms are for increased customer satisfaction. Our deliverables are patents, technical papers and system level implementations. .
- Research @ Quantum: In the summer of 2011, I started off a research work in Quantum Corp., Irvine, CA. Our research was dedicated to finding algorithmic solutions for the performance improvement of real-world tape drives. In such a typical storage medium, the performance is constrained by the dominant error patterns at the output of a maximum likelihood detector. First, I have identified those dominant error events for the magnetic recording medium of interest (This is found to be generation specific). More recently, I have developed a list noise predictive maximum likelihood detector based on periodic parity updates as an alternative to standard trellis-based Viterbi decoding for Partial Response (PR4) signals. In this study, proposed algorithm keeps track of L paths for each state of the trellis and at each update step, a decision is made while the other paths are being discarded. Although a large L leads to an improved performance, it also increases the computation and memory complexity. This novel framework is shown to eliminate up to 92% of all the error events with reasonable choices of L. Those exiting results lead to an application for a patent and a possible journal paper submission.
- WIRELESS MULTIMEDIA: A Cross Layer Approach: Current networked systems are often designed usiing multiple layers. A base layer bears significant information about the source that can be enumerated as the skeleton, whereas the enhancement layers contain a type of information that can be used to refine the source to obtain a better visual quality. In wireless systems, such layering could lead to highly inefficient network design because of a variety of factors unique to the wireless environments such as time-varying channels, and interference. Thus, cross-layer approaches have been proven right over many years but not implemented because of their complex inter-layer interactions, and potential loss of robustness. However, in recent developments, it has been demonstrated that cross-layer optimization can be achieved while preserving the robustness of the solution.
First, I have focused on a progressive source transmission system consisting of a progressive source coder, a rate compatible punctured convolutional (RCPC) channel coder and a hierarchical modulation module. Under a bandwidth constraint, we developed a novel packetization strategy and used it with a source-channel-modulation coding setting. Parameters of the system are optimized in mean distortion sense. See publications for more information.
- CODING FOR EMBEDDED BIT STREAMS: One of the important contributions of my thesis was a simple, yet a constructive concatenated coding method proposed for embedded bit streams that achieve very close (within 0.25-0.3dB in PSNR) to Shannon's capacity results for BSCs. A novel packetization paradigm is coupled with two kinds of channel codes: RCPC and RC-LDPC codes and judiciously chosen interleavers for burst error randomization. We have developed a tractable functional optimization procedure that allowed us to use constrained exhaustive search to find a global optimum solution to the optimization problem
IN THE ABSENCE OF CSI OR MULTICAST: "Fountain idea". In the two previous transmission scenarios, the main assumption was that the transmitter has a perfect or imperfect information about the state of the channel (CSI). In many multimedia applications, either the CSI mechanism is missing or unreliable. In other applications such as multicast scenarios, each receiver experiences a different physical channel and thereby there is no single channel state to adapt the whole system.
In such transmission scenarios, fountain codes are great match. Fountain codes are rateless in the sense that the number of encoded packets that can be generated from the source message is potentially limitless; and the number of encoded packets generated can be determined on the fly. In the third section of my thesis, we proposed the most general fountain code design that is best suited for multimedia transmissions. More specifically, we showed that we can tailor the parameters of the proposed fountain code such that it will be best suited for a progressive transmission. We also showed that unequal iteration time (URI) performance of the proposed design is emphasized because of the joint design of different distributions that characterizes the performance of a fountain code. In this later research, we aimed at finding the limiting performance bounds of a simple LT code design using a specific source transmission scenario.
OBJECT TRACKING: "Visible or Invisible?" : In the summer of 2009, I have had a different experience with imaging group at Mitsubishi Electric Research Laboratory (MERL), Cambridge, MA. First, I developed a tissue simulation program using a C-based engine and a Finite Element Method for morphing the object (a tumor in our case) in a given volume. An example visualization using MATLAB is shown in the figure to the right. Later, this tool enabled us to obtain synthetic images and videos to test our tracking and classification algorithms. In essence, we tested each algorithm based on the accuracy and reliability of detection & tracking for both visible and invisible tumor scenarios. In that figure, a cross sectional views of the 3D volume is shown at two distinct time instants along with the tracking result. I later developed a set of other algorithms: (1) Improved temporal random walk tracking, (2) Seedless image segmentations utilizing a multiple use of random walks or graph cuts and (3) Learning and tracking using regression in di®erent Lie groups.
Above photo from www.turningpointbooks.com