Characterization and Detection of an Oscillating Magnetic Dipole Signals

Abstract: Magnetic anomaly detection (MAD) is one of the most important sensing techniques for detecting ferromagnetic objects by using magnetometers. As a passive method, MAD has an advantage over other techniques as it is not affected by air, water, soil, or weather conditions. MAD has various applications, including locating unexploded ordnance (UXO), submarine passive detection, vehicle tracking, etc. However, one of the main problems with magnetic sensing is interpreting magnetic data; therefore, the characterization and detection of magnetic targets has received much attention in research in recent years.
In MAD, characterization and detection of magnetic field signals are based on a dipole approximation that is a function of two elementary parameters: location and moment vectors of the object. Those vectors are usually time-dependent, so we need to model the dipole's trajectory and moment orientation in space. In addition, we need to understand the noise model of the ambient noise environment, and the internal noise of the magnetometer. In order to develop an efficient MAD algorithm that performs well in aspects of high probability of detection (PD) with constrained false alarm rate (FAR), we must understand the complete physical model of the magnetic signal.
In the present work we introduce a new model and detection algorithm for cases when the ferromagnetic object oscillates at an extremely low frequency (ELF) in a noisy environment. Initially, an experimental setup with an array of several magnetometers and an acquisition system was built in order to produce magnetic data recordings of noise, as well as oscillating magnetic dipole (OMD) signals. The input signals were analyzed in time and frequency domains in order to best select the physical model.
We show that the magnetic noise can be characterized as an autoregressive (AR) signal, and the OMD can be characterized as a dipole with a stationary location and a rapid change of the angle of the magnetic moment vector. This periodic change of the magnetic moment angle was based on an equation of motion (EOM) of a non-linear gravity pendulum. It was found that the overall model has high compliance with “real world” experiments.
After the model was developed and verified, the MAD algorithm was proposed. This detection algorithm consists of two parts: pre-whitening of the signal in order to flatten the spectrum, and a multiple harmonics search based on periodic characterization of the signal. The proposed approach, applied with a moving window, allows us to obtain real time detection and reduce the non-periodic interferences of temporal sources.
The performances of the algorithm were determined using Monte Carlo computer simulations. The simulations show that the developed detection algorithm has improved detection capabilities at SNR as low as -10 dB and high immunity to noise and clutters. Furthermore, compared to other benchmark detectors the proposed algorithm gives significantly better performance. Finally, the detector was tested under realistic operational conditions by processing real data and showed good performance with only 2.5 dB differences relative to the simulated performance.
*This research thesis was carried out under the supervision of Prof. Zeev Zalevsky from the Faculty of Engineering at Bar-Ilan University and Dr. Roger Alimi from Soreq Research Center.

20/05/2015 - 15:00 - 16:00
Tsuriel Ram-Cohen
דוא"ל להרשמה: 
Bar-Ilan University
Engineering Building 1103, Room 329