• Skip to primary navigation
  • Skip to main content
  • Home
  • Research
  • Publications
  • News
  • Code Repository
  • People
  • Join the Lab
  • Group Photos
  • Courses

Advanced Vision and Learning Lab (AVLL)

Texas A&M University College of Engineering

Conference

Histogram Layer Time Delay Neural Networks for Passive Sonar Classification

July 12, 2023

J. Ritu, E. Barnes, R. Martell, A. Dine and J. Peeples, "Histogram Layer Time Delay Neural Network For Passive Sonar Classification," 2023 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA). doi: 10.1109/WASPAA58266.2023.10248102.

Underwater acoustic target detection in remote marine sensing operations is challenging due to complex sound wave propagation. Despite the availability of reliable sonar systems, target recognition remains a difficult problem. Various methods address improved target recognition. However, most struggle to disentangle the high-dimensional, non-linear patterns in the observed target recordings. In this work, a novel method combines a time delay neural network and histogram layer to incorporate statistical contexts for improved feature learning and underwater acoustic target classification. The proposed method outperforms the baseline model, demonstrating the utility in incorporating statistical contexts for passive sonar target recognition. The code for this work is publicly available: https://github.com/Peeples-Lab/HLTDNN.

Quantitative Analysis of Primary Attribution Explainable Artificial Intelligence Methods for Remote Sensing Image Classification

April 4, 2023

A. Mohan and J. Peeples, “Quantitative Analysis of Primary Attribution Explainable Artificial Intelligence Methods for Remote Sensing Image Classification,” in IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 2023, in Press. doi: 10.48550/arXiv.2306.04037.

We present a comprehensive analysis of quantitatively evaluating explainable artificial intelligence (XAI) techniques for remote sensing image classification. Our approach leverages state-of-the-art machine learning approaches to perform remote sensing image classification across multiple modalities. We investigate the results of the models qualitatively through XAI methods. Additionally, we compare the XAI methods quantitatively through various categories of desired properties. Through our analysis, we offer insights and recommendations for selecting the most appropriate XAI method(s) to gain a deeper understanding of the models’ decision-making processes. The code for this work is publicly available: https://github.com/Peeples-Lab/XAI_Analysis.

Histogram Layers for Synthetic Aperture Sonar Imagery

September 2, 2022

J. Peeples, A. Zare, J. Dale, and J. Keller, "Histogram Layers for Synthetic Aperture Sonar Imagery," in IEEE International Conference on Machine Learning and Applications (ICMLA), 2022, doi: 10.1109/ICMLA55696.2022.00032.

Synthetic aperture sonar (SAS) imagery is crucial for several applications, including target recognition and environmental segmentation. Deep learning models have led to much success in SAS analysis; however, the features extracted by these approaches may not be suitable for capturing certain textural information. To address this problem, we present a novel application of histogram layers on SAS imagery. The addition of histogram layer(s) within the deep learning models improved performance by incorporating statistical texture information on both synthetic and real-world datasets.

© 2016–2025 Advanced Vision and Learning Lab (AVLL) Log in

Texas A&M Engineering Experiment Station Logo
  • College of Engineering
  • Facebook
  • Twitter
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment