top of page

Position Statement on Application of Bioacoustic AI to Identify Humpback Whale Songs

Submitted by Baldeep Singh Gill, Associate Editor, The Indian Learning and Manohar Samal, Research Intern on Artificial Intelligence Application Identifying Humpback Whale Songs.


 

Comprehending Oceans is a huge milestone in the transformation of Artificial Intelligence (AI). Lately, there have been a lot of technological advancements to scrutinize large water bodies. Machine Learning, Neural Networks and algorithms are proving to be constructive in analysing immeasurable depths and surfaces of Oceans. 


Under Google’s Artificial Intelligence for Social Good Program, the company developed algorithms to enable the identification of humpback whale calls (also referred to as humpback whale songs) from twelve locations in the Pacific island region. These algorithms have been developed under Google’s program in partnership with the Pacific Island Fisheries Science Center of the United States of America National Oceanic and Atmospheric Administration (NOAA). 


The AI application under this program has resulted in a collection of underwater humpback whale calls for a duration of 15 years amounting to 9.2 terabytes (TB) of data of decimation from 200 kHz to 10 kHz and that has provided new and significant information about the presence, daily calls, population structure and seasonality of humpback whales. The information collected under the program has proved to be quite resourceful as it has led to the collection of data that did not exist with scientists before the program in uninhabited and remote islands. The main contribution of this data collected using artificial intelligence is that it will be a vital tool in identifying effective mitigation of anthropogenic impacts on humpback whales. 


The most prominent manner in which the AI collected data was by utilising passive acoustic monitoring which is the process of listening to marine mammals. Such monitoring is conducted using underwater microphones that are referred to as hydrophones that record signals to detect, classify and localize tasks offline. The pristine advantage of this technique is that it enhances the ability to detect animals that are submerged underwater and allows longer monitoring and detection periods.  


In addition to the passive acoustic monitoring method, the company in partnership with NOAA also utilised ResNet- 50 which is deep residual learning AI consisting of convolutional neural network architecture that has been extremely successful for image recognition and non- speech audio classification. However, few challenges are always faced by the teams employing such equipment. This is because many times narrow-band noise, often caused by nearby boats and equipment are also recorded along with the primary dataset. This is resolved using per-channel energy normalization (PCEN) that is employed to suppress narrow-band noise. Therefore, the AI has to be assisted with supervised learning in aspects like optimization of image models for humpback detection and is also capable of unsupervised learning of semantic audio representations in finding similar song units. 


Climate change has significantly affected the behaviour of marine animals and that has increased the difficulties of researchers and scientists in collecting and studying data. Under such circumstances, such AI tools prove to be extremely resourceful and pivotal. Google has also assisted in starting the organisation, ‘Global Fishing Watch’ that monitors fishing activity around the world through Big Data. 


Google’s next goals are to utilise advances in bioacoustics AI technology in distinguishing between three subpopulations of Killer Whales that will enable the effective monitoring of their health and protect them. We believe that Artificial Intelligence will provide comprehensive solutions to ubiquitous environmental challenges and help build a clean and sustainable natural environment.


To understand this development, please refer to:




For queries, mail us at editorial@isail.in.


Comments


Updates from our Newsletter, INDIAN.SUBSTACK.COM

The Indian Society of Artificial Intelligence and Law is a technology law think tank founded by Abhivardhan in 2018. Our mission as a non-profit industry body for the analytics & AI industry in India is to promote responsible development of artificial intelligence and its standardisation in India.

 

Since 2022, the research operations of the Society have been subsumed under VLiGTA® by Indic Pacific Legal Research.

ISAIL has supported two independent journals, namely - the Indic Journal of International Law and the Indian Journal of Artificial Intelligence and Law. It also supports an independent media and podcast initiative - The Bharat Pacific.

bottom of page