BirdNET-Analyzer Documentation

Welcome to the BirdNET-Analyzer documentation! This guide provides detailed information on installing, configuring, and using BirdNET-Analyzer.

Introduction

BirdNET-Analyzer is an open source tool for analyzing bird calls using machine learning models. It can process large amounts of audio recordings and identify (bird) species based on their calls.

Get started by listening to this AI-generated introduction of the BirdNET-Analyzer:


Source: Google NotebookLM

Citing BirdNET-Analyzer

Feel free to use BirdNET for your acoustic analyses and research. If you do, please cite as:

@article{kahl2021birdnet,
  title={BirdNET: A deep learning solution for avian diversity monitoring},
  author={Kahl, Stefan and Wood, Connor M and Eibl, Maximilian and Klinck, Holger},
  journal={Ecological Informatics},
  volume={61},
  pages={101236},
  year={2021},
  publisher={Elsevier}
}

About

Developed by the K. Lisa Yang Center for Conservation Bioacoustics at the Cornell Lab of Ornithology in collaboration with Chemnitz University of Technology.

Go to https://birdnet.cornell.edu to learn more about the project.

Want to use BirdNET to analyze a large dataset? Don’t hesitate to contact us: ccb-birdnet@cornell.edu

We also have a discussion forum on Reddit if you have a general question or just want to chat.

Have a question, remark, or feature request? Please start a new issue thread to let us know. Feel free to submit a pull request.

More tools and resources

We also provide Python and R packages to interact with BirdNET models, as well as training and deployment tools for microcontrollers. Make sure to check out our other repositories at https://github.com/birdnet-team.

Projects map

We have created an interactive map of projects that use BirdNET. If you are working on a project that uses BirdNET, please let us know and we can add your project to the map.

You can access the map here: Open projects map

Please refer to the projects map documentation for more information on how to contribute.

License

Source Code: The source code for this project is licensed under the MIT License

Models: The models used in this project are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)

Please ensure you review and adhere to the specific license terms provided with each model.

Please note that all educational and research purposes are considered non-commercial use and it is therefore freely permitted to use BirdNET models in any way.

Funding

This project is supported by Jake Holshuh (Cornell class of ´69) and The Arthur Vining Davis Foundations. Our work in the K. Lisa Yang Center for Conservation Bioacoustics is made possible by the generosity of K. Lisa Yang to advance innovative conservation technologies to inspire and inform the conservation of wildlife and habitats.

The development of BirdNET is supported by the German Federal Ministry of Education and Research through the project “BirdNET+” (FKZ 01|S22072). The German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety contributes through the “DeepBirdDetect” project (FKZ 67KI31040E). In addition, the Deutsche Bundesstiftung Umwelt supports BirdNET through the project “RangerSound” (project 39263/01).