Butterfly Monitoring App – Citizen Science for Biodiversity
“Where nature walks meet real science”
Introduction to the Problem
Across the globe, biodiversity is under severe threat. Insects in particular are experiencing dramatic declines – in Germany, long-term studies have shown a loss of more than 75% of insect biomass in just a few decades. Butterflies are among the most visible and beloved insects, but they are also vital: they act as pollinators, part of the food chain, and as indicators of environmental health. When butterfly populations shrink, it signals much wider problems in our ecosystems.
The Helmholtz Centre for Enviromental Research, one of Germany’s most important research centers, set us the challenge of exploring this issue. They explained that while professional monitoring networks exist, they are limited:
-
Monitoring butterflies requires time, expertise, and consistency, which means only trained specialists can contribute.
-
These specialists can only cover certain areas, leaving huge gaps in monitoring data.
-
Meanwhile, everyday people – hikers, students, retirees – see butterflies every day but lack a way to turn these sightings into scientifically useful data.
This disconnect between what scientists need and what citizens experience is a key obstacle. Without new approaches, the biodiversity crisis remains invisible to most people and unmeasured in many regions. To fight species decline, we must find ways to close the data gap – empowering citizens to become part of the solution.
The Case
This project was initiated by the Helmholtz Centre for Enviromental Research, one of Germany’s leading research centers for environmental studies. Their challenge to us was:
-
Traditional butterfly monitoring is time-intensive, expert-driven, and limited in scope.
-
Current data collection cannot cover enough geographic areas to detect changes early.
-
At the same time, many citizens encounter butterflies daily, but their observations are not systematically recorded or scientifically usable.
Their current butterfly monitoring app “TagfalterMonitoringDeutschland” already allows citizens to record sightings, but during our analysis we noticed several flaws:
- The species catalog is not beginner friendly, with use of only scientific names
-
The species catalog is incomplete, some butterflies lack images, have confusing, inconsistent names and are missing a description.
-
The app is limited in guidance and feedback, which lowers motivation for continued use.


We were tasked with creating a prototype concept that addresses these issues and explores the use of AI image recognition to support reliable data collection.
Our Target Group
Our idea addresses this challenge by focusing on Citizen Scientists: ordinary people who can contribute to real scientific projects through their observations and actions. We identified several groups who would benefit from – and actively use – our solution:
-
Students and Teachers: For many young people, learning about biodiversity is abstract and textbook-based. With our app, students can go outside, observe butterflies in their environment, and immediately see how their data contributes to real science. Teachers gain an engaging tool for biology lessons, connecting curriculum content to real-world problems.
-
Amateur Naturalists and Hikers: Many adults already spend their free time outdoors, photographing insects and enjoying nature. They are curious and motivated but often lack the scientific expertise to contribute their sightings reliably. The app gives them the tools to make their hobby scientifically valuable.
-
Retirees and Families: Older adults often look for meaningful ways to spend their free time. Families with children seek activities that are both educational and fun. Observing butterflies with the app creates an intergenerational experience: children learn while grandparents contribute to science.
-
Researchers and Policymakers: Although not the direct “users” of the app in the field, these groups are crucial beneficiaries. They rely on high-quality data to identify trends, develop environmental policies, and design conservation programs.
Personas – our imagined users
Student
Anna, the High-School Student
Age: 16
Background: Attends a secondary school in Hamburg. Interested in biology and participates in an after-school eco-club.
Goals: Wants to learn how to identify butterfly species while contributing to real science. Needs a simple, engaging, and educational tool.
Frustrations: Limited technical knowledge, easily discouraged by complicated apps. Sometimes struggles with poor mobile data outdoors.
Needs: Clear guidance, gamification elements (badges, points), simple interface.
Teacher
Eva, the Biology Teacher
Age: 51 Background: Biology teacher at a secondary school near Munich. Integrates nature observation projects into class.
Goals: Wants to inspire students to care about biodiversity while providing them with hands-on Citizen Science projects. Needs a trustworthy app that fits into teaching.
Frustrations: Students get distracted by technical issues; fears incorrect data might undermine learning outcomes. Needs to manage multiple student submissions easily.
Needs: Robust app that works across devices, simple onboarding process, educational resources within the app, clear validation of student data.
Amateur Naturalist
Markus, the Amateur Naturalist
Age: 42
Background: Works as an IT consultant, spends weekends hiking and photographing insects. Has intermediate knowledge of butterflies.
Goals: Wants to contribute accurate data to support biodiversity monitoring. Interested in seeing trends and maps of populations.
Frustrations: Concerned about misidentifying similar species. Finds manual data entry tedious. Worries about battery drain when outdoors all day.
Needs: Reliable AI identification, offline functionality, efficient photo-to-data workflow, and access to aggregated results (maps, trends).
Retired Nature Enthusiast
Hans, the Retired Nature Enthusiast
Age: 68
Background: Recently retired mechanical engineer living in a small town in Bavaria. Has a lot of free time and enjoys daily walks in nature.
Goals: Wants to stay active, enjoy outdoor activities, and contribute to biodiversity monitoring as a meaningful retirement hobby.
Frustrations: Not very tech-savvy, sometimes intimidated by new apps. Concerned about small text and complicated workflows. May have limited patience with technical errors.
Needs: Large, easy-to-read interface, simple step-by-step guidance, offline mode for rural areas, motivation through seeing his personal impact (statistics, trend maps).
Together, these groups form a broad community of Citizen Scientists. They differ in age, background, and expertise, but they share one thing: a desire to connect with nature and to contribute to something bigger than themselves.
Still the focus has to be on the younger generations considering they tend to be less patient with learning new apps and therefore are easier to lose as an user. That’s why gamification is a key feature of our design: progress badges, personal statistics, and small challenges make the experience rewarding and keep them engaged over time.
Our Solution
Our solution is a mobile application that transforms everyday butterfly sightings into reliable scientific data and solves the shortcoming of the current Helmholtz app by:
-
Providing ease of use for non-experts, with a clean design and clear guidance.
-
Offering accurate identification support with AI-assisted recognition.
-
Ensuring data reliability through automatic GPS/timestamp metadata.
-
Adding motivation & learning tools (badges, guides).
- attract younger users with gamification
What makes our solution special is the two-model AI approach:
-
Model 1 filters whether an image is usable and decides which images to process further.
-
Model 2 performs butterfly recognition on this filtered set.
This approach increases accuracy and ensures that scientists receive high-quality data instead of large amounts of unreliable input.
Overall the core functions are the following:
-
AI-powered recognition: Using the phone’s camera, the app identifies butterflies in real time.
-
Light and understandable design: The Design of the app is light with as few buttons as possible aswell as the use of universal symbols to ensure easy and beginner friendly use.
-
Automatic metadata logging: Every observation is automatically linked with GPS, time, and weather data, which ensures its scientific reliability.
-
Gamification & motivation: To keep users engaged, the app includes badges, personal statistics, and maps showing their impact.
-
Educational content: Built-in species guides and tips help beginners learn and experts refine their knowledge.
By combining these features, the app bridges the gap between citizens and researchers. It empowers ordinary people to collect high-quality data, while giving them feedback and learning opportunities in return.
The revenue for the app itself could be generated from Research grants & government funding, which the Helmholtz Centre already receives.

The Prototype
Our prototype is not a finished product, it is a conceptual demonstration as a visual concept of how an AI-powered Citizen Science app could function in practice. It represents our ideas and assumptions about how users will interact with the technology, what features are most valuable, and how these features might integrate into a broader scientific workflow.
At this stage, the prototype includes:
-
Camera mode with live counting (but simulated).
-
Interval-based snapshots (visual placeholders).
-
Metadata collection (conceptual).
-
User interface design (clean, minimal, intuitive).
Not yet included:
-
No AI recognition implemented (this is simulated for now).
- No complex dashboards or global networking features.
The prototype is designed to:
-
Explore functionality: We wanted to see how features like real-time butterfly detection, interval snapshots, and tap-to-focus might fit together.
-
Visualize the experience: By simulating the camera view with outlines, we can demonstrate how users would see and track butterflies in the field.
-
Test assumptions: For example, is it realistic to expect that AI can recognize butterflies in real-time on a mobile phone? How much processing needs to happen locally vs. via cloud services?
-
Evaluate usability: The prototype gives us a way to think about interface design, accessibility for older users, and learning tools for beginners.
-
Create a feedback loop: Prototypes spark discussion with scientists, teachers, and nature enthusiasts about what works, what doesn’t, and what should be prioritized in further development.
Ideas embodied in the prototype:
-
Camera-based interaction → The camera is the primary interface, making the experience intuitive and visual.
-
Automation vs. user control → The prototype balances automatic detection (for beginners) with manual options (for advanced users).
-
Integration with scientific workflows → Metadata logging shows how Citizen Science observations could feed into research databases.
-
Learning & motivation → Gamification and educational elements illustrate how to keep people engaged long-term, not just during initial curiosity.
This prototype is not meant as a finished product, but as a thinking and communication tool to test ideas, evaluate usability, and highlight what makes our approach unique compared to existing solutions.

AI Transparency & Ethics
While our prototype is primarily a visual demonstrator, the long-term vision involves integrating AI models to recognize butterfly species in real time. Because artificial intelligence is not only a technical tool but also a social one, we reflected on how to ensure that its use is fair, transparent, and responsible.
Data privacy and protection:
We designed the concept so that the app only uploads the minimum amount of data necessary, like small photo subsets of butterflies together with metadata such as GPS, time, and weather. No personal data (like user identities or exact movement patterns) is collected. By reducing the scope of data transfer, we protect user privacy while still providing scientists with reliable information.
Transparency and trust:
AI often feels like a “black box.” Our concept reduces this problem by keeping the workflow clear: one model filters for usable images, while the second performs recognition. By openly explaining this two-step process, we make the system easier for users and scientists to understand.
Fairness and bias reduction:
AI models are only as good as the data they are trained on. If training images only come from certain regions, seasons, or species, the app might underperform for less common or harder-to-photograph butterflies. To address this, we emphasize the importance of building a diverse and representative training dataset, ensuring that the AI works not only for common species but also for rare or locally important ones.
Accuracy through innovation:
Our concept uses a two-model approach, where the first model filters which images are usable, the second performs the recognition. This not only improves accuracy but also reduces the chance of misleading outputs, since unusable images (blurry, partial, too small) are discarded early. By explaining this workflow openly, we make the AI’s role more understandable to users.
Responsible engagement:
Finally, we want the app to empower people, not replace them. The AI is a supporting tool, not a substitute for human curiosity or expertise. Users learn from the app, receive feedback, and stay motivated, while scientists gain reliable data. In this way, the project combines the strengths of both worlds: the efficiency of AI and the passion of people.
We as a team
- Moritz B.
- Paul S.
- Jannik E.
We are a team of three colleagues from the same company and country, all currently completing our apprenticeship as IT specialists for application development (Fachinformatiker für Anwendungsentwicklung).
Although we share the same professional background, we combined our skills and creativity to work through this complex case study. Each of us took responsibility for different aspects of the project, from researching the biodiversity problem, to structuring the app concept, to thinking through design and technical feasibility.
To organize our work, we used a Kanban board with the stages Backlog → Work in Progress (Solo) → Work in Progress(All) → Done. This helped us divide tasks, assign responsibilities, and keep track of our progress during the workshop.

What unites us as a team is our motivation to build something that is both technically feasible and socially impactful. Even as apprentices, we wanted to show how modern development methods, combined with curiosity and teamwork, can lead to ideas that matter.
Reflection:
Through this project we learned how to divide roles effectively, adapt based on peer feedback, and refine our prototype step by step. We realized how important it is to first think and design before actually turning it into a real technical solution and think not only about technology but also about usability, motivation, and scientific reliability.
Our shared vision is simple:
Use our IT skills to make biodiversity research more accessible and impactful.
Our Market Stand

Appendix

Disney Method Table: Disney-Method-Table