ACI 2026 Website Now Live

Prof. Dr. Oliver Bendel will host the next ACI Conference, bringing the event to continental Europe for the first time as it convenes on the FHNW campus in Brugg-Windisch, Switzerland, from December 2–5, 2026. The conference website is already online, and the most important information is available at www.aciconf.org; individual deadlines may still change. Building on a tradition that has taken the community from Glasgow to North Carolina, Newcastle, Bloomington, Milton Keynes, Haifa, and Atlanta, this edition continues the conference’s role as a leading venue for advancing Animal-Computer Interaction. As the field grows, researchers and practitioners explore how technology shapes animals’ lives, welfare, cognition, and social dynamics while developing animal-centered systems and methods that embrace multispecies perspectives. The conference maintains its commitment to interdisciplinary collaboration across biology, technology, and cultural studies, supporting work that seeks to design ethically grounded, welfare-enhancing, and inclusive technological futures for all animals, humans included. ACI 2026 will also feature a Special Issue on Animal-Machine Interaction, a research field shaped in important ways by Oliver Bendel. Proceedings will be published in the ACM Digital Library.

Fig.: The ACI website

ACI Conference Comes to Continental Europe for the First Time

Prof. Dr. Oliver Bendel will host the next ACI Conference, marking the first time the event comes to continental Europe as it convenes on the FHNW campus in Brugg-Windisch, Switzerland, from 2-5 December 2026. Building on a tradition that has taken the community from Glasgow to North Carolina, Newcastle, Bloomington, Milton Keynes, Haifa and Atlanta, this edition continues the conference’s role as the premier venue for advancing Animal-Computer Interaction. As the field grows, researchers and practitioners explore how technology shapes animals‘ lives, wellbeing, cognition and social dynamics while developing animal-centered systems and methods that embrace multispecies perspectives. The conference maintains its commitment to interdisciplinary collaboration across biology, technology and cultural studies, supporting work that seeks to design ethically grounded, welfare-enhancing and inclusive technological futures for all animals, humans included. Proceedings will be published in the ACM Digital Library, and the official conference website will go live in January 2026. Information on previous ACI conferences is available at www.aciconf.org.

Fig.: Oliver Bendel at the FHNW campus in Brugg-Windisch (Photo: Jork Weismann)

Ein Buch zur Tier-Maschine-Interaktion

Am 18. November 2025 hat Prof. Dr. Oliver Bendel bei Springer Gabler sein neues Manuskript abgegeben. Im Frühjahr 2026 erscheint das schmale Buch mit dem Titel „Tier-Maschine-Interaktion“. Am Anfang wird der Inhalt skizziert: „Dieses Essential gibt eine kompakte Einführung in die Disziplin bzw. das Forschungs- und Anwendungsfeld der Tier-Maschine-Interaktion (TMI). Es zeigt, wie Tiere und Maschinen in unterschiedlichen Kontexten zusammentreffen und miteinander bestehen, welche Chancen und Risiken sich daraus ergeben und welche Perspektiven sich für Wissenschaft, Wirtschaft und Politik eröffnen. Ziel ist es, die Leser für die Potenziale und Herausforderungen der Tier-Maschine-Interaktion zu sensibilisieren, Orientierung im interdisziplinären Diskurs zu geben und Anregungen für Forschung, Entwicklung und Entscheidungsprozesse zu liefern.“ Es ist das erste Buch zu diesem Thema. Es enthält zwei Abbildungen, drei Tabellen und mehrere Boxen mit Definitionen und Hintergrundinformationen.

Abb.: Oliver Bendel am Campus Brugg-Windisch (Foto: Jork Weismann)

An Introduction to Animal-Machine Interaction

„Just.Us + Animal Welfare“ is a lecture series organized by Department 10 Veterinary Medicine to promote animal welfare at Justus Liebig University Giessen. On November 12, 2025, Prof. Dr. Oliver Bendel gave a lecture on „Bao meets Pluto: Grundlagen und Beispiele der Tier-Maschine-Interaktion“ („Bao meets Pluto: Fundamentals and examples of animal-machine interaction“). Animal-machine interaction deals with the encounter and coexistence of animals and machines – from classic devices to vehicles, aircraft, and agricultural machinery to networked, autonomous robots and AI systems. The focus is on perception through sensors and senses, interaction and communication between animals and machines, and the question of how these encounters can be designed technically, organizationally, and ethically in such a way that risks for animals are reduced and potential for them and for humans is tapped. In his lecture, Oliver Bendel laid out the fundamentals of animal-machine interaction and described prototypes and projects. He also outlined what is possible and to be expected in this field of research in the coming years, for example in connection with robotic quadrupeds and bipeds. The online lecture was followed by 660 listeners. Further information is available at www.uni-giessen.de/de/fbz/zentren/icar3r/akademie/justus.

Fig.: Oliver Bendel with his Bao (Photo: Selina Rohr)

There’s a Large Hippo Resting in the Mud

On November 10, 2025, the article „There’s a Large Hippo Resting in the Mud“ by Oliver Bendel and Doris Jovic was published introducing the VISUAL project. „VISUAL“ stands for „Virtual Inclusive Safaris for Unique Adventures and Learning“. All over the world, there are webcams showing wild animals. Sighted people can use them to go on photo and video safaris comfortably from their sofas. Blind and visually impaired people are at a disadvantage here. As part of Inclusive AI, the project developed a prototype specifically for them. Public webcams around the world that are directed at wild animals are tapped. Users can choose between several habitats on land or in water. They can also select „Adult“ or „Child“ as a profile and choose a role („Safari Adventurer“, „Field Scientist“, „Calm Observer“). When the live video is accessed, three screenshots are taken and combined into a bundle. This bundle is analyzed and evaluated by GPT-4o, an MLLM. The user then hears a spoken description of the scene and the activities. The project is likely one of the first to combine Inclusive AI with new approaches in Animal-Computer Interaction (ACI). The article was published in Wiley Industry News and can be accessed at: wileyindustrynews.com/en/contributions/theres-a-large-hippo-resting-in-the-mud. It should be noted that it is also available in German.

Fig.: There’s a large hippo resting in the mud

Kick-off-Meeting des Projekts DEEP VOICE

Das Projekt DEEP VOICE startete am 3. September 2025. Es wurde von Prof. Dr. Oliver Bendel initiiert. Als Projektmitarbeiter konnte der Machine-Learning-Experte Florian Karrer gewonnen werden. Er wird in diesem Zusammenhang seine Abschlussarbeit an der Hochschule für Wirtschaft FHNW schreiben. „DEEP VOICE“ steht für „Decoding Environmental and Ethological Patterns in Vocal Communication of Cetaceans“. Das Projekt zielt darauf ab, die symbolische Sprache von Walen zu entschlüsseln. Dafür soll radikal die Perspektive des Tiers eingenommen werden. Es soll ein Modell von ihm, seinem Verhalten und seiner Umwelt entstehen, das dann als Grundlage für die Sprachverarbeitungskomponente dient. Das Projekt verbindet biologische und ethologische Grundlagen mit Ansätzen des Machine Learning und will so zu einem besseren Verständnis der nichtmenschlichen Intelligenz und Kommunikation sowie – über die Tier-Computer-Interaktion (Animal-Computer Interaction, ACI) – zur Mensch-Tier-Interaktion beitragen. Oliver Bendel und seine Studenten haben sich bisher vor allem auf die Körpersprache von Haus- und Nutztieren (The Animal Whisperer Project) sowie das Verhalten von Haustieren (The Robodog Project) und Wildtieren (VISUAL) konzentriert. Zudem wurde ein Konzept für die Gesichtserkennung bei Bären erstellt, mit dem Ziel, invasives Tracking zu vermeiden.

Abb.: Zwei Wale im Meer

AI for Non-Human Animal Communication

Recent advancements in artificial intelligence (AI) and bioacoustics have opened a unique opportunity to explore and decode animal communication. With the growing availability of bioacoustic data and sophisticated machine learning models, researchers are now in a position to make significant strides in understanding non-human animal languages. However, realizing this potential requires a deliberate integration of AI and ethology. The AI for Non-Human Animal Communication workshop at NeurIPS 2025 will focus on the challenges of processing complex bioacoustic data and interpreting animal signals. The workshop will feature keynote talks, a poster session, and a panel discussion, all aimed at advancing the use of AI to uncover the mysteries of animal communication and its implications for biodiversity and ecological conservation. The workshop is inviting submissions for short papers and proposals related to the use of AI in animal communication. Topics of interest include bioacoustics, multimodal learning, ecological monitoring, species-specific studies, and the ethical considerations of applying AI in animal research. Papers should present novel research, methodologies, or technologies in these areas, and will undergo a double-blind review process. The paper submission deadline is September 5, 2025, with notifications of acceptance by September 22, 2025. More information is available at aiforanimalcomms.org.

Fig.: Nonhuman primates in conversation

Video zum VISUAL-Projekt

Zum Projekt VISUAL liegt seit 29. August 2025 ein Video vor, das das System im Betrieb zeigt. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem MLLM, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet. Das Video kann über www.informationsethik.net/videos/ abgerufen werden.

Abb.: Das VISUAL-System

Wenn Hunde auf einen Roboterhund treffen

Das Projekt „The Robodog Project: Bao Meets Pluto“ untersuchte, wie Haushunde auf den vierbeinigen Roboter Unitree Go2 – von Projektinitiator Prof. Dr. Oliver Bendel Bao genannt – reagieren und wie ihre Halter solche Roboter in gemeinsam genutzten öffentlichen Räumen wahrnehmen. Das Projekt begann Ende März 2025 und wurde Anfang August 2025 abgeschlossen. Die Studie befasste sich mit drei Fragen: 1. Wie reagieren Hunde verhaltensmäßig auf einen vierbeinigen Roboter in sechs definierten Durchläufen, nämlich stationär, gehend und springend im Originalzustand sowie stationär, gehend und springend mit einem zusätzlichen 3D-gedruckten Hundekopf? 2. Welche Erwartungen und Bedenken haben die Halter? 3. Welche regulatorischen Rahmenbedingungen könnten eine sichere Integration unterstützen? Zwölf Hunde wurden in sechs strukturierten Interaktionsphasen beobachtet; ihr Verhalten wurde mithilfe von BORIS videokodiert. Vorgespräche mit acht Haltern sowie ein Experteninterview mit der Biologin und Hundetrainerin Dr. Sabrina Karl lieferten zusätzliche Erkenntnisse. Die Studie unter der Leitung von Selina Rohr ergab, dass die meisten Hunde vorsichtig, aber nicht aggressiv waren. Das Interesse nahm zu, sobald sich der Roboter bewegte, während visuelle Modifikationen kaum Wirkung zeigten. Ein 3D-gedruckter Hundekopf hingegen schien die Hunde im Standmodus zu faszinieren. Hergestellt und zur Verfügung gestellt wurde er von Norman Eskera. Häufig suchten die Tiere die Orientierung bei ihren Haltern, was die Bedeutung menschlicher Vermittlung unterstreicht. Die Halter zeigten sich vorsichtig aufgeschlossen, betonten jedoch Bedenken hinsichtlich Sicherheit, Unberechenbarkeit und Haftung. Die Ergebnisse sprechen für eine an Drohnen orientierte Regulierung beim Einsatz solcher Roboter im öffentlichen Raum.

Abb.: Der Hundekopf aus dem 3D-Drucker

An Investigation into the Encounter Between Social Robots and Animals

The volume „Animals, Ethics, and Engineering: Intersections and Implications“, edited by Rosalyn W. Berne, was published on 7 August 2025. The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. The title of Oliver Bendel’s contribution is „An Investigation into the Encounter Between Social Robots and Animals“. The abstract reads: „Increasingly, social robots and certain service robots encounter, whether this is planned or not, domestic, farm, or wild animals. They react differently, some interested, some disinterested, some lethargic, some panicked. Research needs to turn more to animal-robot relationships, and to work with engineers to design these relationships in ways that promote animal welfare and reduce animal suffering. This chapter is about social robots that are designed for animals, but also those that – for different, rather unpredictable reasons – meet, interact, and communicate with animals. It also considers animal-friendly machines that have emerged in the context of machine ethics. In the discussion section, the author explores the question of which of the presented robots are to be understood as social robots and what their differences are in their purpose and in their relationship to animals. In addition, social and ethical aspects are addressed.“ The book was produced by Jenny Publishing and can be ordered via online stores.

Fig.: A monkey with a mirror

Abschluss des Projekts VISUAL

Am 31. Juli 2025 fand die Abschlusspräsentation des Projekts VISUAL statt. Initiiert wurde dieses von Prof. Dr. Oliver Bendel von der Hochschule für Wirtschaft FHNW. Durchgeführt wurde es von Doris Jovic, die ihren Bachelor in Business Information Technology (BIT) in Basel macht. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem multimodalen großen Sprachmodell, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Die Bedürfnisse von blinden und sehbeeinträchtigten Personen wurden über eine barrierefreie Onlineumfrage eingeholt, bei der der FHNW-Mitarbeiter Artan Llugaxhija unterstützte. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet.

Abb.: Doris Jovic bei der Abschlusspräsentation

„The Robodog Project“ Comes to an End

Animal-machine interaction (AMI) is a discipline or field of work that deals with the interaction between animals and machines. This is how Prof. Dr. Oliver Bendel explains it in the Gabler Wirtschaftslexikon. It is primarily concerned with the design, evaluation, and implementation of complex machines and computer systems with which animals interact and which in turn interact and communicate with animals. There are close links to animal-computer interaction (ACI). Increasingly, the machine is a robot that is either remote-controlled or (partially) autonomous. In „The Robodog Project“ – also known as „Bao Meets Pluto“ – the encounters between robotic quadrupeds and small to medium-sized dogs are explored. The project collaborator is Selinar Rohr, who is writing her bachelor’s thesis in this context. The walking, running, and jumping Unitree Go2 from Oliver Bendel’s private Social Robots Lab is in its original state or is wearing a head made with a 3D printer provided by Norman Eskera. The project is being carried out at the FHNW School of Business and will end on August 12, 2025, after which the results will be presented to the community and, if possible, to the general public.

Fig.: The project is about encounters between robotic quadrupeds and dogs

Online Survey on the VISUAL Project

On June 19, 2025, the interim presentation of the VISUAL project took place. The initiative was launched by Prof. Dr. Oliver Bendel from the FHNW School of Business. The project assistant is Doris Jovic, who is currently pursuing her Bachelor’s degree in Business Information Technology (BIT). „VISUAL“ stands for „Virtual Inclusive Safaris for Unique Adventures and Learning“. All over the world, webcams provide real-time footage of wild animals. Sighted people can use them to go on photo or video safaris from the comfort of their homes. However, blind and visually impaired individuals are at a disadvantage. In the spirit of Inclusive AI – a concept and movement that includes tools like Be My Eyes and its Be My AI feature – this project aims to create an accessible solution. By August 2025, the goal is to develop a prototype that allows blind and visually impaired users to receive audio descriptions of webcam images or videos of wildlife. The system analyzes and interprets the footage using a multimodal large language model (LLM), presenting the results via an integrated text-to-speech engine. To better understand the needs of the target group, an online survey has been available since June 19, 2025. It is accessible in both English and German.

Fig.: Photo safaris for blind and visually impaired people (Image: ChatGPT/4o Image)

Animals, Ethics, and Engineering

The edited volume „Animals, Ethics, and Engineering: Intersections and Implications“ will be published by Jenny Stanford in August 2025. It can already be pre-ordered via online stores. These provide the following information: „‚Animals, Ethics, and Engineering: Intersections and Implications‘ is a seminal work that explores the intricate relationship between technology, ethics, and the welfare of nonhuman animals. Edited by Rosalyn W. Berne, this volume brings together leading scholars and practitioners to examine the ethical responsibilities inherent in technological progress and its impact on animal well-being.“ (Information Publisher) The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. Rosalyn W. Berne is the Anne Shirley Carter Olsson Professor of Applied Ethics and Chair of the Department of Engineering and Society, School of Engineering and Applied Sciences, University of Virginia, where she has been a faculty member since 1999 and co-directs the Online Ethics Center for Engineering and Science (OEC).

Fig.: One of our relatives

14 Animal-Related Concepts and Artifacts

Since 2012, Oliver Bendel has developed 14 concepts and artifacts in the field of animal-computer interaction (ACI) or animal-machine interaction (AMI) together with his students. They can be divided into three categories. The first are animal- and nature-friendly concepts. The second are animal-friendly machines and systems (i.e., forms of moral machines). The third are animal-inspired machines and systems that replace the animals or bring them closer to you. Articles and book chapters have been published on many of the projects. The names of the developers can be found in these. A few prototypes made it into the media, such as LADYBIRD and HAPPY HEDGEHOG. Oliver Bendel repeatedly refers to Clara Mancini, the pioneer in the field of animal-computer interaction. Recently, ethicists such as Peter Singer have also turned their attention to the topic. Current projects include „Robodog“ (in which a robot dog meets real dogs) and „VISUAL“ (which is designed to enable photo safaris for the blind).

Fig.: 14 Animal-related concepts and artifacts

The Robodog Project

Robotic four-legged friends – often referred to as robot dogs – are becoming more and more widespread. As a result, they will also encounter more and more real dogs. The question is how to design, control, and program the robot in such a way that the animals do not overreact and cause no harm to robots, animals, or bystanders. As part of “The Robodog Project”, smaller dogs are to be confronted with a walking, running, and jumping Unitree Go2. The plan is to visit controllable environments such as dog training grounds and arrange meetings with dog owners. The findings will lead to suggestions for design and control. Robot enhancement can also play a role here. For example, hobbyists have produced heads for Unitree Go2 using a 3D printer, giving the robot a completely different look. Suggestions for programming will also be made. The project is due to start at the FHNW School of Business in March 2024. It is part of Prof. Dr. Oliver Bendel’s research in the field of animal-machine interaction.

Fig.: Oliver Bendel with his Unitree Go2

13 Animal-Related Concepts and Artifacts in 13 Years

Since 2012, Oliver Bendel has developed 13 concepts and artifacts in the field of animal-computer interaction (ACI) or animal-machine interaction (AMI) together with his students. They can be divided into three categories. The first are animal- and nature-friendly concepts. The second are animal-friendly machines and systems (i.e., forms of moral machines). The third are animal-inspired machines and systems that replace the animals or bring them closer to you. Articles and book chapters have been published on many of the projects. The names of the developers can be found in these. A few prototypes made it into the media, such as LADYBIRD and HAPPY HEDGEHOG. Oliver Bendel repeatedly refers to Clara Mancini, the pioneer in the field of animal-computer interaction. Recently, ethicists such as Peter Singer have also turned their attention to the topic.

Fig.: An overview of the projects

People of Animal-Computer Interaction 2024

The International Conference on Animal-Computer Interaction (ACI) took place at the University of Glasgow from December 2 to 5, 2024. Clara Mancini and Fiona French are among the pillars of the ACI. Clara Mancini, PhD, is professor of animal-computer interaction (ACI) and founding director of the Open University ACI Laboratory. Her work explores the interaction between animals and technology, and the nexus between technology, animal well-being and justice, and human-animal relationships. Her research spans the theory, methodology, practice, and ethics of designing animal-centered interactive systems for and with animals to contribute to a more just and inclusive multispecies society. She is a co-founder of the ACI and has been promoting animal-centered research and design across disciplines for over a decade, organizing numerous scientific events and serving on various scientific committees. Dr. Fiona French is an associate professor in the School of Computing and Digital Media of the London Metropolitan University. She is course leader for the BSc Games Programming and a Fellow of the Higher Education Academy (FHEA). Her research interests are in the area of animal-computer interaction. Prof. Dr. Oliver Bendel attended the ACI International Conference 2022 and 2024. He has been conducting research in the field of animal-computer interaction and animal-machine interaction since 2012. Dr. Ilyena Hirskyj-Douglas (University of Glasgow) was the host of ACI 2024.

Fig.: Clara Mancini, Fiona French, and Oliver Bendel at the reception of the ACI 2024 (Photo: Jonathan Traynor)

Award for „The Animal Whisperer Project“

„The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business) won the Honourable Mention Short Paper Award at the 2024 ACI Conference. From the abstract: „Generative AI has become widespread since 2022. Technical advancements have resulted in multimodal large language models and other AI models that generate, analyze, and evaluate texts, images, and sounds. Such capabilities can be helpful in encounters between humans and animals. For example, apps with generative AI on a smartphone can be used to assess the body language and behavior of animals – e.g., during a walk or hike – and provide a recommendation for human behavior. It is often useful to take into account the animal’s environment and situation. The apps can help people to avert approaches and attacks, and thus also protect animals. In ‚The Animal Whisperer Project‘, three apps were developed as prototypes based on the multimodal large language model GPT-4 from OpenAI from the beginning to mid-2024. Three specific GPTs resulted: the Cow Whisperer, the Horse Whisperer, and the Dog Whisperer. All three showed impressive capabilities after the first prompt engineering. These were improved by implementing information from expert interviews and adding labeled images of animals and other materials. AI-based apps for interpreting body language, behavior, and the overall situation can apparently be created today, without much effort, in a low-budget project. However, turning them into products would certainly raise questions, such as liability in the event of accidents.“ The proceedings are available here.

Fig.: Nick Zbinden and Oliver Bendel with the Honourable Mention Short Paper Award

Start of the ACI 2024

The International Conference on Animal-Computer Interaction (ACI) started on December 2, 2024 at the University of Glasgow. The lectures will take place on the last two days. The ACI “is the leading venue in the rapidly expanding field of ACI”. “Initially held as a one-day affiliated event, since 2016 it has become a three- or four-day independent event and has been attracting a growing number of participants and contributors from diverse backgrounds.” (Website ACI) ACI’s roots lie in the theoretical, methodological and ethical foundations and values that have informed interaction design for decades. “Growing out of this fertile ground, ACI’s theoretical and methodological scope has since been expanding to include all forms of animals’ interaction with computing systems and all aspects of animal-centred computing, resulting in an increasing variety of applications.” (Website ACI) After the welcome address by Ilyena Hirskyj-Douglas (University of Glasgow) on the morning of 4 December 2024, Amanda Seed (School of Psychology and Neuroscience, University of St Andrews) gave her opening keynote entitled „What kind of mind do primates have?“.

Fig.: Ilyena Hirskyj-Douglas welcomes the participants

ACI ’24 Proceedings

The „Proceedings of the International Conference on Animal-Computer Interaction 2024“ were published at the end of November 2024, a few days before the conference in Glasgow. The following papers received awards: „Wireless Tension Sensors for Characterizing Dog Frailty in Veterinary Settings“ by Colt Nichols (North Carolina State University), Yifan Wu (North Carolina State University), Alper Bozkurt, David Roberts (North Carolina State University) and Margaret Gruen (North Carolina State University): Best Paper Award; „Communication Functions in Speech Board Use by a Goffin’s Cockatoo: Implications for Research and Design“ by Jennifer Cunha (Indiana University), Corinne Renguette (Perdue University), Lily Stella (Indiana University) and Clara Mancini (The Open University): Honourable Mention Award; „Surveying The Extent of Demographic Reporting of Animal Participants in ACI Research“ by Lena Ashooh (Harvard University), Ilyena Hirskyj-Douglas (University of Glasgow) and Rebecca Kleinberger (Northeastern University): Honourable Mention Award; „Shelling Out the Fun: Quantifying Otter Interactions with Instrumented Enrichment Objects“ by Charles Ramey (Georgia Institute of Technology), Jason Jones (Georgia Aquarium), Kristen Hannigan (Georgia Aquarium), Elizabeth Sadtler (Georgia Aquarium), Jennifer Odell (Georgia Aquarium), Thad Starner (Georgia Institute of Technology) and Melody Jackson (Georgia Institute of Technology): Best Short Paper Award; „The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business): Honourable Mention Short Paper Award.

Fig.: Facing a horse

Cow Whisperer, Horse Whisperer, Dog Whisperer

Am 28. Mai 2024 fand an der Hochschule für Wirtschaft FHNW die Zwischenpräsentation für das Projekt „The Animal Whisperer“ statt. Initiiert hat es Prof. Dr. Oliver Bendel, der sich seit vielen Jahren mit der Tier-Computer-Interaktion und der Tier-Maschine-Interaktion beschäftigt. Als Projektmitarbeiter konnte Nick Zbinden gewonnen werden, ein angehender Wirtschaftsinformatiker. Er entwickelte drei Anwendungen auf der Basis von GPT-4o, den Cow Whisperer, den Horse Whisperer und den Dog Whisperer. Mit ihnen kann man Körpersprache und Umfeld von Kühen, Pferden und Hunden analysieren. Damit sollen Gefahren für Mensch und Tier abgewendet werden. So kann ein Wanderer auf dem Smartphone die Empfehlung bekommen, eine Weide nicht zu überqueren, wenn eine Mutterkuh mit ihren Kälbchen zugegen ist. Dafür muss er nur die Anwendung aufrufen und Fotos von der Umgebung machen. Die Tests verlaufen bereits sehr vielversprechend. Nick Zbinden führt derzeit Gespräche mit drei menschlichen Flüsterern, also Experten auf diesem Gebiet, die die Körpersprache und das Verhalten der Tiere besonders gut einschätzen und mit ihnen besonders gut umgehen können. Dabei werden auch Fotos – die etwa unterschiedliche Positionen der Ohren oder Köpfe zeigen – von ihnen beschrieben und dann von ihm in die Anwendungen eingespeist. Die Endergebnisse werden im August 2024 vorliegen.

Abb.: Ein Pferdeflüsterer