Kick-off-Meeting des Projekts DEEP VOICE

Das Projekt DEEP VOICE startete am 3. September 2025. Es wurde von Prof. Dr. Oliver Bendel initiiert. Als Projektmitarbeiter konnte der Machine-Learning-Experte Florian Karrer gewonnen werden. Er wird in diesem Zusammenhang seine Abschlussarbeit an der Hochschule für Wirtschaft FHNW schreiben. „DEEP VOICE“ steht für „Decoding Environmental and Ethological Patterns in Vocal Communication of Cetaceans“. Das Projekt zielt darauf ab, die symbolische Sprache von Walen zu entschlüsseln. Dafür soll radikal die Perspektive des Tiers eingenommen werden. Es soll ein Modell von ihm, seinem Verhalten und seiner Umwelt entstehen, das dann als Grundlage für die Sprachverarbeitungskomponente dient. Das Projekt verbindet biologische und ethologische Grundlagen mit Ansätzen des Machine Learning und will so zu einem besseren Verständnis der nichtmenschlichen Intelligenz und Kommunikation sowie – über die Tier-Computer-Interaktion (Animal-Computer Interaction, ACI) – zur Mensch-Tier-Interaktion beitragen. Oliver Bendel und seine Studenten haben sich bisher vor allem auf die Körpersprache von Haus- und Nutztieren (The Animal Whisperer Project) sowie das Verhalten von Haustieren (The Robodog Project) und Wildtieren (VISUAL) konzentriert. Zudem wurde ein Konzept für die Gesichtserkennung bei Bären erstellt, mit dem Ziel, invasives Tracking zu vermeiden.

Abb.: Zwei Wale im Meer

AI for Non-Human Animal Communication

Recent advancements in artificial intelligence (AI) and bioacoustics have opened a unique opportunity to explore and decode animal communication. With the growing availability of bioacoustic data and sophisticated machine learning models, researchers are now in a position to make significant strides in understanding non-human animal languages. However, realizing this potential requires a deliberate integration of AI and ethology. The AI for Non-Human Animal Communication workshop at NeurIPS 2025 will focus on the challenges of processing complex bioacoustic data and interpreting animal signals. The workshop will feature keynote talks, a poster session, and a panel discussion, all aimed at advancing the use of AI to uncover the mysteries of animal communication and its implications for biodiversity and ecological conservation. The workshop is inviting submissions for short papers and proposals related to the use of AI in animal communication. Topics of interest include bioacoustics, multimodal learning, ecological monitoring, species-specific studies, and the ethical considerations of applying AI in animal research. Papers should present novel research, methodologies, or technologies in these areas, and will undergo a double-blind review process. The paper submission deadline is September 5, 2025, with notifications of acceptance by September 22, 2025. More information is available at aiforanimalcomms.org.

Fig.: Nonhuman primates in conversation

Video zum VISUAL-Projekt

Zum Projekt VISUAL liegt seit 29. August 2025 ein Video vor, das das System im Betrieb zeigt. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem MLLM, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet. Das Video kann über www.informationsethik.net/videos/ abgerufen werden.

Abb.: Das VISUAL-System

Wenn Hunde auf einen Roboterhund treffen

Das Projekt „The Robodog Project: Bao Meets Pluto“ untersuchte, wie Haushunde auf den vierbeinigen Roboter Unitree Go2 – von Projektinitiator Prof. Dr. Oliver Bendel Bao genannt – reagieren und wie ihre Halter solche Roboter in gemeinsam genutzten öffentlichen Räumen wahrnehmen. Das Projekt begann Ende März 2025 und wurde Anfang August 2025 abgeschlossen. Die Studie befasste sich mit drei Fragen: 1. Wie reagieren Hunde verhaltensmäßig auf einen vierbeinigen Roboter in sechs definierten Durchläufen, nämlich stationär, gehend und springend im Originalzustand sowie stationär, gehend und springend mit einem zusätzlichen 3D-gedruckten Hundekopf? 2. Welche Erwartungen und Bedenken haben die Halter? 3. Welche regulatorischen Rahmenbedingungen könnten eine sichere Integration unterstützen? Zwölf Hunde wurden in sechs strukturierten Interaktionsphasen beobachtet; ihr Verhalten wurde mithilfe von BORIS videokodiert. Vorgespräche mit acht Haltern sowie ein Experteninterview mit der Biologin und Hundetrainerin Dr. Sabrina Karl lieferten zusätzliche Erkenntnisse. Die Studie unter der Leitung von Selina Rohr ergab, dass die meisten Hunde vorsichtig, aber nicht aggressiv waren. Das Interesse nahm zu, sobald sich der Roboter bewegte, während visuelle Modifikationen kaum Wirkung zeigten. Ein 3D-gedruckter Hundekopf hingegen schien die Hunde im Standmodus zu faszinieren. Hergestellt und zur Verfügung gestellt wurde er von Norman Eskera. Häufig suchten die Tiere die Orientierung bei ihren Haltern, was die Bedeutung menschlicher Vermittlung unterstreicht. Die Halter zeigten sich vorsichtig aufgeschlossen, betonten jedoch Bedenken hinsichtlich Sicherheit, Unberechenbarkeit und Haftung. Die Ergebnisse sprechen für eine an Drohnen orientierte Regulierung beim Einsatz solcher Roboter im öffentlichen Raum.

Abb.: Der Hundekopf aus dem 3D-Drucker

An Investigation into the Encounter Between Social Robots and Animals

The volume „Animals, Ethics, and Engineering: Intersections and Implications“, edited by Rosalyn W. Berne, was published on 7 August 2025. The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. The title of Oliver Bendel’s contribution is „An Investigation into the Encounter Between Social Robots and Animals“. The abstract reads: „Increasingly, social robots and certain service robots encounter, whether this is planned or not, domestic, farm, or wild animals. They react differently, some interested, some disinterested, some lethargic, some panicked. Research needs to turn more to animal-robot relationships, and to work with engineers to design these relationships in ways that promote animal welfare and reduce animal suffering. This chapter is about social robots that are designed for animals, but also those that – for different, rather unpredictable reasons – meet, interact, and communicate with animals. It also considers animal-friendly machines that have emerged in the context of machine ethics. In the discussion section, the author explores the question of which of the presented robots are to be understood as social robots and what their differences are in their purpose and in their relationship to animals. In addition, social and ethical aspects are addressed.“ The book was produced by Jenny Publishing and can be ordered via online stores.

Fig.: A monkey with a mirror

Abschluss des Projekts VISUAL

Am 31. Juli 2025 fand die Abschlusspräsentation des Projekts VISUAL statt. Initiiert wurde dieses von Prof. Dr. Oliver Bendel von der Hochschule für Wirtschaft FHNW. Durchgeführt wurde es von Doris Jovic, die ihren Bachelor in Business Information Technology (BIT) in Basel macht. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem multimodalen großen Sprachmodell, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Die Bedürfnisse von blinden und sehbeeinträchtigten Personen wurden über eine barrierefreie Onlineumfrage eingeholt, bei der der FHNW-Mitarbeiter Artan Llugaxhija unterstützte. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet.

Abb.: Doris Jovic bei der Abschlusspräsentation

„The Robodog Project“ Comes to an End

Animal-machine interaction (AMI) is a discipline or field of work that deals with the interaction between animals and machines. This is how Prof. Dr. Oliver Bendel explains it in the Gabler Wirtschaftslexikon. It is primarily concerned with the design, evaluation, and implementation of complex machines and computer systems with which animals interact and which in turn interact and communicate with animals. There are close links to animal-computer interaction (ACI). Increasingly, the machine is a robot that is either remote-controlled or (partially) autonomous. In „The Robodog Project“ – also known as „Bao Meets Pluto“ – the encounters between robotic quadrupeds and small to medium-sized dogs are explored. The project collaborator is Selinar Rohr, who is writing her bachelor’s thesis in this context. The walking, running, and jumping Unitree Go2 from Oliver Bendel’s private Social Robots Lab is in its original state or is wearing a head made with a 3D printer provided by Norman Eskera. The project is being carried out at the FHNW School of Business and will end on August 12, 2025, after which the results will be presented to the community and, if possible, to the general public.

Fig.: The project is about encounters between robotic quadrupeds and dogs

Online Survey on the VISUAL Project

On June 19, 2025, the interim presentation of the VISUAL project took place. The initiative was launched by Prof. Dr. Oliver Bendel from the FHNW School of Business. The project assistant is Doris Jovic, who is currently pursuing her Bachelor’s degree in Business Information Technology (BIT). „VISUAL“ stands for „Virtual Inclusive Safaris for Unique Adventures and Learning“. All over the world, webcams provide real-time footage of wild animals. Sighted people can use them to go on photo or video safaris from the comfort of their homes. However, blind and visually impaired individuals are at a disadvantage. In the spirit of Inclusive AI – a concept and movement that includes tools like Be My Eyes and its Be My AI feature – this project aims to create an accessible solution. By August 2025, the goal is to develop a prototype that allows blind and visually impaired users to receive audio descriptions of webcam images or videos of wildlife. The system analyzes and interprets the footage using a multimodal large language model (LLM), presenting the results via an integrated text-to-speech engine. To better understand the needs of the target group, an online survey has been available since June 19, 2025. It is accessible in both English and German.

Fig.: Photo safaris for blind and visually impaired people (Image: ChatGPT/4o Image)

Animals, Ethics, and Engineering

The edited volume „Animals, Ethics, and Engineering: Intersections and Implications“ will be published by Jenny Stanford in August 2025. It can already be pre-ordered via online stores. These provide the following information: „‚Animals, Ethics, and Engineering: Intersections and Implications‘ is a seminal work that explores the intricate relationship between technology, ethics, and the welfare of nonhuman animals. Edited by Rosalyn W. Berne, this volume brings together leading scholars and practitioners to examine the ethical responsibilities inherent in technological progress and its impact on animal well-being.“ (Information Publisher) The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. Rosalyn W. Berne is the Anne Shirley Carter Olsson Professor of Applied Ethics and Chair of the Department of Engineering and Society, School of Engineering and Applied Sciences, University of Virginia, where she has been a faculty member since 1999 and co-directs the Online Ethics Center for Engineering and Science (OEC).

Fig.: One of our relatives

14 Animal-Related Concepts and Artifacts

Since 2012, Oliver Bendel has developed 14 concepts and artifacts in the field of animal-computer interaction (ACI) or animal-machine interaction (AMI) together with his students. They can be divided into three categories. The first are animal- and nature-friendly concepts. The second are animal-friendly machines and systems (i.e., forms of moral machines). The third are animal-inspired machines and systems that replace the animals or bring them closer to you. Articles and book chapters have been published on many of the projects. The names of the developers can be found in these. A few prototypes made it into the media, such as LADYBIRD and HAPPY HEDGEHOG. Oliver Bendel repeatedly refers to Clara Mancini, the pioneer in the field of animal-computer interaction. Recently, ethicists such as Peter Singer have also turned their attention to the topic. Current projects include „Robodog“ (in which a robot dog meets real dogs) and „VISUAL“ (which is designed to enable photo safaris for the blind).

Fig.: 14 Animal-related concepts and artifacts

The Robodog Project

Robotic four-legged friends – often referred to as robot dogs – are becoming more and more widespread. As a result, they will also encounter more and more real dogs. The question is how to design, control, and program the robot in such a way that the animals do not overreact and cause no harm to robots, animals, or bystanders. As part of “The Robodog Project”, smaller dogs are to be confronted with a walking, running, and jumping Unitree Go2. The plan is to visit controllable environments such as dog training grounds and arrange meetings with dog owners. The findings will lead to suggestions for design and control. Robot enhancement can also play a role here. For example, hobbyists have produced heads for Unitree Go2 using a 3D printer, giving the robot a completely different look. Suggestions for programming will also be made. The project is due to start at the FHNW School of Business in March 2024. It is part of Prof. Dr. Oliver Bendel’s research in the field of animal-machine interaction.

Fig.: Oliver Bendel with his Unitree Go2

13 Animal-Related Concepts and Artifacts in 13 Years

Since 2012, Oliver Bendel has developed 13 concepts and artifacts in the field of animal-computer interaction (ACI) or animal-machine interaction (AMI) together with his students. They can be divided into three categories. The first are animal- and nature-friendly concepts. The second are animal-friendly machines and systems (i.e., forms of moral machines). The third are animal-inspired machines and systems that replace the animals or bring them closer to you. Articles and book chapters have been published on many of the projects. The names of the developers can be found in these. A few prototypes made it into the media, such as LADYBIRD and HAPPY HEDGEHOG. Oliver Bendel repeatedly refers to Clara Mancini, the pioneer in the field of animal-computer interaction. Recently, ethicists such as Peter Singer have also turned their attention to the topic.

Fig.: An overview of the projects

People of Animal-Computer Interaction 2024

The International Conference on Animal-Computer Interaction (ACI) took place at the University of Glasgow from December 2 to 5, 2024. Clara Mancini and Fiona French are among the pillars of the ACI. Clara Mancini, PhD, is professor of animal-computer interaction (ACI) and founding director of the Open University ACI Laboratory. Her work explores the interaction between animals and technology, and the nexus between technology, animal well-being and justice, and human-animal relationships. Her research spans the theory, methodology, practice, and ethics of designing animal-centered interactive systems for and with animals to contribute to a more just and inclusive multispecies society. She is a co-founder of the ACI and has been promoting animal-centered research and design across disciplines for over a decade, organizing numerous scientific events and serving on various scientific committees. Dr. Fiona French is an associate professor in the School of Computing and Digital Media of the London Metropolitan University. She is course leader for the BSc Games Programming and a Fellow of the Higher Education Academy (FHEA). Her research interests are in the area of animal-computer interaction. Prof. Dr. Oliver Bendel attended the ACI International Conference 2022 and 2024. He has been conducting research in the field of animal-computer interaction and animal-machine interaction since 2012. Dr. Ilyena Hirskyj-Douglas (University of Glasgow) was the host of ACI 2024.

Fig.: Clara Mancini, Fiona French, and Oliver Bendel at the reception of the ACI 2024 (Photo: Jonathan Traynor)

Award for „The Animal Whisperer Project“

„The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business) won the Honourable Mention Short Paper Award at the 2024 ACI Conference. From the abstract: „Generative AI has become widespread since 2022. Technical advancements have resulted in multimodal large language models and other AI models that generate, analyze, and evaluate texts, images, and sounds. Such capabilities can be helpful in encounters between humans and animals. For example, apps with generative AI on a smartphone can be used to assess the body language and behavior of animals – e.g., during a walk or hike – and provide a recommendation for human behavior. It is often useful to take into account the animal’s environment and situation. The apps can help people to avert approaches and attacks, and thus also protect animals. In ‚The Animal Whisperer Project‘, three apps were developed as prototypes based on the multimodal large language model GPT-4 from OpenAI from the beginning to mid-2024. Three specific GPTs resulted: the Cow Whisperer, the Horse Whisperer, and the Dog Whisperer. All three showed impressive capabilities after the first prompt engineering. These were improved by implementing information from expert interviews and adding labeled images of animals and other materials. AI-based apps for interpreting body language, behavior, and the overall situation can apparently be created today, without much effort, in a low-budget project. However, turning them into products would certainly raise questions, such as liability in the event of accidents.“ The proceedings are available here.

Fig.: Nick Zbinden and Oliver Bendel with the Honourable Mention Short Paper Award

Start of the ACI 2024

The International Conference on Animal-Computer Interaction (ACI) started on December 2, 2024 at the University of Glasgow. The lectures will take place on the last two days. The ACI “is the leading venue in the rapidly expanding field of ACI”. “Initially held as a one-day affiliated event, since 2016 it has become a three- or four-day independent event and has been attracting a growing number of participants and contributors from diverse backgrounds.” (Website ACI) ACI’s roots lie in the theoretical, methodological and ethical foundations and values that have informed interaction design for decades. “Growing out of this fertile ground, ACI’s theoretical and methodological scope has since been expanding to include all forms of animals’ interaction with computing systems and all aspects of animal-centred computing, resulting in an increasing variety of applications.” (Website ACI) After the welcome address by Ilyena Hirskyj-Douglas (University of Glasgow) on the morning of 4 December 2024, Amanda Seed (School of Psychology and Neuroscience, University of St Andrews) gave her opening keynote entitled „What kind of mind do primates have?“.

Fig.: Ilyena Hirskyj-Douglas welcomes the participants

ACI ’24 Proceedings

The „Proceedings of the International Conference on Animal-Computer Interaction 2024“ were published at the end of November 2024, a few days before the conference in Glasgow. The following papers received awards: „Wireless Tension Sensors for Characterizing Dog Frailty in Veterinary Settings“ by Colt Nichols (North Carolina State University), Yifan Wu (North Carolina State University), Alper Bozkurt, David Roberts (North Carolina State University) and Margaret Gruen (North Carolina State University): Best Paper Award; „Communication Functions in Speech Board Use by a Goffin’s Cockatoo: Implications for Research and Design“ by Jennifer Cunha (Indiana University), Corinne Renguette (Perdue University), Lily Stella (Indiana University) and Clara Mancini (The Open University): Honourable Mention Award; „Surveying The Extent of Demographic Reporting of Animal Participants in ACI Research“ by Lena Ashooh (Harvard University), Ilyena Hirskyj-Douglas (University of Glasgow) and Rebecca Kleinberger (Northeastern University): Honourable Mention Award; „Shelling Out the Fun: Quantifying Otter Interactions with Instrumented Enrichment Objects“ by Charles Ramey (Georgia Institute of Technology), Jason Jones (Georgia Aquarium), Kristen Hannigan (Georgia Aquarium), Elizabeth Sadtler (Georgia Aquarium), Jennifer Odell (Georgia Aquarium), Thad Starner (Georgia Institute of Technology) and Melody Jackson (Georgia Institute of Technology): Best Short Paper Award; „The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business): Honourable Mention Short Paper Award.

Fig.: Facing a horse

Cow Whisperer, Horse Whisperer, Dog Whisperer

Am 28. Mai 2024 fand an der Hochschule für Wirtschaft FHNW die Zwischenpräsentation für das Projekt „The Animal Whisperer“ statt. Initiiert hat es Prof. Dr. Oliver Bendel, der sich seit vielen Jahren mit der Tier-Computer-Interaktion und der Tier-Maschine-Interaktion beschäftigt. Als Projektmitarbeiter konnte Nick Zbinden gewonnen werden, ein angehender Wirtschaftsinformatiker. Er entwickelte drei Anwendungen auf der Basis von GPT-4o, den Cow Whisperer, den Horse Whisperer und den Dog Whisperer. Mit ihnen kann man Körpersprache und Umfeld von Kühen, Pferden und Hunden analysieren. Damit sollen Gefahren für Mensch und Tier abgewendet werden. So kann ein Wanderer auf dem Smartphone die Empfehlung bekommen, eine Weide nicht zu überqueren, wenn eine Mutterkuh mit ihren Kälbchen zugegen ist. Dafür muss er nur die Anwendung aufrufen und Fotos von der Umgebung machen. Die Tests verlaufen bereits sehr vielversprechend. Nick Zbinden führt derzeit Gespräche mit drei menschlichen Flüsterern, also Experten auf diesem Gebiet, die die Körpersprache und das Verhalten der Tiere besonders gut einschätzen und mit ihnen besonders gut umgehen können. Dabei werden auch Fotos – die etwa unterschiedliche Positionen der Ohren oder Köpfe zeigen – von ihnen beschrieben und dann von ihm in die Anwendungen eingespeist. Die Endergebnisse werden im August 2024 vorliegen.

Abb.: Ein Pferdeflüsterer

The Animal Whisperer: A GenAI App for Decoding Animal Body Language

When humans come into contact with wildlife, farm animals, and pets, they sometimes run the risk of being injured or killed. They may be attacked by bears, wolves, cows, horses, or dogs. Experts can use an animal’s body language to determine whether or not danger is imminent. Context is also important, such as whether a mother cow is with her calves. The multimodality of large language models enables novel applications. For example, ChatGPT can evaluate images. This ability can be used to interpret the body language of animals, thus using and replacing expert knowledge. Prof. Dr. Oliver Bendel, who has been involved with animal-computer interaction and animal-machine interaction for many years, has initiated a project called „The Animal Whisperer“ in this context. The goal is to create a prototype application based on GenAI that can be used to interpret the body language of an animal and avert danger for humans. GPT-4 or an open source language model should be used to create the prototype. It should be augmented with appropriate material, taking into account animals such as bears, wolves, cows, horses, and dogs. Approaches may include fine-tuning or rapid engineering. The project will begin in March 2024 and the results will be available in the summer of the same year.

Fig.: The Animal Wisperer (Image: DALL-E 3)

Ein neuer Bewohner des Social Robots Lab

Im privat finanzierten Social Robots Lab von Prof. Dr. Oliver Bendel ist seit Dezember 2023 der Unitree Go2 zu finden. Er hört auf den Namen Bao (chin. für „Juwel“ oder „Schatz“). Heise schreibt in einem Artikel: „Das Basismodell des Go2 ist am Kopf mit einem Lidar ausgestattet, das ein halbkugelförmiges Sichtfeld mit 90 vertikal und 360 Grad horizontal besitzt. Die Mindestreichweite des Lidar beträgt etwa 5 cm. Damit kann der Roboter im Gelände selbstständig navigieren, Hindernisse erkennen und sie auch umgehen. Der Roboter läuft mit einer Geschwindigkeit von bis zu 2,5 m/s und zeigt sich dabei äußerst agil. … Mit an Bord ist auch eine 2-Megapixel-Kamera, mit der Schnappschüsse und Videos aufgenommen werden können.“ (Heise News, 27. Juli 2023) Ein Anliegen des Wirtschaftsinformatikers und Technikphilosophen ist, bei der Tier-Maschine-Interaktion (Animal-Machine Interaction) voranzuschreiten. Dieses Arbeitsgebiet wurde von ihm im Jahre 2013 in seinem Beitrag „Considerations about the Relationship between Animal and Machine Ethics“ definiert, in Anlehnung an den Begriff der Animal-Computer Interaction, deren Pionierin Clara Mancini ist. Seit dieser Zeit entwickelte er mehrere Artefakte und Konzepte in diesem Bereich, darunter Robocar (Modellierung für tierfreundliche Autos), LADYBIRD (Prototyp eines insektenfreundlichen Staubsaugerroboters), HAPPY HEDGEHOG (Prototyp eines igelfreundlichen Staubsaugerroboters) und ANIFACE (Konzept für ein System mit Gesichtserkennung zur Identifizierung von Braunbären). Mit Bao sollen Reaktionen von Haus-, Nutz- und Wildtieren getestet werden. Ziel ist es, Roboter nicht nur menschen-, sondern auch tierfreundlich zu gestalten.

Abb.: Der Unitree Go2 im Büro von Oliver Bendel

Artificial Intelligence and Animals

The online event „Artificial Intelligence & Animals“ will take place on 16 September 2023. „AI experts and attorneys will discuss the intersection of AI and animals in this UIA Animal Law Commission and GW Animal Law webinar“ (Website Eventbrite) Speakers are Prof. Dr. Oliver Bendel (FHNW University of Applied Sciences and Arts Northwestern Switzerland), Yip Fai Tse (University Center for Human Values, Center for Information Technology Policy, Princeton University), and Sam Tucker (CEO VegCatalyst, AI-Powered Marketing, Melbourne). Panelists are Ian McDougall (Executive Vice President and General Counsel, LexisNexis London), Jamie McLaughlin (Animal Law Commission Vice President, UIA), and Joan Schaffner (Associate Professor of Law, George Washington University). Oliver Bendel „has been thinking on animal ethics since the 1980s and on information and machine ethics since the 1990s“. „Since 2012, he has been systematically researching machine ethics, combining it with animal ethics and animal welfare. With his changing teams, he develops animal-friendly robots and AI systems.“ (Website Eventbrite) Yip Fai Tse co-wrote the article „AI ethics: the case for including animals“ with Peter Singer. Sam Tucker is an animal rights activist.

Fig.: An AI ball and a cat

CfP for the Tenth International Conference on Animal-Computer Interaction

The Tenth International Conference on Animal-Computer Interaction will be held December 4-8, 2023, in Raleigh, North Carolina, hosted by North Carolina State University. „ACI is the leading International Conference on Animal-Computer Interaction. It is a highly multidisciplinary event drawing researchers and practitioners from diverse backgrounds to share and discuss work and topics related to the research and design of computing-enabled and interactive technology for and with animals.“ (Website ACI) The Ninth International Conference on Animal-Computer Interaction was held in Newcastle upon Tyne at the end of 2022. Also this year the organizers are interested in a variety of topics in animal-computer interaction and animal-machine interaction, as the call for papers (CfP) reveals: „Submissions might address topics such as: the role of technology in shaping human-animal relationships; studies and/or analysis of large-scale technology for animal deployments; considerations on the wider context of technology for animal use; methods and reflections on studying the next generation of technology for animals; or how to conduct ACI research in a world where commercial design and deployment of technology for animals outpaces academic thought.“ (Website ACI) The CfP can be accessed at www.aciconf.org/aci2023.

Fig.: The William B. Umstead State Park in North Carolina

Vocal Interaction Between Bird Parents and Eggs

After the keynote on the morning of December 8, 2022, ACI2020 continued with „Paper Session 4: Sensors & Signals, Part I: Origin Stories“. David L. Roberts (North Carolina State University) presented on „Motion-Resilient ECG Signal Reconstruction from a Wearable IMU through Attention Mechanism and Contrastive Learning“. The next talk, „TamagoPhone: A framework for augmenting artificial incubators to enable vocal interaction between bird parents and eggs“, was given by Rebecca Kleinberger (Massachusetts Institute of Technology & Northeastern University). The starting point of her research was that some birds have pre-hatching vocal communication. The last presentation before the lunch break that was given online was „Simultaneous Contact-Free Physiological Sensing of Human Heart Rate and Canine Breathing Rate for Animal Assisted Interactions: Experimental and Analytical Approaches“ by Timothy Holder and Mushfiqur Rahman (North Carolina State University). More information on the conference via www.aciconf.org/aci2022.

Fig.: An ostrich cub next to some eggs