Das Projekt DEEP VOICE startete am 3. September 2025. Es wurde von Prof. Dr. Oliver Bendel initiiert. Als Projektmitarbeiter konnte der Machine-Learning-Experte Florian Karrer gewonnen werden. Er wird in diesem Zusammenhang seine Abschlussarbeit an der Hochschule für Wirtschaft FHNW schreiben. „DEEP VOICE“ steht für „Decoding Environmental and Ethological Patterns in Vocal Communication of Cetaceans“. Das Projekt zielt darauf ab, die symbolische Sprache von Walen zu entschlüsseln. Dafür soll radikal die Perspektive des Tiers eingenommen werden. Es soll ein Modell von ihm, seinem Verhalten und seiner Umwelt entstehen, das dann als Grundlage für die Sprachverarbeitungskomponente dient. Das Projekt verbindet biologische und ethologische Grundlagen mit Ansätzen des Machine Learning und will so zu einem besseren Verständnis der nichtmenschlichen Intelligenz und Kommunikation sowie – über die Tier-Computer-Interaktion (Animal-Computer Interaction, ACI) – zur Mensch-Tier-Interaktion beitragen. Oliver Bendel und seine Studenten haben sich bisher vor allem auf die Körpersprache von Haus- und Nutztieren (The Animal Whisperer Project) sowie das Verhalten von Haustieren (The Robodog Project) und Wildtieren (VISUAL) konzentriert. Zudem wurde ein Konzept für die Gesichtserkennung bei Bären erstellt, mit dem Ziel, invasives Tracking zu vermeiden.
Recent advancements in artificial intelligence (AI) and bioacoustics have opened a unique opportunity to explore and decode animal communication. With the growing availability of bioacoustic data and sophisticated machine learning models, researchers are now in a position to make significant strides in understanding non-human animal languages. However, realizing this potential requires a deliberate integration of AI and ethology. The AI for Non-Human Animal Communication workshop at NeurIPS 2025 will focus on the challenges of processing complex bioacoustic data and interpreting animal signals. The workshop will feature keynote talks, a poster session, and a panel discussion, all aimed at advancing the use of AI to uncover the mysteries of animal communication and its implications for biodiversity and ecological conservation. The workshop is inviting submissions for short papers and proposals related to the use of AI in animal communication. Topics of interest include bioacoustics, multimodal learning, ecological monitoring, species-specific studies, and the ethical considerations of applying AI in animal research. Papers should present novel research, methodologies, or technologies in these areas, and will undergo a double-blind review process. The paper submission deadline is September 5, 2025, with notifications of acceptance by September 22, 2025. More information is available at aiforanimalcomms.org.
Zum Projekt VISUAL liegt seit 29. August 2025 ein Video vor, das das System im Betrieb zeigt. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem MLLM, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet. Das Video kann über www.informationsethik.net/videos/ abgerufen werden.
Das Projekt „The Robodog Project: Bao Meets Pluto“ untersuchte, wie Haushunde auf den vierbeinigen Roboter Unitree Go2 – von Projektinitiator Prof. Dr. Oliver Bendel Bao genannt – reagieren und wie ihre Halter solche Roboter in gemeinsam genutzten öffentlichen Räumen wahrnehmen. Das Projekt begann Ende März 2025 und wurde Anfang August 2025 abgeschlossen. Die Studie befasste sich mit drei Fragen: 1. Wie reagieren Hunde verhaltensmäßig auf einen vierbeinigen Roboter in sechs definierten Durchläufen, nämlich stationär, gehend und springend im Originalzustand sowie stationär, gehend und springend mit einem zusätzlichen 3D-gedruckten Hundekopf? 2. Welche Erwartungen und Bedenken haben die Halter? 3. Welche regulatorischen Rahmenbedingungen könnten eine sichere Integration unterstützen? Zwölf Hunde wurden in sechs strukturierten Interaktionsphasen beobachtet; ihr Verhalten wurde mithilfe von BORIS videokodiert. Vorgespräche mit acht Haltern sowie ein Experteninterview mit der Biologin und Hundetrainerin Dr. Sabrina Karl lieferten zusätzliche Erkenntnisse. Die Studie unter der Leitung von Selina Rohr ergab, dass die meisten Hunde vorsichtig, aber nicht aggressiv waren. Das Interesse nahm zu, sobald sich der Roboter bewegte, während visuelle Modifikationen kaum Wirkung zeigten. Ein 3D-gedruckter Hundekopf hingegen schien die Hunde im Standmodus zu faszinieren. Hergestellt und zur Verfügung gestellt wurde er von Norman Eskera. Häufig suchten die Tiere die Orientierung bei ihren Haltern, was die Bedeutung menschlicher Vermittlung unterstreicht. Die Halter zeigten sich vorsichtig aufgeschlossen, betonten jedoch Bedenken hinsichtlich Sicherheit, Unberechenbarkeit und Haftung. Die Ergebnisse sprechen für eine an Drohnen orientierte Regulierung beim Einsatz solcher Roboter im öffentlichen Raum.
The volume „Animals, Ethics, and Engineering: Intersections and Implications“, edited by Rosalyn W. Berne, was published on 7 August 2025. The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. The title of Oliver Bendel’s contribution is „An Investigation into the Encounter Between Social Robots and Animals“. The abstract reads: „Increasingly, social robots and certain service robots encounter, whether this is planned or not, domestic, farm, or wild animals. They react differently, some interested, some disinterested, some lethargic, some panicked. Research needs to turn more to animal-robot relationships, and to work with engineers to design these relationships in ways that promote animal welfare and reduce animal suffering. This chapter is about social robots that are designed for animals, but also those that – for different, rather unpredictable reasons – meet, interact, and communicate with animals. It also considers animal-friendly machines that have emerged in the context of machine ethics. In the discussion section, the author explores the question of which of the presented robots are to be understood as social robots and what their differences are in their purpose and in their relationship to animals. In addition, social and ethical aspects are addressed.“ The book was produced by Jenny Publishing and can be ordered via online stores.
Am 31. Juli 2025 fand die Abschlusspräsentation des Projekts VISUAL statt. Initiiert wurde dieses von Prof. Dr. Oliver Bendel von der Hochschule für Wirtschaft FHNW. Durchgeführt wurde es von Doris Jovic, die ihren Bachelor in Business Information Technology (BIT) in Basel macht. „VISUAL“ steht für „Virtual Inclusive Safaris for Unique Adventures and Learning“. Überall auf der Welt gibt es Webcams, die wilde Tiere zeigen. Sehende können sie nutzen, um bequem vom Sofa aus auf Foto- bzw. Videosafari zu gehen. Blinde und sehbehinderte Menschen sind dabei im Nachteil. Im Projekt wurde im Rahmen von Inclusive AI ein Prototyp speziell für sie entwickelt. Es werden weltweit öffentliche Webcams angezapft, die auf Wildtiere gerichtet sind. Man kann sich zwischen mehreren Lebensräumen auf dem Boden oder im Wasser entscheiden. Zudem kann man „Adult“ oder „Child“ als Profil und eine Rolle („Safari Adventurer“, „Field Scientist“, „Calm Observer“) auswählen. Wenn man das Livevideo aufruft, werden drei Screenshots angefertigt und zu einem Bündel zusammengefügt. Dieses wird von GPT-4o, einem multimodalen großen Sprachmodell, analysiert und evaluiert. Der Benutzer bekommt dann die Beschreibung der Szene und der Aktivitäten vorgesprochen. Die Bedürfnisse von blinden und sehbeeinträchtigten Personen wurden über eine barrierefreie Onlineumfrage eingeholt, bei der der FHNW-Mitarbeiter Artan Llugaxhija unterstützte. Das Projekt dürfte eines der ersten sein, das Inclusive AI mit neuen Ansätzen der Animal-Computer Interaction (ACI) verbindet.
Animal-machine interaction (AMI) is a discipline or field of work that deals with the interaction between animals and machines. This is how Prof. Dr. Oliver Bendel explains it in the Gabler Wirtschaftslexikon. It is primarily concerned with the design, evaluation, and implementation of complex machines and computer systems with which animals interact and which in turn interact and communicate with animals. There are close links to animal-computer interaction (ACI). Increasingly, the machine is a robot that is either remote-controlled or (partially) autonomous. In „The Robodog Project“ – also known as „Bao Meets Pluto“ – the encounters between robotic quadrupeds and small to medium-sized dogs are explored. The project collaborator is Selinar Rohr, who is writing her bachelor’s thesis in this context. The walking, running, and jumping Unitree Go2 from Oliver Bendel’s private Social Robots Lab is in its original state or is wearing a head made with a 3D printer provided by Norman Eskera. The project is being carried out at the FHNW School of Business and will end on August 12, 2025, after which the results will be presented to the community and, if possible, to the general public.
Fig.: The project is about encounters between robotic quadrupeds and dogs
The edited volume „Animals, Ethics, and Engineering: Intersections and Implications“ will be published by Jenny Stanford in August 2025. It can already be pre-ordered via online stores. These provide the following information: „‚Animals, Ethics, and Engineering: Intersections and Implications‘ is a seminal work that explores the intricate relationship between technology, ethics, and the welfare of nonhuman animals. Edited by Rosalyn W. Berne, this volume brings together leading scholars and practitioners to examine the ethical responsibilities inherent in technological progress and its impact on animal well-being.“ (Information Publisher) The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. Rosalyn W. Berne is the Anne Shirley Carter Olsson Professor of Applied Ethics and Chair of the Department of Engineering and Society, School of Engineering and Applied Sciences, University of Virginia, where she has been a faculty member since 1999 and co-directs the Online Ethics Center for Engineering and Science (OEC).
Robotic four-legged friends – often referred to as robot dogs – are becoming more and more widespread. As a result, they will also encounter more and more real dogs. The question is how to design, control, and program the robot in such a way that the animals do not overreact and cause no harm to robots, animals, or bystanders. As part of “The Robodog Project”, smaller dogs are to be confronted with a walking, running, and jumping Unitree Go2. The plan is to visit controllable environments such as dog training grounds and arrange meetings with dog owners. The findings will lead to suggestions for design and control. Robot enhancement can also play a role here. For example, hobbyists have produced heads for Unitree Go2 using a 3D printer, giving the robot a completely different look. Suggestions for programming will also be made. The project is due to start at the FHNW School of Business in March 2024. It is part of Prof. Dr. Oliver Bendel’s research in the field of animal-machine interaction.
The International Conference on Animal-Computer Interaction (ACI) took place at the University of Glasgow from December 2 to 5, 2024. Clara Mancini and Fiona French are among the pillars of the ACI. Clara Mancini, PhD, is professor of animal-computer interaction (ACI) and founding director of the Open University ACI Laboratory. Her work explores the interaction between animals and technology, and the nexus between technology, animal well-being and justice, and human-animal relationships. Her research spans the theory, methodology, practice, and ethics of designing animal-centered interactive systems for and with animals to contribute to a more just and inclusive multispecies society. She is a co-founder of the ACI and has been promoting animal-centered research and design across disciplines for over a decade, organizing numerous scientific events and serving on various scientific committees. Dr. Fiona French is an associate professor in the School of Computing and Digital Media of the London Metropolitan University. She is course leader for the BSc Games Programming and a Fellow of the Higher Education Academy (FHEA). Her research interests are in the area of animal-computer interaction. Prof. Dr. Oliver Bendel attended the ACI International Conference 2022 and 2024. He has been conducting research in the field of animal-computer interaction and animal-machine interaction since 2012. Dr. Ilyena Hirskyj-Douglas (University of Glasgow) was the host of ACI 2024.
Fig.: Clara Mancini, Fiona French, and Oliver Bendel at the reception of the ACI 2024 (Photo: Jonathan Traynor)
„The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business) won the Honourable Mention Short Paper Award at the 2024 ACI Conference. From the abstract: „Generative AI has become widespread since 2022. Technical advancements have resulted in multimodal large language models and other AI models that generate, analyze, and evaluate texts, images, and sounds. Such capabilities can be helpful in encounters between humans and animals. For example, apps with generative AI on a smartphone can be used to assess the body language and behavior of animals – e.g., during a walk or hike – and provide a recommendation for human behavior. It is often useful to take into account the animal’s environment and situation. The apps can help people to avert approaches and attacks, and thus also protect animals. In ‚The Animal Whisperer Project‘, three apps were developed as prototypes based on the multimodal large language model GPT-4 from OpenAI from the beginning to mid-2024. Three specific GPTs resulted: the Cow Whisperer, the Horse Whisperer, and the Dog Whisperer. All three showed impressive capabilities after the first prompt engineering. These were improved by implementing information from expert interviews and adding labeled images of animals and other materials. AI-based apps for interpreting body language, behavior, and the overall situation can apparently be created today, without much effort, in a low-budget project. However, turning them into products would certainly raise questions, such as liability in the event of accidents.“ The proceedings are available here.
Fig.: Nick Zbinden and Oliver Bendel with the Honourable Mention Short Paper Award
The International Conference on Animal-Computer Interaction (ACI) started on December 2, 2024 at the University of Glasgow. The lectures will take place on the last two days. The ACI “is the leading venue in the rapidly expanding field of ACI”. “Initially held as a one-day affiliated event, since 2016 it has become a three- or four-day independent event and has been attracting a growing number of participants and contributors from diverse backgrounds.” (Website ACI) ACI’s roots lie in the theoretical, methodological and ethical foundations and values that have informed interaction design for decades. “Growing out of this fertile ground, ACI’s theoretical and methodological scope has since been expanding to include all forms of animals’ interaction with computing systems and all aspects of animal-centred computing, resulting in an increasing variety of applications.” (Website ACI) After the welcome address by Ilyena Hirskyj-Douglas (University of Glasgow) on the morning of 4 December 2024, Amanda Seed (School of Psychology and Neuroscience, University of St Andrews) gave her opening keynote entitled „What kind of mind do primates have?“.
Fig.: Ilyena Hirskyj-Douglas welcomes the participants
The „Proceedings of the International Conference on Animal-Computer Interaction 2024“ were published at the end of November 2024, a few days before the conference in Glasgow. The following papers received awards: „Wireless Tension Sensors for Characterizing Dog Frailty in Veterinary Settings“ by Colt Nichols (North Carolina State University), Yifan Wu (North Carolina State University), Alper Bozkurt, David Roberts (North Carolina State University) and Margaret Gruen (North Carolina State University): Best Paper Award; „Communication Functions in Speech Board Use by a Goffin’s Cockatoo: Implications for Research and Design“ by Jennifer Cunha (Indiana University), Corinne Renguette (Perdue University), Lily Stella (Indiana University) and Clara Mancini (The Open University): Honourable Mention Award; „Surveying The Extent of Demographic Reporting of Animal Participants in ACI Research“ by Lena Ashooh (Harvard University), Ilyena Hirskyj-Douglas (University of Glasgow) and Rebecca Kleinberger (Northeastern University): Honourable Mention Award; „Shelling Out the Fun: Quantifying Otter Interactions with Instrumented Enrichment Objects“ by Charles Ramey (Georgia Institute of Technology), Jason Jones (Georgia Aquarium), Kristen Hannigan (Georgia Aquarium), Elizabeth Sadtler (Georgia Aquarium), Jennifer Odell (Georgia Aquarium), Thad Starner (Georgia Institute of Technology) and Melody Jackson (Georgia Institute of Technology): Best Short Paper Award; „The Animal Whisperer Project“ by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business): Honourable Mention Short Paper Award.
The Tenth International Conference on Animal-Computer Interaction will be held December 4-8, 2023, in Raleigh, North Carolina, hosted by North Carolina State University. „ACI is the leading International Conference on Animal-Computer Interaction. It is a highly multidisciplinary event drawing researchers and practitioners from diverse backgrounds to share and discuss work and topics related to the research and design of computing-enabled and interactive technology for and with animals.“ (Website ACI) The Ninth International Conference on Animal-Computer Interaction was held in Newcastle upon Tyne at the end of 2022. Also this year the organizers are interested in a variety of topics in animal-computer interaction and animal-machine interaction, as the call for papers (CfP) reveals: „Submissions might address topics such as: the role of technology in shaping human-animal relationships; studies and/or analysis of large-scale technology for animal deployments; considerations on the wider context of technology for animal use; methods and reflections on studying the next generation of technology for animals; or how to conduct ACI research in a world where commercial design and deployment of technology for animals outpaces academic thought.“ (Website ACI) The CfP can be accessed at www.aciconf.org/aci2023.
Fig.: The William B. Umstead State Park in North Carolina
The ACI took place from 5 to 8 December 2022 in Newcastle upon Tyne. It is the world’s leading conference on animal-computer interaction. The proceedings were published in the ACM Library on March 30, 2023. They include the paper „A Face Recognition System for Bears: Protection for Animals and Humans in the Alps“ by Oliver Bendel and Ali Yürekkirmaz. From the abstract: „Face recognition, in the sense of identifying people, is controversial from a legal, social, and ethical perspective. In particular, opposition has been expressed to its use in public spaces for mass surveillance purposes. Face recognition in animals, by contrast, seems to be uncontroversial from a social and ethical point of view and could even have potential for animal welfare and protection. This paper explores how face recognition for bears (understood here as brown bears) in the Alps could be implemented within a system that would help animals as well as humans. It sets out the advantages and disadvantages of wildlife cameras, ground robots, and camera drones that would be linked to artificial intelligence. Based on this, the authors make a proposal for deployment. They favour a three-stage plan that first deploys fixed cameras and then incorporates camera drones and ground robots. These are all connected to a control centre that assesses images and developments and intervenes as needed. The paper then discusses social and ethical, technical and scientific, and economic and structural perspectives. In conclusion, it considers what could happen in the future in this context.“ The proceedings can be accessed via dl.acm.org/doi/proceedings/10.1145/3565995.
At the end of the ACI conference, the „Paper Session 6“ was held, which was titled „Investigating Human-Animal Relations“. Sarah Webber (University of Melbourne) gave a talk on „Watching Animal-Computer Interaction: Effects on Perceptions of Animal Intellect“. In the experiment, people observed orangutans interacting with computer applications. It was examined how they changed their judgments regarding the animals‘ intelligence and behavior. The talk that followed came from Alexandra Morgan (Northumbria University) and was titled „Blind dogs need guides too: towards technological support for blind dog caregiving“. She addressed the needs of blind dogs and showed what gadgets are on the market to assist them. Her team developed an app called „My Blind Dogo“ that could help owners of blind dogs. The session ended with a talk on „A Face Recognition System for Bears: Protection for Animals and Humans in the Alps“ by Oliver Bendel (University of Applied Sciences and Arts Northwestern Switzerland). He presented an integrated system with cameras, robots, and drones that Ali Yürekkirmaz and he had designed. The ACI took place from 5 to 8 December 2022 in Newcastle upon Tyne. It is the world’s leading conference on animal-computer interaction. More information on the conference via www.aciconf.org/aci2022.
On the last day of the ACI Conference (December 8, 2022), „Session 5: Sensors & Signals, Part II: Electric Boogaloo“ started after the lunch break. Carlos Alberto Aguilar-Lazcano (CICESE-UT3) gave a talk on the topic „Towards a monitoring and emergency alarm system activated by the barking of assistant dogs“. The next presentation was „WAG’D: Towards a Wearable Activity and Gait Detection Monitor for Sled Dogs“ by Arianna Mastali (Georgia Institute of Technology). According to her, studies have shown orthopedic injuries to be most common among sled dogs. These like to move very much, but repeatedly exceed their capabilities. To solve this problem, the team has developed a technical solution, a special wearable, with the help of which data on the condition of the animals are generated. „Spatial and Temporal Analytic Pipeline for Evaluation of Potential Guide Dogs Using Location and Behavior Data“ was the title of the next talk, given by David L. Roberts (North Carolina State University), followed by “Comparing Accelerometry and Computer Vision Sensing Modalities for High-Resolution Canine Tail Wagging Interpretation”, given by Devon Martin (North Carolina State University). More information on the conference via www.aciconf.org/aci2022.
After the keynote on the morning of December 8, 2022, ACI2020 continued with „Paper Session 4: Sensors & Signals, Part I: Origin Stories“. David L. Roberts (North Carolina State University) presented on „Motion-Resilient ECG Signal Reconstruction from a Wearable IMU through Attention Mechanism and Contrastive Learning“. The next talk, „TamagoPhone: A framework for augmenting artificial incubators to enable vocal interaction between bird parents and eggs“, was given by Rebecca Kleinberger (Massachusetts Institute of Technology & Northeastern University). The starting point of her research was that some birds have pre-hatching vocal communication. The last presentation before the lunch break that was given online was „Simultaneous Contact-Free Physiological Sensing of Human Heart Rate and Canine Breathing Rate for Animal Assisted Interactions: Experimental and Analytical Approaches“ by Timothy Holder and Mushfiqur Rahman (North Carolina State University). More information on the conference via www.aciconf.org/aci2022.
The fourth day of the ACI2022 conference – December 8, 2022 – began with a keynote by Carys L. Williams (DogsTrust), titled „Time Savers or Toys? Realities of Animal Technology in Industry“. „Carys is a mixed-methods Research Officer at the UK’s largest dog welfare charity, Dogs Trust. Carys’ work has focused on practical and applicable dog behaviour and welfare research to improve the lives of dogs, especially those in Dogs Trust’s 22 rehoming centres (around 12,000 dogs a year!). For the last 2 years Carys has been project lead for the Dogs Trust Post Adoption Support longitudinal research project. She has additionally supported the charity’s move to collect more and better dog data, helping build exciting bespoke digital systems. Carys has also spent over a decade in the zoo industry and is currently a volunteer invertebrate keeper at ZSL London Zoo.“ (Website ACI2022) Carys L. Williams started her keynote with a quote from Vladimir Dinets (University of Tennessee): „What is the best games console for my spider?“ … She then turned to real-world issues, such as supporting the welfare of dogs through technological means. More information on the conference via www.aciconf.org/aci2022.
The ACI2022 conference continued on the afternoon of December 7, 2022 after the coffee break („Paper Session 3: Learning From and With Each Other“). Cristóbal Sepulveda Álvarez (Universidad de Chile) gave a talk on the topic „Measuring Digitally Comparative Abilities Between Discreet and Continuous Quantities through a Digital Enrichment Application“. He showed a parrot that had to choose different quantities on a touch screen. Dirk van der Linden (Northumbria University) was present on behalf of Jasmine Forester-Owen (Northumbria University). He spoke about „Noisy technology, anxious dogs: can technology support caregiving in the home?“. In their prototype, they combine noise detection and body language identification in dogs. Jérémy Barbay (Universidad de Chile) gave the last three presentations of the day: „Comparing Symbolic and Numerical Counting Times between Humans and Non-Humans Through a Digital Life Enrichment Application“, „Popping Up Balloons for Science: a Research Proposal“, and „A Loggable Aid to Speech (for Human and Non-Human Animals): A Research Proposal“. More information on the conference via www.aciconf.org.
The ACI2022 conference continued on the afternoon of December 7, 2022. „Paper Session 2: Recognising Animals & Animal Behaviour“ began with a presentation by Anna Zamansky (University of Haifa). The title was „How Can Technology Support Dog Shelters in Behavioral Assessment: an Exploratory Study“. Her next talk was also about dogs: „Do AI Models ‚Like‘ Black Dogs? Towards Exploring Perceptions of Dogs with Vision-Language Models“. She went into detail about OpenAI’s CLIP model, among other things. CLIP is a neural network which learns visual concepts from natural language supervision. She raised the question: „How can we use CLIP to investigate adoptability?“ Hugo Jair Escalante (INAOE) then gave a presentation on the topic „Dog emotion recognition from images in the wild: DEBIw dataset and first results“. Emotion recognition using face recognition is still in its infancy with respect to animals, but impressive progress is already being made. The last presentation in the afternoon before the coffee break was „Detecting Canine Mastication: A Wearable Approach“ by Charles Ramey (Georgia Institute of Technology). He raised the question: „Can automatic chewing detection measure how detection canines are coping with stress?“. More information on the conference via www.aciconf.org.
ACI2022 continued on December 7 with Paper Sessions. Number 1 was „Designing for Human-Animal Relations“. Clara Mancini from The Open University gave a talk on the topic „Politicising Animal-Computer Interaction: an Approach to Political Engagement with Animal-Centred Design“. She is one of the pioneers in the discipline of animal-computer interaction. This was followed by Dirk van der Linden’s presentation „Animal-centered design needs dignity: a critical essay on ACI’s core concept“. The scientist from the Northumbria University referred to the Swiss law, which assumes the dignity of living beings – animals as well as plants, it should be added. Minori Tsuji from Future University Hakodate spoke about the „Investigation on Enhancement of the Sense of Life in Safari Park Online Tours with Animal Breathing Reproduction System“. Visitors can touch artifacts with different breathing frequencies. The final contribution in the morning came from Jennifer Cunha (Parrot Kindergarten) and Corinne Renguette (Indiana University-Purdue University). It was about „A Framework for Training Animals to Use Touchscreen Devices for Discrimination Tasks“. The scientists taught various animals, such as a parrot, a rat, and a dog, how to use tablets. More information on the conference via www.aciconf.org.