LADYBIRD, the animal-friendly vacuum cleaning robot, was conceived in 2014 by Oliver Bendel and introduced at Stanford University (AAAI Spring Symposia) in 2017 and then implemented as a prototype at the School of Business FHNW. In the context of the project, a menu was proposed with which the user can set the morale of the vacuum cleaner robot. As the name implies, it spares ladybirds. It should also let spiders live. But if you want to have certain insects sucked in, you could define this via a menu. It is important that LADYBIRD remains animal-friendly overall. The idea was to develop a proxy morality in detail via the menu. The vacuum cleaner robot as a proxy machine does what the owner would do. In 2018 the moral menu (MOME) for LADYBIRD was born as a design study. It can be combined with other approaches and technologies. In this way, the user could learn how others have decided and how the personal morality that he has transferred to the machine is assessed. He could also be warned and enlightened if he wants to suck in not only vermin but also spiders.
Fig.: Moral menu for LADYBIRD
A special session “Formalising Robot Ethics” takes place within the ISAIM conference in Fort Lauderdale (3 to 5 January 2018). The program is now available and can be viewed on http://isaim2018.cs.virginia.edu/program.html. “Practical Challenges in Explicit Ethical Machine Reasoning” is a talk by Louise Dennis and Michael Fischer, “Contextual Deontic Cognitive Event Calculi for Ethically Correct Robots” a contribution of Selmer Bringsjord, Naveen Sundar G., Bertram Malle and Matthias Scheutz. Oliver Bendel will present “Selected Prototypes of Moral Machines”. A few words from the summary: “The GOODBOT is a chatbot that responds morally adequate to problems of the users. It’s based on the Verbot engine. The LIEBOT can lie systematically, using seven different strategies. It was written in Java, whereby AIML was used. LADYBIRD is an animal-friendly robot vacuum cleaner that spares ladybirds and other insects. In this case, an annotated decision tree was translated into Java. The BESTBOT should be even better than the GOODBOT.”
Fig.: Machine Ethics in Fort Lauderdale
The LADYBIRD project starts in March 2017. More and more autonomous and semi-autonomous machines make decisions that have moral implications. Machine ethics as a discipline examines the possibilities and limits of moral machines. In this context, Prof. Dr. Oliver Bendel developed various design studies and thus submitted proposals for their appearance and functions. He focused on animal-friendly machines which make morally sound decisions, and chatbots with specific skills. The project is about a service robot, which shall spare beneficial insects – a vacuum cleaner called LADYBIRD. An annotated decision tree modelled for this objective and a set of sensors will be used. Three students work from March to October in the practice project at the School of Business FHNW. Since 2013, the principal Oliver Bendel published several articles on LADYBIRD and other animal-friendly machines, e.g., “Ich bremse auch für Tiere (I also brake for animals)” in the Swiss magazine inside.it. The robot will be the third prototype in the context of machine ethics at the School of Business FHNW. The first one was the GOODBOT (2013), the second one the LIEBOT (2016). All these machines can be described as simple moral machines.
Fig.: The robot should spare the ladybird
The proceedings of the AAAI conference 2016 have been published in March 2016 (“The 2016 AAAI Spring Symposium Series: Technical Reports”). The symposium “Ethical and Moral Considerations in Non-Human Agents” was dedicated to the discipline of machine ethics. Ron Arkin (Georgia Institute of Technology), Luís Moniz Pereira (Universidade Nova de Lisboa), Peter Asaro (New School for Public Engagement, New York) and Oliver Bendel (School of Business FHNW) spoke about moral and immoral machines. The contribution “Annotated Decision Trees for Simple Moral Machines” (Oliver Bendel) can be found on the pages 195 – 201. In the abstract it is said: “Autonomization often follows after the automization on which it is based. More and more machines have to make decisions with moral implications. Machine ethics, which can be seen as an equivalent of human ethics, analyses the chances and limits of moral machines. So far, decision trees have not been commonly used for modelling moral machines. This article proposes an approach for creating annotated decision trees, and specifies their central components. The focus is on simple moral machines. The chances of such models are illustrated with the example of a self-driving car that is friendly to humans and animals. Finally the advantages and disadvantages are discussed and conclusions are drawn.” The proceedings can be ordered via www.aaai.org.
Fig.: Oliver Bendel, Cindy Mason, Luís Moniz Pereira and others in Stanford