Insight into the development

In the final design, the body can move in 12 degrees of freedom and the head in three. Of course, there were also numerous studies on the display:

https://www.youtube.com/playlist?list=PLRYtaQ-AMUMrN1ZFNsoPXWRQQLXOID4v3

https://www.youtube.com/playlist?list=PLw67VJAuUXEw9uSg4IcOIo0tNaQxBzY9H

For the recognition of emotions, different components were developed:

  • a voice control system, so that the dog "listens to the word" (if it wants to)
  • sentiment analysis based on words, so that the dog can react differently to friendly commands (if it wants to) than to unfriendly ones
  • emotion recognition based on spoken language, so that the dog can distinguish between happy, sad or aggressive commands,

By the way, the dog can also bark.

All components were integrated in hardware and software. The Robot Operating System (ROS) was used as the software basis and communication platform and the developed software scopes were executed on an NVIDIA Jetson, an Intel Atom board, two Raspberry Pis and a notebook. A tablet was used as a remote control and voice input device.

All mechanical parts were designed using the CAD tool CATIA and then produced using 3D printing at THI. This meant that design iterations could be implemented very quickly.