Intelligent Systems Lab Project: InfoPlant
Participants
- Alex Walender
- Martin Sauer
- Jonas Gerlach
- Niels Krömker
- Jonas Betzendahl
Supervisors
- Thomas Herrmann
- Jan Hammerschmidt
- Sebastian Zehe
Motivation
- In the 21st century, people are confronted with an avalanche of data from an ever greater number of sources.
- When this data is displayed on screens, it creates distractions and crudely interrupts workflow or leisure experience.
- By representing part of the information (for example corresponding to the digital or non-visual environment), the same information gets displayed, but unobtrusively.
- Users want to attend to the data "on demand" and in a pleasant way.
Application Scenario
A typical scenario in which this project could be used would be the modern home or workspace. Instead of displaying all (eco-related) data on a computer screen, we envision a biological plant which is able to display information on one of its many actuators. For example, the plant could shine in a specific colour when there are new messages for the user or the system could react to too much power being used with refusing to give water to the plant, making it look thirsty.All this happens in an ambient setting, moving otherwise displayed data away from the screen to the environment and thereby relieving the user of unnecessary distractions and obstrusions in his or her workflow or home experience. The plant could be placed in any office, home office or even a living room.
Objectives
The project goal was to use an actual, biological, living plant equipped with technological actuators as an ambient interface to display data and create eco-feedback for the user (e.g. how her power usage relates to that of previous months or a fixed upper limit).Description / Results
The set of features realised in this project encompasses a number of elements, described in more detail below. The sensors and actuators of these features were originally controlled via BRIX2 microcontrollers (developed by Sebastian Zehe @ CITEC) in the first semester, but were moved to an Arduino Uno in the second term, which was in turn controlled and coordinated by a Raspberry Pi that is connected to the internet.- Humidity
For measuring and actuating the humidity of the plant's soil, a humidity sensor and a USB-powered water pump were attached to the plant. They are successfully communicating with the Raspberry Pi, which is running a python app to compare current humidity levels to a pre-programmed threshold. If the humidity falls below the threshold, the app can activate the pump to water the plant as necessary. - Illumination
57 LEDs have been installed on the inside of the plant's pot to shine on the plant in different colours. Due to memory limitations on the Arduino platform, not all LEDs are controllable individually, yet we implemented an API to control them in groups of three. Color values of both the HSV and the RGB color space can be passed to the Raspberry Pi. Furthermore, the activation waveform can be chosen as either linear or logarithmic (which appears linear in subjective brightness to the human eye). Communication between the Arduino and the Raspberry Pi also works adequately. - Ventilation
Four ventilators have been attached to the casing of the plant to rustle the leaves. These fans are individually controlled, can communicate notifactions, support different waveforms (sinus, triangle) of activation and already have active interplay with the LEDs. The only problem remains surplus noise development while fans are active. - Tugging
Another degree of freedom exposed for the plant is the angle at with the branches are positioned. For a few major branches, servos were installed and connected to the branch via nylon wire. The servos turned out to be not as reliable as expected and too weak to properly position thicker branches. - Touch
Human interaction with the plant is an integral part of the ambient setting. To facilitate this, sensors to register when certain leaves of the plant have been touched by a human hand have been installed. They communicate so-called touch events to the Raspberry Pi, after filtering and the checking of an adjustable sensitivity threshold. Regrettably, the sensor for touch events appears to interfere with the humidity sensor and requires improvement. In principle this should not only allow to detect if but also where or how the plant is touched. However, the development of more complex classifiers is beyond the scope of this short project. - Applications
A multi-core unifying Python script on the Raspberry Pi is in control of all other features and handles interaction between them well. Furthermore, a Twitter bot runs as an application so that interactions with the @isyplant account can have effects on the plant. As always, better documentation and cleaner code are areas with work to be done.
Results of the second term (summer term 2015)
Maybe the biggest change in the second term was the switch from the BRIX platform to the Arduino. Originally, we planned to use an Arduino Leonardo, as that is the platform BRIX are based on but then ran into delivery problems and ultimately went with an Arduino Uno. The code itself, however, transferred over mostly without problems; the move from multiple BRIX to one master sketch only involved minor refactoring and some tailoring of memory heavy code sections.Another major undertaking of the second semester was re-writing the python interface for the raspberry pi. It is now thread-safe, can communicate over RSB - as is necessary for integration into the intelligent apartment and working space at CITEC - and supports user-written apps.
Each app is automatically given a priority rating (compareable to the nice-level in UNIX systems) such that - for example -, while an app may choose to deny the plant a surplus of water to make it look thirsty, a system app (written by the developers) can monitor the humidity and prevent the plant from ultimately dying. We also implemented apps to interact with online services like twitter and gmail, notifying the user to new messages.
For a demonstration and live-fire test, we worked together with team BRAWO (BRain At WOrk) to visualise a user's brain state with the plant. The data from BRAWO's headset was broken down, normalised and sent to the plant via RSB. We then read out the data in a python app written for this purpose and lit up the LEDs accordingly. Of course this was only a proof of concept and many more degrees of freedom of the plant could be used to display more complex input data.
Other than this, all aspects of the project were made more robust and/or reliable. We also went back and improved on the code quality and documentation on many aspects so future work by other groups on the same topic can be done using our code with ease.
The following video demonstrates the current development state of the plant system and all its features:
Discussion and Conclusion
Although ultimatively successful, along the way, the project faced some setbacks. Originally, more features were planned (for example, the plant was to be placed on a rotating plate, or be equipped with small, portable speakers to allow for even more ways to convey input data).Some problems we faced were unreliable hardware and unanticipated software complexity (requiring multiple rewrites to meet memory limitations) as well as some of the hardware components developing a rather excessive amount of sound, interfering with the goal of unobtrusiveness.
Furthermore, sensor data was in part significantly less accurate than hoped (e.g. humidity, touch), asking for the development of more robust signal processing / classification. Also, watering the plant turned out to be difficult to control due to an unexpectedly large latency between pump actuation and humidity sensing.
Outlook
The project itself is finished. However we still see possible areas for further refinements / continuation, that future groups could possibly work on:- Integration of the plant into an actual smart environment, like the Intelligent Appartement @ CITEC and setting up actual eco-feedback mechanisms via smart-meter
- Additional actuators, like rotation or a sound system
- Application development, as in a more varied selection of apps
- App management: a GUI to display and manage what apps are running