Intelligent Systems Lab Project: Sensor/Actor Network for Adaptive Lighting Control
Participants
- Robert Abel
- Tobias Rodehutskors
- Daniel Seidel
Supervisors
- Dipl.-Ing. Stefan Herbrechtsmeier
- Dipl.-Ing. Thomas Wöstenkühler
Motivation
In conventional room environments, users have to control lighting manually. Therefore there are fixed locations in each room that users have to reach to change the room lighting conditions.
An Intelligent Room could help to solve this problem by having a “switchless” room, where light is controlled automatically depending on the situation within the room, the outside world and user activity.
Lighting does not adapt to inside and outside conditions. E.g. the light intensity inside is independent from the one outside and can usually only be turned on or off or has to be dimmed manually.
To solve this problem, the project aims to implement a lighting control that behaves in accordance with the inside and outside lighting conditions at all times. It should react to the light intensity and color (mood) outside and adjust the light inside.
Application Scenario
There are several situations, where this project shall improve and faciliate the light handling for the user:
A user enters the Intelligent Room and if it is dark — because it is dark outside or there is no sufficient light despite daylight outside — the user has to turn the light on manually. We want to improve this scenario using sensors to detect users and automatically when turning the lights on when a user enters the room and room light conditions are too dark.
A user sits down at their work desk and turns on a table light. The sensor-enhanced lighting control our work accomplishes would turn the lights on automatically as soon as the user sits down.
Furthermore the light shall adapt its shade to be cold or lightly blue, so that it is more suitable to be used during strenous work.
The light shade in the living room shall adapt as well. E.g. if it is very hot outside, the light may incorporate a blue tint, to give the room a cooler atmosphere or vice-versa: the light may increase its red/orange parts, to produce a warm atmosphere on cold or rainy days. Another adjustment could be to produce comfortable light while watching television or playing console games.
Objectives
The project goals are- effortless user-centered lighting experience,
- lighting adapted to outside light conditions,
- consistent lighting adaptation according to users' activity and desired mood,
- use existing lighting control standard DMX [3] to control lighting equipment.
Description
Our project consists of- three EES boards [1] that contain each a microcontroller, a Spartan-3e FPGA and a bluetooth chip,
- sensors:
- RGB sensors
- light intensity sensors
- two motion sensors
- microphone
- actuators:
- DMX controller
- USB power socket
- RGB spots
- Ambient Intelligence group's light frame
- Android application for user activity pattern recognition [2] and manual control.
Our project setup divided the Intelligent Room into two different compartments. The upper area is the working area, while the lower area is the actual living room in our scenario.
The green icons are the sensor and actuator EES boards, the red icons are sensors, while the yellow icon is the main light source, the light frame. The blue lines represent the compartmentization of the Intelligent Room and indicate the sensitive areas of the motion sensors we employed.
The EES boards and the Android smartphones communicate in a network via Bluetooth® technology with each other.
The boards each have a unique role: the actuator EES board, which controls the lighting via DMX and gets sensory input from the sensor board and the motion detection board.
The Android application running on the smartphone recognizes various user activities such as standing, running, sitting and forwards them to the actuator EES board as well.
In addition the android application enables the user to manually control the lighting (see screenshot below).
The boards are responsible for the following sensors and actuators:
- actuator board:
- DMX controller
- microphone
- sensor board:
- outside RGB sensor
- outside light intensity sensor
- motion board:
- motion detection
- inside RGB sensor
All decisions are based on a state machine running on the actuator EES board. It incorporates all sensory and user inputs and integrates them to provide the correct lighting at all times.
Results
The project targets have been accomplished to our satisfaction. The system is able to react to basic input gathered from conditions inside and out of the room. The following points are some of the achieved results and limiting factuators of the system.
Our basic goals were all met:
- the basic system, i.e. networking using Bluetooth®, sensing user data via DMX, has been set up and fully tested,
- all sensors and actuators were successfully integrated with the respective boards,
- sensible hard-coded thresholds for sensor values have been determined,
- a working state machine integrating all inputs into one smooth user experience has been programmed,
- lighting adapts to sensor data, but is still controllable manually by the user.
However, some caveats remain:
The main processing is done on the actuator microcontroller, which puts undue strain on this resource. Other boards could do some partial calculations and communicate them instead of raw sensor data via Bluetooth®.
Android users are currently limited to one concurrent access only.
Discussion and Conclusion
At first, we planned to use only two EES boards. However, within the development cycle, we changed the setup to include a third EES board to overcome problems in sensor placement and communication. We also had to add a PC and PC-side Bluetooth® server software to connect to the USB power sockets.
A suitable solution in this scenario was to let all devices communicate with each other. At first, we did not indend to use a server for essential tasks. However, since the server became a requirement rather than an optional ressource, the communication should be centralized on the PC server to allow better handling of the aggregated data in future work.
Outlook
The currently presented work might be improved upon with respect to the following aspects. For reusability purposes of the lighting control a connection through the RSB middleware might be implemented. This improvement would allow other groups working in the Intelligent Systems Lab to access all sensors and actuators in a common, standardized way. To realize this project, some centralisation is needed as the microcontroller boards cannot directly be connected to RSB. Instead all sensor and actuator information has to be aggregated and sent to a central server. Nevertheless, communication between server and sensor/actuator nodes should still be done via Bluetooth®.
In the current setup, the states of individual lights depend on hardcoded thresholds for each sensor. This approach might be replaced by a supervised learning system for controlling the desired illumination and light mood. In the training phase a user can choose a favored lighting on his smartphone. This target will be combined with a feature vector, which is composed of different sensor values and derived features, to a single training instance. Time-intensive evaluation and testing will create a robust system.
References
[1] René Griessl, Dokumentation: EES Board V3, http://wwwhni.uni-paderborn.de/fileadmin/hni_sct/lehre/ees/download/ees_1011/EES_Board.pdf (26.10.2011)
[2] Sauvik Das et. al., “Detecting User Activities using the Accelerometer on Android Smartphones”, Team for Research in Ubiquitous Secure Technology (TRUST), http://www.truststc.org/reu/10/Reports/DasGreenPerezMurphy_Paper.pdf (01.03.2012)
[3] ANSI E1.11-2008, http://webstore.ansi.org/RecordDetail.aspx?sku=ANSI+E1.11-2008 (01.03.2012)