Subscribe: My life with Android :-)
http://mylifewithandroid.blogspot.com/feeds/posts/default
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
android  application  ble  bluetooth low  code  data  device  light  low energy  low  nrf  project  rfduino  sensor  time 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: My life with Android :-)

My life with Android :-)





Updated: 2018-02-16T06:52:49.096+01:00

 



Data from the light bulb

2017-12-23T19:46:55.636+01:00

Once I read a blog post from Oona Räisänen that really inspired me. In her project she encoded a digital signal into the referee's whistle. I asked myself if there is any other household object that emits a radiation and could be used as a carrier for low-speed digital signal. I looked around and there was a light bulb. Using light as signal transmitter is not at all a novelty. Beside all that optoelectronics stuff, recently Apple stirred some waves with their Li-Fi announcements that promises ultra-fast communication by means of a light bulb. I had something else in mind. I did not care about the speed, ad-hoc signal transmission of small data chunks that carry location IDs, sensor measurement was much more interesting for me. The use case would look like that you walk with your smartphone into a room lit by an innocently looking light bulb and presto, the smartphone picks up info from the light bulb without any special arrangement, e.g. you don't have to point your device anywhere. Speed and data size is of secondary concern in this use case which is a quite common scenario in the world of Internet of Things (IoT). Also, the light from the light bulb must look completely ordinary for human observer, e.g. no blinking, etc. This post is a tale of that adventure. As I had prior experience with reading infrared signals from ordinary remote controls and sending them to the smartphone, I started with the method IR remote controls use to communicate with their receivers. Very shortly if you have not yet met this system: the emitter sends out period of "0s" (no light) and "1s" (IR light modulated by a certain frequency, typically 38 kHz). It is the length of these periods that carry the information. The common approach is that after a series of sync bursts, a digital 0 is encoded as a no light-light sequence of specified periods. A digital 1 is encoded similarly, except that the no light-light periods are different. You can observe this operation if you watch the light diode of an ordinary remote control through a smartphone camera. As the smartphone camera sensor "sees" in infrared (even though the manufacturers try to filter out this behaviour), you will see flashes of infrared light if you push a button on the remote. Compared to the IR remote, we have an additional requirement: the human observer is not allowed to notice that the light bulb is doing something weird. For this purpose, I changed the modulation scheme. "No light" does not mean that the light bulb is switched off (that would result in a very annoying blinking sensation that humans are very sensitive to), only the modulation frequency is changed. In our system, the "light" periods are modulated with 38 kHz so that the popular IR remote decoder chips can be used, and the "no light" periods with 48 kHz. For the steep band pass input filters of those IR receiver chips, 48 kHz is essentially "no light". Considering the very noisy environment in the visible light domain, I also changed a popular IR modulation scheme, now the 0 bit is 564 microsec 48 kHz signal/564 microsec 38 kHz signal, the 1 bit is 1692 microsec 48 kHz signal/564 microsec 38 kHz signal. The whole payload is 20 bit long allowing to transmit a 16-bit value and a 4-bit data type selector. The data type selector lets the emitter send multiple types of data sequentially. In our demo, these are: station ID (for location), temperature and humidity (obtained from a DHT-22 sensor on the emitter board). The system therefore consists of 3 elements: the emitter circuit that drives the light source, the adapter that receives these light signals and adapts them to the smartphone and finally the smartphone that acts upon those[...]



Android weather station with a solar-powered BLE sensor

2017-06-14T01:25:00.937+02:00

The ultimate test of the low energy consumption is a sensor that can survive on its own, without maintenance. My Android weather station supported by BLE weather sensors has been functioning for more than a year but this year has not passed without adventures in the battery front. First the station was powered by 2 AA NiMh batteries - that was 2 weeks of lifetime. Then came the motorcycle battery, that took much longer to expire but eventually the battery itself failed. Now the 2 sensors run on a discarded laptop battery which may not be able to power a laptop but powers nicely the two sensors with their combined 5 mA consumption.5 mA, however, is a lot so when I found this solar-powered lamp at Jysk, I immediately realized that I had to turn the lamp into the solar-powered version of this weather sensor. Why another weather sensor? Because I wanted to concentrate on the solar-powering aspects and wanted to reuse as much as possible from the old sensor. This prototype may serve as a template, however, for different kind of sensors too.Let's see first the solar lamp that I used as a base.Solar lamp already containing the weather sensor. The two red LEDs indicate that the solar cell is charging the battery.This is a quite cheap device with a solar cell on top and a circuit built around the XD5252F LED driver that takes care of everything from the charging of the small NiMh battery (if there's sunlight) to switching on the LED (if there's darkness). Unfortunately the circuit is so specialized to solar LED lamps that I could not reuse too much of it except for the solar panel and the LED itself. The solar panel is not very high-powered, it is a 2V, 20 mA cell. So it became clear immediately that the Android client app has to work more (consuming more energy) to obtain sensor data while the sensor has to sleep more to conserve its own battery that charges only very slowly from the low-powered solar cell. Also, surviving the night (or longer periods without sufficiently strong sunlight) requires a quite beefy battery in the sensor if we want it to transmit BLE messages to the Android application frequently enough.Click here to download the sources of the Android application. Read this post to figure out, how to create an Android Studio project from the downloaded sources.The previous Android app has been therefore changed so that instead of 15 seconds of scanning, it now scans for 70 seconds. The sensor sleeps 60 seconds then transmits the measurements for 5 seconds. This results in a quite low, 4 mAh energy consumption daily that even the low-powered solar cell can refill if sunny periods occur time to time. To make sure that the sensor survives long without enough sunlight, a 2700 mAh Li-Ion battery was installed (of the 14500 type, with the AA form factor). As in the previous version, the measurement data is transmitted in the BLE advertisement packets. I wanted to transmit battery indicator in this case too so I dropped one byte from the 8-byte long station ID (so it is now 7 bytes long) and instead of that byte now the supply voltage of the microcontroller is transmitted. It is generally 3.3V, if it drops below that then the battery is really not charging. This additional measurement data required that the sensor's UUID be changed, that's how the Android app recognizes this new parameter and displays in a graph.Battery indicator in the measurement screen of the new sensorThe schematics of the sensor can be seen below (click to enlarge).Sensor circuit installed into the solar lamp case Nothing much changed from the previous version, except for the solar cell-battery charger power chain. I wanted to save myself the pain of designing a Li-Ion charger so I used building block already avalable: this DC-DC converter to produce 5V from the solar cell's varying output voltage and this battery charger circuit to take care of the Li-Ion battery. The result is a less than optimal efficiency (almost 50% of the solar cell's energy is lost during the[...]



Adding more power to the BLE-enabled Christmas light

2017-01-02T21:55:28.718+01:00

The truth is that the low-voltage LED strip I used in the previous post was a backup solution. Originally I bought a 230V-operated Christmas light with two independent LED strips but adapting that beast to Bluetooth Low Energy turned out to be a bit more problematic than I expected. I had to learn a bit about power electronics first. My LED light I used as a base in this post is a standard-issue Chinese-made device. Below you can see how it looks like, its original controller already stripped of its plastic protective housing.The circuit is very similar to this one, except that mine had only two LED strips, instead of 4. In my version the controller chip had HN-803 marking and the strip-controlling thyristors are of type PCR 406. The modes the original controller supported were all zero-crossing ones so I retained this operation. Very shortly about the zero-crossing vs. phase-angle mode of controlling thyristors or triacs. A good introduction can be found here. The thyristor is fed with a current that has frequent zero-crossings. This is necessary because once the thyristor is switched on, the simplest way to turn it off is to remove the current on the load. That is why the Graetz-bridge converting the 230V alternating current into direct current does not have the usual filtering capacitors. This guarantees that the current feeding the LED strips/thyristors has zero-crossings with 100 Hz frequency. After the zero-crossing the thyristor can be switched on again by just a mA-range current applied on its gate electrode. The phase difference between the zero-crossing and the moment the gate current is applied determines whether we use dimming or not. Then the thyristor will remain switched on until the next zero-crossing. As the frequency of these zero-crossings is just 100 Hz, pulse-width modulation we used in the previous post for dimming cannot be used, the human eye would notice the flickering with such a low PWM frequency. So the simple circuit I am going to present here can only be used to flash the LED strips but not for dimming them. Implementing phase angle-based dimming would not be too hard with the features of our microcontroller but I did not want to get into that in this post. Warning: part of the circuits described in this post use high-voltage 230 V current. Do not try to build them if you do not have experience with high-voltage electronics because you risk electrocuting! Our exercise looks very simple. We need to remove the HN-803 controller circuit, replace it with our nRF51822 BLE SoC and use 2 of the output pins of the SoC to turn on the thyristors. Once the SoC drives the output pin to low, the thyristor will switch off at the next zero-crossing which allows us to flash the LED strips with frequencies lower than 100 Hz. Unfortunately nothing is simple if high-voltage current is involved because this simple circuit would connect the ground of the microcontroller board to a wire with high-voltage current (HVGND on the schematic) risking electric shock if someone touches the microcontroller board or ground-connected metal parts (like connectors) when the circuit is in operation. So I built an optocoupler-based isolator pictured below (click to enlarge). The isolator ensures that the microcontroller-side has nothing to do with high-voltage current so no special precautions need to be done when handling the MCU board. The isolator itself, along with the remaining parts of the original controller circuit (D1-D4, T1/T2 and of course, LED1 and LED2 representing the two LED strips) are placed in a separate enclosure box. R3 dissipates around 1W so make sure that the resistor in question can withstand this power, I used a 2W resistor. Driving the low-voltage side of the optocoupler s[...]



BLE-enabled Christmas light

2016-12-24T17:26:57.039+01:00

The idea came from the light-themed Budapest Makers' Meetup and from the cheap Christmas LED strip that my wife bought for about 1 euro. Plus my other hobby project has not gone as smoothly as expected but produced a connection-oriented nRF51822 code and bang, the idea was born, let's couple the LED strip with the Bluetooth Low Energy System-on-Chip (SoC), add an Android application and let's see what comes out of it. This post is the tale of that adventure.First, about the LED strip. This is a battery-operated device with two states: off or on. It has surprisingly low power consumption considering its 10 LEDs.I removed the battery case and put it away - it will serve me well in other projects. Then I hooked up the LED strip with the nRF51822 as shown in the schematics below (click to enlarge).As described in some previous blog posts, I use a breakout board containing little more than a sole nRF51822 and the Bus Pirate programmer to upload the application into the chip. The components on the breakout board (quartz, etc.) are not shown in the schematics. If you use the same type of breakout board that I do, make sure that you connect both GNDs together otherwise instability may be the result.Other than that, the circuit is very simple. The LED strip is driven by P0.22 of the nRF51822. Even though the strip consumes in the mA range, I played safe and inserted a 2N7000 FET between the MCU and the LEDs. The 1 Ohm resistor was already part of the circuit in its original form so I thought it is a good idea to preserve it. One can also observe the 3.3V stabilizer circuit that transforms the POWER_INPUT (5V in my case) to the 3.3V consumed by the circuit. Any stabilizer circuit will do, I just point out that without the C1 condensator I experienced instability when the Bluetooth radio was in operation. The whole circuit sits nicely in a plastic electronic enclosure box.Now on to the software. Click here to download the nRF51822 sources.Click here to download the Android sources.The programmer software that drives the Bus Pirate tool is the same forked version that we used before. The project assumes the 12.1.0 version of the nRF5 SDK. I propose that you stick to this version too because upgrading the project to another version may involve a lot of work (as I experienced previously). Convert the soft device with the convert_s130.sh script, upload into the device with the upload_softdevice.sh, compile the code with the "make" command then upload the application with the upload.sh. While doing this, you need to modify SDK path and device files in the scripts/Makefile according to the directory layout of your system. After a power cycle, you can observe the device spitting out a large amount of debug messages on the debug serial port. Also, LED1 starts flashing showing BLE advertising activity.Before installing the Android part, let's check that the device works correctly. Download nRF Connect from the Google Play store, start scanning for BLE devices, look for "ledctr" (the default name given to our device), connect and open the custom service with the 128-bit UUID of 274b15a4-b9cd-4e5e-94c4-1248b42b82f8 that it advertises. You should see something like this:Write the following byte array into the characteristic with the UUID of 274b0000-b9cd-4e5e-94c4-1248b42b82f8.036000000000000360This means 3 seconds ramp-up to 0x60 intensity (96%), no flashing, 3 seconds ramp-down from 0x60 intensity. If you see this light effect, the device is ready. Don't forget to disconnect: the application can handle only one active connection and there's no timeout mechanism implemented.The BLE device implements the following light effect. First there is the ramp-up phase when the light intensity increases from 0 to a maximum. The ramp-up time and the [...]



Android phone as weather station with improved sensor

2016-10-18T12:49:40.259+02:00

The previous two posts introduced a BLE-enabled weather sensor (temperature and humidity) and the Android application that extracts data from this sensor. I hinted that I intend to proceed with a more sophisticated sensor, Bosch Sensortech's BME280 but other projects diverted my attention (shameless self-promotion: read this paper about microcontrollers, image processing and low-power wide area networks if you are curious, what kind of projects took my time). But I never forgot my BME280 temperature/humidity/pressure sensors sitting in my drawer and once I had a bit of time, I resurrected the project.The idea is the same as with the DHT-22. The nRF51822 combined ARM Cortex-M0 microcontroller/Bluetooth Low Energy radio unit will make the sensor data available over Bluetooth Low Energy (BLE) access. The smartphone will read this data and display it to the user. Later (not in this post) I intend to upload the data into some web service for analysis. We have already done this with the DHT-22, now we extend the sensor range with the BME280.Click here to download the Android application. Click here to download the nRF51822 projects that are running on the sensor hardware.Let's start with the sensor. Below is the schematic for the hardware (click to enlarge). The difference between this and the previous one is that the DHT-22 was connected to a simple GPIO pin while the BME280 uses the more sophisticated I2C bus. As the BME280 is quite miniature, I used a breakout board for the sensor too. LED1 and the serial debug port are optional (but quite useful). Debug messages are emitted to the serial port, you need a 3.3V-to-RS232 converter on the TxD pin if you want to observe those.As described previously, the circuit is realized with a low-cost nRF51822 breakout board and is programmed with the Bus Pirate programmer, adapted to nRF51822 by Florian Echtler. The only thing I changed in this setup is that this time I moved the projects to the latest version of the SDK which is the 12.1.0. Also, the soft device (the program module implementing the BLE stack) was bumped from S100 to S130. These decisions caused quite a headache because there's significant difference between the old SDK and the 12.1.0. Therefore I decided that in the nRF51822 project file I share not only the sensor project (called adv_bme180) but two simpler ones (blinky_new and blinky_new_s130) as additions to the instructions on Florian's page. As a recap: the soft device need to be flashed into the device before any BLE application is flashed and the starting address of the BLE application depends on the size of the soft device. This has changed between S100 and S130, hence the updated projects. In both blinky_new_s130 and adv_bme280 you will find the  convert_s130.sh and upload_softdevice.sh scripts that convert into binary format and flash the S130 soft device that came with the Nordic SDK.Once you uploaded the S130 soft device, compile the project in adv_bme180 and upload it into the nRF51822. The sensor node works the same way as the DHT-22 version. The MCU in the nRF51822 acquires measurements from the BME280 by means of the I2C bus (called Two-Wire Interface, TWI in the nRF51822 documentation), once in every second. This includes temperature, humidity and pressure. Then the measurement values are compensated by the read-only calibration data also stored in the BME280 that the MCU reads in the initialization phase. The BME280_compensate_T, BME280_compensate_P and bme280_compensate_H functions come from the BME280 user's manual. The result is the compensated temperature, humidity and pressure values that the MCU puts into the BLE advertisement data. The advertisement data also contains the nRF51822's unique ID that is used to identify the sensor. The sensor has no name as the measurement+ID data is now too long to allow sensor [...]



Android phone as weather station

2016-03-02T20:57:24.000+01:00

The previous post was about a low-cost Bluetooth Low Energy sensor (really, one sensor unit that includes the BLE-enabled microcontroller too costs less than 15 USD and that's just a single prototype, economies of scale come on top of that) and its accompanying Android app that allows obtaining sensor reading manually. That's not bad but manually reading data is sort of inconvenient. If you want to know, what the temperature and humidity was in the dawn, you have to be awake in that early hour. Personally, I prefer to sleep then so I decided to automate the whole process.Click here to download the sources of the Android application. The content of the archive is the app/src/main subtree of an Android Studio project. In addition to extracting the sources into the app/src/main subtree, update app/build.gradle like this:dependencies {     compile fileTree(dir: 'libs', include: ['*.jar'])     testCompile 'junit:junit:4.12'     compile 'com.jjoe64:graphview:4.0.1' }The project depends on Jonas Gehring's GraphView project, hence this new dependency.So what can we expect from this new app? In case of the app that came with the sensor in the previous post, you started a manual scan and if the sensor was in range, you got the humidity/temperature data. The new app scans and stores data in the background. Once it is started, it sets up a periodic timer (default timeout is 1 hour but can be changed in the settings menu) and when the timer fires, it makes a scan. If it finds a BLE node whose advertisement fits our criteria (e.g. it advertises services with the UUID I allocated) then it extracts the measurement data from the advertisement message and stores it in a database on the device. This variant does not yet upload the data to a server, that may come later. However, it can visualize the measurements on simple graphs, hence the dependency on GraphView. Like this:Let's see the interesting bits of this app.First and foremost, it is an interesting feature of this application that the BLE layer is used in such a way that reading the sensor is not an extra cost for the sensor. As the measurement data is embedded into the advertisement packets that the device broadcasts anyway, it does not matter if 1 or 1000 phones read and store data. So this sort of sensor network can grow into an entire ecosystem - the more phone users install and use the app, the more precisely the measured quantity will be available once the phones upload their catch to the server. If you observe, how the data is stored (DHT22SensorDataProvider.java), you can recognize an important shortcut that I made: the database structure depends on the sensor being used. This provider depends on the fact that DHT-22 (the actual measurement device) provides temperature and humidity data in the same reading. A different sensor (like the Bosch BME280 sensors sitting in my drawer waiting for their turn) will require a new provider and also a modification of the visualization part. So there's significant development potential in making the app more flexible when it comes to adding a new sensor type.The actual sampling of the service happens in BLESensorGWService using the AlarmManager to trigger the scan. Now getting the device awake if it was just sleeping is not a simple business. Observe in the list below, that even though there's always an hourly reading, there's a significant variation when the reading happens.In case of our weather reading, it was not a problem but some sensors may have more variable data. A large number of devices reading and uploading would solve the problem of reading time variations.GraphMeasurementActivity is the activity that depends on Jonas [...]



Thermometer application with nRF51822 and Android

2016-02-02T23:02:51.608+01:00

I built quite many prototypes on this blog with RFDuino based on Nordic Semiconductor's nRF51822 and I can still recommend that small development board to people who want to get initiated quickly and painlessly into the world of Bluetooth Low Energy programming. The limitations of RFDuino became apparent quite soon and it was time to get deeper. On the other hand, I wanted to stay with the excellent nRF51822 so I looked for a breakout board - as simple as possible. This is how I stumbled into Florian Echtler's page about low-cost BLE development environment based on the nRF51822 and the Bus Pirate programming tool. So I quickly ordered a no-name variant of Waveshare's Core51822 breakout board, a Bus Pirate tool and a bunch of DHT-22 sensors (because I wanted to measure something in the environment). Also note that the breakout board has a connector with 2 mm pin spacing which is not the usual 0.1 inch pitch. It helps if you have a prototyping board with both 2 mm and 0.1 inch pitch like this one which cannot be found in every store.Generally speaking, following the instructions on Florian's page was easy enough. I ran into two issues. First, I had no success with the SWD programming software he refers to but Florian's fork (which is based on an earlier version of the programming software) worked well for me. Second, I experienced instability if the GND pins of the breakout are not connected (there are 2 of them).First about the hardware. The schematic below show only the parts that are connected to the pins of the breakout board, the schematic of the breakout board itself is not included.Highlights:DHT-22 is connected to P0.17 which is both input and output depending on the communication phase.P0.21 LED provides a feedback about the BLE activities. This is a convention coming from the PCA10028 dev board that we lied to the Nordic tool chain that we have. You can omit this LED if you want to save some energy.SV1 header is a TTL serial port where the example program emits some debug messages. You can omit this header if you are extremely confident. I use a level converter like this to connect this port to a standard RS232C port. The UART operates on P0.18 (RxD) and P0.20 (TxD). SV2 header goes to the Bus Pirate. Check out Florian's document about the connection. Make sure that this cable is as short as possible. Here is how the board looks in all its glory, the Bus Pirate and the RS232C level converter boards in the background. These are of course not needed for deployment, the board runs standalone after the testing is successful.Click here (adv_dht22.zip (nRF51822), bledht22.zip (Android)) to download the example programs related to this blog post.Let's start with the code that goes into the nRF51822 which can be found in adv_dht22.zip. The assumption is that you completed Florian's instructions, including the upload of the S110 soft device. Then unzip adv_dht22.zip and do the following modifications:Edit Makefile and make sure that the RF51_SDK_PATH variable points to the location where you installed the Nordic SDK.Edit upload.sh and make sure that the paths point to the location where you installed Florian's version of the SWD programmer. Also, make sure that the USB device is correct (/dev/ttyUSB0 by default).Now you can say "make" and then "upload.sh". If all goes well, the code is installed in the nRF51822 and you get debug messages on the serial port. At this moment, the nRF51822 is already advertising the measurements it obtained from the DHT-22 sensor. You can check the content of the advertisements with this test tool.The code looks quite frightening compared to the super-simple RFDuino equivalent but most of it is just template. My highlights:Check out in advertising_init(), how the advertisement packet is set up. We transfer the measurements in a service data GAP field and I to[...]



Data transfer to Android device over infared link

2016-01-07T00:29:36.588+01:00

The three previous parts (here, here and here) of this series introduced the infrared-to-Android gateway. Even though those prototypes captured the signals of an ordinary IR remote, I already hinted that I was aiming for something more exciting. In this part, we will replace the IR remote with our own IR transmitter. Once we have our own IR transmitter, we will be able to transfer our own data over IR light. This data link is not reliable enough to transfer large amount of data but in lot of the cases that's not required. E.g. to transfer the data of a temperature/humidity sensor, 32 bit is more than enough.So what can we expect from IR-based data transfer with respect to more popular, radio-based transfer? There are advantages and disadvantages.Advantage for the IR solution is that it is extremely cheap and also extremely power-efficient. Advantage for radio is that IR-based solution always requires line of sight, while radio waves can - to some extent - traverse walls, etc.Advantage for radio is that the IR transmitter has to be aimed at the receiver, if for some reason the receiver and the transmitter move with respect to each other, they lose contact very easily.There's also the question of range. Ordinary TV remotes work up to 3-4 meters of distances which is nice for an inexpensive consumer device but is not enough even for indoor sensor network use cases. In this part I try to figure out what the distance limit may be.For starter, this question is not defined precisely. With sophisticated optics, high-powered transmitters and careful targeting, IR data transfer can be accomplished over significant distances. But we said that we are looking for cheap hardware so we can't rely on sophisticated devices. We need some sort of optics but this should be simple and cheap. That's why I went to a second-hand toy shop and bingo, I found the IR transmitter of Thinkway Toys' Lazer Stunt Chaser. The small toy car has long been lost, but the handgun-like IR transmitter somehow made its way to Hungary. This cheap, plastic toy is a marvelous device. It promises 12 meters of effective range and even though the mounting of the IR light source and the plastic optics is made of cheap materials, it is surprisingly efficient. It even has a normal (red) LED emitting its light through the transparent housing of the infrared LED which produces visible red light circle of about 10 cm of diameter facilitating the targeting of the IR transmitter.So how far can it transmit our IR codes? In order to try it out, I took apart Thinkway's IR transmitter and replaced the circuitry driving the IR LED with mine. The new emitter circuit is based on an Arduino Pro Mini 3.3V/8MHz and the schematics looks like this:The software works on any Atmel ATmega328p-based boards, e.g. on Arduino Uno. If your MCU uses power source with higher voltage than 3.3V, adapt R2 accordingly. E.g. for an 5V Arduino Uno board, you need about 160 Ohm. So this is how I hacked my circuit into the IR transmitter.Note the two LEDs: the IR led in the tube-like mounting and the ordinary red LED behind it that is used for targeting. Also, note how the IR LED is connected to the ATmega328p's PD3 pin - IRLib which is the software used to construct the transmitter selects OC2B PWM output so the primary Timer2 PWM pin (PB3) would not work.Click here to download the source code of the IR transmitter.Open sketch/Makefile and update the ARDUINO_DIR according to your installation. Also, update ISP_PROG according to the programming tool you use to deploy the code. I used USBtinyISP, the Makefile is set accordingly. If your board has USB programming port (like the Arduino Uno has), then setting ISP_PROG is unnecessary.In order to deploy the code into the ATmega328p, say:make isploadif you use a programming tool or simplymake uploadif you don't need a tool.Abou[...]



Infrared-to-Android gateway implementation with interrupts

2015-12-30T02:06:55.365+01:00

In the previous parts of this series the infrared-to-Android gateway based on the RFDuino hardware and an improved version of the hardware were presented. The improved hardware offered quite reliable IR code recognition even when the BLE connection was in operation. Trouble with that implementation was the polling nature of the code; even though the IR reader support hardware is capable of raising an interrupt when a new time measurement is available, the code did not handle that interrupt, instead it polled the interrupt signal.

Click here for the updated gateway code. Click here for the (unchanged) Android application serving the gateway.

The reason I did not implement proper interrupt handling was the I2C library (called "Wire") coming with the RFDuino. Even though the nRF51822 (on which the RFDuino is based) is able to handle its I2C interface (called TWI, two-wire interface) by means of interrupts, it was not implemented in the "Wire" library. When the MCP23008 GPIO port extender raised an interrupt, the MCU was expected to read the MCP23008's capture register by means of an I2C transaction. As the "Wire" library was polling-based, this transaction held back the GPIO-handling interrupt for too long time, freezing the system.

The solution was transforming the "Wire" library into interrupt-based implementation. Now as my goal was a limited functionality (reading one register of an I2C periphery) I did not do it properly. I moved the entire "Wire" library into the application project (see it under the "lib" directory), renamed it to "Wire2" and introduced a couple of new methods, more importantly sendReceiveInt (in lib/Wire2.cpp). This method initiates the write transaction of a data array followed by a read transaction of another data array over I2C, all handled by TWI interrupts. This means that sendReceiveInt returns immediately and the actual data transfer happens in the background. This new method is invoked in the GPIO interrupt handler (GPIOTE_Interrupt in sketch/irblegw3.ino) but this time the GPIO interrupt handler completes very quickly as its job is only to initiate the TWI transaction handler. When the TWI transaction is finished, the TWI interrupt handler invokes the onReceive callback that ends in the application code (twi_complete in sketch/irblegw3.ino).

The most important outcome of this - quite significant - change is that the MCU does not spend its time spinning on the GPIO port reading loop. Instead, it waits for interrupts in ultra-low power mode (sketch/irblegw3.ino, IRrecvRFDuinoPCI::GetResults method, RFduino_ULPDelay invocation) which is important if the infrared-to-Android gateway is powered by a battery. As you may have guessed, my goal is not to fiddle with IR remote controllers, I intend to build a short-range network comprising of infrared, BLE and cellular links and the infrared-to-BLE gateway was just one step.



Improved hardware for the infrared-to-Android gateway

2015-12-16T18:27:09.402+01:00

In the previous post I presented the results of my experiments with the nRF51822-based RFDuino as infrared-to-Bluetooth Low Energy gateway, accompanied by an Android client app. The outcome of that experiment was that the nRF51822 BLE soft stack and the purely software-based IR receiver is not a good match as the BLE soft stack "steals" enough cycles from the Cortex M0 CPU so that the IR reading becomes very unreliable no matter which implementation alternative we go for (3 different alternatives were attempted). I promised an improved hardware that overcomes this limitation and this post is about this improved hardware.The essence of our problems with the BLE soft stack was that in case of very tight timings that the IR receiver requires, the main CPU is not suitable anymore for measuring time periods. Typical IR timing is in the 500 microsec-2 millisec range, this is the time period we should be able to measure reliably. With the BLE soft stack in operation, delays are introduced into the time measurement code by the background interrupt routine serving the soft stack and time measurement of this precision will be wildly off. I considered two options.The most evident option is to drop the integrated MCU-BLE radio combo that the nRF51822 is and go for a separate MCU-BLE modem option. For example an Arduino Pro Mini with a BLE121LR modem would have been a perfect fit as there are both of these modules in my drawer. While this hardware would have been definitely more hassle-free than the nRF51822, setting it up would have required two different programming tools (I have both but that's not necessarily true for the general blog reader out there) and I am still uneasy about soldering the BLE121LR - those pads are smaller than my capabilities.Extending the RFDuino with a dedicated hardware for time period measurement sounded more attractive for me as this was less evident. The functionality we expect is that the MCU is not doing any time measurement. The external hardware must be capable of measuring the time periods between the edges of the TSOP1738 output signal and deliver these measurements to the MCU. The measuring range is about 500 microsec-2 msec. Larger time periods are still measured by the MCU but in this case the disturbances caused by the soft stack are not that relevant. Further complication is that while the nRF51822 has 31 general-purpose pins, RFDuino makes only 7 of those available and 2 of them are reserved for USB communication. This requires that the circuit is interfaced with the RFDuino with the lowest number of wires and I2C is the best option (2 wires). nRF51822 has I2C option (called TWI, two-wire interface) so this would work. I was not able to find a single-chip solution that measures and captures time periods in this time range with I2C interface but the circuit is not that complicated.A 74HC4060 is set up as oscillator and counter. The frequency of the oscillator is about 350 kHz, yielding about 43 microsec time resolution for one counter step, making it convenient to measure between 43 microsec and 5.5 sec with 7-bit resolution. An MCP23008 GPIO-extender with I2C interface provides the conversion to I2C two-wire connection and also has a capture logic. This means that whenever the output level of the IR receiver changes, the MCP23008 stores the current value of the counter in its capture register and raises an interrupt. This way the MCU is not doing any time measurements and the time measurement hardware is able to survive about 1 msec autonomously without service from the MCU.Click here to download the schematic in Eagle SCH format.Click here to download the updated RFDuino gateway sources.  As with the previous version, edit the Makefile in the irblegw2/sketch directory and update the RFDUINO_BA[...]



Controlling Android device with an infrared remote

2015-12-30T02:15:40.211+01:00

I came across several posts about microcontrollers decoding signals of infrared remote controllers and started to think, how an IR remote can be integrated with an Android smartphone. This post is about a simple use case when I control the media volume of the Android phone with an IR remote.Long time ago, in a distant galaxy, IR transmitter-receiver was a standard feature of almost any mobile phone. Technological progress has eliminated that feature so we are now forced to build some hardware. We need infrared sensor for sure to capture the remote's infrared signal. But how can we connect that to the phone? Last year's experiments prompted me to choose Bluetooth Low Energy (BLE). Also, the microcontroller platform was determined by my experiences with RFDuino and I happened to have an RFDuino set in my drawer.The idea is the following. An infrared receiver is connected to the microcontroller that captures and interprets the infrared signals. If the phone is connected to the BLE radio, then the key codes sent by the IR remote are then sent to the phone over BLE. The phone does whatever it wants with the key codes, in my example the volume up, down and mute buttons are handled and are used to influence the volume of the music played by the media player.The infrared implementation is based on the excellent IRLib code. IRLib assumes that the IR receiver is connected to one pin of the microcontroller so I built the circuit below. A TSOP1738 IR receiver is directly connected to a GPIO pin of the RFDuino which is expected to do all the decoding. SV1 header is only for monitoring debug messages, it is not crucial for the operation of the gateway. If you intend to use it, connect an RS232C level converter similar to this one.Then came the complications. "Arduino code" is used with great liberty on the internet, as if Arduino meant a single platform. But in reality, Arduino is a very thin layer on top of the underlying microcontroller. Most of the code, tools and articles out there are based on Atmel's extremely popular AVR family of MCUs. These are 8-bit CPUs, equipped with a host of peripherials. The challenger in this domain is not Intel but ARM's Cortex family. RFDuino is a Nordic Semiconductor's nRF51822 System-on-Chip (SoC) which is a Cortex-M0 core with integrated 2.4 GHz radio. BLE is supported by means of a software stack. ARM Cortex is not supported as well as AVRs by the Arduino community and this miniproject is a cautionary tale about this fact.Click here to download the RFDuino project. In order to compile it, you need to install the Arduino IDE with the RFDuino support as described here.For starter, the Arduino Makefile I used with such a great success previously does not support Cortex-based systems, only AVRs. A short explanation: I don't like IDEs, particularly not in a project where I would like to see exactly what goes into the compile/link process. Arduino IDE is nice for beginners but is a very limiting environment for more ambitious projects. After several days of heavy modifications, I adapted this makefile to the ARM Cortex tool chain of the RFDuino. Go into the irblegw/sketch directory, open the Makefile and look for the following line:RFDUINO_BASE_DIR = /home/paller/.arduino15/packages/RFduinoAdapt this according to the layout of your file system. Then type "make" and the entire code should compile. "make upload" uploads the compiled code into the RFDuino, provided that the port (AVRDUDE_COM_OPTS =  /dev/ttyUSB0 in the Makefile) is also correct.Then came further complications. The IRLib code is also AVR-specific. The differences between Cortex-M0 and AVR are mainly related to interrupt han[...]



Infrared imaging with Android devices

2016-01-03T01:56:21.259+01:00

One of the most evident sensors of Android devices is the camera. An ordinary smartphone's camera is able to capture a lot of interesting information but has its limitations too. Most evidently, its viewing angle depends on the position of the device (so it is not fixed and hard to measure) and its bandwidth is (mostly) restricted to the visible light. It is therefore an exciting idea to connect special cameras to Android devices.In this post, I will present an integration of FLIR Lepton Long-wavelength Infrared Camera to an Android application over Bluetooth Low Energy connection. Long-wavelength IR (LWIR) cameras are not new. Previously, however, they were priced in the thousands of dollars range (if not higher). Lepton is still pricey (currently about 300 USD) but its price is low enough so that mere mortals can play with it. FLIR sells a smartphone integration product (called FLIR One) but it is currently only available for iPhone and locks the camera to one device. Our prototype allows any device with BLE connection to access this very special camera.The prototype system presented here needs a relatively long list of external hardware components and it is also not trivial to prepare these components. This list is the following:An Android phone with Bluetooth 4.0 capability. I used Nexus 5 for these experiments.An FLIR Lepton module. My recommendation is the FLIR Dev Kit from Sparkfun that has the camera module mounted on a breakout panel that is much easier to handle than the original FLIR socket.A BeagleBone Black card with an SD Card >4GB.A BLED112 BLE dongle from Silicon Labs (formerly Bluegiga).The software for the prototype can be downloaded in two packages.This package contains the stuff for the embedded computer.This package is the Android application that connects to it. Once you got all these, prepare the ingredients.1. Hook up the FLIR camera with the BeagleBone BlackFortunately the BBB's SPI interface is completely compatible with the Lepton's so the "hardware" just needs a couple of wires. Do the following connections (P9 refers to the BBB's P9 extension port). FLIR BBB CS P9/28 (SPI1_CS0) MOSI P9/30 (SPI1_D1) MISO P9/29 (SPI1_D0) CLK P9/31 (SPI1_SCLK) GND P9/1 (GND) VIN P9/4 (DC, 3.3V) 2. Prepare the BBB environmentI use Snappy Ubuntu. Grab the SD card and download the image as documented here. Before flashing the SD card, we have to update the device tree in the image so that the SPI port is correctly enabled. Unpack bt_ircamera.zip that you have just downloaded and go to the dt subdirectory. There you find a device tree file that I used for this project. Beside the SPI1 port, it also enables some serial ports. These are not necessary for this project but may come handy.Compile the device tree:dtc -O dtb -o am335x-boneblack.dtb am335x-boneblack.dtsThe output is the binary device tree (am335x-boneblack.dtb) that needs to be put into the kernel image file. Let's suppose that the downloaded image file is ubuntu-15.04-snappy-armhf-bbb.img and you have an empty directory at /mnt/img. Then do the following:fdisk -l ubuntu-15.04-snappy-armhf-bbb.imgLook for the first partition and note the partition image name and the offset:ubuntu-15.04-snappy-armhf-bbb.img1   *        8192      139263       65536    c  W95 FAT32 (LBA)...Note that the actual partition ima[...]



BLED112 on BeagleBone

2015-04-30T13:45:30.809+02:00

In the previous post I demonstrated, how a Bluetooth Low Energy dongle can be used to connect a PC and an Android device. While this is sort of project is appealing, connecting PCs and smartphones is not such an interesting use case. It is much more interesting, however, to transfer the PC-side program directly to an embedded device and that's what I will demonstrate in this post.The Android application used in this post did not change, you can download it here. The BLE server application was updated according to the embedded platform's requirement, you can download the new version here.There are two baskets of embedded platforms out there. One of them is optimized for low power consumption. They are too limited to run a full-scale operating system therefore their system is often proprietary. Arduino (of which we have seen the RFDuino variant) is one of them but there are many more, e.g. Bluegiga modules also have a proprietary application model. We can typically expect power consumption in the 1-10 mA range with some platforms offering even lower standby consumption.The other basket contains scaled-down computers and they are able to run stripped down versions of a real operating system. Their power consumption is in the 100-500 mA range and they often sport 100s of megabytes of RAM and gigabytes of flash memory. They are of course not comparable to low power platforms when it comes to power consumption but their much higher performance (which can be relevant for computation-intensive tasks) and compatibility with mainstream operating systems make them very attractive for certain tasks. The card I chose is BeagleBoard Black and my main motivation was that Ubuntu chose this card as a reference platform for its Ubuntu Core variant.The point I try to make in this post is how easy it is to port an application developed for desktop PC to these embedded computers. Therefore let's just port the BLE server part of the CTS example demo to BeagleBone Black.There are a handful of operating systems available for this card. I chose Snappy Ubuntu - well, because my own desktop is Ubuntu. Grab an SD card and prepare a Snappy Ubuntu boot media according to this description. It worked for me out of the box. You can also start with this video - it is really that easy. Once you hooked up the card with your PC, let's prepare the development environment.First fetch the ARM cross-compiler with this command (assuming you are on Ubuntu or Debian):sudo apt-get install gcc-arm-linux-gnueabihfThen install snappy developer tools according to this guide.Then unpack the BLE server application into a directory and set up these environment variables.export CROSS_COMPILE=arm-linux-gnueabihf-; export ARCH=armEnter the beagle_conn_example directory that you unpacked from the ZIP package and execute:makeThis should re-generate cts_1.0.0_all.snap which is already present in the ZIP archive in case you run into problems with building the app. The snap is the new package format for snappy. Then you can install this package on the card.snappy-remote --url=ssh://192.168.1.123 install ./cts_1.0.0_all.snapYou have to update the IP address according to what your card obtained on your network. The upload tool will prompt you for username/password, it is ubuntu/ubuntu by default.Update the GATT tree in the BLED112 firmware as described in the previous post. Plug the BLED112 dongle into the BeagleBoard's USB port. Then open a command prompt on the BeagleBoard either using the serial debug interface or by connecting to the instance with ssh and execute the following command:sudo /apps/cts.sideload/1.0.0/bin/cts /dev/ttyACM0The familiar console messages appear and you can connect with the Android app as depicted in the image below.One thing you c[...]



Integrating an Android smartphone application with the BLED112 module

2014-12-31T00:13:11.811+01:00

My conclusion with the RFDuino adventures was that RFDuino is a perfect platform to start familiarizing with the Bluetooth Low Energy (BLE) technology. BLE programming was made so simple with RFDuino that it provides quick success. Simplification comes with limitations, however, and eventually time has come for me to step further toward a more flexible BLE platform. Bluegiga's BLE121LR long range module seems to have outstanding range but first I tried a piece of hardware that is equivalent from the API point of view with the BLE121LR but is easier to start with and that is Bluegiga's BLED112 USB dongle.The BLED112 implements the same API (called BGAPI, check out Bluetooth Smart Software API reference) that other Bluegiga BLE modules do but there's no need to buy the pricey DKBLE development board or build any hardware. It plugs neatly into the USB port and is functional without any additional piece of hardware. From the serious BLE application development perspective it has drawbacks too. Firstly, its USB interface is drawing a constant 5mA current so this solution is not very much "low energy". Second disadvantage is that its single USB interface is shared between the BGAPI API and the programming interface so installing scripts into the BLED112 is a risky enterprise. If the script running on the BLED112 occupies the USB port, there's no way to update it so the module is essentially bricked. Hence in this exercise we will keep the BLE application logic on the PC hosting the module and talk to the module with BGAPI. This is very similar setup when the application logic is running on a microcontroller or embedded PC.Click here to download the Android client and the PC server example programs.In this exercise, we will implement the Current Time Service (CTS) and access this service from an Android application. CTS is a standard Bluetooth service. The PC application will fetch the current time from its clock and will update the characteristic exposed by the BLED112. The Android application will detect the advertised CTS service, connect to it, retrieve the time and display it to the user. The Android application will also subscribe to time changes demonstrating the notification feature of BLE GATT.Let's start with the PC part. Unpack the cts_example.zip file and inside there are a set of C files belonging to the PC application in the root directory. I developed and tested the application on Ubuntu 14.10 so if you use a similar system, you have good chances that you just type "make" and it will compile. Preparing the BLED112 dongle is more complicated, however and this is the result of the quite cumbersome Bluegiga tool chain. Any change to the GATT services (read this presentation if you don't know what GATT is) requires a firmware update of the BLED112. This sounds scary but it is not too complicated if you have a Windows system. Bluegiga SDK supports only Windows and there is one element of the tool chain, the firmware downloader that does not run on emulated Windows either - you need the real thing. So the steps are the following:Grab a Windows machine, download the Bluegiga SDK and install it.Get the content of the config subdirectory in cts_example.zip and copy somewhere in the Windows directory system. Then generate the new firmware with the \bin\bgbuild.exe cts_gattBLED112_project.bgproj command. The output will be the cts_BLED112.hex file which is the new firmware. We could have placed application logic into the firmware with a script but as I said, it is a bit risky with the BLED112 so this time the new firmware contains only the GATT database for the CTS service.Launch the BLE GUI appl[...]



Parsing BLE advertisement packets

2016-05-09T11:54:47.196+02:00

Ever since I created the Gas Sensor demo (post here, video here, presentation here), I had the feeling of an unfinished business. That demo sent the sensor data in BLE advertisement packets so the client never connected to the sensor but received data from the sensor in a broadcast-like fashion. The implementation looked like this:        public void onLeScan(final BluetoothDevice device, int rssi, byte[] scanRecord) {            String deviceName = device.getName();...                int addDataOffs = deviceName.length() + 16;                int siteid = ((int)scanRecord[addDataOffs]) & 0xFF;                int ad1 = ((int)scanRecord[addDataOffs+1]) & 0xFF;This was a quick & dirty solution that remained there from my earliest prototypes. It sort of assumes that the structure of the BLE advertisement packet is fixed so the sensor data can always be found at fixed locations of the advertisement packet. This does not have to be the case, Bluetooth 4.0 Core Specification, Part C, Appendix C (or Core Specification Supplement in case of 4.2 version) describes, how the fields of the advertisement packets look like. It just so happens that with the given version of the RFDuino BLE module, the Manufacturer Specific Data field where RFDuino puts the user data for the advertisement packet can always be found at a specific location.The proper way is of course to parse this data format according to the referred appendix of the specification and in this post I will show you how I implemented it.Here are the three example programs mentioned in this post: blescan.zip, gassensordemo.zip, gas_adv.inoLet's see first the BLEScan project.Update: the project has been updated to support more of the 4.2 elements. It has also been converted into an Android Studio project but the download material contains only the app/src part of the tree.Update: I was asked by e-mail, how to import the project (in blescan.zip) into Android Studio. Here is a simple process.Create a new project in Android Studio under any name. Make sure that your project supports at least API level 18. Choose the "create no Activity" option.Once your project is created, go and find it on the disk. On my Ubuntu system, the project files go under ~/StudioProjects/ where is the name you gave to your project. We will call this directory .Go into /app/src and delete everything there. Copy blescan.zip into /app/src and unzip it. It will create a single directory called "main" and the sources below.In Android Studio, do File/Synchronize. After that is completed, you can open your project files, build APK, etc.The parser code is under the hu.uw.pallergabor.ble.adparser package. Then you just give the scanRecord array to AdParser's parseAdData method like this:            ArrayList ads = AdParser.parseAdData(scanRecord);and then you get an array of objects, each describing an element in the scan record. These objects can also produce printable representation like this:Now let's see the revised GasSensorDemo project, how the gas sensor measurement is properly parsed out of the scan record. First we parse the scan packet fields:                ArrayList



Connect your Android to the real world with Bluetooth Low Energy

2014-11-24T07:22:33.330+01:00

This is my presentation at Londroid IoT meeting, 2014 nov. 19.


src="//www.slideshare.net/slideshow/embed_code/41845205" width="425" height="355" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen>
Connect your Android to the real world with Bluetooth Low Energy from Gabor Paller



Motor boat control with Bluetooth Low Energy

2015-11-13T07:17:30.406+01:00

 My previous post about Bluetooth Low Energy applications with RFDuino and Android presented a connectionless gas sensor. That prototype was based solely on BLE advertisements, no connection was built between the scanner device (Android phone or tablet) and the sensor. While this connection-less operation is advantageous for sensors that just broadcast their measurement data, more complex scenarios that e.g. require authentication or build a communication session cannot be implemented in this model. The prototype I am going to present in this post demonstrates connection-oriented operation between RFDuino and an Android application.Watch this video to see what the application is about. allowFullScreen='true' webkitallowfullscreen='true' mozallowfullscreen='true' width='320' height='266' src='https://www.youtube.com/embed/IWID35cOLZg?feature=player_embedded' FRAMEBORDER='0' />The story behind this motor boat project is that I bought this RC-controlled model boat while I worked in the UK. But when we moved back to Hungary, I lost the RC controller. So the boat had been unused for years until I realized how great it would be to use an Android device as a controller. Hence I quickly integrated the RFDuino with the motor boat's original control circuitry and wrote the necessary software. As you can see in the video, it has quite respectable range even though I did not dare go into the October water of Lake Balaton where the second part of the video was shot (water temperature: some 10 degrees centigrade).First about the "hardware". I did not have the circuit schema of the original RC controller in the boat so I had to experiment a bit. By following the motors' cables I quickly found two three-legged stocky elements that looked like switching transistors (although the labels on them were not readable after all those years in service). I removed one end of the resistors that I thought connected the base of these transistors to the rest of the RC control circuit and tried out, how much current is needed to switch on the motors. To my pleasant surprise, 1 mA current was enough so I rather believe that these are actually not transistors but power switching ICs. Anyway, RFDuino outputs can provide 1 mA switching current so I just connected the other end of those removed resistors to two spare RFDuino I/O ports. Lo and behold, it worked. If RFDuino raises any of these pin to 1, the respective motor starts. One minor additional problem was about the power supply of RFDuino. The motor boat employs an 7.2 Volt battery and RFDuino needs 3.3 V. I added an LM1117-3.3V power regulator circuit between the battery and RFDuino and the "hardware" was ready.Do you know about BLE concepts like service and characteristics? If not, please read this presentation for a quick introduction. In short: BLE services (also called GATT profiles) are composed of characteristics which are key-value pairs decorated by meta-information that the BLE specification calls descriptors. RFDuino with its default firmware is not able to implement any standard GATT profile except for its own custom GATT profile. This is a major disadvantage in product-level development but makes RFDuino code super-easy because the programmer does not have to deal with BLE details. In the RFDuino custom service, a "read" and a "write" characteristic is defined. Whatever the client (in our case, the Android application) writes into the "write" characteristic appears for the RFDuino code as incoming data callback. If the RFDuino code calls the RFduinoBLE.send(v) method, the data appears in the "read" characteristic. The Android client can register a callb[...]



Award for our Android Gas Sensor

2014-10-07T21:21:28.319+02:00

I just got a mail that our gas sensor entry (a gas sensor with Bluetooth Low Energy connectivity and the associated Android application) has just won 3rd place on the We Know RFDuino contest. Thanks to everyone who viewed our video and thus helped us to compete successfully! Meanwhile the source code of the prototype was made open source so you may want to check out that too!



Gas sensor prototype explained

2014-09-29T21:22:52.123+02:00

The "We know RFDuino" contest has not ended yet but its end is sufficiently close so that I can explain our prototype application. Our entry is a Bluetooth Low Energy-connected gas sensor and it is presented in the video below. Make sure that you watch it, you help us win the competition. allowFullScreen='true' webkitallowfullscreen='true' mozallowfullscreen='true' width='320' height='266' src='https://www.youtube.com/embed/iYWIzbJK81U?feature=player_embedded' FRAMEBORDER='0' />The prototype demonstrates a unique capability of Bluetooth Low Energy device advertisement messages: you can embed user data into these broadcasts. These come handy if you just want to send out some measurement data to whoever cares to listen without creating a session between the BLE client and server. This broadcast-type data transfer may support unlimited number of clients with very low energy consumption on the sensor side.Click here to download the Android client application project.Click here to download the RFDuino source code.The prototype works like the following. The microcontroller presented in the video measures the Lower Explosion Limit and sends this value to the RFDuino microcontroller over a super-simpe serial protocol. A message of this protocol looks like this:0xA5 where seq_no is an increasing value and LEL% is the measured Lower Explosion Limit value. The microcontroller code is not shared here but you can get the idea. The RFDuino code receives the LEL% value over the serial port it creates on GPIO pins 3 and 4, creates a custom data structure for BLE advertisements consisting of the site ID and the LEL% value then starts advertising. This is performed cyclically so the LEL% value is updated in the sensor's BLE advertisement every second.Now let's see what happens on the Android side. This is a non-trivial application with multiple activities but the Real Thing (TM) happens in the MapScreenActivity, in the onLeScan method. This method is called every time the Android device's BLE stack discovers a device. In this case we check whether the device's name is "g" (this is how we identify our sensor) and we retrieve the LEL% data from the advertisement packet.  We also handle the Received Signal Strenght Indicator (rssi) value for proximity indication. Bluetooth device discovery is restarted in every 2 seconds so that we can retrieve the latest LEL% value. The rest is just Plain Old Android Programming.The identification of the sensor and the encoding of the sensor data is obviously very naive but this is not really the point. You can make it as complex as you like, e.g. you can protect the sensor data with a hash and place that hash also into the advertisement so that the receiver can make sure that it gets data from an authorized sensor and not a fake one. The important thing is that the entire framework is sufficiently flexible so that relatively complex functionality can be implemented and RFDuino really simplifies sensor programming a lot.If you enjoyed the example application, make sure you watch the video (many times if possible :-)) and if you happen to be in London on 2014 November 19, you might as well come to the Londroid meetup where I present this and another BLE project (a connection-oriented one, called MotorBoat).[...]



Camera shot on charger connection

2014-09-06T20:25:48.692+02:00

Somebody came to me with an idea whether a cheap Android phone can be turned into an automatic camera. Some external sensor would send a signal to the phone and the phone would take a picture automatically. We started to discuss the possible connection of the external sensor and an interesting idea came up: the charger connection.

Android delivers an event whenever the charging power is connected or disconnected: can it be used to send a binary signal to an application in a very simple way, without fiddling with Bluetooth or USB?

Click here to download the example application.

You have to start the application once. Then whenever you connect the charger, it takes a picture. When the application is in the foreground, a preview is shown but as long as the application is active (not destroyed) it works from the background too.

Here are the experiences:

  • On my high-end device the application reacted quickly to charger connection, the reaction time from connecting the charger to the camera shot was less than a second. But when the application was tested on the very low-end Android target device, the picture was much less rosy: the delay increased to 3-4 seconds, effectively making the solution unusable.
  • In order for this application to work, it has to be started at least once manually. This pretty much kills all unattended use cases.
  • The shutter sound is almost impossible to remove. Update: on certain devices (Nexus 4 and Nexus 7 confirmed) there is no shutter sound in silent mode.
The takeaway for us was to reject the idea. But I share the example program anyway, maybe it can be useful for somebody.





Android gas sensor application with Bluetooth Low Energy/RFDuino

2014-09-02T22:33:22.243+02:00

I have always had a fascination with sensors linked up with mobile devices so it seemed just a good opportunity to try out the latest fashionable technology in the area, Bluetooth Low Energy in the context of a competition. SemiconductorStore.com announced the "We know RFDuino" competition for applications of the RFDuino module. RFDuino is an Arduino module with Bluetooth Low Energy (BLE) support. It is ideal to act as an interface between a sensor and a BLE-enabled mobile device like the Nexus 7.

Eventually I will publish the entire source code of this prototype application on this blog. But as this is a contest, I will wait until the contest ends (Sept. 30). Till then, watch the (very amateurish) video we have prepared about our sensor and the Android application. The entry with the most views wins the contest so if you like the concept, share the video with others! Thanks in advance. :-)

allowFullScreen='true' webkitallowfullscreen='true' mozallowfullscreen='true' width='320' height='266' src='https://www.youtube.com/embed/iYWIzbJK81U?feature=player_embedded' FRAMEBORDER='0' />




Beyond RenderScript - parallelism with NEON

2015-11-13T07:20:37.578+01:00

My last post about the parallel implementation of Distributed Time Warping (DTW) algorithm was a disappointment. The RenderScript runtime executed the parallel implementation significantly slower than the single-core implementation (also implemented with RenderScript). It turned out that parallelizing the processing of 10000-50000 element vectors on multiple cores were not worth the cost of the multi-thread processing and all the overhead that comes with it (threads, semaphores, etc.). One core must be allocated a significantly larger workload but our DTW algorithm is not able to generate such a large, independent workload because rows of the DTW matrix depend on each other. So in order to exploit RenderScript multi-core support, it is best to have an algorithm where the output depends on only the input and not on some intermediate result because this type of algorithm can be sliced up easily to multiple cores.It would have been such a waste to discard our quite complicated parallel processing DTW algorithm so I turned to other means of parallel execution. Multi-core is one option but the ARM processors in popular Android devices have another parallel execution engine, internal to the core, the NEON execution engine. One NEON instruction is able to process 4 32-bit integers in parallel (see picture below). Can we speed up DTW fourfold with this option?NEON is actually quite an old technology, even Nexus One was equipped with it. It is much more widely deployed therefore than multi-core CPUs. While ordinary applications can take advantage of multi-core CPUs (e.g. two processes can execute in parallel on two cores), NEON programs are difficult to write. Although some compilers claim the ability to generate NEON code and template libraries are available, the experience is that the potential performance benefits cannot be exploited without hand-coding in assembly and that's not for the faint hearted.The example program can be downloaded from here.The relevant functions are in jni/cpucore.c. There are 3 implementations, processNativeSlow, processNative and processNativeNEON, each is progressively more optimized than the previous one. The processNativeSlow and processNative functions are in C, in processNativeNEON the most time-critical loop ("tight loop") is entirely implemented in mixed ARM/NEON assembly. This tight loop produces 4 result elements in parallel so we expect huge performance gain over the single-core RenderScript implementation (dtw.rs).The experience is completely different. While the NEON implementation is significantly faster on small datasets, one second of voice is 8000 samples so data sizes grow quickly. On 10 second data sets (80000 samples, 6.4 billion element DTW matrix) the simple nested loop C99 implementation and the complex, hard to understand NEON implementation produces about the same execution time.How is this possible? Let's take an example of 10 second reference and evaluation samples. This means 80000 elements, 80000*80000=6.4 billion values to calculate. Calculating each value takes 20 bytes to access (2 input samples (2 bytes each), 3 neighbor cells (4 bytes each) and storing the result (4 bytes)). A1 SD Bench measures 800 Mbyte/sec copying performance on my Galaxy Nexus (and similar values on the two cheap Android tablets that the family has), that obviously means 2 accesses (one read and one write). For simplicity, let's assume that reads and writes take about the same time. This me[...]



RenderScript in Android - the parallel version

2015-11-13T09:21:06.291+01:00

In the previous post I promised to revisit the parallel case. The big promise of RenderScript is to exploit parallelism among different CPUs, GPUs and DSPs in the device at no additional cost. Once the algorithm is properly transformed into parallel version, the RenderScript runtime grabs whatever computing devices are available and schedules the subtask automatically.The problem with DTW is that it is not so trivial to parallelize. Each cell in the matrix depends on cells at (x-1,y), (x-1,y-1) and (x,y-1) (provided that the cell to calculate is at (x,y)). By traversing the matrix horizontally or vertically, only two rows (one horizontal and one vertical) can be evaluated in parallel.Michael Leahy recommended a paper that solves this problem. This algorithm traverses the matrix diagonally. Each diagonal row depends on the two previous diagonal rows but cells in one diagonal row don't depend on each other. One diagonal row can be then fed to RenderScript to iterate over it. The picture below illustrates the concept.The example program can be downloaded from here.You will notice that there are two parallel implementations. The findReferenceSignalC99Parallel() is the "proper" implementation that follows closely the RenderScript tutorial. Here the diagonal rows are iterated in Java and only the parallel kernel is implemented in RenderScript. This version - even though it is functional - is not invoked by default because it delivers completely inacceptable performance on my 2-core Galaxy Nexus. By looking closely at the execution times, I concluded that even though RenderScript runtime invocations ( copying into Allocations and invoking forEach) are normally fast, sometimes very innocent-looking invocations (like copying 5 integers into an Allocation) can take about a second. This completely ruined this implementation's performance.The other parallel implementation which is actually invoked and whose performance is compared to the 1-core RenderScript implementation (the fastest one) is findReferenceSignalC99ParallelRSOnly(). This version is implemented entirely in RenderScript. Unfortunately its performance is 2-2.5 times slower than the 1-core implementation. How can it be?First, if you compare dtw.rs and dtwparallel2.rs, you will notice that the parallel implementation is considerably more complex. Indexing out those varying-length diagonal rows takes a bit of fiddling while the 1-core implementation can take the advantage of fast pointer arithmetic to move from cell to cell sequentially. So the parallel implementation starts with a handicap. This handicap is not compensated by the 2 cores of the Galaxy Nexus.OK, Galaxy Nexus is the stone age but what happens on a 4-core processor like on a Nexus 4? The runtime does launch with 4 cores but then the Adreno driver kicks in and the result is that the parallel implementation is about 3 times slower than the serial one. What happens in the driver, I don't know, as far as I can see, the source code is not available.Jasons Sams recommended to disable the GPU driver withadb shell setprop debug.rs.default-CPU-driver 1but I decided to stop my adventures here. The conclusion I drew for myself is that RenderScript in its present form is not ready for parallel programming. Clang-LLVM is a very promising compilation technology but the parallel runtime suffers from a number of problems. IMHO, there should be a way to programmatically control the way the wor[...]



RenderScript in Android - anatomy of the benchmark program

2015-11-13T09:25:53.554+01:00

 In the previous post I have presented our RenderScript benchmark and demonstrated that RenderScript implementation of the same algorithm can be 2-3 times faster than Java. How can a "script" be so fast? In order to understand this speed difference, let's see how the RenderScript fragment is executed.The example program is available here.First, let's see how the script looks like. The source can be found in dtw.rs, in the same directory where other Java sources (just one file in this case) are.It looks like an innocent C function but there are some specialties. All the global variables like these ones:int32_t s2len = 0;int32_t *d0;can be used to pass data to the script.  The toolchain generates a Java wrapper for each .rs file, ours is called ScriptC_dtw.java. In order to set s2len, for example, one calls the set_s2len function in the ScriptC_dtw class. The d0 global variable is a pointer type, setting this variable requires an Allocation Java object. Open MainActivity.java and look up the findReferenceSignalC99() method. There you will find:        Allocation signal1Allocation = Allocation.createSized(                rsC,                Element.I16(rsC),                refSignal.length);        signal1Allocation.copyFrom(refSignal);        script.bind_signal1(signal1Allocation);Here we created an allocation that holds 16-bit integers, copied the input signal into it and bound the allocation so that the allocation's data is available to the script. When the script is invoked, the data area of this allocation is simply available to the script as:int16_t *signal1;This sort of parameter passing is one-way for simple values but two-way for allocations. So whatever you write in your script into e.g. s2len, you won't be able to read it in the Java layer after the script finishes executing. In contrast, Allocations provide two-way data transfer, that's why the result value is passed back to Java in an Allocation.In MainActivity.java:        Allocation rAllocation = Allocation.createSized(                rsC,                Element.I32(rsC),                1);        script.bind_r(rAllocation);In dtw.rs:        *r = d1[s1len-1];And again in MainActivity.java after the script finished executing:        int result[] = new int[1];        rAllocation.copyTo(result);...        int maxc = result[0];The execution of the script seems simple enough but there's more than meets the eye.Execution context and the script instance are created.         RenderScript rsC = RenderScript.create( this);        ScriptC_dtw script = new ScriptC_dtw(                 rsC,                 getResources(), &nbs[...]



RenderScript in Android - Java optimization in the benchmark program

2015-11-13T09:26:34.043+01:00

I got into discussion with Michael Leahy with regards to the benchmark program posted at the end of the previous post. Michael claimed that the Java implementation of the benchmark algorithm in my test program is suboptimal and he contributed an optimized version that he claims could deliver 70% better performance. 
Therefore I decided to re-evaluate the benchmark results - but with a twist. Not only I added Michael's optimized Java implementation but I optimized the RenderScript implementation too. The results are the following.
  • Michael's implementation did improve the execution time of the Java implementation by about 2.6 times.
  • RenderScript implementation is still about 2.3 times faster than Michael's optimized Java implementation.
The new version is available here. I will continue with explaining this example program in the next post.