Just google-search "Android Gingerbread" & you will see a host of results of this pattern:
Here is what Android has to say in the official Gingerbread release notes.
That IS quite a mouthful. But stripping-off the marketing-spiel we can say:
- Gingerbread provides sensors-support to native C/C++ apps.
- Gingerbread provides more accurate and precise sensor-data.
- Gingerbread provides APIs to recognise complex user gestures.
- Gingerbread supports gyroscope and barometer.
Real Sensors map 1-to-1 to actual Hardware. Data of Virtual Sensors, on the other hand, is exported to the apps by performing calculations on 2(or more) real sensors. |
While points [1], [2], [3] do mention tremendous improvements over FroYo, none of them has to do anything with any new sensors. Moving on to [4] we see the first mention of supposedly new sensors. But, here are two things that most people overlook:
- Both Gyroscope and Barometer (i.e. pressure) sensors were already available in previous releases of Android.
- The 4 "NEW" sensors (shown alongside) are just wrapper-APIs around existing hardware. They just provided "easy-to-digest" data.
- The 4 "NEW" sensors (shown alongside) are just wrapper-APIs around existing hardware. They just provided "easy-to-digest" data.
These newly introduced wrapper-APIs process the raw sensor data into a format ready to use by the Android apps. This proves especially useful to the apps doing advanced 3D math. ( read Games ;) )
Gyroscope sensor was supported in FroYo and so was Barometer. Since it was early days for Android sensors not much attention was paid to them. Maybe even added as an afterthought to the existing array of Accelerometer,Compass,Orientation.
With Android apps (
again read Games
;) ) really pushing them to the limit, the limitations of a "pure" accelerometer device became evident.
Apart from the rudimentary API which Android provided, Invensense Motion-Processing Library(MPL) sits alongside the Sensor-HAL and provides a feature-rich API to obtain Gestures, Glyphs & Pedometer data from sensors.
All this data is derieved from a combination of Accelerometer/Gyroscope/Compass hardware modules. The MPL processes the individual data and combines them appropriately to overcome the individual limitations of each sensor & provide an overall better stream of precise & accurate (processed)samples. Also Advanced operation/Pattern-matching & count is done by the MPL and any app can then directly obtain data pertaining to gestures or step-counts(pedometer) etc. using the MPL APIs.
Here is a "short" video by David Sachs(Invensense Tech) which explains the advantages of INVENSENSE MPL extensions on Android...
To conclude, one can say that the Sensor sub-system has undergone a huge overhaul in Gingerbread. And one can only hope that what it delivers is well worth all the hyped-up expectations.