posted by Bryon Moyer
I’ll round out the last of the things that caught my attention at this year’s ISSCC with a proposal and implementation of an AC-biased microphone. This is done based on projections that the biasing resistor for a traditional DC approach will head into ridiculously high territory – teraohms and higher.
The team, from NXP and Delft University, lists a number of problems that this causes.
- The connection between the MEMS die and the ASIC can easily pick up stray electrical noise due to its high impedance, meaning expensive packaging is required to shield this node.
- Creating a poly resistor of this size would be enormous; instead, active devices biased below their turn-on voltages are used. But leakage currents from neighboring ESD structures can find their way through these, with the ultimate result being increased noise.
- They say that chopping can’t be used to reduce flicker noise because of the extra input current the switching would cause; increased transistor sizes are needed instead.
- The on-chip bias generator will typically be a charge pump, and ripple noise could push the shut-off active devices used as resistors to turn on slightly; therefore large filtering caps are needed.
Their approach is differential, and they modulate the signal while cancelling out the carrier using cross-coupling caps; there are, in fact, three caps that have to be tuned to match the microphone sensing cap, and they have an 11-bit register for each of them.
Critically, feedback resistors are used to set the common-mode bias level; that and the fact that their contribution to in-band noise is now low due to the modulation mean that resistor values can be brought back down well below a gigaohm.
While you might expect the increased complexity to make the ASIC larger, in fact quite the reverse is true (presumably due to smaller components): the ASIC is 1/12 the size of the current state of the art. Expensive shielding is also no longer required to reject external noise.
They weren’t overwhelmed by the SNR they achieved, in the 58/60-dB range, but they commented that, with some focus, they could easily get to 64/65-dB levels.
For those of you with the proceedings, you can get much more detail in session 22.2.
posted by Bryon Moyer
Not long ago, in our coverage of 3D vision, we discussed time-of-flight as one of the approaches to gauging distance. Even though it and the other 3D vision technologies are gunning for low-cost applications, it’s easy, at this point, to view them as exotic works in progress.
Well, time of flight is now being put to use for the most prosaic of duties: making sure your cheek doesn’t accidentally hang up on you.
Of course, our phones already have this feature via their proximity sensor, installed specifically for this purpose. It detects when the phone is near the face and shuts down the touchscreen, both saving power and rendering it immune to the random input it would otherwise get as it hit your cheek now and again.
As STMicroelectronics sees it, however, the existing way of judging proximity leaves something to be desired. Right now, it’s a simple process of sending light out and measuring how much gets reflected back, a method that can depend on a lot of factors besides proximity. How often such sensors fail isn’t clear to me, but ST has come forward with a new approach: using time of flight to measure how long it takes the light (regardless of the quantity of light) to make a round trip.
They do this by co-packaging an IR LED emitter, an “ultra-fast” light detector, and the circuitry needed to calculate the distance from the measurements. It also contains a wide-dynamic-range ambient light sensor.
Is all of that needed just to keep your phone from getting too cheeky? Well, it’s clear that that’s simply the “marquee” function they address. On the assumption that you can do a lot more interesting stuff if you can measure with reasonable accuracy how far away something is (as opposed to a more binary near/far assessment), they’re betting that phone makers will want to include it so that both they and enterprising apps writers will come up with all kinds of interesting new things to do. It changes the class of apps it can manage from digital to analog (in the sense I defined them when discussing accelerometer applications).
Used in such other applications, they’re targeting a distance range of up to 100 mm (about 4 inches for those of us that grew up with non-metric intuitive visualization). They think it will work beyond that, but they’re not committing to that at this time.
You can find more info in their release.
posted by Bryon Moyer
Smart meters represent a major infrastructural change in our energy delivery industry. It’s gone smoothly in some places and has met with fierce resistance in others.
- The promise is that we will receive more detailed information about how we use power, allowing us to make smarter decisions.
- The gray zone is that the power companies can actually come in and adjust our energy usage once a full smart grid is in place.
- And the dark zone is that the energy companies are making money selling the personal data so that marketers can mine it.
For now, the only benefit that we can see outright is the first one. And yesterday I had the opportunity to see the results of all of the hard work that has gone into this new technology. It probably wasn’t the first report I’d gotten, but it was the first one I paid attention to. It’s the new smarter update from PG&E that provides the new information about my power usage. (I haven’t explored the online version yet.)
So I took a minute to see what I could learn. And, aside from comparing my usage to other households, the big thing it offered up was to identify what times of the day my power usage was highest. This might let me learn, for instance, that I was doing the laundry when everyone else was, and I could shift that around to even things out. This could have monetary impact once/if they ever start demand-based pricing.
So I studied the chart and the summary with great anticipation. What was my peak usage period?
Drum roll please:
My peak usage is between 5AM and midnight.
Yup. I was so surprised. I actually use more power during the hours when I’m awake. I had no idea.
I may try to shift things around by doing most of my computer work at night while I’m asleep.
The hopeful news is that, if the marketers are just as clunky about studying my data details, then I have nothing to worry about.
Then there’s the possibility that this was never about me optimizing my power, but was always simply about getting data to the big data guys…