posted by Bryon Moyer
As noted in today’s article on some of the characteristics of the DDS data transport standard, it’s missing a rather important component: formalized security. Proprietary schemes have been layered on top of it, but the OMG has a beta standard that they’re now finalizing (a process that could take up to a year).
But that doesn’t stop early adoption. RTI has announced an implementation of the new OMG security standard for DDS – something likely made easier since, by their claim, they contributed much of the content of the standard.
There are a couple of particular challenges with respect to security on DDS. First, due to its decentralized nature, there are no brokers or single-points-of-security (which would be single points of failure). This means that each device or node has to handle its own security.
Second, DDS runs over many different transport protocols, some of which may or may not have their own security. Because of that, you can’t rely on the underlying transport security for protection. This means adding DDS-level security (which may complement security at a lower level).
We usually think of security as protecting the privacy of a message so that only the intended receiver can read it. While this is true, RTI points out that, in many cases, the content isn’t really secret – you just want to be sure that it’s authentic. They use as an example a weather data transmission: you may not care if anyone else sees it, but you want to be sure you’re getting the real thing and not some spoofed message that’s going to send your boats out into the heart of a hurricane. (I hear that competition amongst fishermen is fierce!)
So RTI’s Connext DDS Security includes authentication, access control, encryption (using encryption standards), data tagging (user-defined tags), and logging.
(Click to enlarge)
Image courtesy RTI
If all you’re interested in is authentication, you can improve performance by taking a hash of the message (much faster than encrypting) and then encrypting only the hash (much smaller – hence quicker – than the entire message). Full encryption (needed to obscure the entire payload) can be 100 times slower.
You can also customize your own encryption and authentication code if you wish.
They claim that this is the first “off the shelf” security package; the prior proprietary approaches ended up being written into the applications explicitly. Here it’s provided as a library for inclusion in the overall DDS infrastructure.
You can find more in their announcement.
posted by Bryon Moyer
Any new foundry would want to grow up to be a megalith like TSMC, right? Isn’t that how you prove you’ve “made it”? Well, not if you’re Novati. They’re a different sort of foundry, one you don’t hear about so often over the noise of the Big Guys.
Here’s the thing: when you’re in the foundry mainstream, you do one thing: you chase Moore’s Law and try to keep it going. You figure out what the masses want, and you trim everything extraneous away so that you can sate the masses in enormous volumes at competitive costs.
But what if you’re in the market for something that can’t be made using the techniques that suit the masses? That’s where smaller… ok, I’m going to use the dreaded word (investors: please cover your ears): niche players can find plenty of business, even if, by so doing, they can maybe achieve only kilolithic or decalithic status.
I met with them at Sensors Expo. Sensors are a typical opportunity for a more flexible fab, since they may use unusual techniques and materials, and each one may be slightly different, making it hard to put everyone onto one high-volume recipe.
Novati does CMOS and MEMS (particularly silicon microfluidics) – jointly and severally. When jointly, with both on the same wafer, they typically do MEMS-last, placing the MEMS elements above the CMOS circuitry. They can do this either by growing more silicon epitaxially over the CMOS or by stacking a separate wafer.
They also work on silicon photonics projects and 2.5D (silicon interposer) and 3D integration.
Most of what they do leverages a common set of equipment (largely for 200-mm wafers, with some 300-mm ones), but where the diversity really comes in is with materials. They can work with 60 different elements – far more than would be found in your average foundry.
Most foundries want to keep the number of elements they allow through the door to the absolute minimum. A new material, if not handled carefully, brings with it the risk of unexpected contamination with potentially calamitous results – something that’s just not worth messing with if you’re spinning oodles of wafers an hour.
But smaller guys need to be more flexible, and a willingness to work with more materials can be a boon to developers trying new ideas. Gold is the one element that Novati is particularly careful with: They segregate that in a separate room. For all the others, they study each one under consideration and develop specific protocols to ensure that the material goes only where they want it to. Which may be limited to some nanolayer a few atoms thick laid down by atomic layer deposition (ALD) on a wafer.
Once a project gets to production volumes, they can handle it to an extent, but they may also hand off to a partner that can handle higher volumes. Of course, if the volume production involves odd materials, then they’ll need to work with someone willing to handle that material.
As with any business, there’s always opportunity on the fringes of the mainstream. In this case, they’re entertaining many of those opportunities; they’re just being careful not to step on Moore’s toes.
You can find out more on their site.
(Image courtesy Novati)
posted by Bryon Moyer
You may recall that PNI Sensors has a sensor hub called SENtral. It represents a unique partitioning between hardware and software intended to lower its power and size. Its focus was primarily motion-oriented sensors, which, at the time, were the bulk of what system designers were paying attention to.
Since then, Google has issued their sensor requirements for Android 4.4 (Kit Kat). It requires very specific sensors, some of which are actual physical sensors, and others of which are “virtual” sensors – fused out of data from the real sensors. A step counter is an example of a virtual sensor: There is no hard step counter in any device, but the information from the inertial sensors can be combined to create the step counter.
So PNI Sensor has updated their SENtral hub to meet the Kit Kat requirements; they call it SENtral-K. It supports more sensors than their original version did, meeting the list that Google has sent down. Some of what the –K version does could have been done in the older one by adding new functions in the RAM space; this new version implements the functions in the ROM space.
One of their focuses is on what they call “simultaneity.” The idea is that it takes time to do the calculations required for the virtual sensors, and yet Android doesn’t accept excuses for virtual sensors. Heck, it thinks it knows which sensors are real and virtual, but in fact it doesn’t. (For example, the gyroscope could be a “soft gyro”).
What that means is, if you’re sampling your real sensors at 100 Hz, then Kit Kat expects all sensors – real or virtual – to be available at 100 Hz. Which means the calculations better be fast enough to keep up with that. Yeah, they’re not rocket science, but we’re talking tiny platforms drawing as little power as possible, making the burden non-trivial.
That power is lowered by implementing many of the fusion algorithms in hardware. They claim to be the lowest power, at least against microcontroller-based sensor hubs, with under 200 µA at 1.8 V, which is 360 µW. That would appear to be higher than QuickLogic’s claimed 250 µW (yes, that’s for their wearable version, but it’s the same hardware as the Kit Kat version – just different libraries), but it’s an order of magnitude less than what they show for Cortex-based hubs.
The other Kit Kat requirement they meet is that of “batching.” In and of itself, that term isn’t particularly helpful, since I can imagine a number of ways of batching sensor data. A conversation with PNI’s George Hsu clarified Google’s intent, and it wasn’t one of the scenario’s I had envisioned.
The idea is that the real sensors, from which all the virtual sensors are determined, should be buffered for some amount of time – like 10 s or so (there’s no hard spec on the time; it’s left to designers to do the right thing for their systems). If something goes wonky with the calculation and the application processor (AP) sees a sensor value that it finds suspect, it can actually go back to the original sensors, grab the historical raw data, and redo the calculations itself to confirm or correct the suspect values.
SENtral buffers five sensors: the accelerometer, the gyroscope (with and without bias correction) and the magnetometer (with and without offset correction). The buffer size is flexible; it uses RAM, and so the available RAM must be allocated between buffers and any other functions using the RAM.
Oh, and they go to pains to point out that this thing is small. (I’ve seen it; it’s small.)
Image courtesy PNI Sensor
You can find more in their announcement.