Not Just Software vs. Hardware
We recently took a look at Lattice’s approach to sensor hubs. We’ve seen many other ways of implementing sensor hubs in the past, but all of those were software-based; it was just a question of where the software executes. Lattice’s approach is hardware, and that raises all kinds of new questions.
The biggest red flag that it raises for me is that moving a task from software to hardware in the design phase is not trivial. (Trying to keep it with the software guys, using tools that automatically generate hardware is, for the most part, a quixotic goal that seems largely to have been lovingly placed back on the shelf.) In my quest to figure this part out, I found that there’s more to the sensor hub world than all-software and all-hardware. And that makes the design question even more complex.
Taiwanese CPU Company is Happy to Keep Cool, Cash Checks
You know that feeling when you discover a great little restaurant that nobody else knows about? Or listen to a terrific band that’s flying under the radar?
That’s how the designers of a few hundred million SoCs must feel. They’ve discovered the Andes, a small 32-bit microprocessor core that sits in the middle of a burgeoning array of small-scale electronic devices. Once known only to the Asian cognoscenti, Andes is going global, including a push into the United States. Who knows – Andes may even be seen in South America before long.
A New Approach from KAUST
When you think of silicon, what characteristics come to mind? Soft and pliable? Or hard and edgy? Well, your basic silicon chip isn’t something you’d want floating free in your arteries as part of some health monitor; you’d want something covering it to smooth the edges.
Even the semi-liquid form of silicon’s oxide – glass – isn’t exactly associated with soft and gentle. Given that silicon chips are largely made of layers of silicon and silicon dioxide, well, it’s why we call them “chips” instead of “drops” or “pads” or something else suitably soft.
Indoor Location System Knows Where Your Treasure is Buried
As first-world problems go, losing your car keys is a bad one. Losing a whole warehouse full of shippable merchandise is probably worse, but warehouses typically have lots of people standing around watching over the goods. But what do you do when you’ve lost your car keys somewhere inside the warehouse? Needle, meet haystack. Not so easy now, huh? What are you going to do?
If you’ve planned ahead, you’d have little badges on your keys – badges stuffed with technology designed by DecaWave. The Dublin, Ireland startup has created an itty-bitty little chip it calls the ScenSor that’s designed to solve this, and much larger, problems.
In the Short and Long Term
It’s time to take another look at the grid, yet another part of our world that is supposed to be getting smarter. And, for this update, there are two decidedly distinct aspects to address: the here-and-now – bits that can be used today, in particular for smart meters; and the yet-to-come – a look at some insights provided by Imec last month into their view of where things are going.
SoC me ASAP
The obvious main theme for what’s become available in the last few months has everything to do with SoCs and platforms for smart meters. Now, smart meters are, for some of us, old news. I’ve seen the battles, I’ve seen the chained-and-padlocked analog meters, and, well, all of that has disappeared from the headlines. Every house I’ve been in for the last several years has had a smart meter. So… we’re done with smart meters. Right?
Microchip’s New PIC32MZ Puts Progress in Perspective
Sometimes the smallest things can knock you back on your heels and make you go, “Wow.”
Sure, we work in an industry of constant innovation. Computers get faster all the time, software gets more impressive, Internet startups come and go… we thrive on change and “destructive creation.” But still, you look around sometimes and wonder how we got here.
Not that long ago, the computer world was in the midst of a big RISC-versus-CISC battle. Would those newfangled RISC machines overturn our ideas of processor design and software’s role in computer architecture? Whole companies were founded on the basis of new CPU instruction sets. Fortunes were won and lost. Research went into overdrive. Stanford University produced MIPS, while, barely 20 miles away across San Francisco Bay, academic and athletic rival UC Berkeley’s SPARC joined the charge.
Imec/Delft Tool Manages Tradeoffs
It used to be pretty straightforward to figure out the cost of a finished IC. You had a linear progression of steps, each of which cost something to perform, and each of which might cause some fallout. In the end, your die cost was simply the sum total of all of those steps amortized over however many dice survived the whole process.
We’ll call creating a wafer a single step, even though, obviously, it’s enormously complex – and getting more so by the hour. But some number of the chips on the wafer (hopefully a lot) will be good. You then dice up the wafer into dice [typically referred to as “die” or “dies” in this industry, for some reason]. That will damage some of the erstwhile-good dice. You then take the remaining good dice and assemble them into packages, which entails yet further fallout. In the end, some number of chips see the light of day as good, finished units.
New Synopsys Processor Makes Leaps in Performance
Pop quiz! What’s the second-most-popular CPU core in the world? First place goes to ARM, of course, but who’s the runner-up?
If you guessed MIPS, PowerPC, x86, Tensilica, 8051, or XMOS, you’re wrong. (In good company, but still wrong.) The correct answer is: ARC.
According to Synopsys, 1.3 billion ARC processors were embedded into chips last year, and that number is growing by about 300 million per year. That puts ARC second only to the mighty ARM. Must be something about the name. Maybe all those designers thought they were getting ARM but licensed ARC by accident.
Power Plays the Death Card
Moore’s Law is a maddening mistress. As our engineering community has collectively held the tail of this comet for the past forty-seven years, we’ve desperately struggled to divine its limits. Where and why will it all end? Will lithography run out of gas, brining the exponential curve of semiconductor progress to a halt? Will packaging and IO constraints become so tight that more transistors would make no difference? Or, will economics bring the whole house of cards crashing down - putting us in a situation where there is just no profit in pushing the process envelope?
These are still questions that keep many of us employed - predicting, prognosticating, and pontificating from our virtual pedestals - trying to read the technological tea leaves and triangulate a trend line that will serve up that special insight we seek. We want to know the form of the destructor. When the exponential constants of almost fifty years make a tectonic shift and our career-long assumptions change forever, we’d appreciate some forewarning. We want to look the end of an era in the eye.
More MIPS Less Power
Dateline: November 1st. 2013 - 11 hundred hours. Sunnyvale. California. ARMed and Dangerous. We’re in the high tech trenches - MCU’s activated, knee deep in IP, connecting to the IoT, reading from SSD, wearing our GUIs like they are going out of style, and loving every big.LITTLE minute of it. Where o’ where could we be? ARM TechCon. Of course. (Come on, like you didn’t know.) In a special ARM TechCon double header, I check out hardware and software debug with Glenn Woppman (CEO - ASSET InterTech) and open source software for ARM SoCs with Charlene Marini from ARM. Join me.