A Look at Recent Software Development Process Tool Announcements
Software development processes can vary dramatically. If you program only occasionally as a hobby, like me, then you dream up what you want to do and immediately start coding. Working units then randomly materialize and just as quickly disappear like quantum fluctuations. Moving into the more professional arena, if there is a process, it can vary from something light, agile, and extreme, where code is generated quickly and converges towards requirements using Brownian successive approximation, all the way to heavyweight rational processes with which you risk spending your entire career just reading the manual (understanding it would take even longer; lifting the entire document is out of the question).
The idea behind any process is that you can generate better software more quickly if you employ some form of discipline during development. And discipline is generally not considered fun when you’re coding cool new stuff. So the cost of being disciplined has to be balanced by a tangible benefit. Just knowing that “it’s a better way to do things” usually won’t suffice. There has to be a cost to slovenly coding.
According to Gartner’s Jim Tulley, there are around 7,000 ASIC design starts a year, a number that is in slow decline. By way of contrast, there are around 100,000 FPGA design starts a year, of which 30,000 include a microprocessor of some kind. Yet for eleven years at the IP conference in Grenoble, the leading get-together for IP suppliers and users, the only mention of FPGAs has been in the context of building blocks for ASIC prototyping tools or, more recently, for testing the market before undertaking the incredibly more costly task of building an ASIC.
Obviously, this situation has to change, and it was with an eye to making a change that we helped organise a panel session at IP08. On the platform were Actel, Xilinx and Altera, as well as Synopsys, as a third party IP provider, and Alcatel-Lucent as a user. Conference organisers, Design and Reuse, also have a web site that provides a search mechanism for IP, both for ASICs and for FPGAs, and Gabriele Saucier, the CEO of Design and Reuse, was on the panel, which I tried to keep in order. The title of the session was “IP vision for FPGA: Do complex FPGA designs rely on the use of vendor-created and third-party IPs?”
Synopsys Announces Base Curve Compaction for CCS Models
If you had all the time in the world, you could simulate an entire SoC using SPICE, but you don’t, so you can’t. At least not for digital circuits; analog is different, since detailed analysis is required there, and it’s not a billion transistors. And yet, even with digital, we can’t quite revert all the way to 1s and 0s, but we can start to use some abstraction in the form of library cells for basic circuit chunks like transistors, inverters, gates, and flip-flops. Those cells can be characterized using SPICE (and/or physical measurement), and, from that information, models can be built that higher-level tools can use to help determine the delay and/or power and/or noise characteristics of your circuit. But any abstraction, pretty much by definition, means you give up some accuracy; as long as that sacrifice is small, it’s a reasonable price to pay.
For the purposes of figuring out how long it took a signal to get through a gate, one used to use a pretty high level of abstraction. Pick a voltage, slew rate, and load, and look up the delay in a table. It doesn’t completely abstract away the analog (there is a slew rate, after all), but damn near. That was the old non-linear delay model (NLDM) method. Problem is, it gets less and less accurate at the more aggressive technology nodes. So we need to move back a bit towards the analog realm, giving up some abstraction. A benefit of abstraction is doing more, more quickly, and with less data, so giving up abstraction means more data and slower. More on that later.
A few weeks ago I wrote about model-based development in Modelling: not just for big boys? At the end of the article I said “So is modelling just for big boys? Well, I hate to say it, but unless companies change their views on an appropriate level of expenditure for tools, yes it is, today. However, I have heard that one company is planning an announcement that could change this scenario quite markedly. And if the rumour is true, there may soon be a tool that will be very cost-effective for the single user.”
Well, today, this announcement is being made. Artisan Software Tools is releasing a version of a fully featured version of their modelling tool, Artisan Studio. And Artisan Studio Uno is free. Gratis. Costs nothing, nada, bubkiss, zip, zilch, zero. This is certainly cost-effective, at least in cash terms.
Generally, if a thing looks too good to be true, then it usually is too good to be true. Like the emails that I get every day from Nigeria and elsewhere offering me a chance of a share in several million dollars. Artisan Studio Uno doesn’t want your bank account details, just register (name, address and email) and download.
The Enginnering Tide
We engineers are unusually comfortable with periodicity. We find ourselves fooling around with frequency domain from the first days of our undergraduate education, and by the time we become practicing professionals, we whip in and out of Fourier's follies with the facility of wild monkeys traversing the forest canopy. We eat, drink, and breathe periodic waveforms. We handle harmonics, passbands, s-planes, and corners with reckless abandon. We own the spectrum.
When it comes to our own careers, however, some of us switch to DC psychology almost immediately. We paradoxically refuse to acknowledge that technologies, markets, companies, and the economy all exhibit complex periodic behaviors that affect our jobs, our areas of expertise, and our successes and failures. If we applied our understanding of our craft to our career, we might save ourselves scads of sleepless nights, angry rants at "the man," and hopeless plunges into the abyss of romanticized obsolescence.
If you were able to record the development of a town as it grew into a city over years and decades and then speed up the film in a super-fast-mo replay, you’d notice, assuming you weren’t thrown into an epileptic seizure by the rapid day/night flashing, that things start in a small center and move out for a while. Farmlands are replaced by tract homes, forests are cut down, hills may be leveled or developed, and the town inexorably creeps outward like mold in a Petri dish.
At some point, a limit starts to impede the amoebic outward spread. The constraining factor may be geographical; perhaps the extent of a valley has finally been covered, or open space or a greenbelt was declared, halting further encroachment. It may be sociological; commute times from the outskirts to where the jobs are may have become intolerable. If nothing else, the community may decide they’ve become dull as dirt and want to inject a little urban spirit into their wan suburban style. The reasons may vary, but little by little, the outward push will give way to an upward push.
This doesn’t come without a cost; clearly it costs more to dig into the ground in order to put in a parking garage with two underground floors and three above-ground floors than it does simply to pave over a chunk o’ mud and call it parking. Making a tall building earthquake-proof and stabilizing it against winds is harder than throwing together some ugly tilt-up walls and slapping a Wal-get sign on it. But building up eventually becomes cheaper than the alternatives.