How Physical Synthesis Enables FPGA Design Productivity

by Ajay Jagtiani, Altera Corporation

As FPGAs increase in density, system designers are using these increased densities to the maximum by creating larger and more complex designs. These large designs are based on design requirements that either requires adding new functionality to an existing application such as a channel card or a line card used in wireless applications or reducing board real estate by combining the functionality of two chips into a single device or creating new designs for new applications.

These varied designs could contain legacy code for an application or a DSP class design that has a high latency requirement. For such classes of designs the synthesis tools may not optimize the design optimally, which leads to long critical paths. The reason for these long critical paths is that logic synthesis tools depend on estimated delays to synthesize designs.

 

GateRocket Blasts Off

FPGAs Verifying FPGAs

by Kevin Morris

The system is both elegant and enigmatic.

When visitors see the RocketDrive sitting on your lab bench (particularly if it is plugged into the handsome show-floor-worthy box currently making the rounds at trade shows), your "cool factor" will definitely creep up a notch or two. When you use it to help you knock bugs out of your next FPGA design, you'll most likely be pleased with your purchase. GateRocket's RocketDrive is a useful tool for FPGA designers.

You have to be careful, though, not to think about it too hard.

You see, if you've been doing FPGA design for awhile, you probably have first-hand experience with the history of FPGA de-bugging and "verification" methodologies. If you've been reading here for awhile, you also probably know why "verification" is in quotes. In our world, verification is a process for vetting your design before it goes over the wall to manufacturing and tooling. The very idea of verification is to make sure that everything is great and straight before crossing the one-way barrier from the domain of flexible, iterative design to the world of irreversible investment in expensive masks and physical inventory. In "measure twice, cut once," verification is the second "measure" - giving us peace of mind that all is well before we commit our concept to materials.

 

Emulate This!

Stirrings in the Hardware-accelerated Verification World

by Bryon Moyer

When each chip you design is going to cost you millions in mask charges and other associated fees, and when any mistake in such a chip can cost you millions more, it makes sense that you’re willing to fork out some cash to help reduce the chances of a flub. And when getting to market sooner means dollars in your pocket, it’s likely that getting a chance to test your software earlier will also be worth some coin.

Of course, this is the whole reason anyone pays for good chip design tools (as opposed to simple software design, where a mistake – in theory – costs nothing but a follow-up patch). And it’s why a ton of that payment is for verification. And a non-trivial part of such verification can be allocated to hardware acceleration. Such acceleration not only gets you through more testing more quickly, but it also lets you emulate the system in which software will run quickly enough to where you can actually test out some of your software in advance of the hardware being available.

 

Scopes

Much More than Just a Wriggly Line

by Dick Selwood

If you are really up-to-date on what is happening in the world of oscilloscopes, then I am afraid that this Embedded Technology Journal Update is not for you – unless you want to go to our comments page and add your two cents’ worth of correction. But if, like me, you were vaguely aware that things are changing in the measurement field, then brace yourself.

The cathode ray tube, with its wriggly signal, (OK, with its wave form) was so much the shorthand for “electronics” that The Plessey Company, for a while Britain’s leading electronics company, used a stylized screen trace as its logo. With the rise of digital systems, another form of analysis tool, the logic analyzer, was developed to look at the zeros and ones and provide, as its name suggests, some analysis of what was happening. These two sat beside each other on the bench, but the two boxes are now increasingly merging into just one.

 

A Synthesis & Partitioning Strategy for Effective Multi-FPGA Prototyping

by Nang-Ping Chen, Auspy, Inc. and Ehab Mohsen, Mentor Graphics

Prototyping an ASIC, ASSP, or SoC onto a single FPGA is not without its challenges. You have to deal with differences in ASIC and FPGA architectures, optimize for performance and area requirements, and account for a debug strategy. Unfortunately, this is only the tip of the iceberg when tasked with implementing an ASIC onto a multi-FPGA platform. Currently, the largest FPGAs have a capacity of roughly 1.5 M equivalent ASIC gates, so when prototyping a chip larger than this, a multi-FPGA strategy must be in place, and several more pitfalls must be accounted for.

And yet it is well worth the effort. Over the years, FPGA prototyping has proven indispensable for functional verification and early software integration. With mask costs approaching $3M for 45nm designs, avoiding a re-spin by prototyping with FPGAs is a small price to pay—even if it means a minor deviation from the final ASIC environment (e.g., clocking, memories, and speed). The larger the design, the more development and manufacturing cost. These larger designs must be partitioned into several FPGAs if they are to be prototyped. It comes as no surprise that for multi-FPGA prototyping, a little pre-planning can go a long way.

 

FPGAs and the IC Bubble

The Techonomics of Programmability

by Kevin Morris

Exponentials are exciting!

Anything in the real world that follows an exponential curve is a recipe for increased adrenalin production. If we're bopping along in our normal linear lives, and we bump into a geometric progression, we (those of us that took math, anyway) naturally expect that we're in for a short and exciting ride. Something that happens in twos or fours today will be exploding into the 128s and 256s by the end of the week, and next month will be flaming out in the bazillions. Although these events can have huge amplitudes, their short duration typically prevents the integral from amounting to much, and their lasting effect is minimal.

What the heck was that last paragraph talking about?

Let's come back from the arena of abstract arithmetic for a bit and drop into the real world. Your e-mail box catches a less-than-funny forward from one of those "forwarding friends," (the type that sends you about twelve uninteresting e-things each day - ranging from virus alerts to chain letters to pictures of political candidates with farm animals photoshopped to their heads.) If you're early in the wave, you may see the e-joke only once this week. Next week, however, you'll get three copies - the week after, maybe sixty - and the week after that they'll fill your spam bucket as the exponential explosion of forwards gets the joke to every man, woman, and child in the world with more bandwidth than reading time. By the fourth week, the joke is gone completely, flamed out in a fiery flash of fuel deprivation. The world - largely unchanged from the event.


Login Required

In order to view this resource, you must log in to our site. Please sign in now.

If you don't already have an acount with us, registering is free and quick. Register now.

Sign In    Register