Archive for the 'EDA' Category

Platform innovation will drive EDA

Wednesday, November 15th, 2006

An interesting take on the EDA industry evolution and look into the trend by Leon Stok, director of EDA for IBM’s Systems and Technology Group in a keynote address at the International Conference on Computer Aided Design (ICCAD).

Stok identified three previous innovation eras in the EDA industry — those of invention, implementation, and integration. The fourth era, he said, is the one we’re about to enter and is centered on design implementation “platforms”. To make platform innovation happen, Stok said, we will need to define standards as APIs, not ASCII formats. This will allow tools to talk to each other, instead of producing data that another tool can barely read, he noted.

With APIs, smaller companies with innovative potential solutions for the UDSM technology challenges will have a more level playing field. Each can plug in their solutions and then let the market forces decide. It will also pave the way for the bigger players as they too can focus on the overall user flow with a market decided mix of their own in-house tools or point tools from other companies.

Designers give CAD research gurus an earful

Wednesday, November 15th, 2006

The organizers of ICCAD (International Conference on Computer Aided Design) decided to do something different and added a designer’s perspective track this year.

“Our goal is to bridge the gap between practitioners and research,” said ICCAD general chair Soha Hassoun, an associate professor of electrical engineering and computer science at Tufts University (Medford, Mass.), in opening remarks at the conference. “We would like them [designers] to tell you [researchers] what critical issues should drive CAD research in the next few years.”

Now, I would term this as going back to basics. The users tell their requirements to the researchers which in turn guides the researchers towards the right direction in terms of practical benefits and usage of their efforts – optimal Lab to Fab transition.

Another tenet emphasized was by STMicroelectronics’ Pascal Urard: “We need academia and the EDA community to think at the flow level, not only at the tool level”.

It is apt to remind the EDA community that they should enable the end user with his final flow and not only the point tools. It’s true that tools provide the differentiating edge amongst the EDA vendors and users should have the flexibility of picking up the ones which best suit his flow.

However, to justify the “Automation” in “EDA”, it pays to facilitate the flow.

Why can’t we do it in EDA?

Wednesday, July 26th, 2006

This was the questioned posed by Chairman of Orb Networks and former CEO of Cadence, Joe Costello in his DAC 2006 keynote speech.

The “it” referred to here is the mix and match of new plug-ins (internal and external), bundling things on top of others’ offerings and selling directly to customers.

With the increasing complexity of technological challenges compounded by rising market pressures, it does indeed benefit both the big EDA companies as well as the small start-ups with niche solutions to collaborate. However, opening the tools and making them pluggable is not without its major share of teething issues. While standards do take a long time to be formulated and then adopted, they’ll still be required to an extent for “universal plug-ins”.

One scenario is where EDA companies have the basic engines for the standard design activities. To these, smaller niche companies provide their plug-ins for value-addition and tackling of issues related to leading edge designs. With a uniform standard, these companies can go with their plug-ins to various EDA companies. In its absence however, each EDA company will need to work closely with these smaller companies and sell the complete “bundle with options” to the user.

A point to be noted here is that it’ll be naïve to assume that the present basic engines are implemented in a modular fashion where a plug-in can be used in a quasi seamless fashion. Then comes the question: if addressing of the leading edge issues is done in a modular fashion by the smaller companies who are free to sell their wares to the other big EDA companies, what is in it for the big EDA companies? What will be their competitive edge? But on the flip side, if the major EDA companies persist in attempting to do everything on their own, given the complexities and constraints, it’ll not result in much growth for the EDA industry.

Interestingly, there are signs of the industry moving in this direction. For DFM, with TSMC sharing their process information with the EDA companies to integrate into their design flow is one example. This can be treated as a “plug-in”.

Let’s take an example here: Synopsys recently came out with 3 tools in the DFM space - LCC (lithography compliance checking), CMP (chemical-mechanical polish) checking, and CAA (critical-area analysis). As per the press note, LCC inspects GDS-II files using a rapid-computation model of the lithography process, calibrated with foundry data. This scan predicts the actual shapes the mask features will produce, across the focus window of the lithography step. It then examines these features against a rule set to detect pinch-off, end-shortening, bridge, and other faults that could occur with a reasonable probability.

The normal output of the device is a color-coded die map: green for areas that pass, yellow for areas of concern, and red for trouble spots. Design teams that are knowingly pushing the litho rules can look under this graphic presentation at a numerical database that will give them actual predictions of critical dimensions.

Designers can then invoke an auto-correction tool, which is based on extensive, process-dependent heuristics, to deal automatically with the majority of the problems—adding space between lines, moving edges or corners, and other such reasonable measures.

Now reconsider the situation with a small EDA company working on the basic LCC part. It takes inputs from the lithography process model provided to it by the big EDA vendor (I don’t think the big foundries will be that comfortable in working closely with the smaller companies in handling their process data!). The GDSII is also provided as an input from the big EDA vendor’s tool(s). Finally the auto correction tool can be provided by either.

I cite this example because while these three new tools do attempt to handle the first order problems, they do not even begin to cover all the important sources of variation in 90-nm and finer geometries. TSMC cites more than 2000 independent sources of potential trouble.

I see a hybrid approach in the near future………

DFM again

Tuesday, July 4th, 2006

TSMC had recently unveiled its 65nm DFM compliance design support ecosystem by coming out with its DFM Data kit compiled with DFM Unified Format (DUF). DUF has been developed by TSMC to align DFM tools. This kit would help to put the fabless designers on an equal footing with the IDMs. The format, though, models only random and systematic defects with parametric defects being planned for a future release.

Now yet another tool has hit the “in news” DFM space.

Blaze DFM Inc recently rolled out its solution Blaze MO. It is marketed as targeting to improve the parametric yield through a better control over leakage, timing and variability.
It has an electrical focus in contrast with other DFM tools which have a geometric focus (focusing on wire spreading, lithography simulation and critical area analysis)

The heat is on…….

Thermal Analysis

Tuesday, June 27th, 2006

Thermal analysis is gaining momentum. While these analysis tools were there in the past especially with analog and mixed signal devices, they’ve lately gained prominence with sub 90nm digital designs too.

Thermal analysis tools track thermal gradients across the die. Uneven shifting of the threshold voltage, timing violations (clock timings are especially sensitive to delay variations caused by thermal gradients), leakage, electromigration, reliability are some of the thermal problems.

While some vendors are coming out with standalone thermal analysis tools e.g Gradient, some like Magma have thermal analysis in built into their power analysis tool as they believe that since power and temperature are interlinked, a user should not be shuttling between 2 separate analysis tools. As package plays a vital role in thermal analysis, some are getting package considerations also into the product.

Along with these tools, it’s the thermal management chips which are riding along the wave. According to Databeans, thermal management ICs could reach just under 2B$ in 5 years. The main growth segments cited are the ones using FPGAs and ASICs.

It’s time for a change

Tuesday, June 6th, 2006

Yes, indeed the methodology should be done by people who know it best i.e. design engineers. EDA companies should step in to facilitate this and not formulate them. We should not have situations where the design engineer needs to grapple with firstly the design & process complexities and secondly with trying to fit the design tool into his design methodology. With the increasing complexities associated with sub micron designs, there is a need for more and more collaboration. The tasks are too mammoth and interlinked for any single entity to manage on their own. Realization of the methodology flow within the common & open source verification environment by solutions and not just point tools should be the offerings from the EDA vendors. Indeed an open source community which leverages on the combined industry expertise is the need. However such open source platforms take time to be adopted as players approach it warily keeping their IPs and niche expertise in mind.

Bring on 2006

Monday, December 5th, 2005

Mentors’ CEO Walden Rhines’ interview in Electronic Times (posted on 2/12/05) brought out 2 interesting points in the EDA industry :

First is on the EDA industry growth which Rhines attributes mainly to developing new solutions to new problems, developing new methodologies & applying technology to different applications. With a very small growth in the number of designers and with tools and methodology in place, design companies do not tend to spend so much in purchasing that many new licenses/seats.

The second interesting point is about start-ups. Usually started by professionals from the major EDA companies/design companies when they see issues/loops in the design flow which they feel they can plug in much better than the existing tools. With the market growing more and more towards point tools and now towards an open platform, they focus on a niche issue. While they contribute a little over 20% of the market revenue, they do represent a major chunk of the EDA methodologies mindshare. And excepting a few of them who have a solid business plan in addition to the strong technology base, most get acquired by the major EDA companies - and spur their growth.

More on DFM….

Tuesday, November 22nd, 2005

I read this paper, “Yield challenges require new DFM approach” by P.T Patel in EE Times. It’s very well written and informative.

Yet another pointer to making the existing design tools (the focus in the article was on routing) more suitable for getting your design manufacturable.

Can someone explain DFM ?

Monday, November 21st, 2005

Quite an interesting article in Electronic Engineering Times by Richard Goering.

Getting to basics ….

In commercial space, a designer designs a chip with the objective that it should not only function as per specs but also be manufactured in a commercially viable mode. This is implicit. Else shouldn’t we have heard about tools like Design for Silicon Success/DFSS or DFFTSS….??

Yes, we do have flows which aim for FTSS but not point tools. The point tools facilitate various design phases like simulation, synthesis, routing etc. but it’s a design flow which optimizes their usage to achieve objectives like intended functionality , high yield. In fact, all the existing design tools should have this “DFM” embedded in them by default.

Designers need not become manufacturing experts and the tools should be good enough to handle the yield issues in a transparent and automatic manner. But with the mandatory breaking of walls between design and manufacturing in the DSM zone, it does help for the designer to be aware of the potential manufacturing issues and take them into account while designing in order to avoid corrections at later stages.

Design trends & EDA tools : China & Taiwan

Wednesday, October 19th, 2005

I recently read the latest report from EE Times and Gartner Dataquest on Design Trends & EDA Tools : China & Taiwan. It can be accessed from their website http://www.eetasia.com.

The report makes a very interesting reading and made me ponder on a few points…….

ASIC Design segment

A. Application segment
While consumer applications remained the major application segment in Taiwan (as was in ’04 too), it displaced Telecom/Datacom in China to be the major one there too. Some possible underlying reasons (apart from generic market conditions) could be
· Telecom/Datacom designs have traditionally been using the leading edge process geometry. The rising mask costs associated with them could have been a factor of the decrease in new ASIC designs.
· The varied & vast set of categories within the consumer applications mkt. abets more ASIC designs and spin-offs.
· More consumer ASICs are coming out with the rapid growth of the China consumer mkt.

B. Gate Count
The general increase in the gate count follows the rising complexity which is also aided by the integration of various functions/categories in a single product.
Majority of the respondents working on large designs are companies that are local subsidiaries of foreign companies or local ventures – not joint ventures with foreign companies. This is mostly due to the high costs involved in large & complex designs. It makes sense for joint ventures with foreign companies to focus on the local marketing & enhance their foreign partner’s footprint in the local mkt.

C. Process Geometry
China figures indicate a more rapid embracing of newer technos.
The 0.18um in Taiwan, apart from remaining the predominant node for the past 3 years, has grown from 35% from last year to 44% while the 0.13um has increased from 13% to 23%.
0.18um is still the dominant node in China. However, it’s share has slightly decreased from 49% to 45%. But it’s in the 0.13um that we see the real increase – 12% to 31%

D. IP core usage rate
EDA companies’ share has decreased and has been correctly attributed to their partnering with foundries. With the increasing complexities and the focus going more towards Design for Yield, it is natural for foundries to play a major role in this partnership with EDA vendors/3rd party IP suppliers. The growing complexity of selecting the right IP & the subsequent issues seen during their integration in the design compel companies resort to developing them in-house. The marked surge in the independent 3rd party suppliers is also due the fact that they specialize in their niche IPs & these IPs are their main products.

EDA tools usage
Increasing reliability and reduced costs are the paramount factors for the electronic designers in Taiwan (increased functionality is no longer the most important goal) while increased functionality and reduced cost are the most important goals for the designers in China.

This possibly indicates a more mature market (in terms of EDA tools usage) in Taiwan where they seem to be more conversant and satisfied with the various functionality features offered by the EDA tools and hence are focusing more on reliability i.e. fewer issues while going through the design flow and hence shortening their time to market.
Cost reduction remains common; not surprising where Consumer applications is the predominant market.

Open-Silicon automates the flow

Monday, September 19th, 2005

Refering to When infrastructure is essence: Open-Silicon automates the flow , an article posted in Electronic Engineering Times by Ron Wilson on Sep 16 2005

ASIC implementation is a complex procedure.
Automating it is more complex.
And adhering to the automated flow & achieving the intended results is an art in itself.

Every ASIC design team would have ventured into attempting to automate the complete process at least once in it’s life time. From my experience, it’s not the complexities involved (in the methodology or the automation), nor is it the lack of resources; but it is the good old discipline (or lack of it) that keeps one away from achieving the benefits of this automation. The biggest spoke in such an automation is the varied sets of designs, each with it’s unique baggage of complexities and requirements. Deviations are bound to occur if one needs each design to be optimized. So, it’s heartening to note that Open Silicon’s automated flow intends to include such creative detours.

It takes time (and restraint) to include all details; version control, detailed comments, personal tweaking, coding practices etc. I recollect the time when I had to put on hold all library releases by my team in order to include version control; it was not one of my favorite periods ! But yes, the subsequent gains more than supplemented for it.

Having developed, implemented as well as managed a gamut of automations across various ASIC implementations spanning various geographies, I adhere to the age old wisdom : A tool is as good as it’s implementation.

Leaky chips test designers’ skills

Thursday, September 1st, 2005

Refering to Leaky chips test designers’ skills by Mike Clendinin in www.eetasia.com

Yes, one can no longer rely upon deploying the power optimization techniques in the later part of the design. For that matter, it’s not sufficient to keep it restricted to any one design phase. It needs to be strategized and implemented right from algorithmic level, through architecture level and down to the placement & route phases. The higher the level, the more power savings one gets.

And there’s a constant balancing act between the various design constraints i.e. power, area and timing……..at least as of now.

Are ESL and DFM false hopes ?

Monday, August 22nd, 2005

Refering to Are ESL and DFM false hopes? by Richard Goering in www.eetasia.com -

ESL and DFM are the two buzzwords in the DSM design space. With the spiraling costs involved, such techniques are getting into the mandatory zone. If we say that ESL is too domain and application specific, it’s just following another important trend in the market – that of structured ASICs/platform ASICs. You get master slices for various applications and these are further customized as per actual requirement. Is the industry, having moved from a “single EDA vendor toolset for a complete integrated design flow” to a “unified design flow integrated with various point tools from multiple EDA vendors” headed towards one with “point tools with a user defined interface for point customizations” ?

DFM surely requires a strong close link between the designer and the foundry but are the foundries ready for this ? The skyrocketing costs of setting up new fabs with the DSM processes led the foundries to partner together. Will the lure of acceptable yield and revenues henceforth get a similar result between IDMs and foundries ?