Custom IC Designhttps://community.cadence.com/cadence_blogs_8/b/cicThe Custom IC Design blog is tailored...en-USZimbra Community 8The Art of Analog Design: Part 3, Monte Carlo Samplinghttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/22/the-art-of-analog-design-part-3-monte-carlo-samplingSat, 23 Sep 2017 05:25:18 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:a6b0ce69-4554-4bf1-91d4-397df7c35903Arthur Schaldenbrand0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340429https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/22/the-art-of-analog-design-part-3-monte-carlo-sampling#comments<p>In Part 2, we looked at Monte Carlo sampling methods. In Part 3, we will consider what happens once Monte Carlo analysis is complete. Of course, we will need to analyze the results, so let’s look at some of the tools for visualizing what the Monte Carlo analysis is trying to show us about the circuit.</p>
<p>First let’s review the results from the previous blog. The circuit being simulated is a Capacitor D/A Converter, or CAPDAC. The CAPDAC is used in a Successive Approximation ADC to generate the reference levels for comparison. The mismatch of the unit capacitors in the CAPDAC contributes to degradation of the CAPDAC SINAD (Signal-to-Noise and Distortion ratio) and is an important contributor in determining the overall SINAD of the ADC. This CAPDAC is used in a 10 Bit ADC. Based on the error budget for the ADC, if the CAPDAC has a SINAD of 60dB or better we will be able to meet our ADC SINAD target. The CAPDAC SINAD was simulated using Monte Carlo with auto-stop, yield target of 60dB for SINAD, yield of 3s or greater, confidence level of 90%, and Low Discrepancy Sampling, LDS, method. The simulation required 1755 samples to meet the 90% confidence requirement level.</p>
<p>In the last blog append, we looked at the. The effect of process variation on SINAD distribution was plotted, see figure 1. To help understand the how CAPDAC performance compared to the specification,. The specificationthe pass/fail limits have been overlaid on top of the distribution, green is pass and red is fail.</p>
<p><a href="https://community.cadence.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/2818.blog3a_5F00_figure1_5F00_MonteCarloDistribution.png"><img src="https://community.cadence.com/resized-image/__size/1200x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/2818.blog3a_5F00_figure1_5F00_MonteCarloDistribution.png" style="height:auto;display:block;margin-left:auto;margin-right:auto;" alt=" " /></a></p>
<p align="center">Figure 1: CAPDAC SINAD distribution</p>
<p>The plot also has bars showing the mean value, s, and the values of standard deviation from -3σto +3σ allowing us to visualize how much margin the CAPDAC has relative to the specification. For the CAPDAC there is almost 2s close margin between the specification and the upper limit of the specification, -3s limit, of the distribution.</p>
<p>One observation from looking at the distribution, is that the distribution appears to have a long tail. In statistics, distributions with long tails means that the distribution has a large number of occurrences far from the central part of the distribution. Looking at the distribution, we can see that on the positive side of the distribution, there is only one point that is > +2s from the mean. While on the negative side of the distribution, there are many data points, < -3s from the mean. Next, let’s apply another tool, quantile-quantile plotting. The purpose is to test our simulated distribution and is a Normal (or Gaussian) distribution. A quantile-quantile plot is a technique to evaluate if two distributions are the same by plotting their quantiles against each other where the quantiles are points taken at regular intervals from the cumulative distribution function (CDF) of a random variable. The 0-quantile of distribution is the median, it is the value where half the samples in the distribution are higher in value than the median and half of the samples in the distribution are lower in value the median. Since the distribution is skewed, the mean value will not be equal to the median value.</p>
<p><a href="https://community.cadence.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/3122.blog3a_5F00_figure2_5F00_QuantilePlot.png"><img src="https://community.cadence.com/resized-image/__size/1200x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/3122.blog3a_5F00_figure2_5F00_QuantilePlot.png" style="height:auto;display:block;margin-left:auto;margin-right:auto;" alt=" " /></a></p>
<p align="center">Figure 2: Quantile-quantile plot for CAPDAC SINAD</p>
<p>If the simulated distribution is a straight line when plotted against the reference distribution, the Normal distribution, then the distributions match and the simulated distribution is Gaussian. As expected, the simulated distribution is not a straight line when plotted against the Normal distribution (see Figure 2). The distribution is only Normal in the region from -1s to +1s of standard deviation. Another way to look at the effect of the long tail is to consider how the CAPDAC yield compares to the expected yield of a Normal distribution. For the CAPDAC, there is 1 failure for 1755 samples. The worst-case value of CAPDAC SINAD is 59.85dB, -5.2s from the mean value. Using the Normal distribution, the expected failure probability for 5s deviation from the mean value is 1 failure per 3.5 million attempts. The effect of the long tail, non-Normal nature of the distribution, is a significant reduction in the yield compared to the yield when the distribution is a Normal distribution. Using quantile-quantile plots provides a powerful tool for visualizing whether the simulated distribution is a Normal distribution or not.</p>
<p>Next, let’s look at another measurement that is useful for designers. First, let’s determine the process capability index or Cpk value. The Cpk is a statistical measure of process capability which is the ability of a process to produce output within specification limits. For the CAPDAC, the Cpk is one of the outputs in the Virtuoso ADE Assembler results window (see Figure 3). The Cpk can only be output if a specification has been defined.</p>
<p>The Cpk is defined as the ratio of the distance from the mean value to the specification in standard deviations over the distance from the mean value to the actual distribution limit in standard deviation. For the CAPDAC, the numerator is 4.6s, the distance from the mean value of 61.15dB to 60dB in sigma, see sigma to target. The target yield was 3s so the denominator is 3s. </p>
<p>The less precise way to think about Cpk, is to think of it as a measure of design margin. It tells us how much margin we have between the actual limit of the process and the user’s expectation for the process.</p>
<p>To summarize we have looked at two tools for visualizing the results of Monte Carlo analysis and using the tools to identify problems. Plotting distributions allows us to understand how well centered a design is. Quantile plots allow us to look at the distribution and identify if it has a long tail since a long tail can translate into poor yield. And by using Cpk we can quantify how much design margin we have. In the next blog post, we will start to look at what we can do to identify and correct issues. </p><div style="clear:both;"></div><img src="https://community.cadence.com/aggbug?PostID=1340429&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Analog Design EnvironmentAPSADE ExplorerAnalog SimulationanalogADEMonte CarloAnalog Design EnvironmentViVAADE AssemblerCusstom IC DesignVirtuosity: Sweeping Multiple DSPF Views in ADEhttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/22/virtuosity-sweeping-multiple-dspf-files-in-adeFri, 22 Sep 2017 13:47:37 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:e5df3df5-6797-460f-bcee-7edd41fbfdb4Arja Hunkin0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340187https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/22/virtuosity-sweeping-multiple-dspf-files-in-ade#commentsWouldn't it be great if you could have a view for your DSPF files and sweep them in an ADE session without having to add them as simulation files? Well now you can!
You can create a DSPF view just like any other view, schematic, layout, extracted - and this can be easily included in any ADE simulation. You can also combine this with the config sweep feature to enable you to sweep several DSPF views at once. Just make note that the top-level test bench must be a config. Let's see how to do this...(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/22/virtuosity-sweeping-multiple-dspf-files-in-ade">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340187&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Analog Design EnvironmentViVa-XLcustom/analogADE ExplorerAnalog SimulationDSPFADEBlock-level simulationVirtuoso Analog Design EnvironmentAnalog Design EnvironmentSchematic EditorViVAVirtuosityCircuit DesignCustom IC DesignSchematicADE AssemblerVirtuosity: Sweeping Multiple Config Viewshttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/18/sweeping-multiple-config-viewsMon, 18 Sep 2017 13:41:36 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:3a6ce84a-1e03-426f-a552-027c530987bbArja Hunkin0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340188https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/18/sweeping-multiple-config-views#commentsBefore IC6.1.7 ISR10, you could sweep multiple views in ADE for only one block in your design. What if you have more than one block that has multiple views that you want to sweep? Well from ISR10 onwards, you can do that. Here's how.(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/18/sweeping-multiple-config-views">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340188&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Analog Design EnvironmentADE ExplorerExplorerAnalog SimulationADEVirtuoso Analog Design EnvironmentAnalog Design EnvironmentSchematic EditorVirtuosityCircuit DesignCustom IC DesignSchematicADE AssemblerVirtuosity: What Color is Your Virtuoso Wearing Today?https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/15/virtuosity-what-color-is-your-virtuoso-wearing-todayFri, 15 Sep 2017 14:11:26 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:e71e39c4-b648-4150-8241-0941ea436715Rishu Misri Jaggi0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340415https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/15/virtuosity-what-color-is-your-virtuoso-wearing-today#commentsLike you, Virtuoso can dress in a different color too every day. Interested to know, how? Read on to find out ....(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/15/virtuosity-what-color-is-your-virtuoso-wearing-today">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340415&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Customize VirtuosoVirtuoso Editorcolorcolor-aware designVirtuosityCustom ICVirtuosity: Driving Along a Longer Route May Take You Home Sooner!https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/12/virtuosity-driving-a-longer-route-may-help-reach-home-soonerTue, 12 Sep 2017 13:16:13 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:75ccf712-e555-48e3-8ad2-45e651b1a68bRishu Misri Jaggi0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340271https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/12/virtuosity-driving-a-longer-route-may-help-reach-home-sooner#commentsOn my way back home every day, I need to make a decision — should I drive less, or more? Because, there are two different routes that I can take to home. The shorter route is usually busier at peak traffic times. The other route, is long.
When I reach the cross road, I almost get swayed in to take the shorter, seemingly straight path. The days I give in to that temptation, I usually reach home late.
It can be the same when using software — what may seem to be a harmless shortcut could cost you a lot of troubleshooting time. Here's how a customer recently experienced this when copying a library in Virtuoso.(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/12/virtuosity-driving-a-longer-route-may-help-reach-home-sooner">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340271&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">library managerVirtuosoVirtuosityphysConfigCPHcopy libraryCustom ICVirtuosity: Saving, Loading and Sharing ADE Annotation Settingshttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/07/virtuosity-sharing-and-automatically-loading-ade-annotation-settingsThu, 07 Sep 2017 09:17:51 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:2528fb72-ada1-44bc-a085-5cb4f099d34eArja Hunkin0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340321https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/07/virtuosity-sharing-and-automatically-loading-ade-annotation-settings#commentsThe whole ADE annotation flow was overhauled way back in IC6.1.6 but at that time there was no way to share the annotation settings between designs, or to automatically load them. Well, in IC6.1.7 ISR13 we have added the ability to do both!
(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/07/virtuosity-sharing-and-automatically-loading-ade-annotation-settings">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340321&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">ADE ExplorerAnnotation SettingsADE AnnotationsADEAnalog Design EnvironmentSchematic EditorVirtuositySchematicADE Assemblerannotation setupVirtuoso Video Diary: What Are Parametric Sets?https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/04/virtuoso-video-diary-what-are-parametric-setsMon, 04 Sep 2017 11:38:52 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:64dbbb78-a661-416f-8ea9-008e01e30eb8Ashu V0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340394https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/04/virtuoso-video-diary-what-are-parametric-sets#commentsOver the past few IC6.1.7 and ICADV12.3 ISR releases, a lot of new and useful features have been added to Virtuoso ADE Explorer and Virtuoso ADE Assembler. An interesting one that recently caught my attention amidst this forever-increasing feature list is – Parametric Sets in Design Variables.
This feature could be a savior if you’re working on a gigantic list of design variables or parameters with sweeps, but don’t want to run all the possible sweep combinations for them. Parametric sets help save time and also provide you the flexibility to run a specific set of variables. To put it in simpler words – when you create a parametric set by combining two or more variables, only a selected set of sweep combinations are created by picking values from the same ordinal position for all the variables or parameters in the parametric set. This reduces the number of design points, thereby, reducing the number of simulations.(<a href="https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/09/04/virtuoso-video-diary-what-are-parametric-sets">read more</a>)<img src="https://community.cadence.com/aggbug?PostID=1340394&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Analog Design EnvironmentADE ExplorerExploreranalogADEMixed-SignalVirtuoso Analog Design EnvironmentVirtuosoAnalog Design EnvironmentVirtuoso Video DiaryCircuit Designmixed signalCustom IC DesignAssemblercustom design technologyADE AssemblerCusstom IC DesignPhotonics Summit and Workshop 2017https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/16/photonics-summit-and-workshop-2017Wed, 16 Aug 2017 22:45:03 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:9ce4d62f-8463-4db5-a3ae-1b8892bb1934Meera Collier0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340368https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/16/photonics-summit-and-workshop-2017#comments<h3><span style="font-size:75%;">Interested in learning about system-level integration of electronic/photonic devices?</span></h3>
<p><img src="https://community.cadence.com/resized-image/__size/300x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/bigstock_2D00_Silicon_2D00_wafer_2D00_with_2D00_processor_2D00_c_2D00_21265985_5F00_jpg.jpg" alt=" Silicon Wafer" width="400" title="Silicon Wafer" style="float:right;margin:6px;" />The use of silicon photonics allows semiconductor designers to leverage the billions of dollars invested in existing manufacturing facilities, integrating electronics and optical on the same die or in the same package. Breakfast Byte’s blogger Paul McLellan has written quite a few excellent blogs about Silicon Photonics. In one <a href="https://community.cadence.com/cadence_blogs_8/b/breakfast-bytes/archive/2016/12/16/silicon-photonics" title="Silicon Photonics" target="_blank">blog post</a>, he says, “The basics of silicon photonics are something that <i>every</i> semiconductor designer should have at least a little knowledge about.” Check out this post, it's a fantastic primer!</p>
<p>When integrating hybrid devices in which the electronic and photonic components are combined into a single IC, you need tools that integrate both design methodologies—electronic and photonic—as well as taking packaging and systems designs into account. Cadence has been involved with and has provided solutions for silicon photonic design and packaging for quite some time now, and in 2015, partnered with Lumerical Solutions and PhoeniX Software to create an integrated electronic/photonic design environment (EPDA).</p>
<p>Learn about these solutions while and networking with other expert users at the second annual <b>Photonics Summit and Workshop</b> on September 6 and 7, at the Cadence headquarters in San Jose.</p>
<p>On day one, industry experts will present on photonic IC design and packaging. They feature representatives from Hewlett Packard Labs, AIM Photonics, IBM, Chiral Photonics, Inc., and more.</p>
<p>Day two will consist of a hands-on workshop where you will learn system-level electronic and photonic design first hand. During the workshop, you will…</p>
<ul>
<li>Add new elements to an existing photonics PDK</li>
<li>Assemble a PIC and its CMOS driving logic, fiber connector (from our partner Chiral Photonics), and a laser as a complete system</li>
<li>Play with Tektronix testing equipment</li>
</ul>
<p>To see the full agenda and to register for the free event, go to <a href="http://www.cadence.com/go/silicon-photonics" title="More Information!" target="_blank">www.cadence.com/go/silicon-photonics</a>.</p><div style="clear:both;"></div><img src="https://community.cadence.com/aggbug?PostID=1340368&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">PhoenixLumericalPhotonics SummitphotonicsCustom IC DesignThe Art of Analog Design
Part 2: Monte Carlo Samplinghttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/12/the-art-of-analog-design-part-2-monte-carlo-samplingSat, 12 Aug 2017 15:08:58 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:a07f7e0a-d1d6-4d00-adb5-03c4c7fe5f6cArthur Schaldenbrand0https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340358https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/12/the-art-of-analog-design-part-2-monte-carlo-sampling#comments<p>Historically, one of the great challenges that analog and mixed-designers face has been accounting for the effect of process variation on their design. Minimizing the effect of process variation is an important consideration because it directly impacts the cost of a design. From Pelgrom’s Law (1), it is understood that the device mismatch due to process variation decreases as the square root of increasing device area, see note 1. For example, to reduce the standard deviation, sigma, of the offset voltage from 6mV to 3mV, means that the transistors need to be four times larger.</p>
<p>By increasing transistor size, the die cost is also increased since die cost is proportional to die (and transistor) area. In addition to increasing cost, increasing device area may degrade performance due to the increased device parasitic capacitances and resistances of larger devices. Or the power dissipation may need to increase to maintain the performance due to the larger parasitic capacitances of the larger devices. In order to optimize a product for an application, that is, for it to meet the target cost with sufficient performance, analog and mixed-signal designers need tools to help them analyze the effect of process variation on their design.</p>
<p>Another way to look at the issue is to remember that analog circuits haven’t scaled down as quickly as digital circuits, that is, to maintain the same level of performance has historically required something like the roughly the same die area from process generation to process generation. So, while the density of digital circuitry doubles every eighteen months, analog circuits don’t scale at the same rate. If an ADC requires 20% of the die area at 180nm, then after two process generations at the 90nm process node the die area of the ADC and digital area are equivalent. After two more process generations at 45nm, the ADC requires 4x the area of the digital blocks, see note 2. The example that has been presented is exaggerated, however, the basic concept that process variation is an important design consideration for analog design is valid.</p>
<p>Traditionally, the main focus of block-level design has been on parasitic closure, that is, verifying the circuit meet specification after layout is complete and parasitic devices from the layout have been accounted for in simulation. This focus on parasitic closure meant that there was only limited supported for analyzing the effect of process variation on design. During the design phase, sensitivity analysis allowed a designer to quantitatively analyze the effect of process parameters on performance. During verification, designers have used corner analysis or Monte Carlo analysis to verify performance across the expected device variation, environmental, and operating conditions. In the past, these analysis tools were sufficient because an experienced designer already understood their circuit architecture, its capabilities, and its limitations. So performance specifications could be achieved by overdesigning the circuit. However, ever decreasing feature size have increased the effect of process variation and market requirements meaning designers have less margin to use for guard banding their design. Also, the decreasing feature size means that power supply voltages are scaling down and in some cases circuit architectures need to change. An example of how power supply voltage effects circuit architecture is ADC design, where there has been a movement from pipeline ADC designs at legacy nodes, 180nm, to successive approximation ADC, SARADC, for advanced node, 45nm, designs. This change has occurred because a SARADC can operate at lower power supply voltages than pipeline ADCs. As a result of the changing requirements placed on designers, there is a need for better support for design analysis than ever before.</p>
<p>Let’s look at an example of statistical analysis often performed by analog designers. Shown below is the Signal to Noise and Distortion Ratio, SNDR or SINAD, value of a Capacitor D/A Converter, CAPDAC. A CAPDAC is used in a successive approximation ADC to generate the reference voltage levels used to compare the input voltage to in order to determine the digital output code. The SINAD of the CAPDAC determines the overall ADC accuracy. </p>
<p><a href="https://community.cadence.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/7612.blog2_5F00_figure1.png"><img src="https://community.cadence.com/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/7612.blog2_5F00_figure1.png" style="height:auto;display:block;margin-left:auto;margin-right:auto;" alt=" " /></a></p>
<p style="text-align:center;"><span style="font-size:150%;"><strong> Figure 1:Example of Monte Carlo Analysis Results for Capacitor D/A Converter Signal-to-Noise Ratio</strong></span></p>
<p>On the left is the distribution of the capacitance variation and one the right is the CAPDAC Signal-to-Noise Ratio, SNR, distribution. From the SNR distribution, the mean and standard deviation of the CAPDAC SNR can be calculated. If the specification of the SNR must be greater than 60dB, does this result mean that the yield will be 100%? Another question to consider is whether or not distribution for the SNR is Gaussian or not since the analysis of the results is impacted by the type of distribution. Or we might want to quantify the process capability, C<sub>pk</sub>. C<sub>pk</sub> is a parameter used in statistical quality to control to understand how much margin the design has. In the past, this type of detailed statistical analyses has not been available in the design environment. In order to perform statistical analysis, designers needed to export the data and perform the analysis with tools such as Microsoft Excel.</p>
<p>Beginning in IC6.1.7, Cadence<sup>®</sup> Virtuoso<sup>®</sup> ADE Explorer was released with features to support a designer’s need for statistical analysis. Just a note, for detailed technical information, you can explore the Cadence Online Support website or contact your Virtuoso front-end AE. Now let’s take a quick look at enhancements to Monte Carlo analysis starting with the methods used to generate the samples for.</p>
<p>In Monte Carlo analysis, the values of statistical variables are perturbed based on the distributions defined in the transistor model. The method of selecting the sample points determines how quickly the results converge statistically. Let’s start with a quick review, in the CAPDAC example we ran 200 simulations and all of them passed. Does that mean that the yield is 100%? The answer is no, it means that for the sample set used for the Monte Carlo analysis, the yield is 100%. In order to know what the manufacturing yield will be, we need to define a target yield, for example, let target yield greater than 3 standard deviations, or 99.73%, and define a level of confidence in the result of 95%. Then we can use a statistical tool called the Clopper-Pearson method to determine if Monte Carlo results have a >95% chance of having a yield of 99.73%. The Clopper-Pearson method produces an interval of confidence, the minimum and maximum possible yield, given the current yield, number of Monte Carlo iterations, etc. Often designers perform a number of simulations: 50, 100, etc. based on experience and assume that the results would predict the actual yield in production. By checking the confidence interval, we can reduce the risk of missing a yield issue. Another result of using the rigorous approach to statistical analysis, is that more iterations of Monte Carlo analysis are required. As a result, designers need better sampling methods that reduce the number of samples, Monte Carlo simulation iterations, required in order to trust the results.</p>
<p>Random sampling is the reference method for Monte Carlo sampling since it replicates the actual physical processes that cause variation; however, random sampling is also inefficient requiring many iterations, simulations, to converge. New sampling methods have been developed to improve the efficiency of Monte Carlo analysis by more uniformly selecting sample points. Shown in Figure 2, is a comparison of samples selected for two random variables, for example, n-channel mobility and gate oxide thickness. The plots show the samples generated by random sampling and a new sampling algorithm called Low Discrepancy Sampling or LDS. Looking at the sample points, it is clear that LDS has more uniformity spaced sample points. More uniformly spaced sample points mean that the sample space has been more thoroughly explored and as a result the statistical results converge more quickly. This translates into fewer samples being required to correctly estimate the statistical results: yield, mean value, and standard deviation.</p>
<p><a href="https://community.cadence.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/0777.blog2_5F00_figure2.png"><img src="https://community.cadence.com/resized-image/__size/1200x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/0777.blog2_5F00_figure2.png" style="height:auto;display:block;margin-left:auto;margin-right:auto;" alt=" " /></a></p>
<p style="text-align:center;"><span style="font-size:inherit;"><strong>Figure 2: Comparison of Random Variable values using Random Sampling and LDS Sampling </strong></span></p>
<p>The LDS sampling method replaces Latin Hypercube sampling because it is as efficient and supports Monte Carlo auto-stop. Monte Carlo auto-stop is an enhancement to Monte Carlo that optimizes simulation time. Statistical testing is used to determine if the design meets some test criterion, for example, for the CAPDAC, assume that you want to know with a 90% level of confidence that the SNR yield is greater than 99.73%. The user needs to define these criteria at the start of the Monte Carlo analysis and the results are checked after every iteration of the Monte Carlo analysis. The analysis stops if one of two conditions occurs. First, the analysis will stop if the minimum yield from the Clopper-Pearson method is greater than the target criteria, that is, the SNR yield is greater than 99.73%. More importantly, the Monte Carlo analysis will also stop if Virtuoso ADE Explorer finds that the maximum yield from the Clopper-Pearson method will not exceed 99.73%. Since failing this test means that the design has an issue that needs to be fixed, this result is also important. It also turns out that failure usually occurs quickly, after a few iterations of the simulation. As a result, using statistical targets to automatically stop Monte Carlo can significantly reduce the simulation time. In practice, what does this look like? Consider the following plot in Figure 3 which shows the upper bound, the maximum yield, and lower bounds, minimum yield, and the estimated yield of the CAPDAC as a function of the iteration number. The green line is the lower bound of the confidence interval assuming the user would like to represent the estimated yield By the 300<sup>th</sup> iteration, we know that the yield is greater than 99% with a confidence level of 90%. Or we can be very confident that the CAPDAC yield will be high. In addition, thanks to Monte Carlo auto-stop we only needed to run the analysis once.</p>
<p><a href="https://community.cadence.com/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/5557.blog2_5F00_figure3.png"><img src="https://community.cadence.com/resized-image/__size/1200x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-00-15/5557.blog2_5F00_figure3.png" style="height:auto;display:block;margin-left:auto;margin-right:auto;" alt=" " /></a></p>
<p style="text-align:center;"><span style="font-size:inherit;"><strong>Figure 3: Yield Analysis Plot</strong></span></p>
<p>To summarize, the two improvements to Monte Carlo sampling are LDS sampling and Monte Carlo auto-stop. LDS sampling uses a new algorithm to more effectively select the sampling points for Monte Carlo analysis. Monte Carlo auto stop uses the statistical targets: yield and confidence level, to determine when to stop the Monte Carlo analysis. As a result of these two new technologies, the amount of time required for Monte Carlo analysis can be significantly reduced.</p>
<p align="left">In the next article, we will look into analyzing Monte Carlo analysis results to better understand our design and how to improve it.</p>
<p align="left">Note 1: Remember in analog design, designers rely on good matching to achieve high accuracy in their designs. Designers can start with a resistor whose absolute accuracy may vary +/-10% and taking advantage of the good relative accuracy, matching between adjacent resistors, to achieve highly accurate analog designs. For example, the matching between adjacent resistors may be as good as 0.1%, allowing design data converters of 10 bit, 1000parts per million (ppm), 12 Bit, 0.00025ppm, or even 14 Bit, 00001ppm, accuracy circuits.</p>
<p align="left">Note 2: In reality, only the components in the design sensitive to process variation do not scale, so the area of the digital blocks will scale and the area of some of the analog blocks may scale. The solution designers typically adopt to maintain scaling, is to implement new technologies, such as, digitally assisted analog (DAA) design to compensate for process variations. While adopting DAA may enable better scaling of the design, it also increases schedule risk and verification complexity.</p>
<p align="left">References:</p>
<p align="left">1) M.J.M. Pelgrom, A.C.J. Duinmaijer, and A.P.G. Welbers, “Matching properties of MOS transistors,” IEEE Journal of Solid-State Circuits, vol. 24, pp. 1433-1439, October 1989.</p>
<p align="left">2) See Clopper-Pearson interval <a href="http://en.wikipedia.org/wiki/Binomial_proportion_confidence_interval">http://en.wikipedia.org/wiki/Binomial_proportion_confidence_interval</a></p><div style="clear:both;"></div><img src="https://community.cadence.com/aggbug?PostID=1340358&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">Interval of ConfidenceMonte Carlo with auto-stopconfidence levelLow Discrepancy SamplingClopper-PearsonPelgrom’s LawMonte Carlo analysisLatin Hypercube SamplingThe Art of Analog Design
Part 1: Overview of Variation-Aware and Robust Designhttps://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/09/the-art-of-analog-design-part-1-overview-of-variation-aware-and-robust-designThu, 10 Aug 2017 06:18:55 GMT75bcbcf9-38a3-4e2e-b84b-26c8c46a9500:04602616-de66-4ef9-a604-3bb53f682bbbArthur Schaldenbrand1https://community.cadence.com/cadence_blogs_8/b/cic/rsscomments?WeblogPostID=1340355https://community.cadence.com/cadence_blogs_8/b/cic/archive/2017/08/09/the-art-of-analog-design-part-1-overview-of-variation-aware-and-robust-design#comments<p>In this series, we will focus on advanced concepts for custom IC design, in particular, variation-aware design (VAD). With emergence of high-speed simulators such as Spectre<sup>®</sup> APS, designers can now run simulations faster than ever before, so they are able to more completely verify their designs before taping out. However, it requires more than verifying the proper functionality for different stimulus and performance across corner conditions to assure a design is successful. To be successful requires more, it requires properly allocating design margins based on process variation. Designers can not only use the Cadence<sup>®</sup> Virtuoso<sup>®</sup> ADE Product Suite to analyze the results and verify the design is specification compliant, reducing the risk of a design respins and getting the product to market faster. It can also increase competitiveness by helping designers reduce the effect of process variation on a design. Solving this problem requires more than fast simulation, it requires adopting new tools and methodologies.</p>
<p>First, let’s consider the impact of over margining to avoid the negative effects of process variation on circuit performance. For example, let’s say we are designing a successive approximation ADC and find that the linearity of the capacitor digital-to-analog converter, CAPDAC, used to generate reference values, limits yield to 90%. Also assume that for the current design, the CAPDAC is 25% of the die area and there are 1000 die/wafer. If we can increase the yield to 99% by doubling the CAPDAC area, should we do it? Working through the numbers, we see that the current design has 900 good die per wafer while the high-yield design has 792 good die wafer, 800 die/wafer * 99% yield. So even though the yield went up, profit will go down. There are two points to consider:</p>
<ol>
<li>Overdesign, designing with a margin is not free. Allowing too much design margin can hurt competitiveness.</li>
<li>The second point is subtler, to borrow from Mark Twain, “There are three kinds of lies: lies, damn lies, and statistics.”, that is, we are relying heavily on statistical analysis to make critical decisions.</li>
</ol>
<p>What type of simulation was performed to generate the yield numbers generated? Should the results of these simulations be trusted? These are questions that we also need to consider when making the decision on which design to take to production. In the first part of this series of articles, we will explore variation-aware design. The question to be considered is how to balance the conflicting requirements immunity to process variation against the cost in terms of product competitiveness. In the second half of these articles, we will explore reliability analysis for devices and interconnect. Again, this is an area where designers have traditionally relied on allocating design margin and overdesign to prevent issues. The question to be considered is, as the importance of designing for automotive applications and industrial and infrastructure applications grows, do we have enough design margin? Automotive designs operate in harsher environments and may need to operate reliably for years after a consumer product would have been recycled. The need for these types of solutions has been anticipated and these capabilities already exist in the design environment. </p>
<p align="left"> In the next article, we will look into Monte Carlo sampling methods to see how we can minimize the number of simulations required to answer the question of what yield is for the circuit.</p><div style="clear:both;"></div><img src="https://community.cadence.com/aggbug?PostID=1340355&AppID=15&AppType=Weblog&ContentType=0" width="1" height="1">spectre apsrobust designvariation aware designVirtuoso Analog Design Environmentvad