• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Custom IC SKILL
  3. OverUnder on large polygons

Stats

  • Locked Locked
  • Replies 5
  • Subscribers 144
  • Views 15254
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

OverUnder on large polygons

lkphoto
lkphoto over 11 years ago
Hi Andrew,

I have problems in performing the “OverUnder” operation (=first using dbLayerSize with a positive amount, then with a twice the same negative amount, then with the positive amount again) caused by the fact that polygons cannot have more than 4000 points in SKILL. The purpose of the OverUnder operation is to remove acute angles, too small minimum spacings, and too small features during data preparation. These would create DRC errors.

I am designing photonic components, therefore the typical size of my cells is 10um x 10um or larger, moreover I have to use orthogonal polygons, and the curves are snapped to a 1nm grid.

I have a routine that writes such large polygons as a sum of subpolygons, each having less than 4000 points.

When I apply the dbLayerSize function to all the subpolygons at the same time, Cadence crashes –I assume because all the subpolygons together have more than 4000 points.

I ended up therefore by applying the dbLayerSize operation to single pairs of neighboring subpolygon, one at the time. The problem is that the result of the OverUnder operation depends on the way the big original shape is split into subpolygons. This is clearly not a clean solution, and I discarded it. Also, it is quite slow (taking few minutes for a single layer).

As a second attempt, I tiled the big polygon (=union of the many subpolygons) using square tiles (I used to this end the dbLayerAndNot) and applied OverUnder to each of them separately. Once this was done, I repeated the operation with the dual square lattice obtained by translating the tiles by half the tile pitch in both x and y directions. The result is predictable (as far as the tile size is sufficient), but the algorithm is extremely slow because of the dbLayerAndNot operation.

I wonder if there is a known solution to this problem.

For example, is there a way to get rid of the 4000 vertices per polygon limitation?

It would be great if I handn’t to go outside the Skill environment (e.g. by using Calibre), since these pCells see a number of other post-processing steps in Skill.

Thanks a lot in advance,
  • Cancel
  • tweeks
    tweeks over 11 years ago

    lkphoto said:
    Hi Andrew,

    Gee, can I reply too? :P  Maybe we should rename this the "Ask Andrew Forum".

    I'm going to assume that, if your message was really intended only for Andrew, you would have contacted him by email or the Cadence community private message feature.

    lkphoto said:

    When I apply the dbLayerSize function to all the subpolygons at the same time, Cadence crashes –I assume because all the subpolygons together have more than 4000 points.

    Please report this to your AE so that the bug gets fixed.

    lkphoto said:

    For example, is there a way to get rid of the 4000 vertices per polygon limitation?

    Have you tried using point arrays?

    We have:

     

    dbCompressPointArray
    dbPointArrayAnd 
    dbPointArrayAndNot 
    dbPointArrayOr 
    dbPointArraySize
    dbPointArrayXor
    dbTransformPointArray 
    geCompressPointArray
    

    lkphoto said:

    It would be great if I handn’t to go outside the Skill environment (e.g. by using Calibre), since these pCells see a number of other post-processing steps in Skill.

    Could you stream out, run Calibre, and stream back in to SKILL? Or run Calibre on the OA database directly using sh() or ipcBeginProcess()? And then there's Assura... :)

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • tweeks
    tweeks over 11 years ago

    Another sneaky trick would be to take advantage of the rarely-used "magnification" attribute of an instance's transform to perform the over- and under-sizing. 

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Andrew Beckett
    Andrew Beckett over 11 years ago

    A few points:

    • Tom is right - this is not an "Ask Andrew" forum - there are others who respond in this forum, and that's really the point - it's a community forum. I reply to these posts when I have a moment (it's not part of my job). Sure I probably respond more than most, but others do too.
    • That said, please don't send me private messages asking for responses for things that can equally well be asked in the forums or to customer support. Given that I do have a full time job here in Europe, I don't really have the bandwidth to deal with direct unsolicited private emails or personal messages (obviously I make exceptions, but it's not a scalable solution for every CIC engineer in the world to send me direct questions - this is why we have customer support).
    • You didn't mention which version you are using. There are some limitations:
      • CDB (the IC5141 and before database) has a limit of 4000 points in a polygon
      • OA (e.g. IC61 and later) does not have a limit
    • Various functions (such as the dbLayer functions) will automatically split a large polygon into pieces to ensure that it stays beneath a point limit. It should not crash (and I'm not aware of any crash issues, so maybe it is specific to the version you're using?). I just created an 8000 point polygon in IC616, and was able to run dbLayerSize on it with no problems.
    • In IC61, by default dbLayerSize will split with a maximum number of vertices of 195. However, I could quite happily do:
         b=dbLayerSize(cv "Metal2" a 50 10000)
      which sizes up by 50um and creates a single polygon because the resulting polygon with 8000 points has fewer than 10000 points.
    • Even in IC5141, the fact that you are limited to 4000 points shouldn't actually cause you any problems because you would operate on the complete list of shapes that you build and the tool will deal with them properly as if they were merged.
    • Tom's suggestion of using the magnification factor in (say) dbMoveFig to scale it up won't help, because that is a scaling rather than a sizing (which are different options). With scaling, it's just like a zoom - it doesn't cause any omission of narrow tracks or merging of narrow gaps. Note that mag of anything other than 1 cannot be used on instances in OA (only on transformations of shapes).

    Regards,

    Andrew.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • lkphoto
    lkphoto over 11 years ago

     Thanks for the inputs, Andrew.
    -ok, just kidding :/
    Thanks everyone ;b

    I have been using IC6.1.4, IC6.1.5 and IC6.1.6.
    Indeed, in IC6.1.6.101 dbLayerSize as well as the dbLayerAnd and the other Boolean operations work fine. This update is of great help.
    However, the dbCreatePolygon still does NOT work with more than 4000 vertices (in IC6.1.6.101  ). I assumed that since this wasn’t fixed, dbLayerSize and the other functions weren’t fixed either –my mistake.

    Few more observations:

    1. The crash described above happened on IC6.1.4 –below a portion of the crash report:

    startdate:Wed Jun 25 10:36:06 2014
    crashdate:Wed Jun 25 10:38:53 2014
    appname:virtuoso
    version:@(#)$CDS: virtuoso version 6.1.4 12/02/2010 23:18 (sjfnl010) $
    subversion:IC6.1.4.500.10

    2. Below is the error message which appears in IC6.1.6.101 as soon as one starts making polygons with more than 4000 points:

    Generating Pcell for '1_Test_Circle layout'.
    (4999)
    *Error* dbCreatePolygon: Invalid point list - ((0.006283178 4.999996) (0.01256635 4.999984) (0.0188495 4.999964) (0.02513261 4.999937) (0.03141569 4.999901) ... )
    <<< Stack Trace >>>
    dbCreatePolygon(pcCellView list("rxphot" "drawing") CoordsCircle)
    (XX = dbCreatePolygon(pcCellView list("rxphot" "drawing") CoordsCircle))
    let((Num X0 Y0 R) (Num = 4003) (Num = 4002) (Num = 5000) (X0 = 0.0) ... )
    prog((_pcParameters Xx) if(!pcCellView return()) (_pcParameters = (pcCellView~>parameters)) (Xx = (_pcParameters~>Xx)) if(!stringp(Xx) then (Xx = "X")) ... )
    (... in pcDefinePCell ...)

    3. The dbPointArrayXxx functions suggested above seem having the same problem with the 4000 points limit (also in IC6.1.6.101) (same type of message as for dbCreatePolygon)

    Is there an “official” way of creating polygons starting from a list with more than 4000 points?

    Regards,
    lkphoto

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Andrew Beckett
    Andrew Beckett over 11 years ago

    I'd obviously encountered this limit before, because I found some code in a small example which showed exactly the same failure with dbCreatePolygon. Doing a bit of searching, I saw one report of this issue with dbCreatePolygon, but that didn't end up with an agreed CCR to implement an increase, partly because it was filed against IC5141, and partly because it didn't really justify why it was needed).

    I've seen no reports of the failure message from the dbPointArray functions, which report:

    *Error* dbPointArrayBoolean: Invalid point list - ((2050.0 0.0) (2049.999 1.611) (2049.997 3.221) (2049.994 4.832) (2049.99 6.442) ... )

    There isn't really a workaround, other than tiling the shapes yourself first (splitting into pieces) which isn't entirely trivial (not impossible, just work). You can of course build up the shapes from smaller pieces if that works for you.

    If you want this functionality to be enhanced, please contact customer support with details of why this is important for your needs, and then we can get CCRs filed for both the dbCreatePolygon issue, and also for the dbPointArray.* issue (which are similar, but should be dealt with separately). I could file CCRs, but without the customer link and justification it'll be difficult to get it prioritised.

    Kind Regards,

    Andrew.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information