• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Custom IC SKILL
  3. somethings about panic file

Stats

  • Locked Locked
  • Replies 7
  • Subscribers 145
  • Views 18851
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

somethings about panic file

nidon
nidon over 13 years ago

Dear sir:

 I write a code to generate layout(about more than 80000 layouts),but when I run it,sometimes the CIW window will show the information:

CellView(test1 layout) from lib(test) is saved in the panic file(/path/test/test1/layout/layout.cd-)

To recover do:

dbOpenPanicCellView("test" "test1" "layout")

And then,the icfb will closed by itself,and I have to rerun the code again.

I guess the reason is I generated too more layouts one time,isn't it?

If it is,how shoud I do

Thank you!

  • Cancel
  • Quek
    Quek over 13 years ago

    Hi nidon

    This means that your Virtuoso session has crashed. It might be due to your codes or it might be a bug. Please report the issue to your local Cadence support together with the CDS.log file that shows the crash stack.

    Best regards
    Quek

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • nidon
    nidon over 13 years ago

    Hi quote:

    Thanks for your reply.As what you said, I check the CDS.log file and the Virtuoso did crashed.

    But I found other two problems:

    1.The reason cased crashed is seems I generate too much layouts ont time and it will make Virtuoso crashed.And I reduce the number of layouts,the code executed well.

    2.I found that,at the beginning of the code executed,the code running very fast,but after a few time,the speed is begin to slow.So,is there any way to keep the speed?

     

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Quek
    Quek over 13 years ago

    Hi nidon

    It is good to know that you have a workaround for the crash. Perhaps you might want to examine your codes to make it more efficient (e.g. use dbClose to purge unused cellviews, etc) so that it does not take up more memory as it runs. You can use the "Profiler" in "ciw:Tools->SKILL Development" to check the amount of time/memory used up by various functions in your codes. It requires a SKILL development license.


    Best regards
    Quek

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • marcelpreda
    marcelpreda over 13 years ago

    Hi,

    Regarding #1 reason could be the memory - to much memory used and no posibility to alloc more.

    I guess that if this is the case there you soubd see some message in CDS.log, before the message which referes to panic file.

    Regarding #2, have a look on function  hiSetUndoLimit(n_undoLimit)

     Before starting to create the layouts call hiSetUndoLimit(0).

    This will disable the Undo edit feature, but the procedures like creating/resizing/deleting shapes should run faster.

    I found it very useful when doing thousands operations with shapes.

    Best Regards,

    Marcel

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • nidon
    nidon over 13 years ago

    Thanks for Quek and  marcelpreda

    I check my code,and I found I use dbOpenCellViewByType and don't use the dbClose to release the memory generated by dbOpenCellViewByType .

    So,I think it's the key reason.

    And for your suggestions to  improve the speed of script,I did a test for them.
    I use the command groups dbOpenCellViewByType(fileA) ,dbOpenCellViewByType + dbClose(fileB),dbOpenCellViewByType +dbClose+hiSetUndoLimit(0)(fileC) in three files(each file total generate 60000 layouts) .And run them at one time.

    And the result is running time of fileB > fileA  >= fileC.

    I feel very insteresting here,theoretically speaking I use dbClose to release the memory,so the time of file A should be the longest time in the three.But the result is the longest file is fileB(I run these files in three computer).

     

     

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • marcelpreda
    marcelpreda over 13 years ago

    Hi,

    I think that is OK to have exec_time(fileB) > exec_time(fileA).

    The idea is that dbClose() is also saving data on disk, and this opperation is time consuming .

    Best Regards,

    Marcel

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • nidon
    nidon over 13 years ago

    Thanks for your advice,marcelperda.

    It makes the file average executing time is reduced more than 30%.

     

    Thanks for your kindly support,marcelperda and Quek.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information