I am stil quite a newbie at this, so I apologize if I'm asking a silly question.
As I'm developing more and more skill code functions, I'm finding that trying to ensure that each individual function remains working as expected is becoming more and more difficult. The obvious solution is to implement some sort of regression testing. Unfortunately, when doing a web search I can't find any relevant system for skill code.
Has anyone out there implemented such a system? Is there any software that exists to aid in this?
It depends what "working as exepcted" means.
If you are looking only for return values you can do like this:
create a script which calls al your functions and saves the associated return values in a text file.
run that script from virtuoso , using load()... save the output.txt file in something like golden_image.txt.
Keep a copy of of your CDS.log file .
Letter when you have modified the functions, you just have to run
virtuoso -replay CDS.log
then you have to compare output.txt with golden_image.txt
If output means something more complex (e.g. create/modify cells, etc, etc) then you must do a diff like of the golden_output and the output of the last run.
anyway to automate the jobs you may want to use "virtuoso -replay", eventually "virtuoso -nograph -reply ..." to not splash the windows al over the screen .
In reply to marcelpreda:
You could also use functions like assert() in some test code to ensure that functions are returning what you expect - effectively design a set of unit tests.
In reply to Andrew Beckett:
@Marcel, Yes, I had thought of doing something like that. I was just curious as to whether something else existed out there.
@Andrew, Thanks! I was unaware of the assert function, This should allow me to do what I had in mind.
I think that with the assert() statements and the method mentioned by Marcel, a simple regression environment should be possible to create.