Below is the ocean script
simulator( 'spectre )hostMode( 'distributed )
design ,analysis & simulator options ---
run(?jobName "job1" ?queue "invglobal" ?lsfResourceStr "rusage[mem=2048]" ?lsfNoOfProcessors "4" )
The run command written above is throwing error shown below.
ERROR (49) : Job Submission Error : Job submission failed; LSF Error 120 - Request aborted by esubTry to submit a LSF job from the command line using the same setup. Resolve any LSF issue found in this step before submitting the job to LSF through ADE DPjob1001
When run() command arguments are changed as shown below, LSF job is submitted successfully.
run(?jobName "job1" ?drmsCmd "bsub -n 1 -M 2 -q invglobal -P gpio")
Now, I understand that former run() command isn't given project name in it's arguments but latter run() command , which has successfully submitted job, is given project name. The question is how to provide project name in the former run() command?
is there a way to know what is the shell command former run() command is executing ? If we can know the equivalent shell command of former run() command then we will come to know what options of bsub command are missing.
My wild guess would have been run(?lsfProjectName "gpio" ...) but I can't find that this is implemented. It's in the ADE UI, and you can seed the settings with:
envSetVal("asimenv.distributed" "lsfProjectName" 'string "gpio")envSetVal("asimenv.distributed" "selectLsfProjectName" 'boolean t)
but I'm not sure they influence an OCEAN distributed job.
So I'd suggest you try ?lsfProjectName as above, and then the cdsenv settings (before doing the simulator('spectre) in your OCEAN script), and if that doesn't work you might want to contact customer support.
However, if this needs changing, it's unlikely to be implemented because this can be done right now with ADE Explorer/Assembler - you'd set up a job policy where this is filled in and then you could use the Maestro SKILL functions to run the simulation from batch. Any fix would need a change to the ADE L infrastructure (particularly the ADE L job submission infrastructure) which is deprecated in favour of ADE Explorer going forward. So contacting customer support would be to see if a workaround can be found for now (I don't have time to do the research in my spare time - better to have this done through the normal support channel).
I understood that you are suggesting two approaches.
1). run(?lsfProjectName "gpio" ...)
As you expected it's not implemented in run() command, so it's showing below warning and job submission is failed.
WARNING (OCN-6153): The argument "?lsfProjectName" in run command is not a valid argument. Therefore, the argument "?lsfProjectName" is ignored.
I found that these settings are not influencing ocean, as job submission is failed again.
But if these settings are done then ADE distributed process is submitting job successfully while OCEAN is failing to.
PS: In my previous post I asked a question that I'm pasting here below.
"is there a way to know what is the shell command former run() command is executing ? If we can know the equivalent shell command of former run() command then we will come to know what options of bsub command are missing."
Sorry - I'd not noticed when I'd tried it on my local LSF farm (which doesn't have projects set up) that the ?lsfProjectName gives a warning. You're quite right...
There are some debug options for debugging the "LBS" integration of LSF, but I think the fundamental issue here is that project support doesn't seem to have made it through to the OCEAN layer. That would need customer support interaction (same for the debugging options - I think this can be done, but I don't have the time to check right now).