npn13g2 subckt simulation problem
Closed this issue · 7 comments
Environment
- Klayout Version: 0.29.7
- OS/Platform: linux
- XSchem 3.4.5
- NGSpice 43
Expected Behavior
both TBs (inv_ttl_tran and inv_ttl_tran2) are expected to work since the only difference is that transistors in one of them are inside a subckt.
Actual Behavior
TB with subckt doesn't work and gave a subckt error message:
Error: unknown subckt: x1.xq1 out x1.net1 0 0 x1.npn13g2
Simulation interrupted due to error!
Steps to Reproduce the Problem
- just run 'ngspice inv_ttl_tran.spice' in attached testcase that error message will appear.
Both TB netlists and xschem files are provided in attachment as a testcase.
inv_ttl.tar.gz
PS: I didn't have problems with MOSFETs using the same hierarchy flow on xschem/ngspice.
Hi @britovski could you please take a look at this PR #241. It should resolve your issue of hierarchical schematic entry.
Hi @KrzysztofHerman
The problem with hierarchical schematic maybe is ok but now for the same testcase using the PR update I got another errors (for both testbenches, with and without subckt).
You can just run any spice netlist to see those errors (the same will appears for re-generated on xschem).
Warning: singular matrix: check node q.x1.xq2.qnpn13g2#collCI
Note: Starting dynamic gmin stepping
Warning: singular matrix: check node x1.xq2.s1
Warning: singular matrix: check node q.x1.xq2.qnpn13g2#substrate
Note: Dynamic gmin stepping completed
Warning: singular matrix: check node q.x1.xq2.qnpn13g2#collCI
Warning: singular matrix: check node q.x1.xq2.qnpn13g2#collCI
Warning: singular matrix: check node q.x1.xq2.qnpn13g2#collCI
doAnalyses: TRAN: Timestep too small; time = 1.6e-11, timestep = 6.25e-23: trouble with node "vdd"
tran simulation(s) aborted
@KrzysztofHerman When I commented the *option klu on the .spiceinit it worked.
@britovski good to hear that. The KLU makes sense only for big circuits, it can speed up the simulation by a factor of 2. Additionally not all simulations, like SP, can be executed using KLU. Can we close this thread ?
@KrzysztofHerman Yes. We can close. Thanks.