Usually, after first board samples got and tested, if the signal waveform is bad, it will be better to take a simulation for that and find a solution which will be applied in the second phase sample build.
If not so confidential for initial design, it's worth to take some time simulating.
How about yours's opinion? As for my limited experience, maybe what i said is not a good way.
A young HW development engineer from Shanghai, China.
There are a lot of different approaches designers take to signal integrity, but from my experience it's generally better to run simulations on at least your critical nets before that first board spin to verify that there aren't going to be serious problems. If you could find those "bad waveforms" up front, then you can eliminate a lot of headaches and late nights in the lab trying to debug a system because of some intermittent failures that are signal integrity related.
It's even better if you can take the time up front and run some SI analysis to come up with routing constraints for the layout (spacing to control crosstalk, lengths for proper SI and timing, etc). A lot of designers will just use the routing guidelines provided by their IC vendors instead of coming up with their own, but for a majority of cases, your stackup design and routing channels aren't exactly the same as what they suggest. That means that some of the assumptions that go into their constraints may not be valid for your application. It's tough to tell if you're going to be OK or not without doing a little bit of simulation with their constraints on your actual stackup and board conditions.
Do others agree? Or do you find always designing with the IC vendors design guidelines for routing to be "good enough"?
To simulate or not to simulate that is the question...
We simulate to understand the electrical effect of layout decisions.
If you have used the same technology a number of times in the past and you are doing something that you understand well, I see no reason to simulate.
If you have a design specification from a trusted source and you are doing EXACTLY what the specification proved works properly, I see no reason to simulate.
If you do not fall into one of those two situations, you need to get information from somewhere. The only choices are to build it or to simulate it. A typical cost to build is $4-6,000 and 2-4 weeks of lost engineering time. If your build does not work, what do you know? You know it does not work, but you do not necessarily know why.
We as engineers love test equipment. Give me a 6 Gig storage scope ($50K) and a nice high frequency probe ($6K) and you will make me a very happy engineer. The problem is that the scope can only tell me something after the board has been built.
I make my living fixing other peoples problems. When I come in on a consulting gig, they normally escort me to the lab where the scope and spectrum analyzer are set up. We look at the instruments very intently and note that there is definitely a problem. I am always very polite, but I can not fix the problem with the oscilloscope. There are no knobs to tweak the circuit. I always drop the problem into a simulator where I have the knobs to figure out how the circuit is really working.
Comparing a $50K investment in an oscilloscope vs a $50K investment in a good simulator is like comparing medical doctors. If you have a pain in your chest, who do you want to see first, the cardiologist or the pathologist. They can both figure out that you have a heart problem. The difference is, the cardiologist can tell you how to fix the problem. but the pathologist will just confirm you died of a heart attack. I want to do it right the first time. Life is just better that way.
A final word of advise from a confirmed simulation addict. All simulators work with a set of underlying assumptions. Make sure you understand those assumptions. Using a simulator while ignoring the underlying assumptions is like driving up the "Exit" ramp on the freeway, you may get 100 yards, or you may get a mile or more, but sooner or later there is going to be a problem.
SI problems are always best avoided by good design, rather than fixed during debug. It's always really hard to effect changes in debug, and you usually have to re-spin the board to even know if your changes will solve the problem.
Simulation up front helps prevent that, generally (but not always) allowing you to get it right in design. I prefer to think of it as modelling rather than simulaiton, because it's more about what might happen than what will happen. The accuracy is not always great, but usually enought to tell you where the problems will be. Often it's just a case of understanding a particular device's ability to drive a certain line. Datasheets are usually thin on providing meaningful information about this aspect of performance, but a decent IBIS model will take you a long way. Of course you do need a decent IBIS model - not always a given.
Getting a decent IBIS model.... boy have I heard that before. Having been in the SI simulation business for the last ten years, I have to say its the number one issue over the years. Sure we are always getting feature requests - but this one is huge. With the sheer number of ICs that come out each year... the only way this can get resolved is for the semiconductor vendor to realize the importance of good IBIS models at design in time. It seems they put new engineers on this problem more often than not... and it shows.
My request to all the simulation users out there...... Don't let your semiconductor vendor off the hook.... they need to provide quality IBIS models so you can do your job .... without first trying to create a model.
I agree that running full SI analysis is time consuming but I run a crosstalk report at least on all PCB's.
If I can give my customers a PCB that passes crosstalk both because of routing and layer management then I've done
better than not doing any post layout SI analysis.
Kenneth J. Wood
Saturn PCB Design, Inc. firstname.lastname@example.org
2737 Bishop Lane Phone: (407) 340-2668
Deltona, Fl 32725 Fax: (386) 789-2765
Ken - you make a great point! - there are many levels of value and effort required in SI simulation. As you state, using default echnology models and running a batch crosstalk analysis is pretty high value - you don't have to get IBIS models for all components - you set your minimum allowable crosstalk thresholds and you're off. This works sufficiently up to a certain point based on signaling speed - SERDES channels running at 1 gigabit per second..... need more analysis.
I think Hyperlynx is very smart to help user to do simulation fast, especially it's batch mode. Because Multicore is common, Hope Hyperlynx can automatically fully using all cores to speed up batch simulation linearly with the cores. Also Hope Hyperlynx support read all models and stimulus etc directly from CES, it will help make work more easy and save times.