I'm noticing some discrepancies in voltage levels when I run DDR Batch Simulation vs. simulating a single data net on LineSim and wanted to know if I am running the batch simulation incorrectly. The DDR3L interface is at VDD=1.35V. Attached is the scope view of 2 waveforms (Red -> driver waveform of a single data line after DDR batch simulation and Blue is the driver waveform of the same data line from LineSim. V_high for Blue settles at 1.35V as expected. Red however, stays between V_high = 1.08V and V_low = 0.27V. These values correspond to 0.8*1.35 and 0.2*1.35 respectively, which are the thresholds indicated in the datasheet, but it puzzles me why LineSim would show a correct voltage level, but not the DDR batch simulation.