Any two simulators will have different effects from the same IBIS driver if you make the bit time short enough (higher frequency). This is caused by the bit time being shorter than the length of the rising or falling waveforms. The data does not define how the device behaves in this case, so the simulator must assume something. The assumptions programmed into the simulators are all a bit different. If you see different results from two simulators on a single rising or single falling edge then one of them (or both) must be incorrect.
I understand the situation you described, but things may more a bit of complex. Even the bit interval is enought larger than the rising/falling time, the huge difference still exists.
It seems both Hyperlynx native engine and ADMS engine never use the RAMP data if there are rising/falling wave data in the IBIS file, the RAMP data in the IBIS file only be used by the engines on the case where no rising/falling wave data exits. If you delete all rsing/falling wave data in the IBIS file, both Hyperlynx
native engine and ADMS engine will use the RAMP data and you will see the simulation results coincide .
An interest is that Cadence Sigxp will use the RAMP if it find the bit interval(width) is shorter than the sum of Tr value and Tf value derived from
rising/falling wave datas in the IBIS file. Cadence Sigxp user always see very better result from the simulation on the upper limit of switching frequency if the driver model have wrong RAMP data.