I have a fairly simple RTL design (only two system verilog modules). My testbench reads from a pcap file which has 5 million packets. I stream them to my DUT and scoreboard the output with what was sent.
Modelsim memory usage on the server steadily increases until it reaches 4GB when it crashes. This happens when the testbench has sent about 250K packets.
I am not sure whether this is a tool issue or if there is any memory leak in my testbench or something wrong in the RTL.
Is it normal for Modelsim's memory usage to steadily increase as simulation time increases?