Reading Large Files Using $readmemh - verilog

This is a discussion on Reading Large Files Using $readmemh - verilog ; Hi, Is there any way to know at run-time whether a file read using $readmemh has been successful or not? Does it have a return value? Also, is there a limit to the size of the file which can be ...

+ Reply to Thread
Results 1 to 3 of 3

Reading Large Files Using $readmemh

  1. Default Reading Large Files Using $readmemh

    Hi,

    Is there any way to know at run-time whether a file read using
    $readmemh has been successful or not? Does it have a return value?

    Also, is there a limit to the size of the file which can be read by
    $readmemh in NCSim?

    I have a huge memory array (256 MB) to be initialized and an
    appropriate sized file for reading through $readmemh. NCSim appears to
    execute the $readmemh and continues the simulation, but the memory is
    not initialized. The statement executes correctly if I fill 2-3 MB of
    the memory array initially, and use the rest of the memory array
    locations during run time for other purposes. For one of the runs, I
    need to initialize all the 256 MB, but $readmemh is turning out to be a
    stumbling block.

    Any help would be greatly appreciated.

    Thanks & Regards
    Ganesh


  2. Default Re: Reading Large Files Using $readmemh

    Interesting simultaneous posts on $readmemh on huge mem + NCSIM. Davy
    also had some questions on that in other thread.

    In your case, how about:

    1. Split your 256 MB file to say 128 files of 2 MB each (sure too many,
    but what ever loads cleanly)
    2. Use $readmemh with start_addr, end_addr (Google search might help or
    LRM)
    3. Create your file names in a loop using $sformat
    4. Load 128 files

    Tell us how it goes - if you care to!

    Regards
    Ajeetha, CVC
    www.noveldv.com


  3. Default Re: Reading Large Files Using $readmemh

    $readmemh does not have a return value. However, it should print out
    warnings if it failed for some reason, such as not finding the file, or
    the file ending before the full memory was filled.

    There should not be a limit to the size of the file that can be read by
    $readmemh in NCSim. If the file is larger than 4G (or maybe 2G), your
    file system needs to be set up to support large files. If you can look
    at the file and it is all there, then presumably that is fine.

    I just tested versions 5.5 and 5.6 using $readmem to read 256MB (both
    as 8bitx256M and 32bitx64M). I did tests on both Sun and Linux
    systems. It appeared to work fine.

    You should file a bug report, or at least provide me with more detail
    about what you were doing (version, platform, testcase).


+ Reply to Thread

Similar Threads

  1. error reading a large number of binary files
    By Application Development in forum Idl-pvwave
    Replies: 15
    Last Post: 04-27-2007, 02:35 PM
  2. Performance with reading large numbers of files...
    By Application Development in forum DOTNET
    Replies: 3
    Last Post: 01-24-2006, 03:37 AM
  3. Performance reading/writing large text files with Streams in VB.NE
    By Application Development in forum DOTNET
    Replies: 6
    Last Post: 07-13-2005, 11:51 AM
  4. Reading through large datasets in blocks
    By Application Development in forum DOTNET
    Replies: 1
    Last Post: 09-06-2004, 09:07 PM
  5. Re: Problem reading large file
    By Application Development in forum Java
    Replies: 1
    Last Post: 05-05-2004, 09:26 PM