download problems

I'm having problems with large (300+mb) downloads that aren't coming in properly. I've been checking these downloads using MD5 (comparing the sum generated by whomever I download from), and they generally fail.
Last night, I tried using command line FTP, and the download that I had just tried a few minutes previously (which failed the MD5), came in fine.
I used a program called ncftp.

I'm not sure if this is new 'problem' (though I've noticed it over the last fews months), and it appears to happen with both the current version (1.0.973), as well as the previous release.

I've been adding the download to my Global Queue.
I've got 'On File Exist' for downloads set to resume.
My resume rollback is set to 1024 (was smaller, same problems).
My I/O buffer size is 384 (was 64, same problem).
I've got Ascii/Binary set to Auto.

This doesn't seem to be server dependant (though I'll be checking if the servers are running the same software/version).

Does anyone have any advice on what I can do to further troubleshoot this problem??

Thanks,
Dennis

Just wanted to answer my own question!

It seems my download problem might have been due to some bad RAM. Looks like downloading with SmartFTP was one of the 1st indications of a problem, other symptoms also started to manifest themselves.

I finally tracked it down to at least one bad stick (maybe 2, I replaced all of them).

I have download about 6 different, large files (600-700mb), and have not had a single problem with those.

In case anyone cared!

Dennis