drasey
I'm having problems with large (300+mb) downloads that aren't coming in properly. I've been checking these downloads using MD5 (comparing the sum generated by whomever I download from), and they generally fail.
Last night, I tried using command line FTP, and the download that I had just tried a few minutes previously (which failed the MD5), came in fine.
I used a program called ncftp.
I'm not sure if this is new 'problem' (though I've noticed it over the last fews months), and it appears to happen with both the current version (1.0.973), as well as the previous release.
I've been adding the download to my Global Queue.
I've got 'On File Exist' for downloads set to resume.
My resume rollback is set to 1024 (was smaller, same problems).
My I/O buffer size is 384 (was 64, same problem).
I've got Ascii/Binary set to Auto.
This doesn't seem to be server dependant (though I'll be checking if the servers are running the same software/version).
Does anyone have any advice on what I can do to further troubleshoot this problem??
Thanks,
Dennis
Last night, I tried using command line FTP, and the download that I had just tried a few minutes previously (which failed the MD5), came in fine.
I used a program called ncftp.
I'm not sure if this is new 'problem' (though I've noticed it over the last fews months), and it appears to happen with both the current version (1.0.973), as well as the previous release.
I've been adding the download to my Global Queue.
I've got 'On File Exist' for downloads set to resume.
My resume rollback is set to 1024 (was smaller, same problems).
My I/O buffer size is 384 (was 64, same problem).
I've got Ascii/Binary set to Auto.
This doesn't seem to be server dependant (though I'll be checking if the servers are running the same software/version).
Does anyone have any advice on what I can do to further troubleshoot this problem??
Thanks,
Dennis