Simultaneous Downloads...

In all previous versions prior to version 3, an admin could be downloading/uploading a large number of files from one site while downloading/uploading a few files for another. It really was a nice and productive feature. Send the large downloads/uploads to the Global Queue and download/upload a small number of files the regular way. Now that version 3 sends all downloads/uploads to the Global Queue, how do you download/upload a small number of files from another site without having them added to the end of a very long download/upload in the Global Queue? Changing the number of workers only increases the speed at which the files are downloaded/uploaded, but it doesn't allow you to work on one site while creating backups for another. I even tried opening a second instance of SmartFtp, but the downloads in the second instance were added to the end of the Global Queue in the first. Using the Global Queue for all downloads/uploads has really neutered SmartFTP's functinality/usability. Surely there is a way to make this more useful for those who work on multiple sites. Version 2 worked well for those who actually needed to get a lot done on more than one site at a time. Is there a workaround to this new problem???

Image

Please take a look here
https://www.smartftp.com/support/kb/how- ... f2600.html

You just need to limit the max connections in the favorite setting and then increase the number of total workers in the transfer queue.

That topic seems to say that I can edit a file on one site while doing a full backup on another. But I don't want to use SmartFtp to edit the files. I want to be able to download and upload files from one site while running a full backup for another. I don't want the files for the second site to go to the end of the Global Queue and have to wait until the full backup is complete. I want to be able to download and upload files from multiple sites while doing a full backup on another site. Since I offer technical support for a great many sites, I would like for it to work very similar to the way it did in version 2 without having to jump through a whole lot of hoops to do so. The way version 2 worked was ideal for working with many sites at once while creating full backups and uploads in the Global Queue for others. The instructions from your link talks about editing files for one site while doing a full backup for another site. But can I download and upload multiple files from one site while running a full backup for another without having those download/upload files going to the end of the Global Queue where the full backup is running???

Image

It's the same concept. I think it also has been explained in another thread.

Just limit the number of concurrent connections in the favorite with the backup to 1 and then increase the number of total workers (e.g. to 3 and 4). The backup will only use 1 worker and the remaining workers can be use for something else.

Regards,
mat

So if I understand what you're saying is that the only way to do what I need to do is to considerably slow down the large download/upload to one worker so that I can work on more than one site at a time using the extra workers? How is that useful? What you offered in version 2 didn't have those unnecessary limitations. How is what you are describing an improvement???

Couldn't you create a class for the Global Queue so that there can be multiple Global Queues, one for each Remote Browser, so that we can go back to downloading from multiple sites without having to limit the download speed for the larger downloads, so that we can work on multiple sites at once without limitation? I think that would be a good remedy for what we have lost from version 2...

Image

Please if I have misunderstood what you are saying, I would appreciate that you set me straight. What I am understanding you saying is that to be able to download files from multiple sites simultaneously, I have to limit the large downloads, which benefit most from multiple workers, to one worker, then I can use other workers to download small numbers of files, which really don't benefit much from the use of multiple workers. If this is invalid, please let me know...

If I am understanding you correctly, how hard would it be to create a class for the Transfer Queue and create an instance of the class with each remote browser? IOW, each Remote Browser would have its own Transfer Queue. Wouldn't that make far more sense than to tie multiple remote browsers to a single Transfer Queue???

Image

Hello ..

Your problem can be solved with the current functionality in SmartFTP. The solution has been provided to you already. If you want to use multiple workers for your large transfers limit the number of workers in the favorite for the "large transfers" to N. And then set the number of total workers in the transfer queue to N+1. This means there will be one free worker to process the "small transfers". An actual example: You want to use 10 workers for the large transfer. Then you set the max number of workers for this favorite to 10. And you want 2 additional workers that process the small files from another server or favorite. Then you set the number of workers in the Transfer Queue to 12.

Multiple transfer queues add more complexity and bring no or little benefit.

Regards,
Mat

I have the feeling that you still do not understand the concept. You can create a copy of a favorite and then set the limit in one favorite only. Then you can connect to the same server twice and you don't have to wait until the other files are transferred. Example:

1. You have a favorite named "Favorite A"
2. You create a copy of this "Favorite A" which will be named "Favorite A - Copy"
3. Set the number of workers in "Favorite A" to 5 for example
4. Set the number of workers in "Favorite A - Copy" to the default which is unlimited
5. Set the number of total workers in the Transfer Queue to 7.

Now "Favorite A" will use 5 workers. And "Favorites A Copy" will use the remaining 2 workers (out of the 7). And "Favorite A" und "Favorite A Copy" are both the same server.

Regards,
Mat

A quick question for you:

In your first post you wrote: "I even tried opening a second instance of SmartFtp, but the downloads in the second instance were added to the end of the Global Queue in the first"

How exactly did you download the files? drag&drop? From where to where? Because I wasn't able to reproduce this.

Thank you
Regards,
Mat

Yes, I already knew that I could do that but it would be nice to be able to keep up to date. Now that I have clearly pointed out a major design issue that could adversely affect a lot of people, I would hope that you would reconsider the design of version 3. My suggestion would be a lot easier, simpler, and intuitive than what is available now. It makes a whole lot more sense to keep the downloads separate and tied to each remote browser and it would not require a Knowledge Base article to try to explain it...

I would love to be able to continue recommending SFTP to everyone who requests tech support and if you can simplify the process and make SFTP more user friendly again, I would be happy to continue doing so. Out of the 316 sites that I have worked directly on, I imagine that I have convinced at least 300 of the admins for those sites to use SFTP. Then there are hundreds more where I did not have to access their sites who I also was able to convince into using SFTP. I have been a big fan for a very long time and I would hate to see that end now for something that probably would be a simple code fix...

Image