Compression question / problem

Richard Miller wow at together.net
Wed Jan 12 16:42:44 EST 2005


Thanks for all the feedback on this issue.


On Jan 12, 2005, at 4:23 PM, Alex Tweedly wrote:

> Frank D. Engel, Jr. wrote:
>
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA1
>>
>> In general, you can use your (client) bandwidth more effectively by 
>> downloading several files at a time, but 50 might be a bit much.  It 
>> would probably be ideal to create (for example) 5 files of 10 
>> pictures each, then download those 5 files in parallel.
>>
>> Or try with 4 files (2 of 13 images, 2 of 12), for example...
>>
>> And if the files can be distributed to several servers, even better 
>> (but don't do that unless there will be a LOT of simultaneous 
>> downloads from numerous clients -- otherwise it's not worth the 
>> expense, the gains would be minimal).
>
> As Frank says, approx. 4 files to be downloaded in parallel will give 
> you the shortest time to complete the transfer. Be sure to use "load 
> URL" on each to start the transfer, then "put URL" (or any other 
> similar technique). At 4 files, each one is only 125K (for 50 files of 
> 10K each) - increasing beyond that point is approaching the territory 
> where they are small enough to run into noticeable start-up overheads, 
> enough to defeat your purpose.
>
> Note this approach could be considered slightly anti-social if the 
> Internet connection is slow-ish and shared with other users - doing 4 
> transfers in parallel will allow you use a large part of the bandwidth 
> even if there are other users trying to get something done. If that 
> situation is possible, and if it's a 56K or slower connection, I'd 
> limit myself to 2 parallel streams.
>
> -- Alex.
>
>
> -- 
> No virus found in this outgoing message.
> Checked by AVG Anti-Virus.
> Version: 7.0.300 / Virus Database: 265.6.10 - Release Date: 10/01/2005
>
> _______________________________________________
> use-revolution mailing list
> use-revolution at lists.runrev.com
> http://lists.runrev.com/mailman/listinfo/use-revolution
>



More information about the use-livecode mailing list