Checksum via FTP???
Mark Talluto
userev at canelasoftware.com
Mon Sep 12 14:40:26 EDT 2011
I would have a computer running locally in your office poll for uploaded transactions to your web server. Have the computer download the transaction and process it locally at the speed of Rev. Then upload the result to the web server and have the client poll for the result. You can do a turn around transaction in milliseconds using this method. The best part is that nothing survives on the web server for more than a few milliseconds. This will improve your security. Since the real work is done locally, you are also not over utilizing your web sever. In essence, you web server is nothing more than a file server. We do this all day long to process orders, registrations, updates, and other forms of communication with our clients.
Best regards,
Mark Talluto
http://www.canelasoftware.com
On Sep 11, 2011, at 12:48 PM, Richard Gaskin wrote:
> Roger Eller wrote:
>
>> On Sun, Sep 11, 2011 at 2:22 PM, Richard Gaskin wrote:
>>>
>>> Databases are handy for working with very large data stores, esp. where you
>>> need relationality but for simple things like a checksum value for a file,
>>> Mark Weider's suggestion is probably the simplest and most efficient, to
>>> just store a checksum file with the actual file,
> ...
>>
>> Richard, would you have the same reservations about database usage overhead
>> if a standalone or revlet were used as the client? There are over 100,000
>> files pre-existing, so an initial creation of a server-side md5digest for
>> every file would be a challenge in itself. How about a standalone which
>> lives on the server-side (always running - kinda cgi-like) which accepts
>> requests for a files md5digest and returns that string to the client
>> standalone/revlet before starting the download.
>
> If the checksums are pre-calculated I don't know that it would make much practical difference either way. With so many files in that directory, offhand I see no practical detriment with adding a few thousand more. ;)
>
> You might see a minor performance improvement if you split the files into sub-directories of <32k files each, but I'm not sure it would amount to much.
>
> If the values are to be calculated on the fly then a DB may not help much anyway. Writing a CGI to do that on demand would be a snap, provided the files are of a reasonable size to be loaded into RAM and run through md5Digest (or sha1Digest, which is said to be theoretically slightly more reliable).
>
> I believe there's a limit to the size of files that can be run through LiveCode's checksum functions - anyone here know what that limit is offhand?
>
> If your file sizes exceed LC's limit you could use a shell call for that from your CGI.
>
> --
> Richard Gaskin
> Fourth World
> LiveCode training and consulting: http://www.fourthworld.com
> Webzine for LiveCode developers: http://www.LiveCodeJournal.com
> LiveCode Journal blog: http://LiveCodejournal.com/blog.irv
>
> _______________________________________________
> use-livecode mailing list
> use-livecode at lists.runrev.com
> Please visit this url to subscribe, unsubscribe and manage your subscription preferences:
> http://lists.runrev.com/mailman/listinfo/use-livecode
More information about the use-livecode
mailing list