CGI question

Richard MacLemale rmaclema at pasco.k12.fl.us
Wed May 7 14:38:02 EDT 2003


I've been using metacard to do cool cgi tricks on my network and web server
for 2 years now and love, love love it.  However, I'm about to set up this
new big project dealing with student grades, and am wondering what the
"best" solution would be.

This is an OS X Mac with 512 MB of RAM, running apache for web services and
using metatalk cgi scripts.

When a student hits a web page, I need to be able to search for and pull
info out of a 700K text file, to get their class schedule.  The problem is
that this cgi script may be called simultaneously from 40 different
computers, at times.  My concern is that I'll end up with 40 copies of the
script running at the same time, each loading 700K into a variable, which
will eat RAM.  I've got quite a bit of RAM, but still...

So what is the best option of the following:

1.  Bust the 700K file down into different files, maybe A-L and M-Z, and
then only hitting the file I need?  (I could make one file per alphabet
character, but that's not going to be fun to maintain...)

2.  Search the file and pull out info without loading it into a variable,
which would use far less RAM but would take longer?

3.  Load the whole file into a variable and not worry about it?

4.  Use some type of index file which tells me specifically the location of
the user in the big 700K file?

5.  Some other solution I haven't thought of?

The thing that is confusing is the speed issue.  The more I load into a
variable, the faster the script can run, the more memory it uses... But if
it runs fast enough it will be done quicker and get out of the way of the
next request.  

Anyone have any thoughts?  As usual, thanks in advance...

-- 
:)
Richard MacLemale
Network Administrator
J. W. Mitchell High School




More information about the metacard mailing list