How do you join lines in a container (CGUPE)

Sadhunathan Nadesan sadhu at castandcrew.com
Fri Jun 13 13:59:00 EDT 2003


Greetings,

Ok, many thanks to everyone who responded to my post.  Taking various
hints from here and there, Dar, Pierre, etc., I have refactored and
shortened the code and it works without having to process the data in
an external file now.  Much cleaner!  Again, thank you all.

BTW, I asked the original question of how to join the lines so I could
use split but again, taking the suggestion of splitting by LF worked.
At least, it worked using the newer MC engine.  Ok, actually, it didn't
work, that term "LF" gave an execution error, maybe that is a transcript
only reserved word, but "linefeed" worked.  Strange that didn't work on
my MC engine at home but that is another story.

The other thing I had to do was capture the output of the shell
command on the same line.  Meaning, this did NOT work

	put shell(command)
	put it into sqlData

<the above kind of approach does work with http post though>

But this DID work

	put shell(command) into sqlData


and the above is how the example is shown in the MC documentation.


I don't suppose anyone is really interested in the code of the complete
solution but, hey, I include it below just in case.  What this is,
is a utility program, a filter that takes a stream of input data from
a front end data collection program, then looks up something in the
data base which the data collection program is unable to do (without
extensive re-engineering), inserts that value into the data stream,
and passes it on to something else (a back end formatting program).
So it is a pipe line like

collect data | metacard filter | report formatter

And writing this filter was a lot easier than re-engineering the collect
data part.

Sadhu

ps, I'm sure the code could be improved further, for example, using a
function call rather than globals, but it's good enough.



#!/usr/local/bin/mc
###############################################################################
#
#                            -=< Payroll II >=-
#         Copyright (c) Cast & Crew Entertainment Services, Inc. 2003
#
#     Program name:	sony_filter
#        File name:	sony_filter.mt
#           Author:	Sadhunathan Nadesan
#     Date started:	06/11/2003
#
#     Description:
#	Get Walker Numbers from Data Base and replace tselect field
#
###############################################################################


global sqlData, keywords

on startup
  global sqlData, keywords

  -- Grab the tselect output in preparation for translation
  read from stdin until empty
  put it into inputData
  -- Now get the translation table from the data base
  sqlFetch

  -- Put the real product number into the data stream
  repeat for each line thisLine in inputData
    put thisLine into dataArray
    split dataArray by "|"
    -- The set code is in the 5th field, space padded
    put dataArray[5] into setCode
    replace space with empty in setCode
    -- Input data has constant 'walker number' instead of product code
    replace "WALKER_NUMBER" with sqlData[setCode] in thisLine
    -- Send the data on to RPT
    put thisLine
  end repeat

end startup

----------------------------------------------------------
on sqlFetch
-- get the list of set codes and product numbers from the data base
  global sqlData, keywords

  -- build sql query
  put "lines 0" & cr into sqlQuery
  put "select sub_code, string_value from proptl" & cr after sqlQuery
  put "where option_code = 'SONYACCT' /" & cr after sqlQuery
  put sqlQuery into url "file:/tmp/sonylist.sql" 

  -- run sql query and capture results
  put "SQL /tmp/sonylist.sql " into command_string
  set shellCommand to "/bin/sh"
  put shell(command_string) into sqlData

  -- prepare data to be placed in an associative array
  replace space with empty in sqlData
  split sqlData by linefeed and "|"
  put keys(sqlData) into keywords

  -- cleanup
  put "rm -f /tmp/sonylist.sql" into command_string
  put shell(command_string)

end sqlFetch
----------------------------------------------------------



More information about the metacard mailing list