Lucia Mar Unified School District

Suprtool in Education

By Eric Guisinger

Eric Guisinger
I'm a major advocate of Suprtool - the product has saved me literally hundreds of hours of work. We acquired the product here at Lucia Mar in 1998 at my recommendation, as I'd also used it at Kaiser-Permanente in Colorado and at Atascadero State Hospital here in California. In other words, I've used Suprtool for around 20 years (good grief, is that really possible?). I read COBOL really well, but I hate writing it! I highly recommend Suprtool for virtually any HP 3000-based school, school district, and/or county office of education, and/or state department of education.

Lucia Mar Unified School District has 15 schools (with 3 more on the drawing board). Our student population is about 11,000. We have 1 High School, 1 Continuation High School, 3 Junior High Schools, and 10 Elementary Schools.

We use the QSS application on the HP 3000 and we are migrating to a Microsoft-based Client/Server product - Aeries from Eagle Software. In our new environment, the HP 3000 still plays a major role. We will continue using the 3000 as a student data warehouse (still in the QSS data base structure, but with limited use of the application). We will also be using the 3000 for our financials, also from QSS, which includes G/L, A/P, A/R, Budget Prep, Payroll, Personnel, Applicant Tracking, Benefit Administration, Stores, Fixed Assets and many etc's.

It turns out that, while you can implement applications at the school level by having a server at each school (very typical), you still have the ever-present requirement for district-wide (or county-wide or state-wide) data reporting and analysis. The HP 3000, in this case, becomes a large server and data warehouse, but still very capable of running legacy applications and supporting the extensive multi-site reporting requirements. The student-related data needs to be extracted from the school servers and sent to the district server (HP 3000) so that district-wide data base(s) can be updated.

There is also the circumstance where we wish to centrally maintain certain types of student-related data. The HP 3000 fulfills this role nicely. The data needs to be extracted from the HP and sent out to the various school servers so the school data bases can be updated with data from the centrally maintained data base on the district server (HP 3000).

Each night we extract data from the remote data bases via a wide area network and transmit the extracted files to the HP 3000. On the HP 3000, Suprtool is used to erase existing student information and replace it with the information from the remote system. This is a "replace everything" approach: The remote system is the 'master' copy and data existing on the HP is only a copy of that. A typical Suprtool task here might be to read the enrollment history data set with about 285,000 entries, extracting just those records of a specific type for a specific school, subsequently deleting them, and then adding (updated) records that were extracted from the remote system into the data set. Suprtool reads our enrollment history data set in about 6 seconds. We actually read the enrollment data set a total of 4 times for each school being processed. We have 15 schools, so 15 schools x 4 x 6 seconds still means we've processed a data set with 285,000 records in about 6 minutes - a very acceptable level of performance. Since it is all done in Suprtool, it is relatively easy to maintain (as compared to, say, COBOL). We have similar Suprtool processes that maintain parent/guardian information and student demographic information that has been extracted from the remote systems.

The data elements in the central HP 3000 data bases that are actually maintained at the remote sites can be as much as one day behind the real-time data, since we only run the extract/update once per day. The reverse view is true for the remote sites: centrally maintained data is only transmitted to the remote sites once per day, so that information could be as much as one day behind real-time.

Receiving ASCII files produced by Microsoft Access has never been a problem. The main thing you have to watch out for is possible data type incompatibilities. I solved this challenge by making sure that no packed-decimal or binary numeric elements are ever in an extract file. I convert the data type at the time the file is generated, i.e., a long integer field in the data base becomes a 'zoned-decimal'/'display' numeric field in the extract file.

We use Minisoft FTP software to effect the transfer of the Microsoft Access export files from our Novell server to the HP 3000. Minisoft provides the necessary object definitions, etc. to allow me to transfer the files using VB code in the Access application. This is another product in which I have great confidence.

When sending files to Microsoft Access, you have some options that give you quite a lot of flexibility, but once again, I've opted for keeping it very, very simple. The data type compatibility issue is resolved the same way. Import/Export specifications are used in Access to bring the data into the Access environment.

In both cases, I've opted to use only fixed-length files, as opposed to any type of delimited file. This gets you around the problem of delimiting characters that are embedded in your data (intentional or otherwise), which can really wreak havoc on the results!

We also use Suprtool as a diagnostic tool. A good example from recent experience: I needed to find all "Employee/Employer Paid Benefit" postings to the general ledger from the payroll system on 9/29/2000. The data set in question has 1,200,000 records in it. I was able to locate the records, save them to a file, list them in a user friendly format, modify/update a value in the located records, based on table lookups, and list the records again showing the update, ALL in Suprtool.

Another example: Recently, one of the district departments needed a listing of students that are enrolled in the Gifted And Talented (GATE) program and also enrolled in one or more English Language Learner programs. I cannot think of a way to do this report in Query. I'm sure there is a way, but doing an exhaustive read multiple times on a data set with 285,000 entries in Query is definitely not fast. It did require multiple passes in Suprtool to create lookup tables, but even reading the enrollment data set 4 times, and reading the student data set one time, only took about 40 seconds, and the user had their report in less than 15 minutes. This is just one type of task that Suprtool excels at. There are many others.

I'm to the point now that I usually think of Suprtool as a primary solution to many of our routine and ad-hoc requirements.

Eric Guisinger,
Technical Services, Lucia Mar Unified School District