Introduction Banner
thesis by dj banner
The purpose of this document is to, through the use of case histories,  analyze the current state of affairs and important issues that still need  to be addressed in the development of the United States Disaster Relief   Infrastructure of the 1990's.   Related legislation and governmental programs  will be identified, and historical problems found within the disaster  recovery infrastructure will be highlighted via the exploration of four  recent U.S. disaster case studies: the bombing of the World Trade Tower in  1992;  the Chicago Flood in 1993; Hurricane Andrew in 1993; and the Great  Flood of 1993.  Suggestions for further improvement to the current system will also be tendered.
It had been noted by this author that exellent books were currently in  print which chose to analyze how information mismanagement and infrastructure  miscommunications directly or indirectly caused human misery, whether through  political or business misfortune.  One of the best which directly inspired  this paper was **Great Information Disasters** by Dr. Forest Horton and  Dr. Dennis Lewis.   Their case study,  of the hurricane which hit Southern England on the night of the 15th/16th of October 1987, was a great help when this author attempted to analyze the similar disaster of Hurricane Andrew in 1993.  (In both cases the warning systems were  inadequate, leading to classic information disasters.)  But absent in the available literature was any document which specifically  chose via case histories to look directly at the historical and current  United States disaster relief infrastructure.   
It had also been observed  that several blatent problems in the U.S. disaster  recovery infrastructure were revealed within the 1992-1993 time period when  the four cases featured in this paper occurred.  This author then endeavored  to try to fill that particular gap in the literature.  It was the author's  objective that this research should exhist as a convenient summary of the  current U.S. Disaster Relief Infrastructure system's condition, of the  disasters described, and that some good or enlightenment of the disaster  relief structures might be obtained through analysis and comparison.
This study was accomplished exclusively via qualitative research methods, using purely unobtrusive measures by the gathering of information through  previously published data.  This included  government  publications such as   Congressional Reports, insurance reports, newspaper articles and documents  obtained electronically through the internet.  One strength of this  methodology was that it is naturalistic, in that no attempt was made to  manipulate the research settings. The major weakness of this methology is  the subjectiveness by which materials were selected by this author, which  could possibly be looked upon as being biased.  For this, the author  apologizes in advance.
The research design was non-experimental, in that it was almost impossible  to control all the biases or to protect the factual accuracy of internal validity .  This is because the veracity of the published document sample is assumed and impossible to completely verify.
   This study can also loosely be considered an example of content analysis, as  it is does examine classes of social artifacts in an unobtrusive manner and  according to Philip Stone in his article **The General Inquirer: A Computer Approach to Content Analysis** (p.5), as "...any research technique for making inferences by systematically and objectively identifying specified characteristics within text."
  The content analysis method may by definition be applied to almost any form  of communication that has been recorded in some form (ie: written documents;  audio and video tapes; paintings; poems; songs; etc.).  Written documents  may also include magazines, newspapers, speeches, legal documents such as  laws, bills or constitutions.  In this case the sampling would be considered  systematic and unrandom in nature.  Latent content, or underlying meanings  were handled qualitatively in that entire samples of paragraphs were observed  to determine an overall assessment.  Where this study differed from content  analysis was that it was purely qualitative and made no effort in determining  quantitative data.  A content analysis study normally would have contained  six major steps of which the last four tend toward the quantitative in  nature.  These steps could be summarized from (Babbie) literature as :   (1)extraction of data; (2) classification of data ; (3) weighing of data;  (4) determination of criteria in  assessment of data; (5) establishment of  reliability measures; and (6) the checking of inferences.
  Data collection for this paper was gradual, collected over a time period  between the fall of 1993 and the spring of 1995, with much debt owed to the  electronic resources obtained at Syracuse University, and through the  generosity of the Army Corp of Engineers.
ENDNOTES:
**Great Information Disasters: Twelve prime examples of how information  mismanagement led to human misery, political misfortune and business failure.   *** edited by Dr. Forest W. Horton, Jr. and Dr. Dennis Lewis (ASLIB, the  Association for Information Management: London) 1991 pp.107-113
(Updated 9/02/03 D.J. Russell)
Hosted by www.Geocities.ws

1