MKOPSC Theses and Dissertations
Permanent URI for this collection
Browse
Browsing MKOPSC Theses and Dissertations by Issue Date
Now showing 1 - 20 of 152
Results Per Page
Sort Options
Item Dense gas dispersion modeling for aqueous releases(Texas A&M University, 1999) Lara, ArmandoProduction, transportation, and storage of hazardous chemicals represent potential risks to the environment, the public, and the producers themselves. The release to the atmosphere of materials that may form mixtures denser than air is of special concern since they disperse at ground level. Toxic or combustible materials with boiling points below ambient temperature, such as chlorine and ammonia, are usually stored or transported as a saturated liquid. A release from such a system is likely to produce vaporization of much or all of the stored liquid, leading to entrainment and/or formation of liquid droplets in th: vapor release, affecting the density of the mixture considerably. Current dispersion models limit their study to aerosols that are made up by ideal gases or liquids. This work proposes extending the existing HGSYSTEM, a widely used vapor dispersion simulator and one known as a good performer in terms of dispersion simulation, to treat non-ideal solutions. This thesis gives a description of the fundamentals of vapor dispersion models currently used in industry and the one proposed here are explained in detail. At the end, data collected and the statistical comparison with the observed concentrations and the predicted ones by other simulators are given.Item Role of viscosity in the accurate prediction of source-terms for high molecular weight substances(Texas A&M University, 1999) Shaikh, Irfan YusufThis study shows that using better material property predictions results in better source-term modeling for high molecular weight substances. Viscosity, density, and enthalpy are used as a function of process variables, namely, temperature and pressure, and mixing effects. The viscosity prediction uses an improvement on current predictions by combining b-parameter and Modified Chung-Lee-starling (MCLS) viscosity predictions for the employed pseudo-mixtures. The source-term model used is SPILLS. It is an established, publicly available model, which has been incorporated into several proprietary dispersion modeling packages. The model is modified to accommodate new material property relationships. The final results compared in this work are evaporation rates for pseudo-mixtures of petroleum fractions. The results are compared to actual SPILLS model prediction. The model is also compared using pentane with experimental evaporation data. Currently, this work is valid for crude compositions and can be extended for other materials that meet the new property prediction criterion. This work can also be extended to other areas of source-term and dispersion modeling, namely, aerosol entrainment, rainout predictions, and vapor cloud dispersion.Item Development of a relational chemical process safety database and applications to safety improvements(Texas A&M University, 2000) Al-Qurashi, FahadIndustrial accidents still show a major concern to both the public and the environment. It has been a governmental objective to minimize these accidents. Several rules and regulations have emerged to reduce the impacts of chemical releases on people and environment. As a result of these rules, many databases were developed to record incidents in an attempt to learn from previous mistakes and hence to reduce accidents. Most of these databases are maintained by federal agencies. However, the taxonomy inconsistencies of these databases make it difficult to develop a national picture of the problem of accidental release. Part of this research presents an analysis of the RMP*Info database, the latest EPA database, to determine the most significant chemicals released and other trends. According to this analysis, 85% of the releases in the chemical industry are due to twelve chemicals. The sources of those releases and their consequences are presented. In addition, the effects of the chemical type, toxic or flammable, and the number of full time employees in the facilities are discussed. To increase the value of the lessons learned from this database, proposed links with failure rate databases and reactive chemical databases were discussed. The objective of the relationship among these databases is to bring all relevant information of both equipment and chemicals into one database. As a result, the new database will make possible a better understanding by plant personnel about the reliability of plant equipment and the danger of the chemicals they are dealing with. Consequently, accidents will be reduced. This research shows that relationships can be established among the three databases. Examples were given to demonstrate the procedure of establishing these relationships. This research is one step in this regard and should be followed by applying the proposed procedure in a development of a more developed and beneficial relational database that can help improve the safety performance of industry.Item Chemical accident databases: what they tell us and how they can be improved to establish national safety goals(Texas A&M University, 2000) McCray, Eboni TrevetteThe objectives of this research are to examine and critique eight chemical accident databases, document any trends in accident occurrences, develop a strategy for improving current databases, and to establish national safety goals on the basis of those improvements. This synopsis found that it is impossible to draw any conclusions about the state of chemical safety, past or present, based on the information in the various databases. The databases are deficient in many ways. First, they have been developed using inconsistent and faulty data-collecting methods, and the terms used to describe accidents are often ambiguous. Secondly, the ever-changing reporting requirements prevent comparisons to be made from year to year, making trends impossible to identify. Lastly, the databases provide little to no information about the specifics of accidents, and many accidents are incorrectly lumped under the heading of chemical accidents. All these factors compromise the overall quality of the data. Thus, it is very difficult, if not impossible, to make definitive conclusions. In addition, it is impossible to determine the effectiveness of governmental regulations or industry standards and practices, when there is no reliable data available for comparison. Finally, this study makes recommendations for database improvement by addressing each of the deficiencies, developing a structure for a new database, and establishing a foundation for the development of national safety goals.Item The effects of obstacle geometry on jet mixing in releases of silane(Texas A&M University, 2000) Sposato, Christina FReleases of silane into air and the effects of obstacles were modeled with the Computational Fluid Dynamics (CFD) code, FLUENT. First the CFD code simulated the release of a free turbulent jet of silane into air to assure that the code agreed with established trends for turbulent jets. Then FLUENT was used to model the flow of silane when confined by a wall, or impinged by an obstacle such as a flat plate or a cylinder. Computer simulated concentration profiles of a silane and air mixture were analyzed to determine mixture volumes between the mixture explosive limits. For each volume of an explosive mixture, the volume of silane was determined. The volume of the flammable mixture and the amount of silane within the flammable mixture were normalized and determined as functions of obstacle radius and obstacle distance. lf the obstacle confines the entire volume, the volumes decrease as obstacle distance increases when the radial contribution dominates the volume. As the distance of the obstacle increases then the axial contribution dominates the volume so the volume increases. The volumes increase, decrease, or remain constant depending on the obstacle diameter.Item Effects of operating conditions on a heat transfer fluid aerosol(Texas A&M University, 2000) Sukmarg, PassapornHeat transfer fluids (HTFs) are extensively used in the chemical process industry and are available in wide ranges of properties. Aside from their importance, Factory Mutual Engineering and Research has reported 54 fires and explosions and $150 million losses due to fires involving HTFs during a recent 10-year period [Febo and Valiulis, 1995]. The vapors of these fluids are flammable above their flash points and can cause explosions. To prevent explosions due to loss of vapor, heat transfer fluids are used as hot liquids at elevated pressures. If loss of containment does occur, the liquid will leak under pressure and may disperse as a fine aerosol mist. Though it has been recognized that aerosol mists can explode, very little is known about their flammability. Therefore, research is critically needed to measure aerosol properties and the flammability of fluid aerosols. This research is the first part of a study of heat transfer fluid aerosols. This part of the study focuses on dispersion and formation of heat transfer fluid aerosols from process leaks. To simulate industrial leaks, aerosol formation from a plain orifice into ambient air is studied by measuring liquid drop sizes and size distributions at various distances from an orifice. Measurements are made over ranges of temperature, pressure and orifice diameters. Aerosol drop size distributions of a HTF are measured by a non-intrusive method of analysis using a Malvern Laser Diffraction Particle Analyzer (Malvern laser). The Malvern laser employs the principle of Fraunhofer diffraction, which is light scattering. The Malvern does not require any standard to calibrate, but the laser tube must be aligned frequently to assure that the detector receives the maximum light intensity. The Malvern software converts light intensity information from the detector to drop size distributions. HTF used in this research was an alkylated aromatic received from an industrial source. The measurements were made in the horizontal direction along the center-line of the HTF spray. The effects of pressure, temperature and orifice size on fluid spray atomization and aerosol drop size distributions were studied at various distances from the orifice. Trends of drop size distributions were analyzed with respect to pressures, temperatures, and orifice sizes. The results of this research will be used in industry to help predict the behavior of fluid releases from leaks, and the information will improve the safety of heat transfer fluid handling and process safety design.Item Non-intrusive characterization of heat transfer fluid aerosol formation(Texas A&M University, 2001) Krishna, KiranHeat transfer fluids are widely used in the chemical process industry and are available in a wide range of properties. These fluids are flammable above their flash points and can cause explosions. Though the possibility of aerosol explosions has been widely documented, knowledge about the explosive potential of such aerosols is limited and critically needed. The aerosol droplet size distributions of heat transfer fluids must be studied to characterize their explosion hazards. This research involves non-intrusive measurement of such aerosol sprays using a Malvern Instrument Diffraction Particle Analyzer. The aerosol is generated by plain orifice atomization to simulate the formation and dispersion of heat transfer fluid aerosols through leaks in process equipment. Predictive models relating the aerosol formation distances, aerosol droplet size, and volume concentrations to bulk liquid pressure, temperature, fluid properties, leak size and ambient conditions are developed. These models will be used to predict the conditions under which leaks will result in the formation of aerosols and ultimately help in estimating the explosion hazard of heat transfer fluid aerosols. The goal is to provide industry information that will help improve process safety.Item A Decision Support System for chemical incident information(Texas A&M University, 2002) Sharma, GauravDecision Support Systems (DSS) find extensive applications in business enabling industry managers to make intelligent risk decisions. Although widely used in business applications, the application of DSS to Process Safety Management has been lacking. This thesis proposes the development of such a DSS based on chemical incident information. Chemical incident information is mostly qualitative in nature. Therefore, mathematical and statistical analysis of this information is an extremely challenging problem. This thesis introduces indices that quantify the qualitative nature of chemical accident information. Weighted Scoring Method is the chosen decision aid for the DSS. Using this decision aid, the various indices are finally consolidated into a single index that serves to facilitate decision making for process safety. The proposed DSS is meant to be user specific. There is scope for the individual user to use the DSS as per his/her decision-making criterion.Item Measurement and prediction of aerosol formation for the safe utilization of industrial fluids(Texas A&M University, 2004-09-30) Krishna, Kiran; Mannan, M. Sam; Hall, Kenneth R.; Kihm, Kenneth D.; West, Harry H.Mist or aerosol explosions present a serious hazard to process industries. Heat transfer fluids are widely used in the chemical process industry, are flammable above their flash points, and can cause aerosol explosions. Though the possibility of aerosol explosions has been widely documented, knowledge about their explosive potential is limited. Studying the formation of such aerosols by emulating leaks in process equipment will help define a source term for aerosol dispersions and aid in characterizing their explosion hazards. Analysis of the problem of aerosol explosions reveals three major steps: source term calculations, dispersion modeling, and explosion analysis. The explosion analysis, consisting of ignition and combustion, is largely affected by the droplet size distribution of the dispersed aerosol. The droplet size distribution of the dispersed aerosol is a function of the droplet size distribution of the aerosol formed from the leak. Existing methods of dealing with the problem of aerosol explosions are limited to enhancing the dispersion to prevent flammable concentrations and use of explosion suppression mechanisms. Insufficient data and theory on the flammability limits of aerosols renders such method speculative at best. Preventing the formation of aerosol upon leaking will provide an inherently safer solution to the problem. The research involves the non-intrusive measurement of heat transfer fluid aerosol sprays using a Malvern Diffraction Particle Analyzer. The aerosol is generated by plain orifice atomization to simulate the formation and dispersion of heat transfer fluid aerosols through leaks in process equipment. Predictive correlations relating aerosol droplet sizes to bulk liquid pressures, temperatures, thermal and fluid properties, leak sizes, and ambient conditions are presented. These correlations will be used to predict the conditions under which leaks will result in the formation of aerosols and will ultimately help in estimating the explosion hazards of heat transfer fluid aerosols. Heat transfer fluid selection can be based on liquids that are less likely to form aerosols. Design criteria also can incorporate the data to arrive at operating conditions that are less likely to produce aerosols. The goal is to provide information that will reduce the hazards of aerosol explosions thereby improving safety in process industries.Item Study of formation and convective transport of aerosols using optical diagnostic technique(Texas A&M University, 2004-09-30) Kim, Tae-Kyun; Kihm, Kenneth D.; Mannan, Mahboobul; McIntyre, Peter; Phares, DenisThe characteristics of liquid and solid aerosols have been intensively investigated by means of optical diagnostic techniques. Part I describes the characteristics of liquid aerosol formation formed by heat transfer fluids (HTFs) from bulk liquids. Part II investigates the characteristics of convective transport behavior of solid particles in virtual impactor (VI). The objective of part I is to establish correlations which offer predictions on atomized particle size of HTFs which are widely and commonly used in process industries. There are numerous reports stating that mist explosions formed from leakage cause disastrous accidents in process industries. For safety concerns, the characteristics of mist formation should be known in order to prevent HTFs from catching on fire or exploding. The empirical data on formation of mist are collected by the optical measurement technique, the Fraunhofer diffraction. The Buckingham-PI theorem is applied to establish a correlation between empirical data and representative physical properties of HTFs. Final results of correlations are solved by a statistical method of linear regression. The objective of part II is to investigate the characteristics of convective transport behavior in virtual impactor (VI) which is used to sort polydisperse precursor powder in the process industries of superconductor wire. VI is the device to separate polydisperse particles as a function of particle size by using the difference in inertia between different sizes of particles. To optimize VI performance, the characteristics of convective transport should be identified. This objective is achieved by visualization techniques. The applied visualization techniques are Mie-scattering and laser induced fluorescence (LIF). To investigate analytically, a local Stokes number is introduced in order to offer criteria on predicting the efficiency of VI performance and boundary effect on particle separation. The achieved results can enhance performance and eliminate defects by having knowledge of the behavior of solid particles in VI.Item Models for multi-strata safety performance measurements in the process industry(Texas A&M University, 2004-09-30) Keren, Nir; Mannan, M. Sam; West, Harry H.; Tretter, Marietta J.; Richmond, William B.Measuring process safety performance is a challenge, and the wide variations in understanding, compliance, and implementation of process safety programs increase the challenge. Process safety can be measured in three strata: (1) measurement of process safety elements within facilities; (2) benchmarking of process safety elements among facilities; and (3) use of incident data collection from various sources for industrial safety performance assessment. The methods presently available for measurement of process safety within facilities are deficient because the results are strongly dependent on user judgment. Performance benchmarking among facilities is done within closed groups of organizations. Neither the questionnaires nor the results are available to the public. Many organizations collect data on industrial incidents. These organizations differ from each other in their interests, data collection procedures, definitions, and scope, and each of them analyzes its data to achieve its objectives. However, there have been no attempts to explore the potential of integrating data sources and harnessing these databases for industrial safety performance assessment. In this study we developed models to pursue the measurement of samples of the strata described above. The measurement methodologies employed herein overcome the disadvantages of existing methodologies and increase their capabilities.Item Systematic Approach for Chemical Reactivity Evaluation(Texas A&M University, 2004-09-30) Aldeeb, Abdulrehman Ahmed; Mannan, M. Sam; Hall, Kenneth R.; Holtzapple, Mark T.; Caton, Jerald A.Under certain conditions, reactive chemicals may proceed into uncontrolled chemical reaction pathways with rapid and significant increases in temperature, pressure, and/or gas evolution. Reactive chemicals have been involved in many industrial incidents, and have harmed people, property, and the environment. Evaluation of reactive chemical hazards is critical to design and operate safer chemical plant processes. Much effort is needed for experimental techniques, mainly calorimetric analysis, to measure thermal reactivity of chemical systems. Studying all the various reaction pathways experimentally however is very expensive and time consuming. Therefore, it is essential to employ simplified screening tools and other methods to reduce the number of experiments and to identify the most energetic pathways. A systematic approach is presented for the evaluation of reactive chemical hazards. This approach is based on a combination of computational methods, correlations, and experimental thermal analysis techniques. The presented approach will help to focus the experimental work to the most hazardous reaction scenarios with a better understanding of the reactive system chemistry. Computational methods are used to predict reaction stoichiometries, thermodynamics, and kinetics, which then are used to exclude thermodynamically infeasible and non-hazardous reaction pathways. Computational methods included: (1) molecular group contribution methods, (2) computational quantum chemistry methods, and (3) correlations based on thermodynamic-energy relationships. The experimental techniques are used to evaluate the most energetic systems for more accurate thermodynamic and kinetics parameters, or to replace inadequate numerical methods. The Reactive System Screening Tool (RSST) and the Automatic Pressure Tracking Adiabatic Calorimeter (APTAC) were employed to evaluate the reactive systems experimentally. The RSST detected exothermic behavior and measured the overall liberated energy. The APTAC simulated near-adiabatic runaway scenarios for more accurate thermodynamic and kinetic parameters. The validity of this approach was investigated through the evaluation of potentially hazardous reactive systems, including decomposition of di-tert-butyl peroxide, copolymerization of styrene-acrylonitrile, and polymerization of 1,3-butadiene.Item Development of a hierarchical fuzzy model for the evaluation of inherent safety(Texas A&M University, 2004-11-15) Gentile, Michela; Mannan, M. Sam; Hall, Kenneth R.; El-Halwagi, Mahmoud; Langari, RezaInherent safety has been recognized as a design approach useful to remove or reduce hazards at the source instead of controlling them with add-on protective barriers. However, inherent safety is based on qualitative principles that cannot easily be evaluated and analyzed, and this is one of the major difficulties for the systematic application and quantification of inherent safety in plant design. The present research introduces the use of fuzzy logic for the measurement of inherent safety by proposing a hierarchical fuzzy model. This dissertation establishes a novel conceptual framework for the analysis of inherent safety and proposes a methodology that addresses several of the limitations of the methodologies available for current inherent safety analysis. This research proposes a methodology based on a hierarchical fuzzy model that analyzes the interaction of variables relevant for inherent safety and process safety in general. The use of fuzzy logic is helpful for modeling uncertainty and subjectivities implied in evaluation of certain variables and it is helpful for combining quantitative data with qualitative information. Fuzzy logic offers the advantage of being able to model numerical and heuristic expert knowledge by using fuzzy IF-THEN rules. Safety is traditionally considered a subjective issue because of the high uncertainty associated with its significant descriptors and parameters; however, this research recognizes that rather than subjective, "safety" is a vague problem. Vagueness derives from the fact that it is not possible to define sharp boundaries between safe and unsafe states; therefore the problem is a "matter of degree". The proposed method is computer-based and process simulator-oriented in order to reduce the time and expertise required for the analysis. It is expected that in the future, by linking the present approach to a process simulator, process engineers can develop safety analysis during the early stages of the design in a rapid and systematic way. Another important aspect of inherent safety, rarely addressed, is transportation of chemical substances; this dissertation includes the analysis of transportation hazard by truck using a fuzzy logic-based approach.Item Development of a computer-aided fault tree synthesis methodology for quantitative risk analysis in the chemical process industry(Texas A&M University, 2005-02-17) Wang, Yanjun; Mannan, M. Sam; West, Harry H.; Chen, Jianer; Teague, Tom L.There has been growing public concern regarding the threat to people and environment from industrial activities, thus more rigorous regulations. The investigation of almost all the major accidents shows that we could have avoided those tragedies with effective risk analysis and safety management programs. High-quality risk analysis is absolutely necessary for sustainable development. As a powerful and systematic tool, fault tree analysis (FTA) has been adapted to the particular need of chemical process quantitative risk analysis (CPQRA) and found great applications. However, the application of FTA in the chemical process industry (CPI) is limited. One major barrier is the manual synthesis of fault trees. It requires a thorough understanding of the process and is vulnerable to individual subjectivity. The quality of FTA can be highly subjective and variable. The availability of a computer-based FTA methodology will greatly benefit the CPI. The primary objective of this research is to develop a computer-aided fault tree synthesis methodology for CPQRA. The central idea is to capture the cause-and-effect logic around each item of equipment directly into mini fault trees. Special fault tree models have been developed to manage special features. Fault trees created by this method are expected to be concise. A prototype computer program is provided to illustrate the methodology. Ideally, FTA can be standardized through a computer package that reads information contained in process block diagrams and provides automatic aids to assist engineers in generating and analyzing fault trees. Another important issue with regard to QRA is the large uncertainty associated with available failure rate data. In the CPI, the ranges of failure rates observed could be quite wide. Traditional reliability studies using point values of failure rates may result in misleading conclusions. This dissertation discusses the uncertainty with failure rate data and proposes a procedure to deal with data uncertainty in determining safety integrity level (SIL) for a safety instrumented system (SIS). Efforts must be carried out to obtain more accurate values of those data that might actually impact the estimation of SIL. This procedure guides process hazard analysts toward a more accurate SIL estimation and avoids misleading results due to data uncertainty.Item Consequence analysis of aqueous ammonia spills using an improved liquid pool evaporation model(Texas A&M University, 2005-02-17) Raghunathan, Vijay; Mannan, M. Sam; El-Halwagi, Mahmoud; Lindell, Michael K.Source term modeling is the key feature in predicting the consequences of releases from hazardous fluids. Aqueous ammonia serves the purpose of a reducing medium and is replacing anhydrous ammonia in most of the Selective catalytic reduction (SCR) units. This newly developed model can estimate the vaporization rate and net mass evaporating into the air from a multicomponent non- ideal chemical spill. The work has been divided into two parts. In the first step a generic, dynamic source term model was developed that can handle multicomponent non-ideal mixtures. The applicability of this improved pool model for aqueous ammonia spills was then checked to aid in the offsite consequence analysis of aqueous ammonia spills. The behavior of the chemical released depends on its various inherent properties, ambient conditions and the spill scenario. The different heat transfer mechanisms associated with the pool will strongly depend on the temperature of the liquid pool system at different times. The model accounts for all the temperature gradients within the contained pool and hence helps us establish better estimation techniques for source terms of chemical mixtures. This research work will help obtain more accurate and reliable liquid evaporation rates that become the critical input for dispersion modeling studies.Item Data driven process monitoring based on neural networks and classification trees(Texas A&M University, 2005-11-01) Zhou, Yifeng; Hahn, Juergen; Mannan, M. Sam; West, Harry H.; Bhattacharyya, Shankar P.Process monitoring in the chemical and other process industries has been of great practical importance. Early detection of faults is critical in avoiding product quality deterioration, equipment damage, and personal injury. The goal of this dissertation is to develop process monitoring schemes that can be applied to complex process systems. Neural networks have been a popular tool for modeling and pattern classification for monitoring of process systems. However, due to the prohibitive computational cost caused by high dimensionality and frequently changing operating conditions in batch processes, their applications have been difficult. The first part of this work tackles this problem by employing a polynomial-based data preprocessing step that greatly reduces the dimensionality of the neural network process model. The process measurements and manipulated variables go through a polynomial regression step and the polynomial coefficients, which are usually of far lower dimensionality than the original data, are used to build a neural network model to produce residuals for fault classification. Case studies show a significant reduction in neural model construction time and sometimes better classification results as well. The second part of this research investigates classification trees as a promising approach to fault detection and classification. It is found that the underlying principles of classification trees often result in complicated trees even for rather simple problems, and construction time can excessive for high dimensional problems. Fisher Discriminant Analysis (FDA), which features an optimal linear discrimination between different faults and projects original data on to perpendicular scores, is used as a dimensionality reduction tool. Classification trees use the scores to separate observations into different fault classes. A procedure identifies the order of FDA scores that results in a minimum tree cost as the optimal order. Comparisons to other popular multivariate statistical analysis based methods indicate that the new scheme exhibits better performance on a benchmarking problem.Item Binary mixture flammability characteristics for hazard assessment(Texas A&M University, 2005-11-01) Vidal Vazquez, Migvia del C.; Mannan, M. Sam; Caton, Jerald; Hall, Kenneth R.; Holste, James C.Flammability is an important factor of safe practices for handling and storage of liquid mixtures and for the evaluation of the precise level of risk. Flash point is a major property used to determine the fire and explosion hazards of a liquid, and it is defined as the minimum temperature at which the vapor present over the liquid at equilibrium forms a flammable mixture when mixed with air. Experimental tests for the complete composition range of a mixture are time consuming, whereas a mixture flash point can be estimated using a computational method and available information. The information needed for mixture flash point predictions are flashpoints, vapor pressures, and activity coefficients as functions of temperature for each mixture component. Generally, sufficient experimental data are unavailable and other ways of determining the basic information are needed. A procedure to evaluate the flash point of binary mixtures is proposed, which provides techniques that can be used to estimate a parameter that is needed for binary mixture flash point evaluations. Minimum flash point behavior (MFPB) is exhibited when the flash point of the mixture is below the flash points of the individual components of the mixture. The identification of this behavior is critical, because a hazardous situation results from taking the lowest component flash point value as the mixture flash point. Flash point predictions were performed for 14 binary mixtures using various Gex models for the activity coefficients. Quantum chemical calculations and UNIFAC, a theoretical model that does not require experimental binary interaction parameters, are employed in the mixture flash point predictions, which are validated with experimental data. MFPB is successfully predicted using the UNIFAC model when there are insufficient vapor liquid data. The identification of inherent safety principles that can be applied to the flammability of binary liquid mixtures is also studied. The effect on the flash point values of three binary mixtures in which octane is the solute is investigated to apply the inherent safety concept.Item Robust model-based fault diagnosis for chemical process systems(Texas A&M University, 2006-08-16) Rajaraman, Srinivasan; Hahn, Juergen; Mannan, M. Sam; Bhattacharyya, Shankar P.; El-Halwagi, MahmoudFault detection and diagnosis have gained central importance in the chemical process industries over the past decade. This is due to several reasons, one of them being that copious amount of data is available from a large number of sensors in process plants. Moreover, since industrial processes operate in closed loop with appropriate output feedback to attain certain performance objectives, instrument faults have a direct effect on the overall performance of the automation system. Extracting essential information about the state of the system and processing the measurements for detecting, discriminating, and identifying abnormal readings are important tasks of a fault diagnosis system. The goal of this dissertation is to develop such fault diagnosis systems, which use limited information about the process model to robustly detect, discriminate, and reconstruct instrumentation faults. Broadly, the proposed method consists of a novel nonlinear state and parameter estimator coupled with a fault detection, discrimination, and reconstruction system. The first part of this dissertation focuses on designing fault diagnosis systems that not only perform fault detection and isolation but also estimate the shape and size of the unknown instrument faults. This notion is extended to nonlinear processes whose structure is known but the parameters of the process are a priori uncertain and bounded. Since the uncertainty in the process model and instrument fault detection interact with each other, a novel two-time scale procedure is adopted to render overall fault diagnosis. Further, some techniques to enhance the convergence properties of the proposed state and parameter estimator are presented. The remaining part of the dissertation extends the proposed model-based fault diagnosis methodology to processes for which first principles modeling is either expensive or infeasible. This is achieved by using an empirical model identification technique called subspace identification for state-space characterization of the process. Finally the proposed methodology for fault diagnosis has been applied in numerical simulations to a non-isothermal CSTR (continuous stirred tank reactor), an industrial melter process, and a debutanizer plant.Item Novel applications of data mining methodologies to incident databases(Texas A&M University, 2006-08-16) Anand, Sumit; Mannan, M. Sam; Halwagi,Mahmoud El; Tretter, Marietta J.Incident databases provide an excellent opportunity to study the repeated situations of incidents in the process industry. The databases give an insight into the situation which led to an incident, and if studied properly can help monitor the process, equipment and chemical involved more closely, and reduce the number of incidents in the future. This study examined a subset of incidents from National Response Center’s Incident database, focusing mainly on fixed facility incidents in Harris County, Texas from 1990 to 2002. Data mining has been used in the financial and marketing arena for many decades to analyze and find patterns in large amounts of data. Realizing the limited capabilities of traditional methods of statistics, more robust techniques of data mining were applied to the subset of data and interesting patterns of chemical involved, equipment failed, component involved, etc. were found. Further, patterns obtained by data mining on the subset of data were used in modifying probabilities of failure of equipment and developing a decision support system.Item Making the business case for process safety using value-at-risk concepts(Texas A&M University, 2006-10-30) Fang, Jayming Sha; Ford, David M.; Cline, Daren B.H.; Mannan, M. SamAn increasing emphasis on chemical process safety over the last two decades has led to the development and application of powerful risk assessment tools. Hazard analysis and risk evaluation techniques have developed to the point where quantitatively meaningful risks can be calculated for processes and plants. However, the results are typically presented in semi-quantitative “ranked list†or “categorical matrix†formats, which are certainly useful but not optimal for making business decisions. A relatively new technique for performing valuation under uncertainty, Value at Risk (VaR), has been developed in the financial world. VaR is a method of evaluating the probability of a gain or loss by a complex venture, by examining the stochastic behavior of its components. We believe that combining quantitative risk assessment techniques with VaR concepts will bridge the gap between engineers and scientists who determine process risk and business leaders and policy makers who evaluate, manage, or regulate risk. We present a few basic examples of the application of VaR to hazard analysis in the chemical process industry. We discover that by using the VaR tool we are able to present data that allows management to make better informed decisions.