Insurability Challenges Under Uncertainty : An Attempt to Use the Artificial Neural Network for the Prediction of Losses from Natural Disasters

The main difficulty for natural disaster insurance derives from the uncertainty of an event’s damages. Insurers cannot precisely appreciate the weight of natural hazards because of risk dependences. Insurability under uncertainty first requires an accurate assessment of entire damages. Insured and insurers both win when premiums calculate risk properly. In such cases, coverage will be available and affordable. Using the artificial neural network a technique rooted in artificial intelligence insurers can predict annual natural disaster losses. There are many types of artificial neural network models. In this paper we use the multilayer perceptron neural network, the most accommodated to the prediction task. In fact, if we provide the natural disaster explanatory variables to the developed neural network, it calculates perfectly the potential annual losses for the studied country.

Extreme natural events pose great challenges for insurers because of their considerable ambiguity and their highly correlated losses.Consequently, such catastrophic risks pose several problems for insurers.First, because the losses arise from a small number of irregular accidental events, the insurer may not have sufficient resources to cover the resulting losses.Absent adequate reinsurance, the firm may go bankrupt or may choose to exit a market in which there is substantial exposure to such catastrophic risks.The second ramification of natural catastrophic losses is their influence on the rate structure of the firms that remain quite viable in the presence of disasters.If the insurer writes coverage in a risk area that experiences a major natural disaster once every decade, in the disaster year firm will suffer losses well in excess of premiums.To be profitable to continue insuring major natural hazards in the catastrophe prone area, the firm must charge a higher premium in the years there are no catastrophes than it would absent the threat of catastrophic risk.Thus, one would expect to observe very high loss ratios (losses incurred/premiums earned) in the catastrophe year and low loss ratios in the non-catastrophe years, compared to the loss ratios of regions not subject to such catastrophic risks.Moreover, major natural PANOECONOMICUS, 2010, 1, pp. 43-60 events paralyze not only the functioning of the insurance industry but affect all other economic sectors needing suitable coverage to deal with the catastrophic impacts of natural disasters.This paper will provide some ideas about the mission of insurance in reducing and managing the devastating impacts of natural disasters.In fact, such hazards threaten sustainable development and people's wellbeing.Accordingly, insurers have the duty to offer appropriate coverage and to incentivize people undertaking preventive measures.In part one, we survey the gravity of natural disasters owing to increasing uncertainty and environmental changes.These changes affect risk characteristics and impacts.For that reason, in the second part we study the capacity of the insurance industry to cover natural catastrophes and we explore how a new economic theory of insurability in uncertainty allows us to remedy the challenges of major hazards insurance.Finally, in the last part we present artificial neural networks-a new technique rooted in artificial intelligence-and we use the multilayer perceptron neural network to develop a predictive model of natural disaster losses.

Uncertainty and Risk Evolution 1.1 The Characteristics of the New Risks
In recent decades most risk characteristics were altered.In fact, new risks such as terrorism, natural disasters and major technological hazards were classified as catastrophic risks.Such major disasters were characterized by their high consequences, low probability, high antiselection, moral hazard and losses correlation.These characteristics differ significantly from Baruch Berliner's (1982) earlier insurance conditions (Table 1).Moreover, environmental changes and increasing uncertainty make insuring against catastrophic risks more difficult.In fact, the new generation of risks threatens not only human security and property but also critical infrastructure (such as hospital, school, telecommunications and electricity networks).These risks can be called major, catastrophic, extraordinary, disastrous, large scale, hypothetic or potential risks.Furthermore, a single event can simultaneously trigger off many claims in various insurance branches.In such high consequence, low probability events, insurers cannot offer the appropriate premium.

The Destructive Impacts of Natural Catastrophic Risks
During the last few years, we recorded an increase in the number and severity of natural catastrophes worldwide.Natural disasters result in varied losses and damage extents.Large loss events are rare and unique; perhaps one event affects many countries at once or an event triggers a second (like the 2004 earthquake in southern Africa causing a Tsunami in 13 countries-a multi-country disaster).However, the major problem of catastrophic risk is manifold loss dependencies.Hence, to adequately assess loss occurrences, the model must take into account dependencies including:  The correlation among claims for losses covered by different policies (such as life, estate, car, employment and business interruption);  The correlation among different events accounting for possibly cascading effects (such as earthquake → landslide → dam failure → flood → technical accident → diseases).
Disaster hazards exceed the boundaries of classic risks, having a great impact on the insurers' capacity to cover the occurred losses.

Insurers' Capacity to Cope with Natural Disasters
Ensuing natural disasters, individuals and insurers typically suffer very large losses.These risks pose considerable problems for the insurance industry, as well as the insured.As a consequence, we expect there will be a major effect on the affordability of catastrophe insurance and on coverage availability in natural disaster prone areas.
The lack of data about the frequency and magnitude of natural events precludes insurers from charging appropriate premiums.Consequently, after a catastrophe insurers increase premiums, leading the insured to under subscribe coverage for their future natural disaster risks.One would expect a decline in insurance policy sales as the price of insurance rises.To be sure, results indicate more influences are at work.In fact, catastrophes lead to a reduction in the net number of firms writing insurance coverage, as well as an increase in the probability of exit because insurers are reluctant to enter markets that expose them to the risk of bankruptcy.This significantly threatens natural disaster insurance availability.The Insurance Service Office (ISO) (1994), Howard C. Kunreuther (1996) and John D. Cummins, Neil A. Doherty and Anita Lo (1999) indicate many insurers withdrew from markets of natural catastrophe areas following large losses in the last decade.
These effects are greatest for the firms least able to withstand the major financial shock of a catastrophic event.In effect, natural hazards threaten insurers with insolvency, especially when events generate highly correlated claims for losses covered by different policies.

The Emergence of a New Economic Theory of Insurability in Uncertainty
In the neoclassical model, key assumptions are idealistic, including symmetry of information, free entry/exit, homogeneity and absolute rationality.In response, John M. Keynes (1921) and Frank H. Knight (1921) pioneered the concept of uncertainty.
The development of economic theories of uncertainty unfolded over many subsequent decades.The pioneering work in this field is certainly Kenneth Arrow's 1953 paper.In practice, this theory was the first stone laid in the edifice of the economics of insurance.
Thereafter, George A. Akerlof (1970) demonstrated that the neoclassical hypothesis of information symmetry is false.In fact, asymmetrical information causes two types of insurance market failure: antiselection and moral hazard.The "theory of contrasts"-Akerlof's main contribution-suggests solving insurance market failure by providing a bundle of contrasts to the insured.
More recently, insurance faces new categories of risk resulting in extensive losses.These major hazards affect people's wellbeing variously.For example, they destroy infrastructure, natural resources and networks.Hence, these catastrophic risks provoke implementing an appropriate burden sharing strategy in which private and public insurance play an important role.Sophie Chemarin and Claude Henry (2005) suggest the public and private sectors must cooperate to reduce the risk of insurer insolvency and to prevent the risk of uncovered losses for the insured.The new economic theory of insurability in uncertainty offers three techniques to facilitate covering natural disaster losses:  Reinsurance: an insurer's insurance by multinational companies with greater resources to support large risks than a small insurer has;  Capital market: a transformation of risks into capital markets by Catastrophe (or "Cat") Bonds.Such markets are growing in importance and attract many kinds of investors (such as insurers and hedge funds);  Public-Private Partnership: programs that complete private insurance and reinsurance coverage through risk sharing and financing mechanisms;  Moreover, the theory recommends ex ante risk management, so we focus on preventive measures and loss mitigation.Undoubtedly, prevention helps government and the insurance industry reduce the loss burden when a catastrophe occurs.The new economic theory of insurability in uncertainty presents three further institutional innovations;  Pools: agreements between insurance and reinsurance companies to mobilize sufficient capacities to cover major risks.These institutions have a mutual form;  Risk Retention Groups (RRGs): insurance companies with the structure of a commercial society, a mutual form and a specialized civil responsibility.Their development ameliorates coverage availability.However, RRGs are mostly developed with the professional services and health sectors. Captives: insurance or reinsurance companies with a mutual form but not belonging to the insurance sector.Their focal vocation is to cover the holder's risks.
PANOECONOMICUS, 2010, 1, pp. 43-60 Accordingly, this new theory directs attention toward the key position of the institutional economy.In fact, the market solution is not Pareto efficient and public intervention is obligatory when natural disasters affect public goods.This emphasises the role of public bodies such as municipalities, territorial and regional authorities to issue early warnings of natural disasters and to alleviate their damages.Indeed, the conventional wisdom of the new economic theory of insurability in uncertainty notes the complements between coverage and prevention.In practice, good prevention is needed to mitigate damages and to guarantee the availability and the affordability of natural disaster insurance.

The Application of Artificial Neural Networks (ANNs) on Natural Disaster Insurance Problems
To cover risks, insurers should be able to identify them and assess their frequency and gravity.In the case of natural disasters it is often hard to estimate the risk's probability and the potential amount of losses.In addition, Kunreuther and Erwann O. Michel-Kerjan (2007) indicate natural disaster insurance is constrained by the antiselection, moral hazard and loss correlations.Moreover, risk over time is a curious phenomenon linked to natural disasters.The distribution of losses due to catastrophes may change over time for a variety of reasons.Somewhat surprisingly, Kip W. Viscusi and Patricia Born (2006) point out there is no econometric analysis whatsoever that addresses fundamental aspects of how catastrophic risks affect insurance markets.For these reasons, in the case of natural disasters we cannot apply the usual approach of risk management, requiring available historical data to generate the hazard's probability.To improve the assessment of natural disaster damages, insurance companies must use modeling tools accounting for the complexity of manifold dependencies in the stochastic process of catastrophic events and losses.The models must also manage the lack of data regarding frequency and losses.In this field, ANNs allow the simulation of catastrophes and permits their generalization despite the shortage of information and the uniqueness of each natural disaster.

ANNs constitute a relatively new technique rooted in artificial intelligence (AI).
They consist of a mathematical model emulating the behaviour of the human brain and offer an interesting capacity to identify patterns among a group of variables without any assumption about the underlying relationship.According to Gérard Dreyfus (1997), the advantage of ANNs compared to other data analysis techniques is they are parsimonious universal approximators of function: with equal precision, the neural networks require less adjustable parameters than the universal approximators usually used.Used in research areas such as character and voice recognition, medical diagnosis and financial and economic research, they prove reliable.In fact, artificial neural networks are best at identifying patterns or trends in data.They are well suited for prediction or forecasting needs including risk management and weather prediction.
The most commonly used ANN is the Multilayer Perceptron (MLP), a fully connected network (Frank Rosenblatt 1962).The MLP is arranged in layers: an input layer receiving information (input data), an output layer giving results and one or more intermediary layers.These are called hidden layers, are responsible for the network's compilation and are able to capture the nonlinear relationship between variables.The basic components of any layer are nodes-or neurons.Each node is connected to every other node in subsequent layers.Information passes forward without feedback from the input layer to the output layer (Figure 1).For this reason, this type of ANN is called a "feedforward neural network".

Figure 1 Schematic Representation of the MLP
After constructing the ANN architecture, we started the learning process.ANN learning, as in biological systems, involves adjusting the synaptic connections (weights) existing between the input neurons and output neurons.In iterations, we train an ANN to perform a particular function until the ANN performs the desired task.Generally, the learning process uses a part of data called the "learning sample".Most of the learning algorithms are for optimization, seeking to minimize-with non linear techniques-a cost function C.This function is an important concept in learning, as it is a measure of how far away we are from an optimal solution  f to the problem we seek solving.The learning algorithm searches through the solution space to find a function with the least possible cost.More simply, for a cost function There are three major learning paradigms each corresponding to a particular abstract learning task: supervised learning, unsupervised learning and reinforced learning.In this paper, we use the MLP neural network with supervised learning.This learning process consists of altering the connection weights until the network outputs (y) nearly reproduce the desired ones (d).In the case of the MLP neural net-work, the optimal solution of the cost function is the minimum of the mean square error (MSE), between the network's responses and its targets where where N is the number of observations.
The MLP with n inputs, one hidden layer and a single output can be expressed as: In our case, Y is the vector of input natural disaster losses and i x , n ,..., 3 , 2 , 1 i  is the associated input vector of explanatory variables; h is the number of neurons on the hidden layer determined empirically; g and f are respectively the hidden and the output activation functions, usually chosen to be monotone and not decreasing.In this paper the function g is the sigmoid and f is the linear function.
are the weights (or parameters) to be usually adjusted (estimated) iteratively by a supervised learning algorithm, the error backpropagation algorithm, proposed by Rumelhart, Hinton, and Williams (1986).

Natural Disasters Data
The best study of any problem requires collecting the maximum information and details about it.This step is highly recommended to ameliorate our knowledge and to fix weaknesses and strengths of the subject of study.In this field, the key difficulty for natural disaster study is the absence of accurate and updated databases.In fact, there is often a data shortage about an event's gravity and occurrence.Consequently, the majority of natural disaster databases suffer from incomplete and inaccurate data, especially regarding information about economic losses.Another weakness of natural disaster data is access.Many databases are not publicly accessible.In addition, there are not many international databases recovering natural disasters worldwide.Furthermore, there are not many databases covering all types of natural disasters.In fact, most databases cover only one kind of disaster, like the Dartmouth Flood Observatory (DFO) database for floods and the United States Geological Survey (USGS) database for earthquakes.In this study, we use two databases: the Emergency Management Disasters Database (EM-DAT) and the Global Environmental Outlook (GEO) Data Portal.

The Emergency Management Disasters Database (EM-DAT)
The The EM-DAT is a publicly accessible international database collecting information on natural and technological disasters.The database contains over 16,000 entries with an average of 700 new entries per year and covers the period from 1900 to the present.It is updated daily and is made available to the public after validation.EM-DAT has four main criteria for a disaster's inclusion:  10 or more people killed;  100 or more people affected;  A declaration of a state of emergency;  A call for international assistance.
Events are entered on a country-level basis and information collected includes location; date; the number of people reported killed, injured homeless and affected and the estimated damage costs.The website provides free access to disaster data on occurrence and impact throughout a country, a disaster profile section and an advanced data search interface.Various analyses, trends, maps and related documents are also available on the website.
According to Debarati Guha-Sapir, Liz Tschoegl and Regina Below ( 2006), the database is compiled from various sources including United Nations agencies such as United Nations Environment Programme (UNEP), Office for the Coordination of Humanitarian Affairs (OCHA), World Food Program (WFP) and Food and Agricultural Organization (FAO); non-governmental organizations such as International Federation of the Red Cross (IFRC); research institutions; insurance institutions such as Lloyds and press agencies.
CRED and numerous other organizations use the EM-DAT to analyze disaster occurrences and impacts; to identify high-risk areas or populations and to highlight priorities for disasters preparedness, mitigation and prevention (i.e., Natural Disaster Hotspots of the World Bank: Living with Risk; United Nations International Strategy for Disaster Reduction).
It ensures comparability of data through a consistent use of common definitions and scientific terminologies across countries and time.The methodology for data capture, validation and cross checking has been developed over two decades and approved by an international group of experts.The most recent experience is the methodology developed for drought and famine data with the International Research Institute of Climate and Society (IRICS).In addition, CRED and Münich Re recently agreed on a standardized terminology of disaster classification.
Guha-Sapir (2007) finds ample experience in data collection and management; use of normative rules and clear definitions; clearly stated methodology; development and use of validation methods and tools; automatization of data entry and outputs and comparability of data over time and space are specific features boosting EM-DAT's credibility.The database provides natural disaster losses for our period of study.

Global Environment Outlook (GEO) Data Portal
The on-line GEO Data Portal2 is an intuitive and flexible resource that can be quickly and easily used for remote data retrieval because of its simplicity and modest technical requirements.The GEO Data Portal is the authoritative source for data sets used by UNEP and its partners in the GEO report and other integrated environment assessments.
Its online database holds more than 450 different variables, as statistics or as geospatial data sets.The contents of the Data Portal cover environmental themes such as climate, forests and freshwater and many others, as well as socioeconomic categories including education, health, economy, population and environmental policies.The data can be conveniently displayed and explored through maps, graphs, data tables, downloaded in various common formats or copied and pasted into word processors.
The successful development and wide use of the GEO Data Portal, combined with strong demand from regions and countries, led to various initiatives to install regional manifestations in certain parts of the world.Guided and supported by the Division of Early Warning and Assessment of Global Resource Information Database (DEWA/GRID-Europe staff), these regional GEO Data Portals provide additional and more detailed data from the region itself and improve access to core environmental data sets for use in GEO related assessment and reporting work.The Latin American and Caribbean region (LAC) is the first region to develop such a data system.After various consultations and review meetings in the region, the GEO LAC Data Portal was formally launched in early 2006.The West Asia region began by applying the global GEO Data Portal template for its regional implementation.This prototype was set up in the summer 2005 during a consultation in Abu Dhabi.The Asia and Pacific region took a similar approach, where the prototype GEO Data Portal was developed in the second half of 2005.In Africa, the regional Data Portal has already progressed substantially, being compatible with the global Data Portal.It was also envisaged to capture much more data and indicators to serve the GEO process, including the African Environment Outlook and other initiatives in and of the region.
While we originally considered 29 explanatory variables, we used 19 from the GEO database (Table 2).Natural disasters are complicated phenomena; in effect, determining their causes or determinants is largely ambiguous.After careful study, we restricted the number of variables to the most representative ones, leaving us 19 variables we suspect have the greatest influence on natural disaster losses.These final variables correspond to six themes: atmosphere, forests, freshwater, land, urban area and socioeconomic (Table 3).In section 3.3 we undertake a sensitivity study for more accuracy in the choice of selected variables.

Using the MLP to Predict Natural Disaster Losses
The MLP neural network approach can capture the nonlinear relationship between variables well, approximating any problem despite its complexity.Accordingly, this approach is appropriate when studying a complex phenomenon such as natural disaster risks.Generally, ANNs are useful in capturing dependency and nonlinearity.In such cases, they are applied in domains such as process modeling and control, robotics, aerospace and financial forecasting.Before testing the applicability of this approach to the insurance domain, we must survey previously used methods.Initially, works on natural disaster insurance use probabilistic models for scenario simulation and determining the "Probable Maximum Loss".However, in recent decades, the increased ambiguity of natural disaster hazards calls for new approaches.In effect, the great uncertainty about the probability of natural disaster hazards and their catastrophic consequences preclude insurers from charging an adequate premium.Therefore, some new works converged to the artificial intelligence domain and the use of the ANNs approach to better understand "Low Probability, High Consequence" natu-ral disaster hazard risks.In this field, we can cite some papers in insurance using the ANN technique.For example, Patrick L. Brockett et al. (1994) use a neural network artificial intelligence model as an early warning system to predict insurer insolvency, Shiva S. Makki and Agapi Somwaru (2001) use an ANN model to explain the choice of a crop insurance contract, and Matthew J. Aitkenhead, Parivash Lumsdon and David R. Miller (2007) develop a remote sensing based neural network mapping of tsunami damage in Aceh, Indonesia.
Considering its predictive capacity the best compared with other statistical methods (Grids of Score, Rules Based Decision and Decision Trees), we use the MLP neural network (Figure 2).

Figure 2 The Predictive Power of Neural Networks
Finally, this work aims to improve the assessment of natural disaster consequences.Correspondingly, an insurer can precisely predict the gravity of risk if they know the area's characteristics (inputs vector).As a result, they can fix an adequate premium based directly on the risk's magnitude.The insurer can also diversify his portfolio of risk geographically to increase profit.
We use MATLAB software for the MLP conception.MATLAB is a high level technical computing language and interactive environment for algorithm development, data visualization, data analysis and numeric computation.
As network entries, we use nineteen variables we suspect to influence the natural disaster losses.Thus, our MLP has 19 inputs (explanatory variables) and one output (the three types of losses: killed people, total affected people and economic damages).The 21 year period of study, between 1980 and 2000, includes 117 countries.We consider one hidden layer.George V. Cybenko (1989) states in his Cybenko theorem, a single hidden layer, feedforward neural network is capable of approximating any continuous, multivariate function to any desired degree of accuracy.
First we subdivide the database into two samples.The first, the learning sample, represents 80% of the information and serves for our neural network's learning.The second, the test sample, represents 20% of the information and allows us to Comprehensibility study the predictive capacity of the obtained network.Next we establish the network's architecture by determining the numbers of neurons of the hidden layer and the numbers of epochs (or the number of iterations in the learning process).After that, we start the learning phase.In doing this, we aim to adjust weights until reaching the minimal MSE between the simulated losses (network's outputs) and the targets (Marvin Minsky and Seymour Papert 1969).
In this paper, we use a MLP with one hidden layer according to the Cybenko theorem.Nonetheless, many MLP networks with one intermediary layer were tested initially for 1000 epochs.For all the outputs, the RN4 with a structure of [19 50 1] simultaneously gives the greatest reduction of the learning MSE and the smallest test MSE (Table 4).Hence, we hold the MLP [19 50 1] (Figure 3).Once we have fixed the number of layers and the number of neurons of each layer, we must determine the number of iterations (epochs) which allow us reaching the smallest MSE.After turning the network many times, we decide to fix the number of iterations at 20000 epochs because MSE decreased considerably and became stable at length.Globally, the developed network returns good results for the three types of losses.In fact, the MSE for the learning sample decreases considerably.In addition, the test sample applied to the obtained model gives us good performance.Consequently, the model illustrates a good predictive capacity.Hence, the ANNs are well adapted to complicated problems such as natural disasters risks.
All the works using the MLP neural network approach use the MSE as a network performance function measuring the network's performance according to the mean of squared errors.In this paper, we use the same measure to test the network's performance.During the learning process of output n°1 (killed people), the MSE decreased considerably until 2,283.10 - .We apply the obtained model to the test sample to study its predictive capacity.The MSE is about 13,025, indicating good predictive power of this network.Accordingly, it provides an excellent tool for insurers to predict the fatalities resulting from natural disasters.They can then calculate an accurate premium amount, given by the equation: Aware of an event's gravity before it occurs, insurers can calculate an accurate premium.Natural catastrophe risks are often qualified as unique cases.The only remaining ambiguity is an event's gravity.Once the gravity is identified, insurers can calculate an adequate premium.
A similar process applies to output n°2 (total affected people) given the same conclusions.During the learning process, MSE decreased noticeably to 7,842.10 -5 .The MSE given by the test sample equals 47,098.This higher MSE illustrates the difficulty of precisely assessing the total number of affected people.In fact, this type of loss requires accurate information about all missing, injured and homeless persons.Regardless, this error is generally acceptable and can be reduced according to data availability.We can conclude the network has a sufficient predictive capacity of the total number of affected people.
For output n°3 (economic damages), we obtain during the learning phase a MSE equal to 2,991.10 -4 .Hence, this network exhibits a good learning capacity.The test sample MSE is 44,015.This higher MSE illustrates the difficulty of predicting economic damages.This is reasonable because data about economic losses is always inaccurate; in fact, there is no clear distinction between direct and indirect losses.Moreover, loss information per sector is often unavailable.Overall, though, we can admit an acceptable predictive capacity of economic loss.
Next we determine each variable's effect on the amount of damages.To do so, we undertake a sensitivity analysis, approximating the impact of each explanatory variable on a natural disaster's gravity.In other words, this allows us to classify explanatory variables according to their influence on any type of natural disaster loss (Table 5).On the one hand, this classification helps us handle prevention more efficiently.On the other hand, insurers can attribute different fees to theirs customers according to their degree of prevention.This restores a competitive market for natural disaster insurance.Insurers can promote prevention efforts by attributing low premiums for the insured who undertake higher prevention efforts (best management of the most influential variables).The insurance industry greatly participates in disaster risk reduction when incentivizing households and industries to undertake preventive measures.In effect, natural disaster insurance can be linked to prevention.For example, Kunreuther and Michel-Kerjan (2007) suggest, insurers can encourage firms who are large emitters of greenhouse to restrict their future pollution by refusing to provide coverage unless the firms undertake preventive measures.

Conclusion
This study aims to help insurers in their task of coverage and to emphasise their preventive role.In practice, all actors (public bodies, multilateral organizations, citizens and private sectors) must strengthen their partnerships to achieve better results in disaster risks reduction policy.The insurance industry has a crucial role in this respect because it assesses the risks, offers coverage and supports prevention.Prevention greatly reduces natural disaster consequences, widely alleviating an insurer's burden of natural disaster losses.For this reason, insurers can use premiums as price signals to incentivize individuals and industries undertaking preventive measures.Indeed, it is more efficient to invest in prevention than in relief and emergency.
Traditional statistical tools using probabilities of identical and independent distributed risks cannot deal with natural catastrophe risks.As the distribution of natural disaster losses changes from one event to another, there are not repetitive cases to establish a predictive model over time.Consequently, we use a MLP neural network technique to predict natural disaster losses from variables suspected to influence the risk consequences.
Globally, the obtained network displays good learning and prediction capacities, allowing insurers to appropriately assess damages and calculate adequate premiums.Moreover, this method helps identify variables with more influence on natural disasters losses.In other words, insurers can encourage prevention by charging lower premiums to those undertaking more preventive measures.Hence, MLP is a good tool for insurance coverage and prevention actions.
Finally, the developed model has the least predictive capacity for the economic damages output.As mentioned, this is linked to the inaccuracy of economic loss data.In fact, we lack standard methods for assessing the full (direct and indirect) economic consequences stemming from natural disasters.
Our prospect is to develop these results and use them to motivate the conception of a more credible database about natural disaster economic losses.Furthermore, we seek promoting preventive measures and increasing awareness of the importance of natural disaster insurance for sustainable public development.
EM-DAT 1 was created in 1988.It is maintained by the Centre for Research on the Epidemiology of Disasters (CRED) at the Catholic University of Louvain in Brussels, Belgium.The main aim of the EM-DAT is providing rapid and accurate information serving the humanitarian community.Formal collaborations were established in 1999 with the United States Agency for International Development Office of Foreign Disaster Assistance (USAID-OFDA) and in 2002 with the Climate Information Project of United States Oceanic Atmospheric Administration (US-NOAA).This last joint effort led to the development of the EM-DAT website.

Figure 3
Figure 3 Graphical Representation of the Developed Neural Network MLP [19 50 1]

Table 2
The Original List of Explanatory Variables

Table 3
The Final List of Explanatory Variables Source: UNEP (2009).

Table 4
Comparison of One Hidden Layer MLP Neural Networks

Table 5
Classification of Variables According to their Decreasing Sensitivity Order Source: Authors' calculations.