Introduction
Information about the number of patients or the extent of their injuries in a disaster situation is usually uncertain at the beginning of a response. Details are revealed in a step-by-step manner through additional new information. Reducing uncertainty about such information is important in managing equipment and rescue personnel in a disaster setting. Comparing entropy in Shannon’s Information TheoryReference Shannon 1 among different situations permits measurement of values of newly given messages. Entropy is defined as the average amount of information from all symbols (ie, elements generated from an information source) arising from the information source. An amount of information is measured according to the probability of an event. The less likely an event is, the more information it provides when it occurs. It is calculated using the logarithm of the inverse of the probability arising from symbols drawn from the information source, measured in bits, the basic unit of information in computing and digital communication. 2 Entropy represents the uncertainty of probability arising from symbols in the source. The larger the entropy of an information source, the more difficult it is to predict symbols arising from the source. In the clinical field, entropy from Information Theory is applied mainly for analysis of bioelectrical signals such as those in electrocardiographyReference Li, Yu, Chen, Lu, Zhang and Li 3 or electroencephalography.Reference Fonseca, Boboeva, Brederoo and Baggio 4 Here, a simulation study was performed using a simplified disaster model based on Shannon’s Information Theory, with the intent of evaluating reduction of uncertainty of information in a triage setting.Reference Shannon 1
Table 1 Components of Information Theory in Relation to Disaster Triage

Hypothesis
A disaster triage scene with a specific number of injured patients represents a source of information about the severity of patients’ incapacity. It is possible to quantify the uncertainty of information regarding the degree of impairment as entropy if the information source and information arising from the source in Information Theory can be adapted for use in the disaster locale (Table 1), and if the degree of patients’ disability is converted to numerical information that is applicable to the available equipment and rescue personnel.
Methods
Setting
Five different scenarios of a fire disaster in a hospital were set. Entropy was calculated according to the situation in each of these cases, Case 1 through Case 5. The amount of information in the triage or the message in Cases 2 through 5 was also calculated.
Victims of the disaster were 10 hospitalized patients. Severity of injury to patients was unknown. The case settings were as follows:
-
1. Case 1: No triage officer and no message;
-
2. Case 2: One triage officer and no message;
-
3. Case 3: One triage officer and a message that only six patients could walk;
-
4. Case 4: One triage officer and a message that all patients could obey commands; and
-
5. Case 5: One triage officer and a message that all patients could walk.
Medical preparedness of the triage system consisted of the following:
-
1. Triage officer: absent, or one person; and
-
2. Triage criteria: Table 2 shows triage criteria created arbitrarily for this study.
Table 2 Triage Criteria

These criteria consisted of four categories adapted from Simple Triage and Rapid Treatment (START) triage.Reference Benson, Koenig and Schultz 5 This triage was used not for priority of treatment but for allocation of equipment and rescue personnel. Equipment and rescue personnel were as follows: facemasks, bag masks, oxygen tanks, wheelchairs, and stretchers (10 of each) and rescue personnel (40 total).
Calculation of Entropy and Amount of Information
Entropy
Each element generated from an information source is called a symbol.Reference Pierce 6 An expected value for the amount of information from the source is called entropy (H) and is calculated by the following formula:Reference Pierce 6

where Pi is arising probability of each symbol from the source; log2 Pi is the amount of information of each symbol in the source, whose unit of measure is bits; and n is the number of symbols contained in the source. H is entropy, which also indicates uncertainty of a symbol arising from the source, and has bits per symbol (bits/symbol) as its unit of measure.
H takes a maximal value in the case that each symbol arises in the same probability:

If some different events coexist, they are considered as composite events. If they are independent of each other, the entropy of the composite events is calculated by summing the entropies of each event (Hi),

where m is the number of events and Hi is each entropy of composite events.
Entropies of the disaster spot in the all cases were calculated by applying information on patients’ incapacity, converted to numbers in relation to types of equipment and rescue personnel, to formulas (1) through (3).
Amount of Information
If H decreases from Ha to Hb via a message, the amount of information in the message (Inf) is calculated by:

The amount of information for the newly given message in Cases 3 through 5 was calculated by formula (4).
Results
The entropy and amount of information were calculated in each case scenario.
Case 1: No Triage Officer and No Information
The triage system did not function owing to the absence of a triage officer. Equipment and number of persons were chosen randomly. All combinations of equipment and number of rescuers were possible. Each probability and entropy from the probability according to formula (2) were:
-
1. P1 as a probability of a method to deliver oxygen by being chosen from no oxygen, facemask, or bag mask was P1=1/3. H P1 as its entropy was H P1=log2(1/P1)=log23=1.58.
-
2. P2 as a probability of a method to transport a patient by being chosen from walking, wheelchair, or stretcher was P2=1/3. H P2 as its entropy was H P2=log2(1/P2)=log23=1.58.
-
3. P3 as a probability of number of rescue personnel by being chosen from 0, 1, 2, 3, or 4 was P3=1/5. H P3 as its entropy was H P3=log21/P3=log25=2.32.
As P1, P2, and P3 were independent of each other, according to formula (3), H in Case 1 was H=H P1+H P2+H P3=5.49 bits/symbol.
Case 2: One Triage Officer and No Information
The triage system functioned with a triage officer, who categorized the 10 patients into four kinds of severity level according to the triage criteria (Table 2). Equipment and number of persons were determined according to the criteria. As the severity of the patients’ injuries was unknown, the four levels were considered to arise independently. The probability of every level to arise was 1/4. According to formula (2), H in Case 2 was H=log24=2.00 bits/symbol.
The triage system reduced the entropy in the disaster setting of Case 2 to 2.00 from 5.49. Therefore, according to formula (4), Inf of the triage was Inf=5.49 − 2.00=3.49 bits.
Case 3: One Triage Officer and a Message that Only Six Patients Could Walk
The triage system functioned with a triage officer. A message that only six patients could walk (considered as Level 1) indicated that the other four patients were between Level 2 and Level 4. Equipment and number of persons were allocated according to the triage criteria. The probability of Level 1 in Case 3 was 6/10, and the probability of the other three levels was 4/30. According to formula (1), H in Case 3 was H=6/10×log2(10/6)+4/30×log2(30/4)+4/30×log2(30/4)+4/30×log2(30/4)=0.6×0.737+0.1333×2.907×3=0.44 + 1.16=1.60 bits/symbol.
The message that only six patients could walk reduced entropy to 1.60 from 2.00 in the disaster scenario of Case 3. Therefore, according to formula (4), Inf of the message in Case 3 was Inf=2.00 − 1.60=0.40 bits.
Case 4: One Triage Officer and a Message that All Patients Could Obey Commands
The triage system functioned with a triage officer. A message that all 10 patients could obey commands indicated a restriction of triage choice to two: Level 1 or Level 2. Equipment and number of persons were allocated along to the triage criteria modified with the message. Probabilities on these two levels were the same: 1/2. According to formula (2), H in Case 4 was H=log22=1.00 bits/symbol.
The message that all 10 patients could obey commands reduced entropy to 1.0 from 2.0 in the disaster scenario of Case 4. Therefore, according to formula (4), Inf of the message in Case 4 was Inf=2.00 − 1.00=1.00 bits.
Case 5: One Triage Officer and a Message that All Patients Could Walk
The triage system functioned with a triage officer. A message that all 10 patients could walk, considered as Level 1, indicated no other choice. No equipment and rescue persons were needed according to the triage criteria. The probability of Level 1 in Case 5 was 10/10. According to formula (2), H in Case 5 was H=log21=0.00 bits/symbol.
The message that all 10 patients could obey commands reduced entropy to 0.00 from 2.00 in the disaster setting of Case 5. Therefore, according to formula (4), Inf of the message in Case 5 was Inf=2.00 − 0.00=2.00 bits.
The values of entropy and the amount of information in each case are summarized in Table 3.
Table 3 Entropy and Amount of Information Contained in Messages

a Amount of information on the triage or the message newly added in each case.
Discussion
Information Theory is a theoretical system for information and communication introduced by Claude Elwood Shannon in 1948.Reference Shannon 1 Although many studies have made use of entropy of Information Theory in medicine, reports adapting the theory for disaster medicine or life-saving issues have been rare.Reference Harrell and Boisvert 7 , Reference Guastello 8 A disaster scene populated by injured patients represents a source of information regarding the extent of their plight. It may therefore be possible to quantify the uncertainty of medical information about patients’ condition using Information Theory, if an information source and information arising from the source in Information Theory could be adapted for use in the disaster location. In a small-scale disaster where balance between supply and demand is well managed, information on patients’ extent of impairment may be convertible to numerical values to aid in the appropriate assignment of available equipment and rescue personnel. For this reason, the triage created arbitrarily in this study (adapted from START triage) was used not for priority of treatment but for allocation of equipment and rescuers.
Case 1 was assumed to be an unrealistic situation in that there was no leader to allocate equipment and rescue personnel to each patient using triage criteria. This case was intended to show that despite sufficient resources, confusion could arise in the absence of an adequate triage system. In a situation where the degree of a patient’s severity could not be judged, information about appropriate use of equipment and rescue personnel could not be acquired. The entropy was calculated from the total probabilities of three factors: method of delivering oxygen, means of conveyance, and number of rescue personnel. The entropy reached a maximal value when resources were combined.
Case 2 was a situation whereby a triage system functioned without additional information. As the proportion of patients in the four categories was unknown, the possibility to exist on each level was equal, and the entropy in Case 2 took a maximal value for four choices. Case 2 showed that application of a triage system could reduce entropy in a disaster setting by using knowledge to combine and allocate resources according to the extent of patients’ disability. In Cases 3 through 5, entropies were calculable under the triage system with additional information. These values demonstrated convergence in uncertainty of information about patients’ disability through new messages.
In this study, uncertainty of information in a small-scale disaster was evaluated as entropy. Shannon’s information source and information arising from the source can be adapted and applied to information on patients’ extent of impairment at a disaster scene. Comparison of entropy derived from information on a patient’s injury severity that was converted to numerical information about available equipment and rescue personnel in differing situations enabled measurement of values of triage and newly given messages. Such values reveal the type of information that should be acquired.
Entropy may also have the potential to evaluate convergence or divergence of confusion in a triage setting or in the aid station of a large-scale disaster under conditions of imbalance between supply and demand.
Limitations
This report presents a theoretical model. Further confirmation is needed regarding to what extent quantification of uncertainty of information by entropy is related to the degree of confusion at a triage scene; for example, by measuring time to response for patients according to triage or to additional information from a disaster exercise.
Conclusion
It was possible to quantify uncertainty of information about extent of disability in patients at a triage location and to evaluate reduction of the uncertainty by using entropy based on Shannon’s Information Theory.