52 AI MAGAZINE
The Human–Machine AI section discusses human–
machine teaming (data in use ). The Command Guided
Swarms (CGS) section provides a motivation for CGS
(data in motion). We then provide an example of
context-based AI multimodal image fusion building
on the above concepts for CGSs, and then provide
The four types of AI (table 1) build from designing
simple devices to complex machines toward the goal
of self-awareness (Hintze 2016). Self-awareness typi-
cally relates to humans, while self-assessment relates
to machines and both are related concepts. Situation
analysis (for example, assessment, awareness, and un-
derstanding) are commonly researched in the informa-
tion fusion community leading to human–machine
information fusion systems. The data information fusion
group model (Blasch et al. 2012), shown in figure 1,
leverages AI developments at each processing stage to
support assessment (level 0, 1, 2, 3 information fusion)
to that of refinement (level 4, 5, 6 information fusion).
System management (level 6) provides contextual con-
straints based on missions, objectives, and goals. The
data information fusion group model aligns with AI
types (table 3) as: Type 1 — reactive machines with
rules support L0 processing; Type II — limited memory
signal processing methods are L1 functions; Type III —
theory of mind situation representations compose L2/3
goals; and Type IV — self-awareness prediction and in-
teraction result from L4/L5/L6 analyses. The descrip-
tions of the information fusion levels demonstrate the
functions aligned to the various levels synergistic with
AI opportunities for autonomy. They divide into sensing
(low level information fusion — assessment, level 1, 2,
3) and action (high level information fusion — control,
level 4, 5, 6).
Level 0 — Data Assessment: The estimation and
prediction of signal/object observable states on
the basis of pixel/signal level data association
(for example, information systems collections).
Level 1 — Object Assessment: The estimation and
prediction of entity states on the basis of data
association, continuous state estimation, and
discrete state estimation (for example, data
Level 2 — Situation Assessment: The estimation and
prediction of relations among entities, to include
force structure and force relations, and communications (for example, information processing).
Level 3 — Impact Assessment: The estimation and
prediction of effects on situations of planned or
estimated actions by the participants; to include
interactions between action plans of multiple
players (for example, assessing threat or intent
actions to planned actions and mission requirements, and performance evaluation.
Level 4 — Process Refinement (this is an element
of resource management): The adaptive data
acquisition and processing to support sensing
objectives (for example, fusion process control
and information systems dissemination).
Level 5 — User Refinement (this is an element of
knowledge management): The adaptive determination of who queries information and who has
access to information (for example, information
operations) and adaptive data retrieved and displayed to support cognitive decision-making and
actions (for example, human systems integration).
Level 6 — Mission Management (this is an element of platform management): The adaptive
determination of spatial–temporal control of
assets (for example, airspace operations), route
planning, and goal determination to support
team decision-making and actions (for example,
context operations) under social, economic, and
From these AI types and information fusion levels,
there is a need to further enhance the systems to adapt
and respond in support of multiple users, various situations, and distributed machines, which is developing
for automation, augmentation, and autonomy.
A key aspect of the data information fusion group
model is the ability to align physics-based and human-derived information fusion over sensed data for action.
Scerri et al. (2015) developed a context-aware situation
analysis device. Information fusion methods seek a
similar goal. Physical data includes processing signals,
extracting features, and making decisions, while the
human-derived information is analyzed with logic,
symbols, and commands as shown in table 4. AI
methods have enhanced the ability to process data; and
recent results in AI leverage logical and semantic rules.
Data (Autonomy) DDDAS Example
Data at rest Statistical algorithms Information fusion (Liu, Z. et al. 2018)
Data in collect High-dimensional model learning Road networks (Yang and Blasch 2008)
Data in transit Systems software computing Container-based agents (Wu et al. 2016)
Data in motion Instrumentation and control Imagery collection (Blasch et al. 2018)
Data in use Human–machine AI User-defined operating picture (Blasch 2013)
Table 2. Data Management for Context-Based AI.