How Does an Analyst Select M&S to Support the Entire DoD Acquisition Lifecycle Process?

https://images.unsplash.com/photo-1571171637578-41bc2dd41cd2?ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&ixlib=rb-1.2.1&auto=format&fit=crop&w=1500&q=80
Free to Use Image from Unsplash

What Should Be: Common Model Framework

While the M&S community works hard to produce solutions that support the needs of the analytical community, modelers and simulation developers often fall into the trap of focusing on their particular domain. They may or may not attempt to leverage existing representations of phenomena because they are so focused on what they need to model or simulate for the analyst. Reuse is always a hot topic, as is composability, but there are barriers to these two ideals that have kept them from becoming a reality.

The idea of a framework that brings models together as needed is not novel. Some might argue that various simulations have been defacto frameworks to that end. For example, we continue to develop specialized terrain to support the needs of simulations and recreate physical representations that support kinetic warfare. We do this because “our” particular simulation was not built to use “your” model, due to issues such as fidelity, format or data. Software programming, in general, relies on libraries that become canonical representations of their functions. These libraries can also be changed as necessary. Why aren’t we using this approach for simulation development?

Imagine a paradigm where an analyst is able to pull together models that represented phenomena necessary to replicate the problem space being explored. These models would be produced by experts in those particular fields. This indeed would require some level of regulation and a serious Verification, Validation & Accreditation discussion, but we save that topic for a future paper. From the point of view of the analyst, if he is trying to have a fair comparison of two systems in a relevant operating environment, having a common source for models would be key. Furthermore, having the ability to pull together those same models for the next analysis, or being able to run updated models using the parameters of the original analysis and then performing a new analysis, would provide great analytic rigor. A potential solution might be a repository that literally houses these models and allows another analyst to leverage what was previously done, instead of the current perishable description of a model or simulation.

The paradigm of distributed simulation in general arguably provides a level of reuse models and simulations; however, as discussed, taking a black box approach to simulation interactions leads to interoperability issues and does not support reuse of fundamental models. Part of the challenge lies in defining the primitives of what those fundamental models would be. There is additionally still a challenge in the breadth of uses of M&S to support acquisition. The types of models for a system-level analysis normally differ from the models used for a force-on-force analysis, but is that required? We need to derive environment representations from a canonical source without having data translation errors that plague terrain generation, simulation gateways, etc. This is an area for serious research and demonstration to prove where the state-of-the-art really is.

Furthermore, models and simulations are worth little without the data that drive them. There are numerous activities in the US and North Atlantic Treaty Organization (NATO) discussing data generation, collection and storage. What remains to be seen is how the M&S space can effectively tie in to these efforts, especially in context of taking advantage of the data as it emerges from the battlespace. Would a common framework for models better support this linkage and in turn, better support the lifecycle? Furthermore, while we have discussed the need for representing kinetics of warfare, research is needed to better support scenario development. The lack of standardization in scenario generation across simulation environments does not allow us to easily sketch out a mission for execution. Would a common model framework further improve this problem?

It is our belief that advancements in technology are beginning to solve the problems in computational power, data storage and distributed access to models and simulations. What remains is a concerted effort to define what capabilities an analyst would actually desire to do his job independent of the current methods and means used to produce and use M&S. Arguably, better defining how M&S could best support the acquisition lifecycle will allow us to move forward rather than continue a slow evolution.

Conclusions

While there remain challenges to enabling analysts to select M&S throughout the entire DoD Acquisition Lifecycle Process, by examining where we have been and where we are now we can make recommendations for where we should be. The challenges that we have identified with current methods will continue to be challenges as long as M&S is designed, developed and employed in the same way it has been. The EASE research project attempted to implement solutions to many of the challenges that we identified through our experience with the MATREX program and has found success in many. Unfortunately, while EASE can provide many benefits, it cannot fully enable true composability and reuse as long as M&S continues to be developed with disparate timelines and purposes. If the analysts define their ideal system, the Science and Technology community can demonstrate what technologies can achieve this vision driving towards a more useful paradigm in the future.

References

[1] : M. Jamshidi. 2008. System of Systems Engineering. 1st ed. Wiley.

[2] : Defense Acquisition University (DAU) ACQuipedia, 22 January 2008, “Acquisition Life Cycle” available via https://dap.dau.mil/ acquipedia/Pages/ArticleDetails.aspx?aid=30c99fbf-d95f-4452- 966c-500176b42688

[3] : Roedler, Garry and Jones, Cheryl. Technical Measurement. A Collaborative Project of PSM, INCOSE, and Industry. 2005 INCOSE-TP-2003-020-01. (https://www.incose.org/ProductsPubs/ pdf/TechMeasurementGuide_2005-1227.pdf)

[4] : Czarnecki, K, 2000, “Generative Programming: Methods, Tools, and Applications”, Addison-Wesley Professional.

[5] : Headquarters Department of the US Army, Army Regulation 5-11. Management of Army Modeling and Simulation. 2014. (http://www.apd.army.mil/pdffiles/r5_11.pdf)

[6] : Metevier, Chris et al. Modeling Architecture for Technology Research and Experimentation (MATREX): M&S Tools and Resources Enabling Critical Analyses. 2009 Modeling and Simulation Information Analysis Center (MSIAC) Journal Summer 2009. (https://www.matrex.rdecom.army.mil/front/ msiac_journal_july_2009.pdf)

[7] : Beauchat, Tracey et al. A Collaborative Tool for Capturing the Design of a Distributed Simulation Architecture for Composable Execution. 2012. Fall Simulation Interoperability Workshop – Spring Conference.

[8] : 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules. 2010. https://www.sto.nato.int/publications/STO%20Meeting%20Proceedings/STO-MP-MSG-126/MP-MSG-126-08.pdf. jsp?arnumber=5553440)

[9] : Rothrock, Ling and Narayanan, S. Human-in-the-Loop Simulations: Methods and Practice. 2011. Springer.

[10] : Keith Snively, Phil Grimm “ProtoCore: A Transport Independent Solution for Simulation Interoperability.” SISO Fall SIW 2006.

[11] : 1278.1-2012 IEEE Standard for Distributed Interactive Simulation – Application Protocols. 2012. (http://ieeexplore.ieee.org/ xpl/articleDetails.jsp?arnumber=6387564)

[12] : Powell, Edward and Noseworthy, Russell. The Test and Training Enabling Architecture (TENA). 2012. . org/download/attachments/6750/TENA-2012-Paper-Final.pdf)

[13] : McCray, Paul and Snively, Keith. Functional Component Testing for Distributed Simulations. 2008. Simulation Interoperability Workshop Spring Conference, April 2008.

[14] : Kaushik, B; Nance, Don; Ahuja, J, 23-25 May 2005, “A Review of the Role of Acoustic Sensors in the Modern Battlefield”, Proceedings of the 11th AIAA/CEAS Aeroacoustics Conference (26th AIAA Aeroacoustics Conference).

Want to find out more about this topic?

Request a FREE Technical Inquiry!