Computer Supported Training Solutions: Discussion of a New Framework for Effective Development and Deployment

Source: DVIDS, https://www.dvidshub.net/image/680983/computer-graphics-verification
Source: DVIDS, https://www.dvidshub.net/image/680983/computer-graphics-verification

Posted: July 21, 2016 | By: Dr. Amela Sadagic, Maj Matthew C. Denney

The Way Ahead: Elements of Proposed Framework

A critical component of success for any complex undertaking lies in a well selected and well connected set of elements of the general framework that all participants in the process support and observe. We provide here a list of recommendations for the most prominent elements of a proposed Framework for Computer Supported Training Solutions in the military training domain. Examples and discussions provided in this text take as an example the US Marine Corps (USMC) domain, however they are very much applicable to situations and institutions or other services and DoD in general.

Comprehensive knowledge about training audience, conditions, objectives and standards: Before a specific technical solution is even proposed, a clear identification of the training audience and its characteristics; training objectives; descriptions of the environment and conditions under which those tasks are to be performed; skills, knowledge, performance standards, expectations and the level of expertise to be gained; identified training gaps; future (projected) training needs; users’ value system, concerns and priorities should be established and defined, as well as skill retention rate and decay. The usability tests and tests of training effectiveness done later on should be conducted using these same parameters and requirements. We also advise acquiring detailed understanding about characteristics of the users – their interests, attitudes, expectations, technical skills, and personal ownership of digital devices, to name a few. The latter one provides clear insights about users interests, motivations, but also the untapped skills they posses, and possibly the expectations they may have from their work environment. Today’s users are unlike any other group in the past – a good illustration of characteristics of this community was derived in study organized in Summer 2013 [4][5] that identified young Marines as the owners of three digital devices – a smartphone (90.91%), game console (78.64%) and laptop/desktop computer (73.18%), with internet connections in their rooms (81.36%), and being avid users of social networking sites (Facebook – 90%), email (85.90%), and first person shooter games (77.27%) (Figure 3).

Figure 3: Profile of a young Marine [4]

Figure 3: Profile of a young Marine [4]

Domain support for research efforts: All research projects, especially the ones categorized as applied research, require extensive consultation with subject matter experts (SMEs). The military community (sponsors, transition customers, base leadership) should make additional efforts to ensure that research teams do get access to a variety of information sources and get opportunities for consultations with the SMEs. We also see the need for extended relationships between the research team and SMEs/practitioners – in order to create true partnerships. This approach enables additional dimensions of the research effort: the interaction and collaboration between researchers and practitioners allows for easier and more direct integration of project results into the community of training providers and end users. The connections, trust and credibility built in that collaboration will the important factor in adoption process. If domain practitioners are given an opportunity to be partners in the project and recognize themselves as co-owners of the process or the results of that work, they are more likely to adopt the results and promote the values of that effort afterwards. Likewise, the same long-term collaboration gives researchers a unique opportunity to have closer insights in the extensive expertise of practitioners, which maximizes the probability of achieving highly valuable and relevant results in their own work.

Work with domain (expert/end) users: If the research studies need to involve a specific user population, the military community should invest maximum efforts to ensure that research teams get access to domain (expert/end) users. This is a necessary condition to ensure that the results obtained in the studies are indeed relevant to the training needs and characteristics of a targeted user population. In addition to engagement of domain users in larger user studies, there is a need to also embrace and support smaller scale tests of prototype solutions – this is the only way in which different solutions can be perfected over the time and have a better chance to be widely deployed at the end of their development and testing cycle.

Recognition and support of systems interoperability: Those responsible for the development process methods, Universal Needs Statement (UNS), research projects and trade show purchases, must be made aware of the needs for interoperability before the actual products are made or purchased. They should be familiar with the requirements for true interoperability if some solutions need to exist and work in concert with other solutions. For example, PMTRASYS has adopted the Army’s Common Training Instrumentation Architecture (CTIA) that standardizes the components of that system. At the basic level this means that all solutions will have to co-exist in each other’s space – if one buys a target system from one company, a controller from another company has to be able to operate that same target and not require the purchase of another target system. Similarly, all graphical simulations and their 2D viewers must support a common mapping system. This requirement will ensure a basic level of compatibility and interoperability. A second modification should be the requirement that development and purchasing follow the Joint Capabilities Integration and Development System [9], or JCIDS acquisition process as well as inclusion of the Marine Corps Operational Test and Evaluation Activity (MCOTEA)[10]. While the UNS submitter, research team or purchaser may not require system connectivity but Training and Education Command (TECOM) does, the initial requirement should be modified to align with the TECOM requirement. Operational systems (weapons, computer command system, etc.) are developed to support the Combatant Commander’s Contingency Plans (responses to potential crisis), so training systems should be developed to support a TECOM Training Plan.

System development and deployment: What does it include and who is responsible? Fielding systems to bases is sufficient for contract and maintenance support, however for the system to be most effectively utilized by the training audience, a TECOM Formal Learning Center (FLC) is ideal to be the system proponent, responsible for both development and post fielding support. It is recommended that the system and the contractors who support its operation become the responsibility of the FLC. It is also recommended that the FLC actively participate in the GAT to ensure that the system provides proper training prior to fielding. FLCs have processes identified in the Systems Approach to Training (SAT) manual to include Learning Analysis and, Learning Objective Development which assist the instructor staff in developing instruction, training and evaluation. The same documents would prove useful in training system development. For example, the DVTE Combined Arms Network would allow an aircraft to fly through an active artillery trajectory without alerting the users; that same activity and situation would not be allowed by the Expeditionary Warfare Training Group’s (EWTG) Fire Support Coordination Course. If the EWTGs had been involved in the development of DVTE from the beginning, this inconsistency would most likely have been avoided.

Certification of instructors (contractors): It is our firm belief that all individuals who provide instructions with any training system, including computer supported training systems, should go through regular certification and recertification processes – this also applies to contractors who operate the systems or provide instructions. Their level of expertise and readiness should be subjected to the same level of scrutiny imposed on any professional performer in the service. Additionally, the responsibility of each instructor (contractor) should not end with providing the instructions – they should also actively look for any instance of negative training transfer that may occur, even if such trends were not initially registered in the system. For example, a DVTE operator, certified by an EWTG, would identify the conflict with the aircraft flight path and the artillery trajectory if the system (in this case DVTE Combined Arms Network) was incapable of doing so.

Fielding the systems: In an ideal case the system should be fielded with a library of tested progressive scenarios and assessment forms just as if the system was part of the curriculum. Users should be requested to provide a feedback to the FLC through a well-established mechanism, a version of the Instructor Rating Form. This process should be required by the SAT manual. The FLC would then include the utilization of the system in their Course Content Review Boards (CCRB) where the FLC and operational units regularly review and revise the curriculum. Supervisor and Graduate Surveys, and other sources of feedback for system users and supervisors would, as well as the CCRBs, ensure that these systems continue to provide necessary and valued training.

Comprehensive support for large scale deployment and adoption of training solutions: Large scale adoption and use of some training solution by all (or almost all) members of training community is needed when that group decides to adopt qualitatively different way of accomplishing that task, with objective to achieve better training results and to support training situations that are not possible with other means. Those were the very reasons why today all pilots, for example, use flight simulators in their training. The hard lessons that we learned from our extensive engagement in a domain of computer supported training simulations, suggest that a success of the adoption of novel technology does not and cannot be left alone and unattended. The expectation that people and institutions will recognize the value of novel technology on their own, and that the large scale adoption of that technology will follow, regularly remains to be only the expectation – considerable efforts on promotion, demonstration of values in and by peer community, strong communication channels and supporting infrastructure is needed if a full success is to be reached [11][12][4][5]. A number of factors that influence adoption of training solutions range from technical characteristics of those systems, human factors (usability, user acceptance and attitudes), leadership endorsement and support, communication channels used to promote the solution in military community, human network (user community) and active engagement of a larger number of agents of change and their aids to support a spread of ideas and adoption, to elements of training domain being well resolved (existence of full training package – having a training solution/system and tested advice how to use it effectively, easy access to training solution and unlimited number of training opportunities unrestricted by location and time, good throughput, train-the-trainer program, more active and changed role of simulation centers (distributing their expertise across the units), introducing challenge programs and competitions, diverse set of ‘push’ strategies instead of relying on ‘pull’ approaches, etc) [4][5].

Harnessing the experiences and insights of users: Once a system gets fielded, users acquire invaluable insights through long-term use; their experiences could be of great value to both system makers and to research teams. It is our understanding that this experience-based knowledge rarely if ever gets reported or utilized, and issues that could have been addressed in new solutions, remain unexamined and unimproved. Ideally, the SAT manual should either require this long term review process for training systems used at FLCs (in addition to the currently required review of instructional material) or PMTRASYS should partner with the Marine Corps Center for Lessons Learned System (MCLLS) to gather data on training systems much as MCLLS gathers data on operational systems.

Want to find out more about this topic?

Request a FREE Technical Inquiry!