Users’ Perspective: Current Practices
The user community provides input and gets engaged in developing and acquiring computer supported training solutions in several ways. One way in which those solutions get developed is when the user community identifies a need and creates the Universal Needs Statement (UNS) that is submitted up the operational chain of command. Upon approval, the requirement is given to Marine Corps Systems Command’s (MARCORSYSCOM) PMTRASYS. This process, for example, is how the High Mobility Multi-purpose Wheeled Vehicle Egress Assistance Trainer (HEAT) was developed and fielded to the various Marine Corps Bases. A different approach was used to develop the Deployable Virtual Training Environment (DVTE); much of this suite resulted from research efforts funded by the Office of Naval Research. In this effort both university and corporate research teams involved many users in several different ways: they were part of a task analysis effort, they acted as Subject Matter Experts in consultations and system evaluations, and they also took part in user studies. The prototypes were then ‘productized’ to make them robust and ready for actual use. Other elements of the DVTE suite were purchased and added to the set, and the entire suite was fielded to USMC bases. A completely different option for adoption of training solutions is through industry development and demonstration at trade shows, such as the Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) where technology demonstrations and subsequent purchases take place.
Funds used to develop the systems are meant to do just that – support the development phase. Another type of funding – support funding – is used for maintenance of the current version of adopted systems and for contractor support. As Tactics, Techniques and Procedures (TTP) change and new operational systems are fielded, there is rarely a process to identify and/or fund the modifications that are required to keep already fielded training systems current. There are additional issues that negatively influence the effectiveness of these solutions, and we list here only a few of the most significant ones. (1) Systems are typically fielded without having progressive scenarios (crawl, walk, run), (2) Documentation consists of a technical manual, at most, but no manual that would have tested advice on how to use the system most effectively in training practice, (3) Systems do not come with the unit assessment methods that would help evaluate the effectiveness of training solutions used by a given training audience, (4) Job descriptions for contractor support personnel include requirements for relevant experience, but contract documents have no advice or requirements for a process through which support personnel would maintain currency with the evolving operational environment, (5) System interoperability is frequently not requested in the UNS, resulting in situations such as that with the Supporting Arms Virtual Trainer (SAVT), which was not designed to be compatible with aircraft simulators that could “fly” Close Air Support (CAS) missions to support Tactical Air Control Party (TACP) training, (6) Government Acceptance Tests (GAT) focus only on system performance, not on user performance, (7) A full Verification, and Accreditation (VV&A) is requested  yet it is rarely conducted, (8) Most systems are not tested for their training effectiveness prior to their deployment. Consequently it is very hard for PMTRASYS to know if users will actually benefit from using the fielded systems prior to their fielding. In addition, post fielding user surveys are rarely conducted. The result of these and other similar issues is that training forces are very reluctant to supplement, let alone replace their current training approaches by introducing computer supported training solutions.