日本財団 図書館


COMPUTER-BASED SUPPORT FOR THE EVALUATION OF SHIP HANDLING EXERCISE RESULTS
M. Baldauf, K, Bencdict, C. Felsenstein, M. Kirchhoff
(Hochschule Wismar, University of Technology, Business and Design, Dept. of Maritime Studies Warnemünde, R. - Wagner - Str. 31, 18109 Rostock, Germany; k.benedict@sf.hs-wismar.de)
 
 Abstract: One of the most important parts of the simulator exercise is the evaluation of the students' results by the instructor both during and after the training session. This should be performed in two ways: first, during the exercise run to ensure that the training objective can be achieved and second after exercise completion in order to give the students an indication of their performance during the simulation run. For these purposes software tools have been created and implemented at the Maritime Simulation Centre in Warnemünde: the "Surveillance Tool", allows for a monitoring during the run and the "Evaluation Tool" will enable a detailed evaluation by the instructor after the run. It allows for in-depth search within the replay data and at the same time facilitates the calculation of the final score for the student's performance based on measurement factors as penalties for exceeding quality limits. Within this paper a brief overview of the principles of these methods is high-lighted and selected examples of applications are described.
 
1. INTRODUCTION
 The role of simulators in the education and training of seafarers has become more important over the last decades. Simulators are now used for the purpose of improving knowledge and assessing competencies according to the STCW convention.
 
 The common procedure in using simulators for training is composed of three elements and normally all of these elements are guided and monitored by instructors:
 
・Briefing: The trainees are to be prepared for the exercises and familiarised with the scenario and the objectives and the optimal results that can be achieved are outlined by the instructor.
 
・Simulation / Exercise run: During the exercise run the instructor's role is to check progress by observing student action and at the same time take notes or fill in checklists about occurrences or actions of the student so that advice and remarks (which cannot be seen in the record files for replay alone) can be offered afterwards.
 
・Debriefing: In the debriefing session the instructor normally assesses the results of the trainees by using the replay to discuss the results and additionally referring to their notes and checklists.
 
 The need for and the advantage of Computer Based Evaluation (CBE) in conjunction with automatic assessment elements as a tool to support the instructors during their work is obvious if:
 
・Many trainees have to undergo simulator training at the same time. Thus the role of Computer Based Training (CBT) as a simulation tool is emerging as a self study activity without direct involvement of instructors. For this type of training the CBE is a vital component which automatically assesses the student's results.
 
・More than one exercise and trainee group are dealt with at the same time (e.g. parallel simulator sessions as in the MSCW). Therefore the instructor has a remarkable workload to deal with while taking charge of all the simulator groups at the same time.
 
 For this reason the Wismar University has started to look into and develop methods for CBE and is testing software modules in the Maritime Simulation Centre Warnemünde.
 
 This paper offers a general overview of the topic. Some aspects of the development and implementation of CBE into simulator training of MSCW are discussed and two examples from the area of collision avoidance and emergency manoeuvre in the event of "man over" board are illustrated.
 
2 SIMULATOR FACILITIES AT THE DEPARTMENT OF MARITIME STUDIES
 The Maritime Simulation Centre Warnemünde at Wismar University, Department of Maritime Studies in Rostock-Warnemünde accommodates six simulators embracing a common network and comprised of four ship-handling bridge systems with differing levels of equipment, a ship's engine system and a VTS facility (see Figure 1). The interaction of many components is a principal feature of the centre. At the same time, it additionally provides an ideal platform for a wide range of research and development.
 The Ship handling Simulator (SHS) is located on the first and second floors of the Centre.
 
・Bridge 1 is capable of simulating a full range of ship handling operations - it essentially consists of a fully integrated replica bridge assembly.
 
・Bridge 2 has a similar projector-based 257° visual display system which can be specifically used for manoeuvring a ship from bridge wing during going-along-side or tow-boat operations.
 
・The remaining two units, Bridge 3 and 4 are used mainly as radar cabins, each being additionally equipped with 120° visual display screens.
 
・An additional feature of the system is a facility for computer-based instructor-less training, for which there are four separate exercise stations, equipped with handles for rudder and engine telegraph. These provide trainees with pre-programmed ship handling scenarios and with a secondary screen posing multiple-choice questions for assessment by a special scoring method.
 
 The Ship Engine Simulator (SES) is housed in the Centre's basement floor and is a replica of a typical modern plant, representing a main engine of 22.000 KW (2-Stroke, 5 Cylinders Diesel engine. Type 5 RTA 84 C by SULZER). Equipped with identical Control panels and displays in an engine control room and an engine room, it simulates not only a main engine but also all auxiliaries of the ships' full machinery system.
 
 The VTS Simulator (VTSS) is located in the middle floor. This simulator covers all aspects of radar and AIS transponder-supported traffic surveillance and associated communications disciplines.
 
 Its control section consists of three main instructor consoles. The trainee section is comprised of nine sub-stations, or 'VTS basic units'. They can be configured to form specific work places or alternatively, up to three VTS centres acting either in parallel or within the same operating environment. Additionally there are two instructorless training stations.
 
 Networking of all six simulators at the centre is done via a Complex Operation Mode, which allows for a comprehensive overview of all maritime traffic operations for training and research.
 
 Parallel to the exercises briefing/debriefing can be done with part task training or full exercise replays. Similarly, the largest of the four, Bridge 1, can also be directly interfaced to the Ship Engine Simulator and so replace its own integral ship engine module with the equivalent of a full-mission SES. Finally, all bridge assemblies can in turn be directly linked to the VTS simulator in order to facilitate combined training of VTS operators, crews and pilots.
 
 Advantages and examples of this complex operation were discussed in [2]. This complexity is one of the reasons to look for new CBE methods in order to make greater use of the inherent capacity of the system which is also essential for improving the quality of the training and helping to decrease the workload of the instructor.
 
3 CONCEPT AND POTENTIAL ELEMENTS OF A COMPUTER BASED EVALUATION TOOL
3.1 Elements of simulation in training
 
 This chapter describes the way simulation is used for training and how the computer can support the instructor by using the automatic evaluation, (complimentary alongside the more conventional methods). In Figure 2 is shown which elements need to be developed in order to use the simulation process for CBE. On the left side the basic elements of the simulation process are listed, on the right side the elements of the assessment process are noted separately. This basic scheme of an evaluation schedule was prepared for general use and can be modified for various concrete simulation exercises. Some elements will be described in more detail now. Scenario Design and assessment criteria: The scenario should be designed in such a way as to cover all situation elements the trainee should be confronted with in order to prove his competence and to achieve an adequate standard of training i.e. by reacting with sufficient skill and according to the quality standards. The quality is measured by specific parameters (i.e. speed, distances to other objects etc.), for which criteria are to be defined as limit values or thresholds. This is one of the crucial points in simulation, because for CBE these parameters and criteria have to be defined clearly in advance and fed into the software before starting the exercise. However, in the COLREGS, soft terms are mostly used (e.g. "proper action", "ample time" etc.) instead of precise definitions.
 
 For CBE it is essential that the strategy is agreed upon first i.e. that the objectives for assessment and the methods, parameters and criteria to be applied for accurate assessment, are chosen beforehand. This requires a clear specification of conditions in the scenario and clear definition of criteria as well as preparation of assessment procedures/programs.
 
Figure 1 
Maritime Simulation Centre at Warnemünde (MSCW) which comprises three interfaced simulator segments for ship handling, ship engine and VTS
 
 Briefing: the trainees are to be prepared for the exercises and familiarised with the scenario. The instructor will advise the trainee about the objectives of the exercise and inform him of the highest score that can possibly be achieved. Sometimes the students have to bring in their own ideas and knowledge during this phase, e.g. if the planning of potential actions is part of briefing - as in passage planning. This could be the first part of the assessment. Now, if necessary, the parameters and criteria for the simulation can be readjusted.
 
 Exercise run with online assessment and surveillance: During the exercise run the instructor controls the exercise and replies simultaneously to any communication from bridges as if a partner from traffic vessels or from ashore and has to control and steer other target vessels as well. He must synchronise checking the trainee's progress by observing his actions and taking notes and/or filling in checklists. In order to decrease the instructors' workload for this phase and to increase the quality of the training a specific the "Data surveillance tool" was designed at MSCW, already described in [1].
 
 Debriefing and off-line assessment: In the debriefing session the instructor normally assesses the results of the trainees by using the replay to discuss the results and additionally referring to his notes and checklists.
 
Figure 2 
general scheme for using simulators in maritime training and assessment
 
 A specific new "Offline assessment tool" was designed at the MSCW and supply the instructor with recorded mess data. The semiautomatic tool increases the quality of the training and allows for a detailed evaluation of the trainees results, e.g.:
 
・by comparing the ships track with reference tracks or
・by analysing the plotted parameters or more complex data (e.g. the risk level) during the exercise.
 
 In the following chapters several examples are given in order to explain these functions in more detail.







日本財団図書館は、日本財団が運営しています。

  • 日本財団 THE NIPPON FOUNDATION