Performance Measurement of a Rule-Based Ontology Framework (ROF) for Auto-Generation of Requirements Specification

Performance Measurement of a Rule-Based Ontology Framework (ROF) for Auto-Generation of Requirements Specification

Amarilis Putri Yanuarifiani, Fang-Fang Chua, Gaik-Yee Chan
DOI: 10.4018/IJITSA.289997
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Documenting requirements specification requires a lot of effort from stakeholders and developers. Time and knowledge limitations are also obstacles in creating structured requirements document. Our previous works proposed a framework for automated generation of requirements specifications called Rule-Based Ontology Framework (ROF).The requirements documentation phase produces two outputs: process modeling according to the Business Process Model and Notation (BPMN) standard and Software Requirements Specification (SRS) documents following the ISO/IEC/IEEE 29148:2018 standard. In this paper, we do performance measurement of ROF in the IS project case study which includes validating ROF prototype by performing User Acceptance Test (UAT); measuring effectiveness by calculating notation error and requirements error; and measuring efficiency by calculating the time spent in producing documents. The efficiency and effectiveness of both are measured by comparing BPMN graph and SRS document generated by ROF with BPMN graph and SRS document that are created manually by the stakeholders.
Article Preview
Top

Introduction

Requirements Engineering is an important and crucial process in software development (Khan et al., 2015). It consists of four main phases: elicitation, documentation, validation, and management. Elicitation phase gathers the requirements from stakeholders and other sources to refine the requirements in greater detail. It is the most difficult task that may increase the risk of project failure (Nisar & Nawaz, 2015).

Requirements list from elicitation phase is written using natural language or conceptual models. A requirements document must be easily understood by both the stakeholders and technical person (Verma & Kass, 2008), so as to ensure the written requirements are matched with the organization’s need. On the other hand, the requirements document must also meet the formality of the appropriate structure to ensure the developers can translate it correctly in the programming language. Potential errors or mismatch requirements which are found during code development process might increase project costs.

Our prior study (Yanuarifiani et al., 2019) proposed a Rule-based Ontology Framework (ROF). ROF covers two phases of Requirements Engineering which are elicitation and documentation. In elicitation phase, initial requirements are collected using the gap identification method that involves stakeholders and developers. By using Kano's model, we perform requirements prioritization to determine which requirements need to be eliminated and which requirements will be implemented. The final requirements are then stored in an ontology taxonomy called Requirements Ontology (RO). By using RO as input, the ROF automatically generates two types of requirements documents; a semi-formal modeling document in Business Process Model and Notation (BPMN) graph and a Software Requirements Specification (SRS) document in ISO/IEC/IEEE 29148:2018 template.

The application of ROF aims to increase effectiveness and efficiency of requirements documentation and minimize the risk of making human mistakes when preparing documents manually. All ROF phases are implemented as ROF prototype functionalities and used to develop Workload Management Applications as a case study. The aim of this paper is to ensure ROF objectives are achieved, by measuring ROF performance with three tasks. The first task, we validate the prototype ROF to ensure stakeholders receive all features and confirm the results by conducting a User Acceptance Test (UAT) by testing end-to-end functionality. The second task, measurement of effectiveness, is done by counting notation and requirements error. The third task, efficiency measurement is done by calculating the time needed to produce the requirements document. Last two tasks are measured by comparing the generated requirements document of ROF and the requirements documents created manually by the stakeholders.

Section two presents a literature review related to the concept of requirements document and the evaluation methods. Section three explains the concept, implementation, and output of ROF in a case study. Performance measurement for the prototype validation, effectiveness measurement and efficiency measurement are explained in Section four. A summary of the performance measurement results and analysis is discussed in Section five. Section six presents the conclusions of this paper.

Complete Article List

Search this Journal:
Reset
Volume 17: 1 Issue (2024)
Volume 16: 3 Issues (2023)
Volume 15: 3 Issues (2022)
Volume 14: 2 Issues (2021)
Volume 13: 2 Issues (2020)
Volume 12: 2 Issues (2019)
Volume 11: 2 Issues (2018)
Volume 10: 2 Issues (2017)
Volume 9: 2 Issues (2016)
Volume 8: 2 Issues (2015)
Volume 7: 2 Issues (2014)
Volume 6: 2 Issues (2013)
Volume 5: 2 Issues (2012)
Volume 4: 2 Issues (2011)
Volume 3: 2 Issues (2010)
Volume 2: 2 Issues (2009)
Volume 1: 2 Issues (2008)
View Complete Journal Contents Listing