2005 Forum Home
Program
Keynote
Posters & Demos
GSRF
> Org Committee
> Prog Committee
> Dates
> For More Info
For Grad Students
Registration: Onsite
Maps & Directions
Past Forums
For More Info
|
|
Using Scenario Traceability to Support Scenario-Based Testing
Student: Leila Naslavsky, UCI/ISR
Advisor: Debra J. Richardson, UCI/ISR
Abstract: Software testing remains the central activity used for ensuring that
a system behaves as expected. It consists of three technical activities - test-case generation, execution, and evaluation. Some of these activities are still lacking proper automation. As consequence of this lack of automation, of time-to-market pressures, and of cost constraints, developers must release their products without proper testing. To alleviate this problem, we suggest moving testing concerns to earlier development phases, while providing better support for developers on using the actual end-users' expectations to test their systems. These expectations are often expressed as scenarios, so, scenario-based testing should be used to show that the system satisfies them. One challenge for scenario-based testing is the need for mapping modeling to code concepts. Scenarios are artifacts used throughout the development life cycle, at different levels of abstraction and forms, and can be mapped from one level of abstraction to the other. Creating and maintaining the mapping across those scenarios can address this challenge and automate some scenario-based testing activities. This paper explains uses of scenarios in software phases, describes one way to relate scenarios across phases and the future research.
Bio:
Leila is currently a 3rd year Ph.D. student at University of California, Irvine working towards his degree under the supervision of Prof. Debra J. Richardson. She received her B.S. degree in Computer Sciences from the Universidade Federal de Pernanbuco, Brasil in 1998. Current research interests include requirements-based testing, in particular, scenario-driven testing and ways to maintain traces among software artifacts in order to improve software testing automation.
|