Workshop on Automation of Software Test (AST'06)

At 28th International Conference on Software Engineering (ICSE'06)


Shanghai, China, 23 May, 2006

Sponsored by Avaya Labs, USA

Online Submission Website is now open:

Software testing is indispensable for all software development. As all mature engineering disciplines need to have systematic testing methodologies, software testing is a very important subject of software engineering. In software development practice, testing accounts for as much as 50% of total development efforts. It is imperative to reduce the cost and improve the effectiveness of software testing by automating the testing process, which contains many testing related activities using various techniques and methods.

Automation is the future trend of software testing in order to reduce its cost. In the past decades, a great amount of research effort has been spent on automatic test case generation, automatic test oracles, etc. However, the current practice of software test automation is mostly based on recording manual testing activities and replaying recorded test scripts for regression testing. Bridging the gap between theory and practice will not only significantly improve the current-state of software production, but also foster innovative research in the area. As the theories of software testing become more mature, a deeper automation of the testing process becomes feasible. Indeed, a large number of software test tools have been developed in the past a few years and become available on the market. However, few of them have taken inter-operability into serious consideration. Software systems have become more and more complicated with components developed by different vendors and using different techniques in different programming languages and even run on different platforms. Few software testing tools can support all testing tasks within one tool. Therefore, it is timely and important for the development of software testing methodologies and scientific discipline as a part of software engineering. The workshop will provide researchers and practitioners a forum for exchanging ideas, experiences, understanding of the problems, visions for the future, and promising solutions to the problems. The workshop will also provide a platform for researchers and developers of testing tools to work together to identify the problems in the theory and practice of software test automation and to set an agenda and lay the foundation for future development.


The theme of the workshop focuses on bridging the gap between the theories and practice of software test automation. The topics covered by the workshop include, but are not limited to, the following: (more details can be found in the call-for-papers)

1) Methodology: Software test automation in the context of various software development methodologies, such as in

*      traditional heavy weight methodologies,

*      rapid prototyping and evolutionary development methodology,

*      component-based software development, and object-oriented software development,

*      agile and test-driven methodology,

*      software architecture and product lines,

*      service-oriented software engineering, etc.

2) Technology: Automation of various test techniques and methods used in various test related activities, such as

*      The techniques that enable various testing activities to be automated, such as test

*      case generation,

*      test oracle and test result checking,

*      test driver, stubs, harness and test script generation,

*      test adequacy and coverage measurement,

*      test effective analysis, and  test report generation,

*      test related software artifact generation,

*      maintenance, and reuses, management of testing activities and recourses, etc.

*      The techniques that support various software testing methods, such as

*      structural testing,

*      functional testing, 

*      error based testing,

*      fault-based testing,

*      partition testing and combinatorial testing,

*      random testing,

*      usability testing, performance testing, load testing and stress testing,

*      program-based testing,

*      specification based testing,

*      model-based testing,

*      risk-based testing, etc.

*      Techniques that support the testing of various specific types of software in various application domains, such as for testing

*      Internet and Web-based applications, such as web services, Peer-to-peer applications and Grid systems, semantic web, search engines, etc.

*      Database applications and information systems,

*      Systems software such as middleware, architecture and reference models, XML schemes, compilers, OS, etc.

*      Ubiquity, pervasive and mobile computing systems,

*      Multimedia and hypermedia applications,

*      Security protocols and application systems, encryption and decryption algorithms,

*      Real-time systems, concurrent and parallel systems, communication systems and protocols, embedded systems,

*      Applications of AI techniques, such as data mining systems, machine learning algorithms, agents and multi-agent systems, ontologies, neuron networks, etc.

3) Software testing tools and environments: Issues in the development, operation, maintenance and evolution of software testing tools and environments, such as

*      The functional, architectural and interface design of automated software testing tools and environments;

*      The construction of practical and prototype systems of automated testing and implementation issues;

*      Evolution of testing tools and environments as software artifacts evolve and change and as software standards and development methods changes;

*      Evaluation of software testing tools and environments;

*      Integration and interoperation of various types of software testing tools efficiently and effectively, and with other types of software development and maintenance tools and development environments such as

*      model-driven development environments,

*      configuration management tools,

*      model-checking and software verification tools,

*      software metrics and measurement tools, etc.

4) Experiments, empirical studies and experience reports and vision of the future:

*      Experiments and empirical studies and comparison of software test automation, and reports on real experiences using automated testing techniques, methods and tools in industry, such as

*      the effectiveness of automated testing tools, methods and techniques, such as fault detecting abilities;

*      the cost of building the automation versus savings from the automation;

*      the usability of various techniques, methods and tools;

*      The identification of problems that hamper the wider adaptation of automated test techniques, methods and tools;

*      The analysis and specification of the requirements on automated software testing.


Papers submitted to the workshop must be unpublished original work. The papers must be in English and in the format of ACM conference proceedings within the length of 5-7 pages and in either PDF or postscript format. Submissions must be made through online upload to the workshop paper submission website at before the deadline on 1 February 2006.

All submissions will be reviewed by three workshop PC members. Acceptance of papers for presentation at the workshop will be based on the merits and their relevance to the objective of the workshop. Announcement of acceptance/rejection will be made on 1 March 2006. Authors of accepted papers are required to submit a camera-ready version of the paper by 14 March, 2006.


The accepted workshop papers will be published in the ICSE’06 conference proceedings by ACM Press. It will be distributed onsite together with the proceedings via USB-Stick or CD-ROM and be available via ACM digital Library. Authors of accepted papers are required to register to the workshop and present the paper at the workshop in order for the paper to be included in the proceedings and ACM Digital Library. After the workshop, the authors of selected best papers will be invited to submit a significantly revised and extended version of their papers for consideration of publication either as a special issue in a journal or as an edited book, which will subject to another round of refereeing. Details will be announced at the workshop.


*      Submission deadline: 1 February  2006

*      Notification of acceptance: 1 March 2006

*      Camera-Ready version: 14 March 2006

*      Workshop date: 23 May 2006


Prof. Hong Zhu,

Department of Computing, Oxford Brookes University,

Oxford OX33 1HX, UK,


Tel: 0044 1865 484580, Fax: 0044 1865 484545


Dr. Joseph R Horgan

Telcordia Technologies,

One Telcordia Drive, RRC-1M322,

Piscataway, NJ, 08854, USA.


Tel: 732-699-2580, Fax: 732-336-7015


Prof. S.C. Cheung

Department of Computer Science
The Hong Kong University of Science and Technology
Clear Water Bay
, Kowloon, Hong Kong.
Tel: +852 2358-7016; Fax: +852 2358-1477


Chair of Publicity and Local Organization

*      J. Jenny Li, Avaya Labs, USA



*      W.K. Chan, Hong Kong Univ. of Sci. & Tech., Hong Kong

*      T.Y. Chen, Swinburne University of Technology, Australia

*      Donghui Guo, Xiamen University, China

*      Rob Hieron, Brunel University, UK

*      Chandra Kintala, Stevens Institute of Technology, USA

*      J. Jenny Li, Avaya Labs, USA

*      Shoaying Liu, Hosei University, Japan

*      Aditya Mathur, Purdue University, USA

*      John May, University of Bristol, UK

*      Phil McMinn, University of Sheffield, UK

*      Allen Nikora, Jet Propulsion Lab, USA

*      Jeff Offutt, George Mason University, USA

*      Mark Segal, Telcordia, USA

*      Rudolph Seviora, University of Waterloo, Canada

*      T.H. Tse, The University of Hong Kong, Hong Kong

*      Hasan Ural, University of Ottawa, Canada

*      Ji Wang, National University of Def. Technology, China

*      Eric Wong, University of Texas at Dallas, USA

*      Martin Woodward, University of Liverpool, UK

*      Tao Xie, North Carolina State University, USA

*      Baowen Xu, Southeast University, China

*      Howell Yee, MIT Lincoln Lab, USA

*      Jia Zhang, Northern Illinois University, USA