Dekorationsartikel gehören nicht zum Leistungsumfang.
Software Test Automation
Software Test Automation
Taschenbuch von Dorothy Graham (u. a.)
Sprache: Englisch

93,90 €*

inkl. MwSt.

Versandkostenfrei per Post / DHL

Lieferzeit 1-2 Wochen

Kategorien:
Beschreibung
A sound and practical introduction to automated testing, this book presents a detailed account of the principles of automated testing. The authors provide practical techniques for designing a good automated testing regime, and advice on choosing and applying off-the-shelf testing tools for specific needs.
A sound and practical introduction to automated testing, this book presents a detailed account of the principles of automated testing. The authors provide practical techniques for designing a good automated testing regime, and advice on choosing and applying off-the-shelf testing tools for specific needs.
Über den Autor

Dorothy Graham and Mark Fewster are the principal consultant partners of Grove Consultants which provides consultancy and training in software testing, test automation, and Inspection. Mark Fewster developed the test automation design techniques which are the primary subject of this book. He has been refining and applying his ideas through consultancy with a wide variety of clients since 1991. Dorothy Graham is the originator and co-author of the CAST Report (Computer Aided Software Testing tools) published by Cambridge Market Intelligence, and the co-author of Software Inspection published by Addison-Wesley in 1993. Both authors are popular and sought-after speakers at international conferences and workshops on software testing.

Inhaltsverzeichnis

Preface
Part One: Techniques for Automating Test Execution
1 Test automation context
1.1 Introduction
1.2 Testing and test automation are different
1.3 The V-model
1.4 Tool support for life-cycle testing
1.5 The promise of test automation
1.6 Common problems of test automation
1.7 Test activities
1.8 Automate test design?
1.9 The limitations of automating software testing
2 Capture Replay is Not Test Automation
2.1 An example application: Scribble
2.2 The manual test process: what is to be automated
2.3 Automating Test Execution: inputs
2.4 Automating Test Result Comparison
2.5 The next steps in evolving test automation
2.6 Conclusion: Automated is not automatic
3 Scripting techniques
3.1 Introduction
3.2 Scripting techniques
3.3 Script pre-processing
4 Automated comparison
4.1 Verification, comparison and automation
4.2 What do comparators do?
4.3 Dynamic comparison
4.4 Post-execution comparison
4.5 Simple comparison
4.6 Complex comparison
4.7 Test sensitivity
4.8 Comparing different types of outcome
4.9 Comparison filters
4.10 Comparison guidelines
5 Testware Architecture
5.1 What is testware architecture?
5.2 Key issues to be resolved
5.3 An Approach
5.4 Might this be Overkill?
6 Automating Pre- and Post-Processing
6.1 What are Pre- and Post-Processing?
6.2 Pre- and Post Processing
6.3 What should happen after test case execution
6.4 Implementation Issues
7 Building maintainable tests
7.1 Problems in maintaining automated tests
7.2 Attributes of test maintenance
7.3 The conspiracy
7.4 Strategy and tactics
8 Metrics
8.1 Why measure testing and test automation?
8.2 What can we measure?
8.3 Objectives for testing and test automation
8.4 Attributes of software testing
8.5 Attributes of test automation
8.6 Which is the best test automation regime?
8.7 Should I really measure all these?
8.8 Summary
8.9 Answer to DDP Exercise
9 Other Issues
9.1 Which Tests to Automate (first)?
9.2 Selecting which tests to run when
9.3 Order of test execution
9.4 Test status
9.5 Designing software for (automated) testability
9.6 Synchronization
9.7 Monitoring progress of automated tests
9.8 Tailoring your own regime around your tools
10 Choosing a tool to automate testing
10.1 Introduction to Chapters 10 and 11
10.2 Where to start in selecting tools: your requirements, not the tool market
10.3 The tool selection project
10.4 The tool selection team
10.5 Identifying your requirements
10.6 Identifying your constraints
10.7 Build or buy?
10.8 Identifying what is available on the market
10.9 Evaluating the short listed candidate tools
10.10 Making the decision
11 Implementing tools within the organization
11.1 What could go wrong?
11.2 Importance of managing the implementation process
11.3 Roles in the implementation/change process
11.4 Management commitment
11.5 Preparation
11.6 Pilot project
11.7 Planned phased installation or roll-out
11.8 Special problems in implementing
11.9 People issues
11.10 Conclusion
12 Racal-Redac Case History
12.1 Introduction
12.2 Background
12.3 Solutions
12.4 Integration to Test Automation
12.5 System Test Automation
12.6 The Results Achieved
12.7 Summary of the case history up to 1991
12.8 What happened next?
13 The Evolution of an Automated Software Test System
13.1 Introduction
13.2 Background
13.3 Gremlin 1
13.4 Gremlin 2.0: A Step Beyond Capture/Replay
13.5 Finding The Real Problem
13.6 Lesson Learned
14 Experiences with Test Automation
14.1

Details
Erscheinungsjahr: 1999
Fachbereich: EDV
Genre: Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 600
Inhalt: Kartoniert / Broschiert
ISBN-13: 9780201331400
ISBN-10: 0201331403
Sprache: Englisch
Einband: Kartoniert / Broschiert
Autor: Graham, Dorothy
Fewster, Mark
Zusammengestellt: Graham, Dorothy
Hersteller: Pearson Education Limited
Maße: 237 x 156 x 31 mm
Von/Mit: Dorothy Graham (u. a.)
Erscheinungsdatum: 28.06.1999
Gewicht: 0,904 kg
preigu-id: 106692955
Über den Autor

Dorothy Graham and Mark Fewster are the principal consultant partners of Grove Consultants which provides consultancy and training in software testing, test automation, and Inspection. Mark Fewster developed the test automation design techniques which are the primary subject of this book. He has been refining and applying his ideas through consultancy with a wide variety of clients since 1991. Dorothy Graham is the originator and co-author of the CAST Report (Computer Aided Software Testing tools) published by Cambridge Market Intelligence, and the co-author of Software Inspection published by Addison-Wesley in 1993. Both authors are popular and sought-after speakers at international conferences and workshops on software testing.

Inhaltsverzeichnis

Preface
Part One: Techniques for Automating Test Execution
1 Test automation context
1.1 Introduction
1.2 Testing and test automation are different
1.3 The V-model
1.4 Tool support for life-cycle testing
1.5 The promise of test automation
1.6 Common problems of test automation
1.7 Test activities
1.8 Automate test design?
1.9 The limitations of automating software testing
2 Capture Replay is Not Test Automation
2.1 An example application: Scribble
2.2 The manual test process: what is to be automated
2.3 Automating Test Execution: inputs
2.4 Automating Test Result Comparison
2.5 The next steps in evolving test automation
2.6 Conclusion: Automated is not automatic
3 Scripting techniques
3.1 Introduction
3.2 Scripting techniques
3.3 Script pre-processing
4 Automated comparison
4.1 Verification, comparison and automation
4.2 What do comparators do?
4.3 Dynamic comparison
4.4 Post-execution comparison
4.5 Simple comparison
4.6 Complex comparison
4.7 Test sensitivity
4.8 Comparing different types of outcome
4.9 Comparison filters
4.10 Comparison guidelines
5 Testware Architecture
5.1 What is testware architecture?
5.2 Key issues to be resolved
5.3 An Approach
5.4 Might this be Overkill?
6 Automating Pre- and Post-Processing
6.1 What are Pre- and Post-Processing?
6.2 Pre- and Post Processing
6.3 What should happen after test case execution
6.4 Implementation Issues
7 Building maintainable tests
7.1 Problems in maintaining automated tests
7.2 Attributes of test maintenance
7.3 The conspiracy
7.4 Strategy and tactics
8 Metrics
8.1 Why measure testing and test automation?
8.2 What can we measure?
8.3 Objectives for testing and test automation
8.4 Attributes of software testing
8.5 Attributes of test automation
8.6 Which is the best test automation regime?
8.7 Should I really measure all these?
8.8 Summary
8.9 Answer to DDP Exercise
9 Other Issues
9.1 Which Tests to Automate (first)?
9.2 Selecting which tests to run when
9.3 Order of test execution
9.4 Test status
9.5 Designing software for (automated) testability
9.6 Synchronization
9.7 Monitoring progress of automated tests
9.8 Tailoring your own regime around your tools
10 Choosing a tool to automate testing
10.1 Introduction to Chapters 10 and 11
10.2 Where to start in selecting tools: your requirements, not the tool market
10.3 The tool selection project
10.4 The tool selection team
10.5 Identifying your requirements
10.6 Identifying your constraints
10.7 Build or buy?
10.8 Identifying what is available on the market
10.9 Evaluating the short listed candidate tools
10.10 Making the decision
11 Implementing tools within the organization
11.1 What could go wrong?
11.2 Importance of managing the implementation process
11.3 Roles in the implementation/change process
11.4 Management commitment
11.5 Preparation
11.6 Pilot project
11.7 Planned phased installation or roll-out
11.8 Special problems in implementing
11.9 People issues
11.10 Conclusion
12 Racal-Redac Case History
12.1 Introduction
12.2 Background
12.3 Solutions
12.4 Integration to Test Automation
12.5 System Test Automation
12.6 The Results Achieved
12.7 Summary of the case history up to 1991
12.8 What happened next?
13 The Evolution of an Automated Software Test System
13.1 Introduction
13.2 Background
13.3 Gremlin 1
13.4 Gremlin 2.0: A Step Beyond Capture/Replay
13.5 Finding The Real Problem
13.6 Lesson Learned
14 Experiences with Test Automation
14.1

Details
Erscheinungsjahr: 1999
Fachbereich: EDV
Genre: Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 600
Inhalt: Kartoniert / Broschiert
ISBN-13: 9780201331400
ISBN-10: 0201331403
Sprache: Englisch
Einband: Kartoniert / Broschiert
Autor: Graham, Dorothy
Fewster, Mark
Zusammengestellt: Graham, Dorothy
Hersteller: Pearson Education Limited
Maße: 237 x 156 x 31 mm
Von/Mit: Dorothy Graham (u. a.)
Erscheinungsdatum: 28.06.1999
Gewicht: 0,904 kg
preigu-id: 106692955
Warnhinweis