Effective methods for software testing:
Accompanying CD-ROM contains ... "work papers and quality control checklists your organization needs to implement an effective software testing process."--P. [4] of cover.
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Indianapolis, IN
Wiley
2006
|
Ausgabe: | 3. ed. |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Zusammenfassung: | Accompanying CD-ROM contains ... "work papers and quality control checklists your organization needs to implement an effective software testing process."--P. [4] of cover. |
Beschreibung: | XXVII, 973 S. graph. Darst. 1 CD-ROM (12 cm) |
ISBN: | 0764598376 9780764598371 |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV021620457 | ||
003 | DE-604 | ||
005 | 20120220 | ||
007 | t | ||
008 | 060619s2006 xxud||| |||| 00||| eng d | ||
010 | |a 2005036216 | ||
020 | |a 0764598376 |c cloth/cdrom |9 0-7645-9837-6 | ||
020 | |a 9780764598371 |9 978-0-7645-9837-1 | ||
035 | |a (OCoLC)62732602 | ||
035 | |a (DE-599)BVBBV021620457 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
044 | |a xxu |c US | ||
049 | |a DE-703 | ||
050 | 0 | |a QA76.76.T48 | |
082 | 0 | |a 005.1/4 |2 22 | |
084 | |a ST 233 |0 (DE-625)143620: |2 rvk | ||
100 | 1 | |a Perry, William E. |e Verfasser |4 aut | |
245 | 1 | 0 | |a Effective methods for software testing |c William E. Perry |
250 | |a 3. ed. | ||
264 | 1 | |a Indianapolis, IN |b Wiley |c 2006 | |
300 | |a XXVII, 973 S. |b graph. Darst. |e 1 CD-ROM (12 cm) | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
520 | 3 | |a Accompanying CD-ROM contains ... "work papers and quality control checklists your organization needs to implement an effective software testing process."--P. [4] of cover. | |
650 | 4 | |a Logiciels - Essais | |
650 | 4 | |a Computer software |x Testing | |
650 | 0 | 7 | |a Testen |0 (DE-588)4367264-4 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Anwendungssoftware |0 (DE-588)4120906-0 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Softwaretest |0 (DE-588)4132652-0 |2 gnd |9 rswk-swf |
655 | 7 | |8 1\p |0 (DE-588)4398750-3 |a Checkliste |2 gnd-content | |
655 | 4 | |a CD-ROMs | |
689 | 0 | 0 | |a Softwaretest |0 (DE-588)4132652-0 |D s |
689 | 0 | |5 DE-604 | |
689 | 1 | 0 | |a Anwendungssoftware |0 (DE-588)4120906-0 |D s |
689 | 1 | 1 | |a Testen |0 (DE-588)4367264-4 |D s |
689 | 1 | |8 2\p |5 DE-604 | |
856 | 4 | 2 | |m GBV Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014835529&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-014835529 | ||
883 | 1 | |8 1\p |a cgwrk |d 20201028 |q DE-101 |u https://d-nb.info/provenance/plan#cgwrk | |
883 | 1 | |8 2\p |a cgwrk |d 20201028 |q DE-101 |u https://d-nb.info/provenance/plan#cgwrk |
Datensatz im Suchindex
_version_ | 1804135412137984000 |
---|---|
adam_text | EFFECTIVE METHODS FOR SOFTWARE TESTING THIRD EDITION WILLIAM E. PERRY
WILEY WILEY PUBLISHING, INC. CONTENTS INTRODUCTION XXV PART I ASSESSING
TESTING CAPABILITIES AND COMPETENCIES 1 CHAPTER 1 ASSESSING
CAPABILITIES, STAFF COMPETENCY, AND USER SATISFACTION 3 THE THREE-STEP
PROCESS TO BECOMING A WORLD-CLASS TESTING ORGANIZATION 3 STEP 1: DEFINE
A WORLD-CLASS SOFTWARE TESTING MODEL 5 CUSTOMIZING THE WORLD-CLASS MODEL
FOR YOUR ORGANIZATION 7 STEP 2: DEVELOP BASELINES FOR YOUR ORGANIZATION
8 ASSESSMENT 1: ASSESSING THE TEST ENVIRONMENT 8 IMPLEMENTATION
PROCEDURES 9 VERIFYING THE ASSESSMENT 13 ASSESSMENT 2: ASSESSING THE
CAPABILITIES OF YOUR EXISTING TEST PROCESSES 13 ASSESSMENT 3: ASSESSING
THE COMPETENCY OF YOUR TESTERS 14 IMPLEMENTATION PROCEDURES 14 VERIFYING
THE ASSESSMENT 16 STEP 3: DEVELOP AN IMPROVEMENT PLAN 16 SUMMARY 18 PART
II BUILDING A SOFTWARE TESTING ENVIRONMENT 35 CHAPTER 2 CREATING AN
ENVIRONMENT SUPPORTIVE OF SOFTWARE TESTING 37 MINIRNIZING RISKS 38 RISK
APPETITE FOR SOFTWARE QUALITY 38 RISKS ASSOCIATED WITH IMPLEMENTING
SPECIFKATIONS 39 FAULTY SOFTWARE DESIGN 39 DATA PROBLEMS 39 IX CONTENTS
RISKS ASSOCIATED WITH NOT MEETING CUSTOMER NEEDS 40 DEVELOPING A ROLE
FOR SOFTWARE TESTERS 43 WRITING A POLICY FOR SOFTWARE TESTING 45
CRITERIA FOR A TESTING POLICY 45 METHODS FOR ESTABLISHING A TESTING
POLICY 46 ECONOMICS OF TESTING 47 TESTING*AN ORGANIZATIONAL ISSUE 50
MANAGEMENT SUPPORT FOR SOFTWARE TESTING 50 BUILDING A STRUCTURED
APPROACH TO SOFTWARE TESTING 51 REQUIREMENTS 54 DESIGN 54 PROGRAM 55
TEST 55 INSTALLATION 55 MAINTENANCE 55 DEVELOPING A TEST STRATEGY 56 USE
WORK PAPER 2-1 58 USE WORK PAPER 2-2 58 SUMMARY 60 CHAPTER 3 BUILDING
THE SOFTWARE TESTING PROCESS 63 SOFTWARE TESTING GUIDELINES 63 GUIDELINE
#1: TESTING SHOULD REDUCE SOFTWARE DEVELOPMENT RISK 64 GUIDELINE #2:
TESTING SHOULD BE PERFORMED EFFECTIVELY 65 GUIDELINE #3: TESTING SHOULD
UNCOVER DEFECTS 65 DEFECTS VERSUS FAILURES 65 WHY ARE DEFECTS HARD TO
FIND? 66 GUIDEIINE #4: TESTING SHOULD BE PERFORMED USING BUSINESS LOGIC
67 GUIDELINE #5: TESTING SHOULD OCCUR THROUGHOUT THE DEVELOPMENT LIFE
CYCLE 68 GUIDEIINE #6: TESTING SHOULD TEST BOTH FUNCTION AND STRUCTURE
69 WHY USE BOTH TESTING METHODS? 69 STRUCTURAL AND FUNCTIONAL TESTS
USING VERIFICATION AND VALIDATION TECHNIQUES 69 WORKBENCH CONCEPT 71
TESTING THAT PARALLELS THE SOFTWARE DEVELOPMENT PROCESS 72 CUSTOMIZING
THE SOFTWARE-TESTING PROCESS 74 DETERMINING THE TEST STRATEGY OBJECTIVES
74 DETERMINING THE TYPE OF DEVELOPMENT PRQJECT 75 DETERMINING THE TYPE
OF SOFTWARE SYSTEM 76 DETERMINING THE PROJECT SCOPE 77 TDENTIFYING THE
SOFTWARE RISKS 77 DETERMINING WHEN TESTING SHOULD OCCUR 79 DEFINING THE
SYSTEM TEST PLAN STANDARD 79 CONTENTS DEFINING THE UNIT TEST PLAN
STANDARD 83 CONVERTING TESTING STRATEGY TO TESTING TACTICS 83 PROCESS
PREPARATION CHECKLIST 86 SUMMA RY 86 CHAPTER 4 SEIECTING AND INSTALLING
SOFTWARE TESTING TOOLS 103 INTEGRATING TOOLS INTO THE TESTER S WORK
PROCESSES 103 TOOLS AVAILABLE FOR TESTING SOFTWARE 104 SEIECTING AND
USING TEST TOOLS 108 MATCHING THE TOOL TO TTS USE 109 SEIECTING A TOOL
APPROPRIATE TO ITS LIFE CYCLE PHASE 109 MATCHING THE TOOL TO THE
TESTER S SKILL LEVEL 111 SEIECTING AN AFFORDABLE TOOL 114 TRAINING
TESTERS IN TOOL USAGE 116 APPOINTING TOOL MANAGERS 117 PREREQUISITES TO
CREATING A TOOL MANAGER POSITION 118 SEIECTING A TOOL MANAGER 118
ASSIGNING THE TOOL MANAGER DUTIES 119 LIMITING THE TOOL MANAGER S TENURE
120 SUMMA RY 120 CHAPTER 5 BUILDING SOFTWARE TESTER COMPETENCY 125 WHAT
IS A COMMON BODY OF KNOWLEDGE? 125 WHO IS RESPONSIBLE FOR THE SOFTWARE
TESTER S COMPETENCY? 126 HOW IS PERSONAL COMPETENCY USED IN JOB
PERFORMANCE? 126 USING THE 2006 CSTE CBOK 127 DEVELOPING A TRAINING
CURRICULUM 128 USING THE CBOK TO BUILD AN EFFECTIVE TESTING TEAM 129
SUMMARY 131 PART IM THE SEVEN-STEP TESTING PROCESS 151 CHAPTER 6
OVERVIEW OF THE SOFTWARE TESTING PROCESS 153 ADVANTAGES OF FOLLOWING A
PROCESS 153 THE COST OF COMPUTER TESTING 154 QUANTIFYING THE COST OF
REMOVING DEFECTS 155 REDUCING THE COST OF TESTING 156 THE SEVEN-STEP
SOFTWARE TESTING PROCESS 156 OBJECTIVES OF THE SEVEN-STEP PROCESS 159
CUSTOMIZING THE SEVEN-STEP PROCESS 160 MANAGING THE SEVEN-STEP PROCESS
161 USING THE TESTER S WORKBENCH WITH THE SEVEN-STEP PROCESS 162
WORKBENCH SKILLS 163 SUMMARY 164 CHAPTER 7 STEP 1: ORGANIZING FOR
TESTING 165 OBJECTIVE 165 WORKBENCH 166 INPUT 167 XII CONTENTS DO
PROCEDURES 167 TASK 1: APPOINT THE TEST MANAGER 167 TASK 2: DEFINE THE
SCOPE OF TESTING 168 TASK 3: APPOINT THE TEST TEAM 168 INTERNAL TEAM
APPROACH 169 EXTERNAL TEAM APPROACH 170 NON-IT TEAM APPROACH 170
COMBINATION TEAM APPROACH 170 TASK 4: VERIFY THE DEVELOPMENT
DOCUMENTATION 171 DEVELOPMENT PHASES 171 MEASURING PROJECT DOCUMENTATION
NEEDS 174 DETERMINING WHAT DOCUMENTS MUST BE PRODUCED 175 DETERMINING
THE COMPLETENESS OF INDIVIDUAL DOCUMENTS 179 DETERMINING DOCUMENTATION
TIMELINESS 180 TASK 5: VALIDATE THE TEST ESTIMATE AND PROJECT STATUS
REPORTING PROCESS 181 VALIDATING THE TEST ESTIMATE 182 TESTING THE
VALIDITY OF THE SOFTWARE COST ESTIMATE 185 CALCULATING THE PROJECT
STATUS USING A POINT SYSTEM 189 CHECK PROCEDURES 200 OUTPUT 200 SUMMARY
200 CHAPTER 8 STEP 2: DEVELOPING THE TEST PLAN 209 OVERVIEW 209
OBJECTIVE 210 CONCERNS 210 WORKBENCH 211 INPUT 212 DO PROCEDURES 212
TASK 1: PROFILE THE SOFTWARE PROJECT 212 CONDUCTINGA WALKTHIOUGH OF THE
C US TOMER / USER AREA 212 DEVELOPING A PROFILE OF THE SOFTWARE PROJECT
213 TASK 2: UNDERSTAND THE PROJECT RISKS 215 TASK 3: SELECT A TESTING
TECHNIQUE 222 STRUCTURAL SYSTEM TESTING TECHNIQUES 223 PUNCTIONAL SYSTEM
TESTING TECHNIQUES 229 TASK 4: PLAN UNIT TESTING AND ANALYSIS 235
FUNCTIONAL TESTING AND ANALYSIS 236 STRUCTURAL TESTING AND ANALYSIS 238
ERROR-ORIENTED TESTING AND ANALYSIS 240 MANAGERIAL ASPECTS OF UNIT
TESTING AND ANALYSIS 243 TASK 5: BUILD THE TEST PLAN 244 SETTING TEST
OBJECTIVES 245 DEVELOPING A TEST MATRIX 245 DEFINING TEST ADMINISTRATION
250 WRITING THE TEST PLAN 251 CONTENTS XIII CHAPTER 9 CHAPTER 10 TASK 6:
INSPECT THE TEST PLAN INSPECTION CONCERNS PRODUCTS/DELIVERABLES TO
INSPECT FORMAL INSPECTION ROLES FORMAL INSPECTION DEFECT CLASSIFICATION
INSPECTION PROCEDURES CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY STEP 3:
VERIFICATION TESTING OVERVIEW OBJECTIVE CONCERNS WORKBENCH INPUT THE
REQUIREMENTS PHASE THE DESIGN PHASE THE PROGRAMMING PHASE DO PROCEDURES
TASK 1: TEST DURING THE REQUIREMENTS PHASE REQUIREMENTS PHASE TEST
FACTORS PREPARING A RISK MATRIX PERFORMING A TEST FACTOR ANALYSIS
CONDUCDNG A REQUIREMENTS WALKTHROUGH PERFORMING REQUIREMENTS TRACING
ENSURING REQUIREMENTS ARE TESTABLE TASK 2: TEST DURING THE DESIGN PHASE
SCORING SUCCESS FACTORS ANALYZING TEST FACTORS CONDUCDNG A DESIGN REVIEW
INSPECTING DESIGN DELIVERABLES TASK 3: TEST DURING THE PROGRAMMING PHASE
DESK DEBUGGING THE PROGRAM. PERFORMING PROGRAMMING PHASE TEST FACTOR
ANALYSIS CONDUCTING A PEER REVIEW CHECK PROCEDURES OUTPUT GUIDELINES
SUMMARY STEP 4: VALIDATION TESTING OVERVIEW OBJECTIVE CONCERNS WORKBENCH
INPUT 254 255 256 256 258 259 262 262 262 263 291 292 293 294 294 296
296 296 297 298 298 299 302 310 312 314 315 316 316 318 320 322 323 325
326 328 330 331 331 332 409 409 410 410 410 411 XIV CONTENTS DO
PROCEDURES 412 TASK 1: BUILD THE TEST DATA 412 SOURCES OF TEST DATA/TEST
SCRIPTS 412 TESTING FILE DESIGN 413 DEFINLNG DESIGN GOALS 414 ENTERING
TEST DATA 414 APPLYING TEST FILES AGAINST PROGRAMS THAT UPDATE MASTER
RECORDS 414 CREATING AND USING TEST DATA 415 PAYROLL APPLICATION EXAMPLE
416 CREATING TEST DATA FOR STRESS/LOAD TESTING 430 CREATING TEST SCRIPTS
430 TASK 2: EXECUTE TESTS 434 TASK 3: RECORD TEST RESULTS 436
DOCUMENTING THE DEVIATION 437 DOCUMENTING THE EFFECT 438 DOCUMENTING THE
CAUSE 438 CHECK PROCEDURES 439 OUTPUT 439 GUIDELINES 439 SUMMARY 440
CHAPTER 11 STEP 5: ANALYZING AND REPORTING TEST RESULTS 459 OVERVIEW 459
CONCERNS 460 WORKBENCH 460 INPUT 461 TEST PLAN AND PROJECT PLAN 461
EXPECTED PROCESSING RESULTS 461 DATA COLLECTED DURING TESTING 461 TEST
RESULTS DATA 462 TEST TRANSACTIONS, TEST SUITES, AND TEST EVENTS 462
DEFECTS 462 EFFICIENCY 463 STORING DATA COLLECTED DURING TESTING 463 DO
PROCEDURES 463 TASK 1: REPORT SOFTWARE STATUS 464 ESTABLISHING A
MEASUREMENT TEAM 465 CREATING AN INVENTORY OF EXISTING PROJECT
MEASUREMENTS 465 DEVELOPING A CONSISTENT SET OF PROJECT METRICS 466
DEFINING PROCESS REQUIREMENTS 466 DEVELOPING AND IMPLEMENTING THE
PROCESS 466 MONITORING THE PROCESS 466 TASK 2: REPORT INTERIM TEST
RESULTS 470 FUNCTION/TEST MATRIX 470 FUNCTIONAL TESTING STATUS REPORT
471 FUNCTIONS WORKING TIMELINE REPORT 472 EXPECTED VERSUS ACTUAL DEFECTS
UNCOVERED TIMELINE REPORT 472 CONTENTS XV CHAPTER12 DEFECTS UNCOVERED
VERSUS CORRECTED GAP TIMELINE REPORT AVERAGE AGE OF UNCORRECTED DEFECTS
BY TYPE REPORT DEFECT DISTRIBUTION REPORT NORMALIZED DEFECT DISTRIBUTION
REPORT TESTING ACTION REPORT INTERIM TEST REPORT TASK 3; REPORT FINAL
TEST RESULTS INDIVIDUAL PROJECT TEST REPORT INTEGRATION TEST REPORT
SYSTEM TEST REPORT ACCEPTANCE TEST REPORT CHECK PROCEDURE S OUTPUT
GUIDELINES SUMMARY STEP 6: ACCEPTANCE AND OPERATIONAL TESTING OVERVIEW
OBJECTIVE CONCERN S WORKBENCH INPUT PROCEDURES TASK 1: ACCEPTANCE
TESTING DEFINING THE ACCEPTANCE CRITERIA DEVELOPING AN ACCEPTANCE PLAN
EXECUTING THE ACCEPTANCE PLAN DEVELOPING TEST CASES (USE CASES) BASED ON
HOW SOFTWARE WILL BE USED TASK 2: PRE-OPERATIONAL TESTING TESTING NEW
SOFTWARE INSTALLATION TESTING THE CHANGED SOFTWARE VERSION MONITORING
PRODUCTION DOCUMENTING PROBLEMS TASK 3; POST-OPERATIONAL TESTING
DEVELOPING AND UPDATING THE TEST PLAN DEVELOPING AND UPDATING THE TEST
DATA TESTING THE CONTROL CHANGE PROCESS CONDUCTING TESTING DEVELOPING
AND UPDATING TRAINING MATERIAL CHECK PROCEDURES OUTPUT IS THE AUTOMATED
APPLICATION ACCEPTABLE? AUTOMATED APPLICATION SEGMENT FAILURE
NOTIFICATION IS THE MANUAL SEGMENT ACCEPTABLE? TRAINING FAILURE
NOTIFICATION FORM GUIDELINES SUMMARY 473 475 475 476 477 478 478 480 480
480 482 482 482 482 483 491 491 492 493 494 495 496 497 498 499 500 503
509 509 512 513 513 514 515 517 518 518 522 522 522 523 523 524 524 525
XVI CONTENTS CHAPTER13 PART IV CHAPTER14 CHAPTER 15 STEP 7:
POST-IMPLEMENTATION ANALYSIS OVERVIEW CONCERNS WORKBENCH INPUT DO
FROCEDURES TASK 1: ESTABLISH ASSESSMENT OBJECTO VES TASK 2: IDENTIFY
WHAT TO MEASURE TASK 3: ASSIGN MEASUREMENT RESPONSIBILITY TASK 4: SELECT
EVALUATION APPROACH TASK 5: IDENTIFY NEEDED FACTS TASK 6: COLLECT
EVALUATION DATA TASK 7: ASSESS THE EFFECTIVENESS OF TESTING USING
TESTING MERRICS CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY INCORPORATING
SPECIALIZED TESTING RESPONSIBILITIES SOFTWARE DEVELOPMENT METHODOLOGIES
HOW MUCH TESTING IS ENOUGH? SOFTWARE DEVELOPMENT METHODOLOGIES OVERVIEW
METHODOLOGY TYPES SOFTWARE DEVELOPMENT LIFE CYCLE DEFINING REQUIREMENTS
CATEGORIES ATTRIBUTES METHODOLOGY MATURITY COMPETENCIES REQUIRED STAFF
EXPERIENCE CONFIGURATION-MANAGEMENT CONTROLS BASIC CM REQUIREMENTS PLANN
ING DATA DISTRIBUTION AND ACCESS CM ADMINISTRATION CONFIGURATION
IDENTIFICATION CONFIGURATION CONTROL MEASURING THE IMPACT OF THE
SOFTWARE DEVELOPMENT PROCESS SUMMARY TESTING CLIENT/SERVER SYSTEMS
OVERVIEW CONCERNS WORKBENCH INPUT 571 571 572 572 574 574 574 575 575
575 576 577 577 577 580 580 581 581 583 585 585 586 586 587 588 592 592
593 596 598 600 600 600 602 602 602 603 605 605 606 611 611 612 613 614
DO PROCEDURES TASK 1: ASSESS READINESS SOFTWARE DEVELOPMENT PROCESS
MARURITY LEVELS CONDUCTING THE CLIENT/SERVER READINESS ASSESSMENT
PREPARING A CLIENT/SERVER READINESS FOOTPRINT CHART TASK 2: ASSESS KEY
COMPONENTS TASK 3: ASSESS CLIENT NEEDS CHECK PROCEDURES OUTPUT
GUIDELINES SUMMA RY CHAPTER 16 RAPID APPLICATION DEVELOPMENT TESTING
OVERVIEW OBJECTIVE CONCERN S TESTING ITERATIONS TESTING COMPONENTS
TESTING PERFORMANCE RECORDING TEST INFORMATION WORKBENCH INPUT DO
PROCEDURES TESTING WITHIN ITERATIVE RAD SPIRAL TESTING TASK 1: DETERMINE
APPROPRIATENESS OF RAD TASK 2: TEST PLANNING ITERATIONS TASK 3: TEST
SUBSEQUENT PLANNING ITERATIONS TASK 4: TEST THE FINAL PLANNING ITERATION
CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY CHAPTER 17 TESTING INTERNAL
CONTROLS OVERVIEW INTERNAL CONTROLS CONTROL OBJECTIVES PREVENDVE
CONTROLS SOURCE-DATA AUTHORIZATION DATA MPUT SOURCE-DATA PREPARATION
TURNAROUND DOCUMENTS PRENUMBERED FORMS INPUT VALIDATION FILE
AUTO-UPDATING PROCESSING CONTROLS CONTENTS XVII 614 614 615 621 621 622
622 624 624 624 624 633 633 634 634 634 635 635 635 635 636 636 636 638
639 640 640 642 642 643 643 643 655 655 657 657 658 658 659 659 659 659
659 661 661 XVIII CONTENTS CHAPTER 18 DETECTIVE CONTROLS DATA
TRANSMISSION CONTROL REGISTER CONTRA 1 TOTAIS DOCUMENTING AND TESTING
OUTPUT CHECKS CORRECTIVE CONTROLS ERROR DETECTION AND RESUBMISSION AUDIT
TRAILS COST/BENEIIT ANALYSIS ASSESSING INTERNAL CONTROLS TASK 1:
UNDERSTAND THE SYSTEM BEINGTESTED TASK 2: IDENTIFY RISKS TASK 3: REVIEW
APPLICATION CONTROLS TASK 4: TEST APPLICATION CONTROLS TESTING WITHOUT
COMPUTER PROCESSING TESTING WITH COMPUTER PROCESSING TRANSACTION FLOW
TESTING OBJECTIVES OF INTERNAL ACCOUNTING CONTROLS RESULTS OF TESTING
TASK 5: DOCUMENT CONTROL STRENGTHS AND WEAKNESSES QUALITY CONTROL
CHECKLIST SUMMARY TESTING COTS AND CONTRACTED SOFTWARE OVERVIEW COTS
SOFTWARE ADVANTAGES, DISADVANTAGES, AND RISKS COTS VERSUS CONTRACTED
SOFTWARE COTS ADVANTAGES COTS DISADVANTAGES IMPLEMENTATION RISKS TESTING
COTS SOFTWARE TESTING CONTRACTED SOFTWARE OBJECTIVE CONCERNS WORKBENCH
INPUT DO PROCEDURES TASK 1: TEST BUSINESS FIT STEP 1: TESTING NEEDS
SPECIFICATION STEP 2: TESTING CSFS TASK 2: TEST OPERATIONAL FIT STEP 1:
TEST COMPATIBILITY STEP 2: INTEGRATE THE SOFTWARE INTO EXISTING WORK
FLOWS STEP 3: DEMONSTRATE THE SOFTWARE IN ACTION TASK 3: TEST PEOPLE FIT
662 663 663 664 664 664 665 665 665 666 666 666 668 668 668 669 669 672
673 677 677 678 678 685 686 686 686 687 687 688 689 690 691 691 692 693
693 693 693 695 696 697 698 700 701 CONTENTS XIX TASK 4: ACCEPTANCE-TEST
THE SOFTWARE PROCESS 702 STEP 1: CREATE FUNCTIONAL TEST CONDITIONS 702
STEP 2: CREATE STRUCTURA] TEST CONDITIONS 703 MODIFYING THE TESTING
PROCESS FOR CONTRACTED SOFTWARE 704 CHECK PROCEDURES 705 OUTPUT 705
GUIDELINES 706 SUMMARY 706 CHAPTER 19 TESTING IN A MULTIPIATFORM
ENVIRONMENT 717 OVERVIEW 717 OBJECTIVE 718 CONCERNS 718 BACKGROUND ON
TESTING IN A MULTIPIATFORM ENVIRONMENT 718 WORKBENCH 719 INPUT 720 DO
PROCEDURES 721 TASK 1: DEFINE PLATFORM CONFIGURATION CONCERNS 721 TASK
2: LIST NEEDED PLATFORM CONTIGURATIONS 723 TASK 3; ASSESS TEST ROOM
CONFIGURATIONS 723 TASK 4: LIST STRUCTURAL COMPONENTS AFFECTED BY THE
PLATFORM(S) 723 TASK 5; LIST INTERFACES THE PLATFORM AFFECTS 725 TASK 6:
EXECUTE THE TESTS 726 CHECK PROCEDURES 726 OUTPUT 726 GUIDELINES 726
SUMMARY 727 CHAPTER 20 TESTING SOFTWARE SYSTEM SECURITY 733 OVERVIEW 733
OBJECTIVE 734 CONCERNS 734 WORKBENCH 734 INPUT 735 WHERE VULNERABILITIES
OCCUR 735 FUNCTIONAL VULNERABILITIES 736 VULNERABLE AREAS 737 ACCIDENTAL
VERSUS INTENTIONAL LOSSES 738 DO PROCEDURES 739 TASK 1: ESTABLISH A
SECURITY BASELINE 739 WHY BASELINES ARE NECESSARY 740 CREATING BASELINES
740 USING BASELTNES 749 TASK 2: BUILD A PENETRATION-POINT MATRIX 751
CONTROLLING PEOPLE BY CONTROLLING ACTIVITIES 751 SELECTING SECURITY
ACTIVITIES 752 CONTROLLING BUSINESS TRANSACTIONS 755 XX CONTENTS CHAPTER
21 CHAPTER 22 CHARACTERISTICS OF SECURITY PENETRATION BUILDING A
PENETRATION-PO INT MATRIX TASK 3: ANALYZE THE RESULTS OF SECURITY
TESTING EVALUATING THE ADEQUACY OF SECURITY CHECK PROCEDURES OUTPUT
GUIDELINES SUMMARY TESTING A DATA WAREHOUSE OVERVIEW CONCEMS WORKBENCH
INPUT DO PROCEDURES TASK 1: MEASURE THE MAGNITUDE OF DATA WAREHOUSE
CONCERNS TASK 2: IDENTIFY DATA WAREHOUSE ACTIVITY PROCESSES TO TEST
ORGANIZATIONAL PROCESS DATA DOCUMENTATION PROCESS SYSTEM DEVELOPMENT
PROCESS ACCESS CONTROL PROCESS DATA INTEGRITY PROCESS OPERATIONS PROCESS
BACKUP/ RECOVERY PROCESS PERFORM ING TASK 2 TASK 3: TEST THE ADEQUACY OF
DATA WAREHOUSE ACTIVITY PROCESSES CHECK PROCEDURES OUTPUT GUIDELINES
SUMMARY TESTING WEB-BASED SYSTEMS OVERVIEW CONCERNS WORKBENCH INPUT DO
PROCEDURES TASK 1: SELECT WEB-BASED RISKS TO INCLUDE IN THE TEST PLAN
SECURITY CONCERNS PERFORMANCE CONCERNS CORRECTNESS CONCERNS
COMPATIBILITY CONCERNS RELIABILITY CONCERNS DATA INTEGRITY CONCERNS
USABILITY CONCERNS RECOVERABILITY CONCERNS 756 757 760 761 762 762 762
762 765 765 765 766 767 768 768 769 769 769 770 771 771 772 773 774 774
780 780 780 780 799 799 800 800 801 802 802 803 803 804 804 806 806 806
807 CONTENTS XXI PARTV CHAPTER 23 CHAPTER 24 TASK 2: SELECT WEB-BASED
TESTS UNIT OR COMPONENT INTEGRATION SYSTEM USER ACCEPTANCE PERFORMANCE
LOAD/STRESS REGRESSION USABILITY COMPATIBILITY TASK 3: SELECT WEB-BASED
TEST TOOLS TASK 4: TEST WEB-BASED SYSTEMS CHECK PROCEDURES OUTPUT
GUIDELINES SUMMARY BUILDING AGILITY INTO THE TESTING PROCESS USING AGILE
METHODS TO IMPROVE SOFTWARE TESTING THE IMPORTANCE OF AGILITY BUILDING
AN AGILE TESTING PROCESS AGILITY INHIBITORS IS IMPROVEMENT NECESSARY?
COMPRESSING TIME CHALLENGES SOLUTIONS MEASURING READINESS THE SEVEN-STEP
PROCESS SUMMARY BUILDING AGILITY INTO THE TESTING PROCESS STEP 1:
MEASURE SOFTWARE PROCESS VARIABILITY TIMELINES PROCESS STEPS WORKBENCHES
TIME-COMPRESSION WORKBENCHES REDUCING VARIABILITY DEVELOPING TIMELINES
IMPROVEMENT SHOPPING LIST QUALITY CONTROL CHECKLIST CONCLUSION STEP 2:
MAXIMIXE BEST PRACTICES TESTER AGILITY SOFTWARE TESTING RELATIONSHIPS
TRADEOFFS CAPABILITY CHART MEASURINC EFFECTIVENESS AND EFFICIENCY 807
807 807 807 808 808 808 808 808 808 809 809 809 810 810 811 817 819 819
820 821 822 823 824 825 826 826 827 831 831 832 833 833 834 835 836 841
841 842 842 842 843 845 847 848 XXII CONTENTS IMPROVEMENT SHOPPING LIST
856 QUALITY CONTROL CHECKLIST 856 CUNCLUSION 857 STEP 3: BUILD ORT
STRENGTH, MINIMIZE WEAKNESS 857 EFFECTIVE TESTING PROCESSES 857 POOR
TESTING PROCESSES 860 IMPROVEMENT SHOPPING LIST 860 QUALITY CONTROL
CHECKLIST 860 CONCLUSION 861 STEP 4: IDENTIFY AND ADDRESS IMPROVEMENT
BARRIERS 861 THE STAKEHOLDER PERSPECTIVE 861 STAKEHOLDER INVOLVEMENT 863
PERFORMING STAKEHOLDER ANALYSIS 863 RED-FLAG/HOT-BUTTON BARRIERS 864
STAFF-COMPETENCY BARRIERS 865 ADMINISTRATIVE/ORGANIZATIONAL BARRIERS 865
DETERMINING THE ROOT CAUSE OF BARRIERS/OBSTACLES 866 ADDRESSING THE ROOT
CAUSE OF BARRIERS/OB STACLES 867 QUALITY CONTROL CHECKLIST 869
CONCLUSION 869 STEP 5: IDENTIFY AND ADDRESS CULTURAL AND COMMUNICATION
BARRIERS 869 MANAGEMENT CULTURES 870 CULTURE 1: MANAGE PEOPLE 871
CULTURE 2: MANAGE BY PROCESS 873 CULTURE 3: MANAGE COMPETENCIES 874
CULTURE 4: MANAGE BY FACT 876 CULTURE 5: MANAGE BUSINESS INNOVATION 878
CULTURAL BARRIERS 879 IDENTIFYING THE CURRENT MANAGEMENT CULTURE 879
IDENTIFYING THE BARRIERS POSED BY THE CULTURE 879 DETERMINING WHAT CAN
BE DONE IN THE CURRENT CULTURE 879 DETERMINING THE DESIRED CULTURE FOR
TIME COMPRESSION 879 DETERMINING HOW TO ADDRESS CULTURE BARRIERS 880
OPEN AND EFFECTIVECOMMUNICATION 880 LINES OF COMMUNICATION 881
INFORMATION/COMMUNICATION BARRIERS 882 EFFECTIVE COMMUNICATION 882
QUALITY CONTROL CHECKLIST 884 CONCLUSION 885 STEP 6: IDENTIFY
IMPLEMENTABLE TMPROVEMENTS 885 WHAT IS AN IMPLEMENTABLE? 885 IDENTIFYING
IMPLEMENTABLES VIA TIME COMPRESSION 886 PRIORITIZING IMPLEMENTABLES 888
DOCUMENTING APPROACHES 890 QUALITY CONTROL CHECKLIST 890 CONCLUSION 890
CONTENTS XXIII STEP 7: DEVELOP AND EXECUTE AN IMPLEMENTATION PLAN 891
PLANNING 891 IMPLEMENTING IDEAS 891 REQUISITE RESOURCES 893 QUALITY
CONTROL CHECKLIST 894 CONCLUSION 894 SUMMARY 895 INDEX 929
|
adam_txt |
EFFECTIVE METHODS FOR SOFTWARE TESTING THIRD EDITION WILLIAM E. PERRY
WILEY WILEY PUBLISHING, INC. CONTENTS INTRODUCTION XXV PART I ASSESSING
TESTING CAPABILITIES AND COMPETENCIES 1 CHAPTER 1 ASSESSING
CAPABILITIES, STAFF COMPETENCY, AND USER SATISFACTION 3 THE THREE-STEP
PROCESS TO BECOMING A WORLD-CLASS TESTING ORGANIZATION 3 STEP 1: DEFINE
A WORLD-CLASS SOFTWARE TESTING MODEL 5 CUSTOMIZING THE WORLD-CLASS MODEL
FOR YOUR ORGANIZATION 7 STEP 2: DEVELOP BASELINES FOR YOUR ORGANIZATION
8 ASSESSMENT 1: ASSESSING THE TEST ENVIRONMENT 8 IMPLEMENTATION
PROCEDURES 9 VERIFYING THE ASSESSMENT 13 ASSESSMENT 2: ASSESSING THE
CAPABILITIES OF YOUR EXISTING TEST PROCESSES 13 ASSESSMENT 3: ASSESSING
THE COMPETENCY OF YOUR TESTERS 14 IMPLEMENTATION PROCEDURES 14 VERIFYING
THE ASSESSMENT 16 STEP 3: DEVELOP AN IMPROVEMENT PLAN 16 SUMMARY 18 PART
II BUILDING A SOFTWARE TESTING ENVIRONMENT 35 CHAPTER 2 CREATING AN
ENVIRONMENT SUPPORTIVE OF SOFTWARE TESTING 37 MINIRNIZING RISKS 38 RISK
APPETITE FOR SOFTWARE QUALITY 38 RISKS ASSOCIATED WITH IMPLEMENTING
SPECIFKATIONS 39 FAULTY SOFTWARE DESIGN 39 DATA PROBLEMS 39 IX CONTENTS
RISKS ASSOCIATED WITH NOT MEETING CUSTOMER NEEDS 40 DEVELOPING A ROLE
FOR SOFTWARE TESTERS 43 WRITING A POLICY FOR SOFTWARE TESTING 45
CRITERIA FOR A TESTING POLICY 45 METHODS FOR ESTABLISHING A TESTING
POLICY 46 ECONOMICS OF TESTING 47 TESTING*AN ORGANIZATIONAL ISSUE 50
MANAGEMENT SUPPORT FOR SOFTWARE TESTING 50 BUILDING A STRUCTURED
APPROACH TO SOFTWARE TESTING 51 REQUIREMENTS 54 DESIGN 54 PROGRAM 55
TEST 55 INSTALLATION 55 MAINTENANCE 55 DEVELOPING A TEST STRATEGY 56 USE
WORK PAPER 2-1 58 USE WORK PAPER 2-2 58 SUMMARY 60 CHAPTER 3 BUILDING
THE SOFTWARE TESTING PROCESS 63 SOFTWARE TESTING GUIDELINES 63 GUIDELINE
#1: TESTING SHOULD REDUCE SOFTWARE DEVELOPMENT RISK 64 GUIDELINE #2:
TESTING SHOULD BE PERFORMED EFFECTIVELY 65 GUIDELINE #3: TESTING SHOULD
UNCOVER DEFECTS 65 DEFECTS VERSUS FAILURES 65 WHY ARE DEFECTS HARD TO
FIND? 66 GUIDEIINE #4: TESTING SHOULD BE PERFORMED USING BUSINESS LOGIC
67 GUIDELINE #5: TESTING SHOULD OCCUR THROUGHOUT THE DEVELOPMENT LIFE
CYCLE 68 GUIDEIINE #6: TESTING SHOULD TEST BOTH FUNCTION AND STRUCTURE
69 WHY USE BOTH TESTING METHODS? 69 STRUCTURAL AND FUNCTIONAL TESTS
USING VERIFICATION AND VALIDATION TECHNIQUES 69 WORKBENCH CONCEPT 71
TESTING THAT PARALLELS THE SOFTWARE DEVELOPMENT PROCESS 72 CUSTOMIZING
THE SOFTWARE-TESTING PROCESS 74 DETERMINING THE TEST STRATEGY OBJECTIVES
74 DETERMINING THE TYPE OF DEVELOPMENT PRQJECT 75 DETERMINING THE TYPE
OF SOFTWARE SYSTEM 76 DETERMINING THE PROJECT SCOPE 77 TDENTIFYING THE
SOFTWARE RISKS 77 DETERMINING WHEN TESTING SHOULD OCCUR 79 DEFINING THE
SYSTEM TEST PLAN STANDARD 79 CONTENTS DEFINING THE UNIT TEST PLAN
STANDARD 83 CONVERTING TESTING STRATEGY TO TESTING TACTICS 83 PROCESS
PREPARATION CHECKLIST 86 SUMMA RY 86 CHAPTER 4 SEIECTING AND INSTALLING
SOFTWARE TESTING TOOLS 103 INTEGRATING TOOLS INTO THE TESTER'S WORK
PROCESSES 103 TOOLS AVAILABLE FOR TESTING SOFTWARE 104 SEIECTING AND
USING TEST TOOLS 108 MATCHING THE TOOL TO TTS USE 109 SEIECTING A TOOL
APPROPRIATE TO ITS LIFE CYCLE PHASE 109 MATCHING THE TOOL TO THE
TESTER'S SKILL LEVEL 111 SEIECTING AN AFFORDABLE TOOL 114 TRAINING
TESTERS IN TOOL USAGE 116 APPOINTING TOOL MANAGERS 117 PREREQUISITES TO
CREATING A TOOL MANAGER POSITION 118 SEIECTING A TOOL MANAGER 118
ASSIGNING THE TOOL MANAGER DUTIES 119 LIMITING THE TOOL MANAGER'S TENURE
120 SUMMA RY 120 CHAPTER 5 BUILDING SOFTWARE TESTER COMPETENCY 125 WHAT
IS A COMMON BODY OF KNOWLEDGE? 125 WHO IS RESPONSIBLE FOR THE SOFTWARE
TESTER'S COMPETENCY? 126 HOW IS PERSONAL COMPETENCY USED IN JOB
PERFORMANCE? 126 USING THE 2006 CSTE CBOK 127 DEVELOPING A TRAINING
CURRICULUM 128 USING THE CBOK TO BUILD AN EFFECTIVE TESTING TEAM 129
SUMMARY 131 PART IM THE SEVEN-STEP TESTING PROCESS 151 CHAPTER 6
OVERVIEW OF THE SOFTWARE TESTING PROCESS 153 ADVANTAGES OF FOLLOWING A
PROCESS 153 THE COST OF COMPUTER TESTING 154 QUANTIFYING THE COST OF
REMOVING DEFECTS 155 REDUCING THE COST OF TESTING 156 THE SEVEN-STEP
SOFTWARE TESTING PROCESS 156 OBJECTIVES OF THE SEVEN-STEP PROCESS 159
CUSTOMIZING THE SEVEN-STEP PROCESS 160 MANAGING THE SEVEN-STEP PROCESS
161 USING THE TESTER'S WORKBENCH WITH THE SEVEN-STEP PROCESS 162
WORKBENCH SKILLS 163 SUMMARY 164 CHAPTER 7 STEP 1: ORGANIZING FOR
TESTING 165 OBJECTIVE 165 WORKBENCH 166 INPUT 167 XII CONTENTS DO
PROCEDURES 167 TASK 1: APPOINT THE TEST MANAGER 167 TASK 2: DEFINE THE
SCOPE OF TESTING 168 TASK 3: APPOINT THE TEST TEAM 168 INTERNAL TEAM
APPROACH 169 EXTERNAL TEAM APPROACH 170 NON-IT TEAM APPROACH 170
COMBINATION TEAM APPROACH 170 TASK 4: VERIFY THE DEVELOPMENT
DOCUMENTATION 171 DEVELOPMENT PHASES 171 MEASURING PROJECT DOCUMENTATION
NEEDS 174 DETERMINING WHAT DOCUMENTS MUST BE PRODUCED 175 DETERMINING
THE COMPLETENESS OF INDIVIDUAL DOCUMENTS 179 DETERMINING DOCUMENTATION
TIMELINESS 180 TASK 5: VALIDATE THE TEST ESTIMATE AND PROJECT STATUS
REPORTING PROCESS 181 VALIDATING THE TEST ESTIMATE 182 TESTING THE
VALIDITY OF THE SOFTWARE COST ESTIMATE 185 CALCULATING THE PROJECT
STATUS USING A POINT SYSTEM 189 CHECK PROCEDURES 200 OUTPUT 200 SUMMARY
200 CHAPTER 8 STEP 2: DEVELOPING THE TEST PLAN 209 OVERVIEW 209
OBJECTIVE 210 CONCERNS 210 WORKBENCH 211 INPUT 212 DO PROCEDURES 212
TASK 1: PROFILE THE SOFTWARE PROJECT 212 CONDUCTINGA WALKTHIOUGH OF THE
C US TOMER / USER AREA 212 DEVELOPING A PROFILE OF THE SOFTWARE PROJECT
213 TASK 2: UNDERSTAND THE PROJECT RISKS 215 TASK 3: SELECT A TESTING
TECHNIQUE 222 STRUCTURAL SYSTEM TESTING TECHNIQUES 223 PUNCTIONAL SYSTEM
TESTING TECHNIQUES 229 TASK 4: PLAN UNIT TESTING AND ANALYSIS 235
FUNCTIONAL TESTING AND ANALYSIS 236 STRUCTURAL TESTING AND ANALYSIS 238
ERROR-ORIENTED TESTING AND ANALYSIS 240 MANAGERIAL ASPECTS OF UNIT
TESTING AND ANALYSIS 243 TASK 5: BUILD THE TEST PLAN 244 SETTING TEST
OBJECTIVES 245 DEVELOPING A TEST MATRIX 245 DEFINING TEST ADMINISTRATION
250 WRITING THE TEST PLAN 251 CONTENTS XIII CHAPTER 9 CHAPTER 10 TASK 6:
INSPECT THE TEST PLAN INSPECTION CONCERNS PRODUCTS/DELIVERABLES TO
INSPECT FORMAL INSPECTION ROLES FORMAL INSPECTION DEFECT CLASSIFICATION
INSPECTION PROCEDURES CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY STEP 3:
VERIFICATION TESTING OVERVIEW OBJECTIVE CONCERNS WORKBENCH INPUT THE
REQUIREMENTS PHASE THE DESIGN PHASE THE PROGRAMMING PHASE DO PROCEDURES
TASK 1: TEST DURING THE REQUIREMENTS PHASE REQUIREMENTS PHASE TEST
FACTORS PREPARING A RISK MATRIX PERFORMING A TEST FACTOR ANALYSIS
CONDUCDNG A REQUIREMENTS WALKTHROUGH PERFORMING REQUIREMENTS TRACING
ENSURING REQUIREMENTS ARE TESTABLE TASK 2: TEST DURING THE DESIGN PHASE
SCORING SUCCESS FACTORS ANALYZING TEST FACTORS CONDUCDNG A DESIGN REVIEW
INSPECTING DESIGN DELIVERABLES TASK 3: TEST DURING THE PROGRAMMING PHASE
DESK DEBUGGING THE PROGRAM. PERFORMING PROGRAMMING PHASE TEST FACTOR
ANALYSIS CONDUCTING A PEER REVIEW CHECK PROCEDURES OUTPUT GUIDELINES
SUMMARY STEP 4: VALIDATION TESTING OVERVIEW OBJECTIVE CONCERNS WORKBENCH
INPUT 254 255 256 256 258 259 262 262 262 263 291 292 293 294 294 296
296 296 297 298 298 299 302 310 312 314 315 316 316 318 320 322 323 325
326 328 330 331 331 332 409 409 410 410 410 411 XIV CONTENTS DO
PROCEDURES 412 TASK 1: BUILD THE TEST DATA 412 SOURCES OF TEST DATA/TEST
SCRIPTS 412 TESTING FILE DESIGN 413 DEFINLNG DESIGN GOALS 414 ENTERING
TEST DATA 414 APPLYING TEST FILES AGAINST PROGRAMS THAT UPDATE MASTER
RECORDS 414 CREATING AND USING TEST DATA 415 PAYROLL APPLICATION EXAMPLE
416 CREATING TEST DATA FOR STRESS/LOAD TESTING 430 CREATING TEST SCRIPTS
430 TASK 2: EXECUTE TESTS 434 TASK 3: RECORD TEST RESULTS 436
DOCUMENTING THE DEVIATION 437 DOCUMENTING THE EFFECT 438 DOCUMENTING THE
CAUSE 438 CHECK PROCEDURES 439 OUTPUT 439 GUIDELINES 439 SUMMARY 440
CHAPTER 11 STEP 5: ANALYZING AND REPORTING TEST RESULTS 459 OVERVIEW 459
CONCERNS 460 WORKBENCH 460 INPUT 461 TEST PLAN AND PROJECT PLAN 461
EXPECTED PROCESSING RESULTS 461 DATA COLLECTED DURING TESTING 461 TEST
RESULTS DATA 462 TEST TRANSACTIONS, TEST SUITES, AND TEST EVENTS 462
DEFECTS 462 EFFICIENCY 463 STORING DATA COLLECTED DURING TESTING 463 DO
PROCEDURES 463 TASK 1: REPORT SOFTWARE STATUS 464 ESTABLISHING A
MEASUREMENT TEAM 465 CREATING AN INVENTORY OF EXISTING PROJECT
MEASUREMENTS 465 DEVELOPING A CONSISTENT SET OF PROJECT METRICS 466
DEFINING PROCESS REQUIREMENTS 466 DEVELOPING AND IMPLEMENTING THE
PROCESS 466 MONITORING THE PROCESS 466 TASK 2: REPORT INTERIM TEST
RESULTS 470 FUNCTION/TEST MATRIX 470 FUNCTIONAL TESTING STATUS REPORT
471 FUNCTIONS WORKING TIMELINE REPORT 472 EXPECTED VERSUS ACTUAL DEFECTS
UNCOVERED TIMELINE REPORT 472 CONTENTS XV CHAPTER12 DEFECTS UNCOVERED
VERSUS CORRECTED GAP TIMELINE REPORT AVERAGE AGE OF UNCORRECTED DEFECTS
BY TYPE REPORT DEFECT DISTRIBUTION REPORT NORMALIZED DEFECT DISTRIBUTION
REPORT TESTING ACTION REPORT INTERIM TEST REPORT TASK 3; REPORT FINAL
TEST RESULTS INDIVIDUAL PROJECT TEST REPORT INTEGRATION TEST REPORT
SYSTEM TEST REPORT ACCEPTANCE TEST REPORT CHECK PROCEDURE S OUTPUT
GUIDELINES SUMMARY STEP 6: ACCEPTANCE AND OPERATIONAL TESTING OVERVIEW
OBJECTIVE CONCERN S WORKBENCH INPUT PROCEDURES TASK 1: ACCEPTANCE
TESTING DEFINING THE ACCEPTANCE CRITERIA DEVELOPING AN ACCEPTANCE PLAN
EXECUTING THE ACCEPTANCE PLAN DEVELOPING TEST CASES (USE CASES) BASED ON
HOW SOFTWARE WILL BE USED TASK 2: PRE-OPERATIONAL TESTING TESTING NEW
SOFTWARE INSTALLATION TESTING THE CHANGED SOFTWARE VERSION MONITORING
PRODUCTION DOCUMENTING PROBLEMS TASK 3; POST-OPERATIONAL TESTING
DEVELOPING AND UPDATING THE TEST PLAN DEVELOPING AND UPDATING THE TEST
DATA TESTING THE CONTROL CHANGE PROCESS CONDUCTING TESTING DEVELOPING
AND UPDATING TRAINING MATERIAL CHECK PROCEDURES OUTPUT IS THE AUTOMATED
APPLICATION ACCEPTABLE? AUTOMATED APPLICATION SEGMENT FAILURE
NOTIFICATION IS THE MANUAL SEGMENT ACCEPTABLE? TRAINING FAILURE
NOTIFICATION FORM GUIDELINES SUMMARY 473 475 475 476 477 478 478 480 480
480 482 482 482 482 483 491 491 492 493 494 495 496 497 498 499 500 503
509 509 512 513 513 514 515 517 518 518 522 522 522 523 523 524 524 525
XVI CONTENTS CHAPTER13 PART IV CHAPTER14 CHAPTER 15 STEP 7:
POST-IMPLEMENTATION ANALYSIS OVERVIEW CONCERNS WORKBENCH INPUT DO
FROCEDURES TASK 1: ESTABLISH ASSESSMENT OBJECTO VES TASK 2: IDENTIFY
WHAT TO MEASURE TASK 3: ASSIGN MEASUREMENT RESPONSIBILITY TASK 4: SELECT
EVALUATION APPROACH TASK 5: IDENTIFY NEEDED FACTS TASK 6: COLLECT
EVALUATION DATA TASK 7: ASSESS THE EFFECTIVENESS OF TESTING USING
TESTING MERRICS CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY INCORPORATING
SPECIALIZED TESTING RESPONSIBILITIES SOFTWARE DEVELOPMENT METHODOLOGIES
HOW MUCH TESTING IS ENOUGH? SOFTWARE DEVELOPMENT METHODOLOGIES OVERVIEW
METHODOLOGY TYPES SOFTWARE DEVELOPMENT LIFE CYCLE DEFINING REQUIREMENTS
CATEGORIES ATTRIBUTES METHODOLOGY MATURITY COMPETENCIES REQUIRED STAFF
EXPERIENCE CONFIGURATION-MANAGEMENT CONTROLS BASIC CM REQUIREMENTS PLANN
ING DATA DISTRIBUTION AND ACCESS CM ADMINISTRATION CONFIGURATION
IDENTIFICATION CONFIGURATION CONTROL MEASURING THE IMPACT OF THE
SOFTWARE DEVELOPMENT PROCESS SUMMARY TESTING CLIENT/SERVER SYSTEMS
OVERVIEW CONCERNS WORKBENCH INPUT 571 571 572 572 574 574 574 575 575
575 576 577 577 577 580 580 581 581 583 585 585 586 586 587 588 592 592
593 596 598 600 600 600 602 602 602 603 605 605 606 611 611 612 613 614
DO PROCEDURES TASK 1: ASSESS READINESS SOFTWARE DEVELOPMENT PROCESS
MARURITY LEVELS CONDUCTING THE CLIENT/SERVER READINESS ASSESSMENT
PREPARING A CLIENT/SERVER READINESS FOOTPRINT CHART TASK 2: ASSESS KEY
COMPONENTS TASK 3: ASSESS CLIENT NEEDS CHECK PROCEDURES OUTPUT
GUIDELINES SUMMA RY CHAPTER 16 RAPID APPLICATION DEVELOPMENT TESTING
OVERVIEW OBJECTIVE CONCERN S TESTING ITERATIONS TESTING COMPONENTS
TESTING PERFORMANCE RECORDING TEST INFORMATION WORKBENCH INPUT DO
PROCEDURES TESTING WITHIN ITERATIVE RAD SPIRAL TESTING TASK 1: DETERMINE
APPROPRIATENESS OF RAD TASK 2: TEST PLANNING ITERATIONS TASK 3: TEST
SUBSEQUENT PLANNING ITERATIONS TASK 4: TEST THE FINAL PLANNING ITERATION
CHECK PROCEDURES OUTPUT GUIDELINES SUMMARY CHAPTER 17 TESTING INTERNAL
CONTROLS OVERVIEW INTERNAL CONTROLS CONTROL OBJECTIVES PREVENDVE
CONTROLS SOURCE-DATA AUTHORIZATION DATA MPUT SOURCE-DATA PREPARATION
TURNAROUND DOCUMENTS PRENUMBERED FORMS INPUT VALIDATION FILE
AUTO-UPDATING PROCESSING CONTROLS CONTENTS XVII 614 614 615 621 621 622
622 624 624 624 624 633 633 634 634 634 635 635 635 635 636 636 636 638
639 640 640 642 642 643 643 643 655 655 657 657 658 658 659 659 659 659
659 661 661 XVIII CONTENTS CHAPTER 18 DETECTIVE CONTROLS DATA
TRANSMISSION CONTROL REGISTER CONTRA 1 TOTAIS DOCUMENTING AND TESTING
OUTPUT CHECKS CORRECTIVE CONTROLS ERROR DETECTION AND RESUBMISSION AUDIT
TRAILS COST/BENEIIT ANALYSIS ASSESSING INTERNAL CONTROLS TASK 1:
UNDERSTAND THE SYSTEM BEINGTESTED TASK 2: IDENTIFY RISKS TASK 3: REVIEW
APPLICATION CONTROLS TASK 4: TEST APPLICATION CONTROLS TESTING WITHOUT
COMPUTER PROCESSING TESTING WITH COMPUTER PROCESSING TRANSACTION FLOW
TESTING OBJECTIVES OF INTERNAL ACCOUNTING CONTROLS RESULTS OF TESTING
TASK 5: DOCUMENT CONTROL STRENGTHS AND WEAKNESSES QUALITY CONTROL
CHECKLIST SUMMARY TESTING COTS AND CONTRACTED SOFTWARE OVERVIEW COTS
SOFTWARE ADVANTAGES, DISADVANTAGES, AND RISKS COTS VERSUS CONTRACTED
SOFTWARE COTS ADVANTAGES COTS DISADVANTAGES IMPLEMENTATION RISKS TESTING
COTS SOFTWARE TESTING CONTRACTED SOFTWARE OBJECTIVE CONCERNS WORKBENCH
INPUT DO PROCEDURES TASK 1: TEST BUSINESS FIT STEP 1: TESTING NEEDS
SPECIFICATION STEP 2: TESTING CSFS TASK 2: TEST OPERATIONAL FIT STEP 1:
TEST COMPATIBILITY STEP 2: INTEGRATE THE SOFTWARE INTO EXISTING WORK
FLOWS STEP 3: DEMONSTRATE THE SOFTWARE IN ACTION TASK 3: TEST PEOPLE FIT
662 663 663 664 664 664 665 665 665 666 666 666 668 668 668 669 669 672
673 677 677 678 678 685 686 686 686 687 687 688 689 690 691 691 692 693
693 693 693 695 696 697 698 700 701 CONTENTS XIX TASK 4: ACCEPTANCE-TEST
THE SOFTWARE PROCESS 702 STEP 1: CREATE FUNCTIONAL TEST CONDITIONS 702
STEP 2: CREATE STRUCTURA] TEST CONDITIONS 703 MODIFYING THE TESTING
PROCESS FOR CONTRACTED SOFTWARE 704 CHECK PROCEDURES 705 OUTPUT 705
GUIDELINES 706 SUMMARY 706 CHAPTER 19 TESTING IN A MULTIPIATFORM
ENVIRONMENT 717 OVERVIEW 717 OBJECTIVE 718 CONCERNS 718 BACKGROUND ON
TESTING IN A MULTIPIATFORM ENVIRONMENT 718 WORKBENCH 719 INPUT 720 DO
PROCEDURES 721 TASK 1: DEFINE PLATFORM CONFIGURATION CONCERNS 721 TASK
2: LIST NEEDED PLATFORM CONTIGURATIONS 723 TASK 3; ASSESS TEST ROOM
CONFIGURATIONS 723 TASK 4: LIST STRUCTURAL COMPONENTS AFFECTED BY THE
PLATFORM(S) 723 TASK 5; LIST INTERFACES THE PLATFORM AFFECTS 725 TASK 6:
EXECUTE THE TESTS 726 CHECK PROCEDURES 726 OUTPUT 726 GUIDELINES 726
SUMMARY 727 CHAPTER 20 TESTING SOFTWARE SYSTEM SECURITY 733 OVERVIEW 733
OBJECTIVE 734 CONCERNS 734 WORKBENCH 734 INPUT 735 WHERE VULNERABILITIES
OCCUR 735 FUNCTIONAL VULNERABILITIES 736 VULNERABLE AREAS 737 ACCIDENTAL
VERSUS INTENTIONAL LOSSES 738 DO PROCEDURES 739 TASK 1: ESTABLISH A
SECURITY BASELINE 739 WHY BASELINES ARE NECESSARY 740 CREATING BASELINES
740 USING BASELTNES 749 TASK 2: BUILD A PENETRATION-POINT MATRIX 751
CONTROLLING PEOPLE BY CONTROLLING ACTIVITIES 751 SELECTING SECURITY
ACTIVITIES 752 CONTROLLING BUSINESS TRANSACTIONS 755 XX CONTENTS CHAPTER
21 CHAPTER 22 CHARACTERISTICS OF SECURITY PENETRATION BUILDING A
PENETRATION-PO INT MATRIX TASK 3: ANALYZE THE RESULTS OF SECURITY
TESTING EVALUATING THE ADEQUACY OF SECURITY CHECK PROCEDURES OUTPUT
GUIDELINES SUMMARY TESTING A DATA WAREHOUSE OVERVIEW CONCEMS WORKBENCH
INPUT DO PROCEDURES TASK 1: MEASURE THE MAGNITUDE OF DATA WAREHOUSE
CONCERNS TASK 2: IDENTIFY DATA WAREHOUSE ACTIVITY PROCESSES TO TEST
ORGANIZATIONAL PROCESS DATA DOCUMENTATION PROCESS SYSTEM DEVELOPMENT
PROCESS ACCESS CONTROL PROCESS DATA INTEGRITY PROCESS OPERATIONS PROCESS
BACKUP/ RECOVERY PROCESS PERFORM ING TASK 2 TASK 3: TEST THE ADEQUACY OF
DATA WAREHOUSE ACTIVITY PROCESSES CHECK PROCEDURES OUTPUT GUIDELINES
SUMMARY TESTING WEB-BASED SYSTEMS OVERVIEW CONCERNS WORKBENCH INPUT DO
PROCEDURES TASK 1: SELECT WEB-BASED RISKS TO INCLUDE IN THE TEST PLAN
SECURITY CONCERNS PERFORMANCE CONCERNS CORRECTNESS CONCERNS
COMPATIBILITY CONCERNS RELIABILITY CONCERNS DATA INTEGRITY CONCERNS
USABILITY CONCERNS RECOVERABILITY CONCERNS 756 757 760 761 762 762 762
762 765 765 765 766 767 768 768 769 769 769 770 771 771 772 773 774 774
780 780 780 780 799 799 800 800 801 802 802 803 803 804 804 806 806 806
807 CONTENTS XXI PARTV CHAPTER 23 CHAPTER 24 TASK 2: SELECT WEB-BASED
TESTS UNIT OR COMPONENT INTEGRATION SYSTEM USER ACCEPTANCE PERFORMANCE
LOAD/STRESS REGRESSION USABILITY COMPATIBILITY TASK 3: SELECT WEB-BASED
TEST TOOLS TASK 4: TEST WEB-BASED SYSTEMS CHECK PROCEDURES OUTPUT
GUIDELINES SUMMARY BUILDING AGILITY INTO THE TESTING PROCESS USING AGILE
METHODS TO IMPROVE SOFTWARE TESTING THE IMPORTANCE OF AGILITY BUILDING
AN AGILE TESTING PROCESS AGILITY INHIBITORS IS IMPROVEMENT NECESSARY?
COMPRESSING TIME CHALLENGES SOLUTIONS MEASURING READINESS THE SEVEN-STEP
PROCESS SUMMARY BUILDING AGILITY INTO THE TESTING PROCESS STEP 1:
MEASURE SOFTWARE PROCESS VARIABILITY TIMELINES PROCESS STEPS WORKBENCHES
TIME-COMPRESSION WORKBENCHES REDUCING VARIABILITY DEVELOPING TIMELINES
IMPROVEMENT SHOPPING LIST QUALITY CONTROL CHECKLIST CONCLUSION STEP 2:
MAXIMIXE BEST PRACTICES TESTER AGILITY SOFTWARE TESTING RELATIONSHIPS
TRADEOFFS CAPABILITY CHART MEASURINC EFFECTIVENESS AND EFFICIENCY 807
807 807 807 808 808 808 808 808 808 809 809 809 810 810 811 817 819 819
820 821 822 823 824 825 826 826 827 831 831 832 833 833 834 835 836 841
841 842 842 842 843 845 847 848 XXII CONTENTS IMPROVEMENT SHOPPING LIST
856 QUALITY CONTROL CHECKLIST 856 CUNCLUSION 857 STEP 3: BUILD ORT
STRENGTH, MINIMIZE WEAKNESS 857 EFFECTIVE TESTING PROCESSES 857 POOR
TESTING PROCESSES 860 IMPROVEMENT SHOPPING LIST 860 QUALITY CONTROL
CHECKLIST 860 CONCLUSION 861 STEP 4: IDENTIFY AND ADDRESS IMPROVEMENT
BARRIERS 861 THE STAKEHOLDER PERSPECTIVE 861 STAKEHOLDER INVOLVEMENT 863
PERFORMING STAKEHOLDER ANALYSIS 863 RED-FLAG/HOT-BUTTON BARRIERS 864
STAFF-COMPETENCY BARRIERS 865 ADMINISTRATIVE/ORGANIZATIONAL BARRIERS 865
DETERMINING THE ROOT CAUSE OF BARRIERS/OBSTACLES 866 ADDRESSING THE ROOT
CAUSE OF BARRIERS/OB STACLES 867 QUALITY CONTROL CHECKLIST 869
CONCLUSION 869 STEP 5: IDENTIFY AND ADDRESS CULTURAL AND COMMUNICATION
BARRIERS 869 MANAGEMENT CULTURES 870 CULTURE 1: MANAGE PEOPLE 871
CULTURE 2: MANAGE BY PROCESS 873 CULTURE 3: MANAGE COMPETENCIES 874
CULTURE 4: MANAGE BY FACT 876 CULTURE 5: MANAGE BUSINESS INNOVATION 878
CULTURAL BARRIERS 879 IDENTIFYING THE CURRENT MANAGEMENT CULTURE 879
IDENTIFYING THE BARRIERS POSED BY THE CULTURE 879 DETERMINING WHAT CAN
BE DONE IN THE CURRENT CULTURE 879 DETERMINING THE DESIRED CULTURE FOR
TIME COMPRESSION 879 DETERMINING HOW TO ADDRESS CULTURE BARRIERS 880
OPEN AND EFFECTIVECOMMUNICATION 880 LINES OF COMMUNICATION 881
INFORMATION/COMMUNICATION BARRIERS 882 EFFECTIVE COMMUNICATION 882
QUALITY CONTROL CHECKLIST 884 CONCLUSION 885 STEP 6: IDENTIFY
IMPLEMENTABLE TMPROVEMENTS 885 WHAT IS AN IMPLEMENTABLE? 885 IDENTIFYING
IMPLEMENTABLES VIA TIME COMPRESSION 886 PRIORITIZING IMPLEMENTABLES 888
DOCUMENTING APPROACHES 890 QUALITY CONTROL CHECKLIST 890 CONCLUSION 890
CONTENTS XXIII STEP 7: DEVELOP AND EXECUTE AN IMPLEMENTATION PLAN 891
PLANNING 891 IMPLEMENTING IDEAS 891 REQUISITE RESOURCES 893 QUALITY
CONTROL CHECKLIST 894 CONCLUSION 894 SUMMARY 895 INDEX 929 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Perry, William E. |
author_facet | Perry, William E. |
author_role | aut |
author_sort | Perry, William E. |
author_variant | w e p we wep |
building | Verbundindex |
bvnumber | BV021620457 |
callnumber-first | Q - Science |
callnumber-label | QA76 |
callnumber-raw | QA76.76.T48 |
callnumber-search | QA76.76.T48 |
callnumber-sort | QA 276.76 T48 |
callnumber-subject | QA - Mathematics |
classification_rvk | ST 233 |
ctrlnum | (OCoLC)62732602 (DE-599)BVBBV021620457 |
dewey-full | 005.1/4 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 005 - Computer programming, programs, data, security |
dewey-raw | 005.1/4 |
dewey-search | 005.1/4 |
dewey-sort | 15.1 14 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
discipline_str_mv | Informatik |
edition | 3. ed. |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>02132nam a2200529zc 4500</leader><controlfield tag="001">BV021620457</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20120220 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">060619s2006 xxud||| |||| 00||| eng d</controlfield><datafield tag="010" ind1=" " ind2=" "><subfield code="a">2005036216</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0764598376</subfield><subfield code="c">cloth/cdrom</subfield><subfield code="9">0-7645-9837-6</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780764598371</subfield><subfield code="9">978-0-7645-9837-1</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)62732602</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV021620457</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">xxu</subfield><subfield code="c">US</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-703</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA76.76.T48</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">005.1/4</subfield><subfield code="2">22</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 233</subfield><subfield code="0">(DE-625)143620:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Perry, William E.</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Effective methods for software testing</subfield><subfield code="c">William E. Perry</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">3. ed.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Indianapolis, IN</subfield><subfield code="b">Wiley</subfield><subfield code="c">2006</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XXVII, 973 S.</subfield><subfield code="b">graph. Darst.</subfield><subfield code="e">1 CD-ROM (12 cm)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Accompanying CD-ROM contains ... "work papers and quality control checklists your organization needs to implement an effective software testing process."--P. [4] of cover.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Logiciels - Essais</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Computer software</subfield><subfield code="x">Testing</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Testen</subfield><subfield code="0">(DE-588)4367264-4</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Anwendungssoftware</subfield><subfield code="0">(DE-588)4120906-0</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Softwaretest</subfield><subfield code="0">(DE-588)4132652-0</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="655" ind1=" " ind2="7"><subfield code="8">1\p</subfield><subfield code="0">(DE-588)4398750-3</subfield><subfield code="a">Checkliste</subfield><subfield code="2">gnd-content</subfield></datafield><datafield tag="655" ind1=" " ind2="4"><subfield code="a">CD-ROMs</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Softwaretest</subfield><subfield code="0">(DE-588)4132652-0</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">Anwendungssoftware</subfield><subfield code="0">(DE-588)4120906-0</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="1"><subfield code="a">Testen</subfield><subfield code="0">(DE-588)4367264-4</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="8">2\p</subfield><subfield code="5">DE-604</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">GBV Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014835529&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-014835529</subfield></datafield><datafield tag="883" ind1="1" ind2=" "><subfield code="8">1\p</subfield><subfield code="a">cgwrk</subfield><subfield code="d">20201028</subfield><subfield code="q">DE-101</subfield><subfield code="u">https://d-nb.info/provenance/plan#cgwrk</subfield></datafield><datafield tag="883" ind1="1" ind2=" "><subfield code="8">2\p</subfield><subfield code="a">cgwrk</subfield><subfield code="d">20201028</subfield><subfield code="q">DE-101</subfield><subfield code="u">https://d-nb.info/provenance/plan#cgwrk</subfield></datafield></record></collection> |
genre | 1\p (DE-588)4398750-3 Checkliste gnd-content CD-ROMs |
genre_facet | Checkliste CD-ROMs |
id | DE-604.BV021620457 |
illustrated | Illustrated |
index_date | 2024-07-02T14:53:31Z |
indexdate | 2024-07-09T20:40:05Z |
institution | BVB |
isbn | 0764598376 9780764598371 |
language | English |
lccn | 2005036216 |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-014835529 |
oclc_num | 62732602 |
open_access_boolean | |
owner | DE-703 |
owner_facet | DE-703 |
physical | XXVII, 973 S. graph. Darst. 1 CD-ROM (12 cm) |
publishDate | 2006 |
publishDateSearch | 2006 |
publishDateSort | 2006 |
publisher | Wiley |
record_format | marc |
spelling | Perry, William E. Verfasser aut Effective methods for software testing William E. Perry 3. ed. Indianapolis, IN Wiley 2006 XXVII, 973 S. graph. Darst. 1 CD-ROM (12 cm) txt rdacontent n rdamedia nc rdacarrier Accompanying CD-ROM contains ... "work papers and quality control checklists your organization needs to implement an effective software testing process."--P. [4] of cover. Logiciels - Essais Computer software Testing Testen (DE-588)4367264-4 gnd rswk-swf Anwendungssoftware (DE-588)4120906-0 gnd rswk-swf Softwaretest (DE-588)4132652-0 gnd rswk-swf 1\p (DE-588)4398750-3 Checkliste gnd-content CD-ROMs Softwaretest (DE-588)4132652-0 s DE-604 Anwendungssoftware (DE-588)4120906-0 s Testen (DE-588)4367264-4 s 2\p DE-604 GBV Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014835529&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis 1\p cgwrk 20201028 DE-101 https://d-nb.info/provenance/plan#cgwrk 2\p cgwrk 20201028 DE-101 https://d-nb.info/provenance/plan#cgwrk |
spellingShingle | Perry, William E. Effective methods for software testing Logiciels - Essais Computer software Testing Testen (DE-588)4367264-4 gnd Anwendungssoftware (DE-588)4120906-0 gnd Softwaretest (DE-588)4132652-0 gnd |
subject_GND | (DE-588)4367264-4 (DE-588)4120906-0 (DE-588)4132652-0 (DE-588)4398750-3 |
title | Effective methods for software testing |
title_auth | Effective methods for software testing |
title_exact_search | Effective methods for software testing |
title_exact_search_txtP | Effective methods for software testing |
title_full | Effective methods for software testing William E. Perry |
title_fullStr | Effective methods for software testing William E. Perry |
title_full_unstemmed | Effective methods for software testing William E. Perry |
title_short | Effective methods for software testing |
title_sort | effective methods for software testing |
topic | Logiciels - Essais Computer software Testing Testen (DE-588)4367264-4 gnd Anwendungssoftware (DE-588)4120906-0 gnd Softwaretest (DE-588)4132652-0 gnd |
topic_facet | Logiciels - Essais Computer software Testing Testen Anwendungssoftware Softwaretest Checkliste CD-ROMs |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014835529&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT perrywilliame effectivemethodsforsoftwaretesting |