无法将类型为“Microsoft.Data.Entity.Build.Tasks.EntityDeploy”的对象强制转换为类型

pxesoft 2023-02-20 09:01:27

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace xx
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine("初学C#测试");
        }
    }
}
 

错误    1    The "EntityDeploy" task could not be instantiated from the assembly "C:\Windows\Microsoft.NET\Framework\v3.5\Microsoft.Data.Entity.Build.Tasks.dll". Please verify the task assembly has been built using the same version of the Microsoft.Build.Framework assembly as the one installed on your computer and that your host application is not missing a binding redirect for Microsoft.Build.Framework. 无法将类型为“Microsoft.Data.Entity.Build.Tasks.EntityDeploy”的对象强制转换为类型“Microsoft.Build.Framework.ITask”。

错误    2    The "EntityDeploy" task has been declared or used incorrectly, or failed during construction. Check the spelling of the task name and the assembly name.   
VS2017中运行正常,但在VS2008就提示这个错误,请大家帮忙看看,这2个错误怎么解决,谢谢!

...全文
340 回复 打赏 收藏 转发到动态 举报
写回复
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复
I. Spring Boot Documentation 1. About the Documentation 2. Getting Help 3. First Steps 4. Working with Spring Boot 5. Learning about Spring Boot Features 6. Moving to Production 7. Advanced Topics II. Getting Started 8. Introducing Spring Boot 9. System Requirements 9.1. Servlet Containers 10. Installing Spring Boot 10.1. Installation Instructions for the Java Developer 10.1.1. Maven Installation 10.1.2. Gradle Installation 10.2. Installing the Spring Boot CLI 10.2.1. Manual Installation 10.2.2. Installation with SDKMAN! 10.2.3. OSX Homebrew Installation 10.2.4. MacPorts Installation 10.2.5. Command-line Completion 10.2.6. Quick-start Spring CLI Example 10.3. Upgrading from an Earlier Version of Spring Boot 11. Developing Your First Spring Boot Application 11.1. Creating the POM 11.2. Adding Classpath Dependencies 11.3. Writing the Code 11.3.1. The @RestController and @RequestMapping Annotations 11.3.2. The @EnableAutoConfiguration Annotation 11.3.3. The “main” Method 11.4. Running the Example 11.5. Creating an Executable Jar 12. What to Read Next III. Using Spring Boot 13. Build Systems 13.1. Dependency Management 13.2. Maven 13.2.1. Inheriting the Starter Parent 13.2.2. Using Spring Boot without the Parent POM 13.2.3. Using the Spring Boot Maven Plugin 13.3. Gradle 13.4. Ant 13.5. Starters 14. Structuring Your Code 14.1. Using the “default” Package 14.2. Locating the Main Application Class 15. Configuration Classes 15.1. Importing Additional Configuration Classes 15.2. Importing XML Configuration 16. Auto-configuration 16.1. Gradually Replacing Auto-configuration 16.2. Disabling Specific Auto-configuration Classes 17. Spring Beans and Dependency Injection 18. Using the @SpringBootApplication Annotation 19. Running Your Application 19.1. Running from an IDE 19.2. Running as a Packaged Application 19.3. Using the Maven Plugin 19.4. Using the Gradle Plugin 19.5. Hot Swapping 20. Developer Tools 20.1. Property Defaults 20.2. Automatic Restart 20.2.1. Logging changes in condition evaluation 20.2.2. Excluding Resources 20.2.3. Watching Additional Paths 20.2.4. Disabling Restart 20.2.5. Using a Trigger File 20.2.6. Customizing the Restart Classloader 20.2.7. Known Limitations 20.3. LiveReload 20.4. Global Settings 20.5. Remote Applications 20.5.1. Running the Remote Client Application 20.5.2. Remote Update 21. Packaging Your Application for Production 22. What to Read Next IV. Spring Boot features 23. SpringApplication 23.1. Startup Failure 23.2. Customizing the Banner 23.3. Customizing SpringApplication 23.4. Fluent Builder API 23.5. Application Events and Listeners 23.6. Web Environment 23.7. Accessing Application Arguments 23.8. Using the ApplicationRunner or CommandLineRunner 23.9. Application Exit 23.10. Admin Features 24. Externalized Configuration 24.1. Configuring Random Values 24.2. Accessing Command Line Properties 24.3. Application Property Files 24.4. Profile-specific Properties 24.5. Placeholders in Properties 24.6. Using YAML Instead of Properties 24.6.1. Loading YAML 24.6.2. Exposing YAML as Properties in the Spring Environment 24.6.3. Multi-profile YAML Documents 24.6.4. YAML Shortcomings 24.7. Type-safe Configuration Properties 24.7.1. Third-party Configuration 24.7.2. Relaxed Binding 24.7.3. Merging Complex Types 24.7.4. Properties Conversion Converting durations 24.7.5. @ConfigurationProperties Validation 24.7.6. @ConfigurationProperties vs. @Value 25. Profiles 25.1. Adding Active Profiles 25.2. Programmatically Setting Profiles 25.3. Profile-specific Configuration Files 26. Logging 26.1. Log Format 26.2. Console Output 26.2.1. Color-coded Output 26.3. File Output 26.4. Log Levels 26.5. Custom Log Configuration 26.6. Logback Extensions 26.6.1. Profile-specific Configuration 26.6.2. Environment Properties 27. Developing Web Applications 27.1. The “Spring Web MVC Framework” 27.1.1. Spring MVC Auto-configuration 27.1.2. HttpMessageConverters 27.1.3. Custom JSON Serializers and Deserializers 27.1.4. MessageCodesResolver 27.1.5. Static Content 27.1.6. Welcome Page 27.1.7. Custom Favicon 27.1.8. Path Matching and Content Negotiation 27.1.9. ConfigurableWebBindingInitializer 27.1.10. Template Engines 27.1.11. Error Handling Custom Error Pages Mapping Error Pages outside of Spring MVC 27.1.12. Spring HATEOAS 27.1.13. CORS Support 27.2. The “Spring WebFlux Framework” 27.2.1. Spring WebFlux Auto-configuration 27.2.2. HTTP Codecs with HttpMessageReaders and HttpMessageWriters 27.2.3. Static Content 27.2.4. Template Engines 27.2.5. Error Handling Custom Error Pages 27.2.6. Web Filters 27.3. JAX-RS and Jersey 27.4. Embedded Servlet Container Support 27.4.1. Servlets, Filters, and listeners Registering Servlets, Filters, and Listeners as Spring Beans 27.4.2. Servlet Context Initialization Scanning for Servlets, Filters, and listeners 27.4.3. The ServletWebServerApplicationContext 27.4.4. Customizing Embedded Servlet Containers Programmatic Customization Customizing ConfigurableServletWebServerFactory Directly 27.4.5. JSP Limitations 28. Security 28.1. MVC Security 28.2. WebFlux Security 28.3. OAuth2 28.3.1. Client 28.3.2. Server 28.4. Actuator Security 28.4.1. Cross Site Request Forgery Protection 29. Working with SQL Databases 29.1. Configure a DataSource 29.1.1. Embedded Database Support 29.1.2. Connection to a Production Database 29.1.3. Connection to a JNDI DataSource 29.2. Using JdbcTemplate 29.3. JPA and “Spring Data” 29.3.1. Entity Classes 29.3.2. Spring Data JPA Repositories 29.3.3. Creating and Dropping JPA Databases 29.3.4. Open EntityManager in View 29.4. Using H2’s Web Console 29.4.1. Changing the H2 Console’s Path 29.5. Using jOOQ 29.5.1. Code Generation 29.5.2. Using DSLContext 29.5.3. jOOQ SQL Dialect 29.5.4. Customizing jOOQ 30. Working with NoSQL Technologies 30.1. Redis 30.1.1. Connecting to Redis 30.2. MongoDB 30.2.1. Connecting to a MongoDB Database 30.2.2. MongoTemplate 30.2.3. Spring Data MongoDB Repositories 30.2.4. Embedded Mongo 30.3. Neo4j 30.3.1. Connecting to a Neo4j Database 30.3.2. Using the Embedded Mode 30.3.3. Neo4jSession 30.3.4. Spring Data Neo4j Repositories 30.3.5. Repository Example 30.4. Gemfire 30.5. Solr 30.5.1. Connecting to Solr 30.5.2. Spring Data Solr Repositories 30.6. Elasticsearch 30.6.1. Connecting to Elasticsearch by Using Jest 30.6.2. Connecting to Elasticsearch by Using Spring Data 30.6.3. Spring Data Elasticsearch Repositories 30.7. Cassandra 30.7.1. Connecting to Cassandra 30.7.2. Spring Data Cassandra Repositories 30.8. Couchbase 30.8.1. Connecting to Couchbase 30.8.2. Spring Data Couchbase Repositories 30.9. LDAP 30.9.1. Connecting to an LDAP Server 30.9.2. Spring Data LDAP Repositories 30.9.3. Embedded In-memory LDAP Server 30.10. InfluxDB 30.10.1. Connecting to InfluxDB 31. Caching 31.1. Supported Cache Providers 31.1.1. Generic 31.1.2. JCache (JSR-107) 31.1.3. EhCache 2.x 31.1.4. Hazelcast 31.1.5. Infinispan 31.1.6. Couchbase 31.1.7. Redis 31.1.8. Caffeine 31.1.9. Simple 31.1.10. None 32. Messaging 32.1. JMS 32.1.1. ActiveMQ Support 32.1.2. Artemis Support 32.1.3. Using a JNDI ConnectionFactory 32.1.4. Sending a Message 32.1.5. Receiving a Message 32.2. AMQP 32.2.1. RabbitMQ support 32.2.2. Sending a Message 32.2.3. Receiving a Message 32.3. Apache Kafka Support 32.3.1. Sending a Message 32.3.2. Receiving a Message 32.3.3. Additional Kafka Properties 33. Calling REST Services with RestTemplate 33.1. RestTemplate Customization 34. Calling REST Services with WebClient 34.1. WebClient Customization 35. Validation 36. Sending Email 37. Distributed Transactions with JTA 37.1. Using an Atomikos Transaction Manager 37.2. Using a Bitronix Transaction Manager 37.3. Using a Narayana Transaction Manager 37.4. Using a Java EE Managed Transaction Manager 37.5. Mixing XA and Non-XA JMS Connections 37.6. Supporting an Alternative Embedded Transaction Manager 38. Hazelcast 39. Quartz Scheduler 40. Spring Integration 41. Spring Session 42. Monitoring and Management over JMX 43. Testing 43.1. Test Scope Dependencies 43.2. Testing Spring Applications 43.3. Testing Spring Boot Applications 43.3.1. Detecting Web Application Type 43.3.2. Detecting Test Configuration 43.3.3. Excluding Test Configuration 43.3.4. Testing with a running server 43.3.5. Using JMX 43.3.6. Mocking and Spying Beans 43.3.7. Auto-configured Tests 43.3.8. Auto-configured JSON Tests 43.3.9. Auto-configured Spring MVC Tests 43.3.10. Auto-configured Spring WebFlux Tests 43.3.11. Auto-configured Data JPA Tests 43.3.12. Auto-configured JDBC Tests 43.3.13. Auto-configured jOOQ Tests 43.3.14. Auto-configured Data MongoDB Tests 43.3.15. Auto-configured Data Neo4j Tests 43.3.16. Auto-configured Data Redis Tests 43.3.17. Auto-configured Data LDAP Tests 43.3.18. Auto-configured REST Clients 43.3.19. Auto-configured Spring REST Docs Tests Auto-configured Spring REST Docs Tests with Mock MVC Auto-configured Spring REST Docs Tests with REST Assured 43.3.20. User Configuration and Slicing 43.3.21. Using Spock to Test Spring Boot Applications 43.4. Test Utilities 43.4.1. ConfigFileApplicationContextInitializer 43.4.2. TestPropertyValues 43.4.3. OutputCapture 43.4.4. TestRestTemplate 44. WebSockets 45. Web Services 46. Creating Your Own Auto-configuration 46.1. Understanding Auto-configured Beans 46.2. Locating Auto-configuration Candidates 46.3. Condition Annotations 46.3.1. Class Conditions 46.3.2. Bean Conditions 46.3.3. Property Conditions 46.3.4. Resource Conditions 46.3.5. Web Application Conditions 46.3.6. SpEL Expression Conditions 46.4. Testing your Auto-configuration 46.4.1. Simulating a Web Context 46.4.2. Overriding the Classpath 46.5. Creating Your Own Starter 46.5.1. Naming 46.5.2. autoconfigure Module 46.5.3. Starter Module 47. Kotlin support 47.1. Requirements 47.2. Null-safety 47.3. Kotlin API 47.3.1. runApplication 47.3.2. Extensions 47.4. Dependency management 47.5. @ConfigurationProperties 47.6. Testing 47.7. Resources 47.7.1. Further reading 47.7.2. Examples 48. What to Read Next V. Spring Boot Actuator: Production-ready features 49. Enabling Production-ready Features 50. Endpoints 50.1. Enabling Endpoints 50.2. Exposing Endpoints 50.3. Securing HTTP Endpoints 50.4. Configuring Endpoints 50.5. Hypermedia for Actuator Web Endpoints 50.6. Actuator Web Endpoint Paths 50.7. CORS Support 50.8. Implementing Custom Endpoints 50.8.1. Receiving Input Input type conversion 50.8.2. Custom Web Endpoints Web Endpoint Request Predicates Path HTTP method Consumes Produces Web Endpoint Response Status Web Endpoint Range Requests Web Endpoint Security 50.8.3. Servlet endpoints 50.8.4. Controller endpoints 50.9. Health Information 50.9.1. Auto-configured HealthIndicators 50.9.2. Writing Custom HealthIndicators 50.9.3. Reactive Health Indicators 50.9.4. Auto-configured ReactiveHealthIndicators 50.10. Application Information 50.10.1. Auto-configured InfoContributors 50.10.2. Custom Application Information 50.10.3. Git Commit Information 50.10.4. Build Information 50.10.5. Writing Custom InfoContributors 51. Monitoring and Management over HTTP 51.1. Customizing the Management Endpoint Paths 51.2. Customizing the Management Server Port 51.3. Configuring Management-specific SSL 51.4. Customizing the Management Server Address 51.5. Disabling HTTP Endpoints 52. Monitoring and Management over JMX 52.1. Customizing MBean Names 52.2. Disabling JMX Endpoints 52.3. Using Jolokia for JMX over HTTP 52.3.1. Customizing Jolokia 52.3.2. Disabling Jolokia 53. Loggers 53.1. Configure a Logger 54. Metrics 54.1. Getting started 54.2. Supported monitoring systems 54.2.1. Atlas 54.2.2. Datadog 54.2.3. Ganglia 54.2.4. Graphite 54.2.5. Influx 54.2.6. JMX 54.2.7. New Relic 54.2.8. Prometheus 54.2.9. SignalFx 54.2.10. Simple 54.2.11. StatsD 54.2.12. Wavefront 54.3. Supported Metrics 54.3.1. Spring MVC Metrics 54.3.2. Spring WebFlux Metrics 54.3.3. RestTemplate Metrics 54.3.4. Cache Metrics 54.3.5. DataSource Metrics 54.3.6. RabbitMQ Metrics 54.4. Registering custom metrics 54.5. Customizing individual metrics 54.5.1. Per-meter properties 54.6. Metrics endpoint 55. Auditing 56. HTTP Tracing 56.1. Custom HTTP tracing 57. Process Monitoring 57.1. Extending Configuration 57.2. Programmatically 58. Cloud Foundry Support 58.1. Disabling Extended Cloud Foundry Actuator Support 58.2. Cloud Foundry Self-signed Certificates 58.3. Custom context path 59. What to Read Next VI. Deploying Spring Boot Applications 60. Deploying to the Cloud 60.1. Cloud Foundry 60.1.1. Binding to Services 60.2. Heroku 60.3. OpenShift 60.4. Amazon Web Services (AWS) 60.4.1. AWS Elastic Beanstalk Using the Tomcat Platform Using the Java SE Platform 60.4.2. Summary 60.5. Boxfuse and Amazon Web Services 60.6. Google Cloud 61. Installing Spring Boot Applications 61.1. Supported Operating Systems 61.2. Unix/Linux Services 61.2.1. Installation as an init.d Service (System V) Securing an init.d Service 61.2.2. Installation as a systemd Service 61.2.3. Customizing the Startup Script Customizing the Start Script when It Is Written Customizing a Script When It Runs 61.3. Microsoft Windows Services 62. What to Read Next VII. Spring Boot CLI 63. Installing the CLI 64. Using the CLI 64.1. Running Applications with the CLI 64.1.1. Deduced “grab” Dependencies 64.1.2. Deduced “grab” Coordinates 64.1.3. Default Import Statements 64.1.4. Automatic Main Method 64.1.5. Custom Dependency Management 64.2. Applications with Multiple Source Files 64.3. Packaging Your Application 64.4. Initialize a New Project 64.5. Using the Embedded Shell 64.6. Adding Extensions to the CLI 65. Developing Applications with the Groovy Beans DSL 66. Configuring the CLI with settings.xml 67. What to Read Next VIII. Build tool plugins 68. Spring Boot Maven Plugin 68.1. Including the Plugin 68.2. Packaging Executable Jar and War Files 69. Spring Boot Gradle Plugin 70. Spring Boot AntLib Module 70.1. Spring Boot Ant Tasks 70.1.1. spring-boot:exejar 70.1.2. Examples 70.2. spring-boot:findmainclass 70.2.1. Examples 71. Supporting Other Build Systems 71.1. Repackaging Archives 71.2. Nested Libraries 71.3. Finding a Main Class 71.4. Example Repackage Implementation 72. What to Read Next IX. ‘How-to’ guides 73. Spring Boot Application 73.1. Create Your Own FailureAnalyzer 73.2. Troubleshoot Auto-configuration 73.3. Customize the Environment or ApplicationContext Before It Starts 73.4. Build an ApplicationContext Hierarchy (Adding a Parent or Root Context) 73.5. Create a Non-web Application 74. Properties and Configuration 74.1. Automatically Expand Properties at Build Time 74.1.1. Automatic Property Expansion Using Maven 74.1.2. Automatic Property Expansion Using Gradle 74.2. Externalize the Configuration of SpringApplication 74.3. Change the Location of External Properties of an Application 74.4. Use ‘Short’ Command Line Arguments 74.5. Use YAML for External Properties 74.6. Set the Active Spring Profiles 74.7. Change Configuration Depending on the Environment 74.8. Discover Built-in Options for External Properties 75. Embedded Web Servers 75.1. Use Another Web Server 75.2. Disabling the Web Server 75.3. Configure Jetty 75.4. Add a Servlet, Filter, or Listener to an Application 75.4.1. Add a Servlet, Filter, or Listener by Using a Spring Bean Disable Registration of a Servlet or Filter 75.4.2. Add Servlets, Filters, and Listeners by Using Classpath Scanning 75.5. Change the HTTP Port 75.6. Use a Random Unassigned HTTP Port 75.7. Discover the HTTP Port at Runtime 75.8. Configure SSL 75.9. Configure HTTP/2 75.9.1. HTTP/2 with Undertow 75.9.2. HTTP/2 with Jetty 75.9.3. HTTP/2 with Tomcat 75.10. Configure Access Logging 75.11. Running Behind a Front-end Proxy Server 75.11.1. Customize Tomcat’s Proxy Configuration 75.12. Configure Tomcat 75.13. Enable Multiple Connectors with Tomcat 75.14. Use Tomcat’s LegacyCookieProcessor 75.15. Configure Undertow 75.16. Enable Multiple Listeners with Undertow 75.17. Create WebSocket Endpoints Using @ServerEndpoint 75.18. Enable HTTP Response Compression 76. Spring MVC 76.1. Write a JSON REST Service 76.2. Write an XML REST Service 76.3. Customize the Jackson ObjectMapper 76.4. Customize the @ResponseBody Rendering 76.5. Handling Multipart File Uploads 76.6. Switch Off the Spring MVC DispatcherServlet 76.7. Switch off the Default MVC Configuration 76.8. Customize ViewResolvers 77. HTTP Clients 77.1. Configure RestTemplate to Use a Proxy 78. Logging 78.1. Configure Logback for Logging 78.1.1. Configure Logback for File-only Output 78.2. Configure Log4j for Logging 78.2.1. Use YAML or JSON to Configure Log4j 2 79. Data Access 79.1. Configure a Custom DataSource 79.2. Configure Two DataSources 79.3. Use Spring Data Repositories 79.4. Separate @Entity Definitions from Spring Configuration 79.5. Configure JPA Properties 79.6. Configure Hibernate Naming Strategy 79.7. Use a Custom EntityManagerFactory 79.8. Use Two EntityManagers 79.9. Use a Traditional persistence.xml File 79.10. Use Spring Data JPA and Mongo Repositories 79.11. Expose Spring Data Repositories as REST Endpoint 79.12. Configure a Component that is Used by JPA 79.13. Configure jOOQ with Two DataSources 80. Database Initialization 80.1. Initialize a Database Using JPA 80.2. Initialize a Database Using Hibernate 80.3. Initialize a Database 80.4. Initialize a Spring Batch Database 80.5. Use a Higher-level Database Migration Tool 80.5.1. Execute Flyway Database Migrations on Startup 80.5.2. Execute Liquibase Database Migrations on Startup 81. Messaging 81.1. Disable Transacted JMS Session 82. Batch Applications 82.1. Execute Spring Batch Jobs on Startup 83. Actuator 83.1. Change the HTTP Port or Address of the Actuator Endpoints 83.2. Customize the ‘whitelabel’ Error Page 84. Security 84.1. Switch off the Spring Boot Security Configuration 84.2. Change the UserDetailsService and Add User Accounts 84.3. Enable HTTPS When Running behind a Proxy Server 85. Hot Swapping 85.1. Reload Static Content 85.2. Reload Templates without Restarting the Container 85.2.1. Thymeleaf Templates 85.2.2. FreeMarker Templates 85.2.3. Groovy Templates 85.3. Fast Application Restarts 85.4. Reload Java Classes without Restarting the Container 86. Build 86.1. Generate Build Information 86.2. Generate Git Information 86.3. Customize Dependency Versions 86.4. Create an Executable JAR with Maven 86.5. Use a Spring Boot Application as a Dependency 86.6. Extract Specific Libraries When an Executable Jar Runs 86.7. Create a Non-executable JAR with Exclusions 86.8. Remote Debug a Spring Boot Application Started with Maven 86.9. Build an Executable Archive from Ant without Using spring-boot-antlib 87. Traditional Deployment 87.1. Create a Deployable War File 87.2. Convert an Existing Application to Spring Boot 87.3. Deploying a WAR to WebLogic 87.4. Use Jedis Instead of Lettuce X. Appendices A. Common application properties B. Configuration Metadata B.1. Metadata Format B.1.1. Group Attributes B.1.2. Property Attributes B.1.3. Hint Attributes B.1.4. Repeated Metadata Items B.2. Providing Manual Hints B.2.1. Value Hint B.2.2. Value Providers Any Class Reference Handle As Logger Name Spring Bean Reference Spring Profile Name B.3. Generating Your Own Metadata by Using the Annotation Processor B.3.1. Nested Properties B.3.2. Adding Additional Metadata C. Auto-configuration classes C.1. From the “spring-boot-autoconfigure” module C.2. From the “spring-boot-actuator-autoconfigure” module D. Test auto-configuration annotations E. The Executable Jar Format E.1. Nested JARs E.1.1. The Executable Jar File Structure E.1.2. The Executable War File Structure E.2. Spring Boot’s “JarFile” Class E.2.1. Compatibility with the Standard Java “JarFile” E.3. Launching Executable Jars E.3.1. Launcher Manifest E.3.2. Exploded Archives E.4. PropertiesLauncher Features E.5. Executable Jar Restrictions E.6. Alternative Single Jar Solutions F. Dependency versions
Software Testing and Continuous Quality Improvement

SECTION I SOFTWARE QUALITY IN PERSPECTIVE . . . . . . . . . . . . . . . 1
1 Quality Assurance Framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
What Is Quality?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Prevention versus Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Verification versus Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Software Quality Assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Components of Quality Assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Software Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Quality Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Software Configuration Management . . . . . . . . . . . . . . . . . . . . . . . . . 12
Elements of Software Configuration Management. . . . . . . . . . . . . . . 12
Component Identification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Version Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Configuration Building. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Change Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Software Quality Assurance Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Steps to Develop and Implement a Software Quality
Assurance Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Step 1. Document the Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Step 2. Obtain Management Acceptance . . . . . . . . . . . . . . . . . . . . 18
Step 3. Obtain Development Acceptance. . . . . . . . . . . . . . . . . . . . 18
Step 4. Plan for Implementation of the SQA Plan . . . . . . . . . . . . . 19
Step 5. Execute the SQA Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Quality Standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
ISO9000. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Capability Maturity Model (CMM) . . . . . . . . . . . . . . . . . . . . . . . . . 20
Level 1 — Initial. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Level 2 — Repeatable. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Level 3 — Defined. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Level 4 — Managed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Level 5 — Optimized . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
PCMM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
CMMI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Malcom Baldrige National Quality Award . . . . . . . . . . . . . . . . . . . 24
Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
vii


Software Testing and Continuous Quality Improvement
2 Overview of Testing Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Black-Box Testing (Functional). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
White-Box Testing (Structural). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Gray-Box Testing (Functional and Structural) . . . . . . . . . . . . . . . . . . 30
Manual versus Automated Testing . . . . . . . . . . . . . . . . . . . . . . . . . 31
Static versus Dynamic Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Taxonomy of Software Testing Techniques . . . . . . . . . . . . . . . . . . . . 32
3 Quality through Continuous Improvement Process . . . . . . . . . . . 41
Contribution of Edward Deming. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Role of Statistical Methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Cause-and-Effect Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Flow Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Pareto Chart. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Run Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Histogram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Scatter Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Control Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Deming’s 14 Quality Principles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Point 1: Create Constancy of Purpose. . . . . . . . . . . . . . . . . . . . . . . 43
Point 2: Adopt the New Philosophy . . . . . . . . . . . . . . . . . . . . . . . . 44
Point 3: Cease Dependence on Mass Inspection. . . . . . . . . . . . . . 44
Point 4: End the Practice of Awarding Business on Price
Tag Alone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Point 5: Improve Constantly and Forever the System
of Production and Service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Point 6: Institute Training and Retraining . . . . . . . . . . . . . . . . . . . 45
Point 7: Institute Leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Point 8: Drive Out Fear . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Point 9: Break Down Barriers between Staff Areas . . . . . . . . . . . . 46
Point 10: Eliminate Slogans, Exhortations, and Targets
for the Workforce. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Point 11: Eliminate Numerical Goals. . . . . . . . . . . . . . . . . . . . . . . . 47
Point 12: Remove Barriers to Pride of Workmanship . . . . . . . . . . 47
Point 13: Institute a Vigorous Program of Education and
Retraining. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Point 14: Take Action to Accomplish the Transformation. . . . . . 48
Continuous Improvement through the Plan, Do, Check,
Act Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Going around the PDCA Circle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
SECTION II LIFE CYCLE TESTING REVIEW . . . . . . . . . . . . . . . . . . . . . 51
4 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Waterfall Development Methodology . . . . . . . . . . . . . . . . . . . . . . . . . 53
Continuous Improvement “Phased” Approach . . . . . . . . . . . . . . . . . 54
viii


Contents
Psychology of Life Cycle Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Software Testing as a Continuous Improvement Process. . . . . . . . . 55
The Testing Bible: Software Test Plan. . . . . . . . . . . . . . . . . . . . . . . . . 58
Major Steps to Develop a Test Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . 60
1. Define the Test Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
2. Develop the Test Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3. Define the Test Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
4. Develop the Test Specifications . . . . . . . . . . . . . . . . . . . . . . . . . 61
5. Schedule the Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
6. Review and Approve the Test Plan. . . . . . . . . . . . . . . . . . . . . . . 61
Components of a Test Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Technical Reviews as a Continuous Improvement Process. . . . . . . 61
Motivation for Technical Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Types of Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Structured Walkthroughs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Inspections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Participant Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Steps for an Effective Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
1. Plan for the Review Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
2. Schedule the Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
3. Develop the Review Agenda. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4. Create a Review Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
5 Verifying the Requirements Phase. . . . . . . . . . . . . . . . . . . . . . . . . 73
Testing the Requirements with Technical Reviews. . . . . . . . . . . . . . 74
Inspections and Walkthroughs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Checklists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Methodology Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Requirements Traceability Matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Building the System/Acceptance Test Plan . . . . . . . . . . . . . . . . . . . . 76
6 Verifying the Logical Design Phase . . . . . . . . . . . . . . . . . . . . . . . . 79
Data Model, Process Model, and the Linkage. . . . . . . . . . . . . . . . . . . 79
Testing the Logical Design with Technical Reviews . . . . . . . . . . . . . 80
Refining the System/Acceptance Test Plan . . . . . . . . . . . . . . . . . . . . 81
7 Verifying the Physical Design Phase . . . . . . . . . . . . . . . . . . . . . . . 83
Testing the Physical Design with Technical Reviews . . . . . . . . . . . . 83
Creating Integration Test Cases. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Methodology for Integration Testing. . . . . . . . . . . . . . . . . . . . . . . . . . 85
Step 1: Identify Unit Interfaces. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Step 2: Reconcile Interfaces for Completeness. . . . . . . . . . . . . . . 85
Step 3: Create Integration Test Conditions . . . . . . . . . . . . . . . . . . 86
Step 4: Evaluate the Completeness of Integration Test
Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
ix


Software Testing and Continuous Quality Improvement
8 Verifying the Program Unit Design Phase . . . . . . . . . . . . . . . . . . . 87
Testing the Program Unit Design with Technical Reviews . . . . . . . . 87
Sequence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Iteration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Creating Unit Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
9 Verifying the Coding Phase. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Testing Coding with Technical Reviews . . . . . . . . . . . . . . . . . . . . . . . 91
Executing the Test Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Unit Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Integration Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
System Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Acceptance Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Defect Recording . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
SECTION III SOFTWARE TESTING METHODOLOGY. . . . . . . . . . . . . . 97
10 Development Methodology Overview . . . . . . . . . . . . . . . . . . . . . . 99
Limitations of Life Cycle Development . . . . . . . . . . . . . . . . . . . . . . . . 99
The Client/Server Challenge. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Psychology of Client/Server Spiral Testing. . . . . . . . . . . . . . . . . . . . 101
The New School of Thought. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Tester/Developer Perceptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Project Goal: Integrate QA and Development . . . . . . . . . . . . . . . 103
Iterative/Spiral Development Methodology. . . . . . . . . . . . . . . . . 104
Role of JADs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Role of Prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Methodology for Developing Prototypes . . . . . . . . . . . . . . . . . . . . . 108
1. Develop the Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
2. Demonstrate Prototypes to Management. . . . . . . . . . . . . . . . . 110
3. Demonstrate Prototype to Users. . . . . . . . . . . . . . . . . . . . . . . . 110
4. Revise and Finalize Specifications. . . . . . . . . . . . . . . . . . . . . . . 111
5. Develop the Production System . . . . . . . . . . . . . . . . . . . . . . . . 111
Continuous Improvement “Spiral” Testing Approach. . . . . . . . . . . 112
11 Information Gathering (Plan). . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Step 1: Prepare for the Interview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Task 1: Identify the Participants . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Task 2: Define the Agenda. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Step 2: Conduct the Interview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Task 1: Understand the Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Task 2: Understand the Project Objectives . . . . . . . . . . . . . . . . . 121
Task 3: Understand the Project Status . . . . . . . . . . . . . . . . . . . . . 121
Task 4: Understand the Project Plans. . . . . . . . . . . . . . . . . . . . . . 122
Task 5: Understand the Project Development Methodology. . . 122
Task 6: Identify the High-Level Business Requirements. . . . . . . 123
x

Contents
Task 7: Perform Risk Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Computer Risk Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Method 1 — Judgment and Instinct. . . . . . . . . . . . . . . . . . . . . 125
Method 2 — Dollar Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 125
Method 3 — Identifying and Weighting Risk Attributes. . . . . 125
Step 3: Summarize the Findings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Task 1: Summarize the Interview. . . . . . . . . . . . . . . . . . . . . . . . . . 126
Task 2: Confirm the Interview Findings . . . . . . . . . . . . . . . . . . . . 127
12 Test Planning (Plan) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Step 1: Build a Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Task 1: Prepare an Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Task 2: Define the High-Level Functional Requirements
(in Scope). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Task 3: Identify Manual/Automated Test Types . . . . . . . . . . . . . 132
Task 4: Identify the Test Exit Criteria . . . . . . . . . . . . . . . . . . . . . . 133
Task 5: Establish Regression Test Strategy . . . . . . . . . . . . . . . . . 134
Task 6: Define the Test Deliverables. . . . . . . . . . . . . . . . . . . . . . . 136
Task 7: Organize the Test Team. . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Task 8: Establish a Test Environment. . . . . . . . . . . . . . . . . . . . . . 138
Task 9: Define the Dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . 139
Task 10: Create a Test Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Task 11: Select the Test Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Task 12: Establish Defect Recording/Tracking Procedures . . . . 143
Task 13: Establish Change Request Procedures . . . . . . . . . . . . . 145
Task 14: Establish Version Control Procedures. . . . . . . . . . . . . . 147
Task 15: Define Configuration Build Procedures. . . . . . . . . . . . . 147
Task 16: Define Project Issue Resolution Procedures. . . . . . . . . 148
Task 17: Establish Reporting Procedures. . . . . . . . . . . . . . . . . . . 148
Task 18: Define Approval Procedures. . . . . . . . . . . . . . . . . . . . . . 149
Step 2: Define the Metric Objectives. . . . . . . . . . . . . . . . . . . . . . . . . 149
Task 1: Define the Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Task 2: Define the Metric Points . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Step 3: Review/Approve the Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
Task 1: Schedule/Conduct the Review . . . . . . . . . . . . . . . . . . . . . 154
Task 2: Obtain Approvals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
13 Test Case Design (Do) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
Step 1: Design Function Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
Task 1: Refine the Functional Test Requirements. . . . . . . . . . . . 157
Task 2: Build a Function/Test Matrix . . . . . . . . . . . . . . . . . . . . . . 159
Step 2: Design GUI Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Ten Guidelines for Good GUI Design. . . . . . . . . . . . . . . . . . . . . . . 164
Task 1: Identify the Application GUI Components . . . . . . . . . . . 165
Task 2: Define the GUI Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
xi

Software Testing and Continuous Quality Improvement
Step 3: Define the System/Acceptance Tests . . . . . . . . . . . . . . . . . . 167
Task 1: Identify Potential System Tests. . . . . . . . . . . . . . . . . . . . . 167
Task 2: Design System Fragment Tests. . . . . . . . . . . . . . . . . . . . . 168
Task 3: Identify Potential Acceptance Tests. . . . . . . . . . . . . . . . . 169
Step 4: Review/Approve Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Task 1: Schedule/Prepare for Review . . . . . . . . . . . . . . . . . . . . . . 169
Task 2: Obtain Approvals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
14 Test Development (Do) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Step 1: Develop Test Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Task 1: Script the Manual/Automated GUI/Function Tests . . . . 173
Task 2: Script the Manual/Automated System Fragment
Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Step 2: Review/Approve Test Development . . . . . . . . . . . . . . . . . . . 174
Task 1: Schedule/Prepare for Review . . . . . . . . . . . . . . . . . . . . . . 174
Task 2: Obtain Approvals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
15 Test Coverage through Traceability. . . . . . . . . . . . . . . . . . . . . . . 177
Use Cases and Traceability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
16 Test Execution/Evaluation (Do/Check). . . . . . . . . . . . . . . . . . . . . 181
Step 1: Setup and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Task 1: Regression Test the Manual/Automated Spiral Fixes. . . 181
Task 2: Execute the Manual/Automated New Spiral Tests. . . . . 182
Task 3: Document the Spiral Test Defects . . . . . . . . . . . . . . . . . . 183
Step 2: Evaluation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Task 1: Analyze the Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Step 3: Publish Interim Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
Task 1: Refine the Test Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . 184
Task 2: Identify Requirement Changes . . . . . . . . . . . . . . . . . . . . . 185
17 Prepare for the Next Spiral (Act) . . . . . . . . . . . . . . . . . . . . . . . . . 187
Step 1: Refine the Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Task 1: Update the Function/GUI Tests. . . . . . . . . . . . . . . . . . . . . 187
Task 2: Update the System Fragment Tests . . . . . . . . . . . . . . . . . 188
Task 3: Update the Acceptance Tests. . . . . . . . . . . . . . . . . . . . . . 189
Step 2: Reassess the Team, Procedures, and Test Environment. . . . 189
Task 1: Evaluate the Test Team . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Task 2: Review the Test Control Procedures . . . . . . . . . . . . . . . . 189
Task 3: Update the Test Environment. . . . . . . . . . . . . . . . . . . . . . 190
Step 3: Publish Interim Test Report. . . . . . . . . . . . . . . . . . . . . . . . . . 191
Task 1: Publish the Metric Graphics . . . . . . . . . . . . . . . . . . . . . . . 191
Test Case Execution Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Defect Gap Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Defect Severity Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Test Burnout Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
xii

Contents
18 Conduct the System Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Step 1: Complete System Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Task 1: Finalize the System Test Types. . . . . . . . . . . . . . . . . . . . . 195
Task 2: Finalize System Test Schedule . . . . . . . . . . . . . . . . . . . . . 197
Task 3: Organize the System Test Team. . . . . . . . . . . . . . . . . . . . 197
Task 4: Establish the System Test Environment . . . . . . . . . . . . . 197
Task 5: Install the System Test Tools . . . . . . . . . . . . . . . . . . . . . . 200
Step 2: Complete System Test Cases. . . . . . . . . . . . . . . . . . . . . . . . . 200
Task 1: Design/Script the Performance Tests . . . . . . . . . . . . . . . 200
Monitoring Approach. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Probe Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
Test Drivers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
Task 2: Design/Script the Security Tests . . . . . . . . . . . . . . . . . . . 203
A Security Design Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Task 3: Design/Script the Volume Tests . . . . . . . . . . . . . . . . . . . . 204
Task 4: Design/Script the Stress Tests . . . . . . . . . . . . . . . . . . . . . 205
Task 5: Design/Script the Compatibility Tests. . . . . . . . . . . . . . . 206
Task 6: Design/Script the Conversion Tests. . . . . . . . . . . . . . . . . 206
Task 7: Design/Script the Usability Tests. . . . . . . . . . . . . . . . . . . 207
Task 8: Design/Script the Documentation Tests . . . . . . . . . . . . . 208
Task 9: Design/Script the Backup Tests . . . . . . . . . . . . . . . . . . . . 208
Task 10: Design/Script the Recovery Tests . . . . . . . . . . . . . . . . . 209
Task 11: Design/Script the Installation Tests. . . . . . . . . . . . . . . . 209
Task 12: Design/Script Other System Test Types . . . . . . . . . . . . 210
Step 3: Review/Approve System Tests . . . . . . . . . . . . . . . . . . . . . . . 211
Task 1: Schedule/Conduct the Review . . . . . . . . . . . . . . . . . . . . . 211
Task 2: Obtain Approvals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Step 4: Execute the System Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Task 1: Regression Test the System Fixes . . . . . . . . . . . . . . . . . . 212
Task 2: Execute the New System Tests. . . . . . . . . . . . . . . . . . . . . 213
Task 3: Document the System Defects . . . . . . . . . . . . . . . . . . . . . 213
19 Conduct Acceptance Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
Step 1: Complete Acceptance Test Planning . . . . . . . . . . . . . . . . . . 215
Task 1: Finalize the Acceptance Test Types. . . . . . . . . . . . . . . . . 215
Task 2: Finalize the Acceptance Test Schedule. . . . . . . . . . . . . . 215
Task 3: Organize the Acceptance Test Team . . . . . . . . . . . . . . . . 215
Task 4: Establish the Acceptance Test Environment . . . . . . . . . 217
Task 5: Install Acceptance Test Tools. . . . . . . . . . . . . . . . . . . . . . 218
Step 2: Complete Acceptance Test Cases. . . . . . . . . . . . . . . . . . . . . 218
Task 1: Subset the System-Level Test Cases . . . . . . . . . . . . . . . . 218
Task 2: Design/Script Additional Acceptance Tests . . . . . . . . . . 219
Step 3: Review/Approve Acceptance Test Plan . . . . . . . . . . . . . . . . 219
Task 1: Schedule/Conduct the Review . . . . . . . . . . . . . . . . . . . . . 219
Task 2: Obtain Approvals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
xiii

Software Testing and Continuous Quality Improvement
Step 4: Execute the Acceptance Tests. . . . . . . . . . . . . . . . . . . . . . . . 220
Task 1: Regression Test the Acceptance Fixes. . . . . . . . . . . . . . . 220
Task 2: Execute the New Acceptance Tests . . . . . . . . . . . . . . . . . 220
Task 3: Document the Acceptance Defects . . . . . . . . . . . . . . . . . 221
20 Summarize/Report Spiral Test Results. . . . . . . . . . . . . . . . . . . . . 223
Step 1: Perform Data Reduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
Task 1: Ensure All Tests Were Executed/Resolved . . . . . . . . . . . 223
Task 2: Consolidate Test Defects by Test Number . . . . . . . . . . . 223
Task 3: Post Remaining Defects to a Matrix. . . . . . . . . . . . . . . . . 223
Step 2: Prepare Final Test Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
Task 1: Prepare the Project Overview. . . . . . . . . . . . . . . . . . . . . . 225
Task 2: Summarize the Test Activities. . . . . . . . . . . . . . . . . . . . . . 225
Task 3: Analyze/Create Metric Graphics. . . . . . . . . . . . . . . . . . . . 225
Defects by Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Defects by Tester . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Defect Gap Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Defect Severity Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Test Burnout Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Root Cause Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Defects by How Found . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
Defects by Who Found. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
Functions Tested and Not . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
System Testing Defect Types. . . . . . . . . . . . . . . . . . . . . . . . . . . 230
Acceptance Testing Defect Types. . . . . . . . . . . . . . . . . . . . . . . 232
Task 4: Develop Findings/Recommendations . . . . . . . . . . . . . . . 232
Step 3: Review/Approve the Final Test Report. . . . . . . . . . . . . . . . . 233
Task 1: Schedule/Conduct the Review . . . . . . . . . . . . . . . . . . . . . 233
Task 2: Obtain Approvals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
Task 3: Publish the Final Test Report . . . . . . . . . . . . . . . . . . . . . . 236
SECTION IV TEST PROJECT MANAGEMENT. . . . . . . . . . . . . . . . . . . 237
21 Overview of General Project Management . . . . . . . . . . . . . . . . . 239
Define the Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
Define the Scope of the Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
Identify the Key Activities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
Estimate Correctly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Manage People . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
Solving Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Continuous Monitoring. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Manage Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
xiv

Contents
22 Test Project Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
Understand the Requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
Test Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
Test Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Identify and Improve Processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Essential Characteristics of a Test Project Manager. . . . . . . . . . . . 248
Requirement Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
Gap Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
Lateral Thinking in Developing Test Cases . . . . . . . . . . . . . . . . . 248
Avoid Duplication and Repetition. . . . . . . . . . . . . . . . . . . . . . . . . 249
Test Data Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
Validate the Test Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
Test to Destroy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
Analyze the Test Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
Do Not Hesitate to Accept Help from Others. . . . . . . . . . . . . . . . 250
Convey Issues as They Arise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Improve Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Always Keep Updating Your Business Knowledge . . . . . . . . . . . 250
Learn the New Testing Technologies and Tools . . . . . . . . . . . . . 250
Deliver Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Improve the Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Create a Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Repeat the Success. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
23 Test Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Finish-to-Start: (FS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Start-to-Start: (SS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Finish-to-Finish: (FF). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Start-to-Finish (SF). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Critical Activities for Test Estimation . . . . . . . . . . . . . . . . . . . . . . . . 255
Test Scope Document. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
Test Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
Test Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Test Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Test Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Execution/Run Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Factors Affecting Test Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Test Planning Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
Test Execution and Controlling Effort. . . . . . . . . . . . . . . . . . . . . . . . 259
Test Result Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Effort Estimation — Model Project . . . . . . . . . . . . . . . . . . . . . . . . . . 259
24 Defect Monitoring and Management Process . . . . . . . . . . . . . . . 263
Defect Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
Defect Meetings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
xv

Software Testing and Continuous Quality Improvement
Defect Classifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
Defect Priority. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Defect Category . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Defect Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
25 Integrating Testing into Development Methodology. . . . . . . . . . 269
Step 1. Organize the Test Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
Step 2. Identify Test Steps and Tasks to Integrate. . . . . . . . . . . . . . 270
Step 3. Customize Test Steps and Tasks . . . . . . . . . . . . . . . . . . . . . . 271
Step 4. Select Integration Points. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271
Step 5. Modify the Development Methodology . . . . . . . . . . . . . . . . 272
Step 6. Incorporate Defect Recording . . . . . . . . . . . . . . . . . . . . . . . . 272
Step 7. Train in Use of the Test Methodology. . . . . . . . . . . . . . . . . . 272
26 On-Site/Offshore Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
Step 1: Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
Step 2: Determine the Economic Tradeoffs. . . . . . . . . . . . . . . . . . . . 276
Step 3: Determine the Selection Criteria. . . . . . . . . . . . . . . . . . . . . . 276
Project Management and Monitoring . . . . . . . . . . . . . . . . . . . . . . . . 276
Outsourcing Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
On-Site Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
Offshore Activities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
Implementing the On-Site/Offshore Model . . . . . . . . . . . . . . . . . . . . 279
Knowledge Transfer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
Detailed Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
Milestone-Based Transfer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
Steady State . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
Application Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
Relationship Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
Standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
Benefits of On-Site/Offshore Methodology . . . . . . . . . . . . . . . . . . . . 283
On-Site/Offshore Model Challenges. . . . . . . . . . . . . . . . . . . . . . . . 285
Out of Sight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Establish Transparency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Security Considerations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Project Monitoring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Management Overhead . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Cultural Differences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Software Licensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
The Future of Onshore/Offshore . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286
SECTION V MODERN SOFTWARE TESTING TOOLS . . . . . . . . . . . . . 287
27 A Brief History of Software Testing . . . . . . . . . . . . . . . . . . . . . . . 289
Evolution of Automated Testing Tools . . . . . . . . . . . . . . . . . . . . . . . 293
Static Capture/Replay Tools (without Scripting Language). . . . 294
Static Capture/Replay Tools (with Scripting Language). . . . . . . 294
xvi

Contents
Variable Capture/Replay Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
Functional Decomposition Approach . . . . . . . . . . . . . . . . . . . 295
Test Plan Driven (“Keyword”) Approach. . . . . . . . . . . . . . . . . 296
Historical Software Testing and Development Parallels. . . . . . . . . 298
Extreme Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299
28 Software Testing Trends. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Automated Capture/Replay Testing Tools . . . . . . . . . . . . . . . . . . . . 301
Test Case Builder Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Advanced Leading-Edge Automated Testing Tools. . . . . . . . . . . . . 302
Advanced Leading-Edge Test Case Builder Tools . . . . . . . . . . . . . . 304
Necessary and Sufficient Conditions. . . . . . . . . . . . . . . . . . . . . . . . . 304
Test Data/Test Case Generation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
Sampling from Production . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
Starting from Scratch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306
Seeding the Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306
Generating Data Based upon the Database . . . . . . . . . . . . . . . . . 307
Generating Test Data/Test Cases Based upon the
Requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
29 Taxonomy of Testing Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
Testing Tool Selection Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
Vendor Tool Descriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
When You Should Consider Test Automation . . . . . . . . . . . . . . . . . 312
When You Should NOT Consider Test Automation. . . . . . . . . . . . . 320
30 Methodology to Evaluate Automated Testing Tools . . . . . . . . . . 323
Step 1: Define Your Test Requirements. . . . . . . . . . . . . . . . . . . . . . . 323
Step 2: Set Tool Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323
Step 3a: Conduct Selection Activities for Informal
Procurement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324
Task 1: Develop the Acquisition Plan . . . . . . . . . . . . . . . . . . . . . . 324
Task 2: Define Selection Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . 324
Task 3: Identify Candidate Tools . . . . . . . . . . . . . . . . . . . . . . . . . . 324
Task 4: Conduct the Candidate Review . . . . . . . . . . . . . . . . . . . . 325
Task 5: Score the Candidates. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
Task 6: Select the Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
Step 3b: Conduct Selection Activities for Formal
Procurement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
Task 1: Develop the Acquisition Plan . . . . . . . . . . . . . . . . . . . . . . 326
Task 2: Create the Technical Requirements Document . . . . . . . 326
Task 3: Review Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
Task 4: Generate the Request for Proposal . . . . . . . . . . . . . . . . . 326
Task 5: Solicit Proposals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
Task 6: Perform the Technical Evaluation . . . . . . . . . . . . . . . . . . 327
Task 7: Select a Tool Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
xvii

Software Testing and Continuous Quality Improvement
Step 4: Procure the Testing Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
Step 5: Create the Evaluation Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . 327
Step 6: Create the Tool Manager’s Plan. . . . . . . . . . . . . . . . . . . . . . . 328
Step 7: Create the Training Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328
Step 8: Receive the Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328
Step 9: Perform the Acceptance Test. . . . . . . . . . . . . . . . . . . . . . . . . 329
Step 10: Conduct Orientation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
Step 11: Implement Modifications . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
Step 12: Train Tool Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
Step 13: Use the Tool in the Operating Environment. . . . . . . . . . . . 330
Step 14: Write the Evaluation Report. . . . . . . . . . . . . . . . . . . . . . . . . 330
Step 15: Determine Whether Goals Have Been Met. . . . . . . . . . . . . 330
APPENDICES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
A Spiral Testing Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
B Software Quality Assurance Plan. . . . . . . . . . . . . . . . . . . . . . . . . 343
C Requirements Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345
D Change Request Form. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347
E Test Templates. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
E1: Unit Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
E2: System/Acceptance Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
E3: Requirements Traceability Matrix. . . . . . . . . . . . . . . . . . . . . . . . 351
E4: Test Plan (Client/Server and Internet Spiral Testing). . . . . . . . 353
E5: Function/Test Matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
E6: GUI Component Test Matrix
(Client/Server and Internet Spiral Testing). . . . . . . . . . . . . . . . . . . . 355
E7: GUI-Based Functional Test Matrix
(Client/Server and Internet Spiral Testing). . . . . . . . . . . . . . . . . . . . 356
E8: Test Case. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
E9: Test Case Log . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
E10: Test Log Summary Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
E11: System Summary Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361
E12: Defect Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
E13: Test Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
E14: Retest Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
E15: Spiral Testing Summary Report
(Client/Server and Internet Spiral Testing). . . . . . . . . . . . . . . . . . . . 368
E16: Minutes of the Meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
E17: Test Approvals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370
E18: Test Execution Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
E19: Test Project Milestones. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
E20: PDCA Test Schedule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373
E21: Test Strategy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
xviii

Contents
E22: Clarification Request. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
E23: Screen Data Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378
E24: Test Condition versus Test Case . . . . . . . . . . . . . . . . . . . . . . . . 379
E25: Project Status Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380
E26: Test Defect Details Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
E27: Defect Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
E28: Test Execution Tracking Manager. . . . . . . . . . . . . . . . . . . . . . . 383
E29: Final Test Summary Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384
F Checklists. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
F1: Requirements Phase Defect Checklist. . . . . . . . . . . . . . . . . . . . . 388
F2: Logical Design Phase Defect Checklist . . . . . . . . . . . . . . . . . . . . 389
F3: Physical Design Phase Defect Checklist . . . . . . . . . . . . . . . . . . . 390
F4: Program Unit Design Phase Defect Checklist. . . . . . . . . . . . . . . 393
F5: Coding Phase Defect Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . 394
F6: Field Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
F7: Record Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
F8: File Test Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400
F9: Error Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 401
F10: Use Test Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403
F11: Search Test Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
F12: Match/Merge Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 405
F13: Stress Test Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
F14: Attributes Testing Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . . 409
F15: States Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
F16: Procedures Testing Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . 412
F17: Control Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413
F18: Control Flow Testing Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . 418
F19: Testing Tool Selection Checklist . . . . . . . . . . . . . . . . . . . . . . . . 419
F20: Project Information Gathering Checklist . . . . . . . . . . . . . . . . . 421
F21: Impact Analysis Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423
F22: Environment Readiness Checklist. . . . . . . . . . . . . . . . . . . . . . . 425
F23: Project Completion Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . 427
F24: Unit Testing Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
F25: Ambiguity Review Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . . 433
F26: Architecture Review Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . 435
F27: Data Design Review Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . 436
F28: Functional Specification Review Checklist. . . . . . . . . . . . . . . . 437
F29: Prototype Review Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . 442
F30: Requirements Review Checklist. . . . . . . . . . . . . . . . . . . . . . . . . 443
F31: Technical Design Review Checklist . . . . . . . . . . . . . . . . . . . . . . 447
F32: Test Case Preparation Review Checklist. . . . . . . . . . . . . . . . . . 449
G Software Testing Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
G1: Basis Path Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
xix

Software Testing and Continuous Quality Improvement
G2: Black-Box Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 452
Extra Program Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453
G3: Bottom-Up Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453
G4: Boundary Value Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453
Numeric Input Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Field Ranges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Numeric Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Output Range of Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Nonnumeric Input Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Tables or Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Number of Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Nonnumeric Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Tables or Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
Number of Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
GUI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
G5: Branch Coverage Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455
G6: Branch/Condition Coverage Testing. . . . . . . . . . . . . . . . . . . . . . 455
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 456
G7: Cause-Effect Graphing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 456
Cause-Effect Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 457
Specification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
Causes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
Effects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
G8: Condition Coverage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 460
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 460
G9: CRUD Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 461
G10: Database Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 461
Integrity Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 461
Entity Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
Primary Key Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
Column Key Integrity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
Domain Integrity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
User-Defined Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
Referential Integrity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
Data Modeling Essentials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464
What Is a Model?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465
Why Do We Create Models?. . . . . . . . . . . . . . . . . . . . . . . . . . . . 465
Tables — A Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 466
Table Names . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Rows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Order. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Entities — A Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
xx

Contents
Identification — Primary Key . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Compound Primary Keys. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Null Values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Identifying Entities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 469
Entity Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 469
Relationships — A Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
Relationship Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
One-to-One. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
One-to-Many . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 472
Many-to-Many . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 473
Multiple Relationships. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475
Entities versus Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . 475
Attributes — A Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 476
Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477
Domain Names. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 478
Attributes versus Relationships. . . . . . . . . . . . . . . . . . . . . . . . 478
Normalization — What Is It? . . . . . . . . . . . . . . . . . . . . . . . . . . . 479
Problems of Unnormalized Entities . . . . . . . . . . . . . . . . . . . . . 479
Steps in Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 480
First Normal Form (1NF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 480
Second Normal Form (2NF). . . . . . . . . . . . . . . . . . . . . . . . . . . . 482
Third Normal Form (3NF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484
Model Refinement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485
Entity Subtypes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486
A Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486
Referential Integrity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486
Dependency Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 488
Constraint Rule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 488
Recursion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 489
Using the Model in Database Design . . . . . . . . . . . . . . . . . . . . 491
Relational Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491
G11: Decision Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 492
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 492
G12: Desk Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 493
G13: Equivalence Partitioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 493
Numeric Input Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Field Ranges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Numeric Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Output Range of Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Nonnumeric Input Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Tables or Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Number of Items. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Nonnumeric Output Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Tables or Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
Number of Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
xxi

Software Testing and Continuous Quality Improvement
G14: Exception Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
G15: Free Form Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
G16: Gray-Box Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495
G17: Histograms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496
G18: Inspections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496
G19: JADs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 497
G20: Orthogonal Array Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 498
G21: Pareto Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
G22: Positive and Negative Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 501
G23: Prior Defect History Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
G24: Prototyping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
Cyclic Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
Fourth-Generation Languages and Prototyping. . . . . . . . . . . . . . 503
Iterative Development Accounting . . . . . . . . . . . . . . . . . . . . . . . . 504
Evolutionary and Throwaway . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504
Application Prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505
Prototype Systems Development . . . . . . . . . . . . . . . . . . . . . . . . . 505
Data-Driven Prototyping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505
Replacement of the Traditional Life Cycle. . . . . . . . . . . . . . . . . . 506
Early-Stage Prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
User Software Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
G25: Random Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
G26: Range Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
G27: Regression Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509
G28: Risk-Based Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509
G29: Run Charts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 510
G30: Sandwich Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 510
G31: Statement Coverage Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . 511
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511
G32: State Transition Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511
PROGRAM: FIELD-COUNT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
G33: Statistical Profile Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
G34: Structured Walkthroughs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
G35: Syntax Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
G36: Table Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
G37: Thread Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515
G38: Top-Down Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515
G39: White-Box Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516
Bibliography. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 529
Book Description to Finelybook sorting Learn how to build web applications from three Microsoft MVPs. After building the data application layer using Entity Framework Core and a RESTful service using ASP.NET Core, you will then build the client side web application three ways: first, using ASP.NET Core, then using Angular 2, and, finally, using React. You will be able to compare and contrast these UI frameworks and select the best one for your needs. .NET Core is a complete rewrite of the popular .NET and its related frameworks. While many concepts are similar between .NET Core and the .NET 4.6 framework, there are revolutionary changes as well, including updates to Entity Framework Core and ASP.NET Core. The first section of this book covers the three main parts of building applications with C#: Entity Framework, ASP.NET Core Services, and ASP.NET Core Web Applications. There is also an explosion in popularity of JavaScript frameworks for client side development, and the authors cover two of the most popular UI frameworks. Start with TypeScript for developing clean JavaScript, along with a client side build tool such as Gulp, Grunt, and WebPack. Using the same data access layer and RESTful service from the .NET Core application, you can rebuild the UI using Angular 2. Then, repeat the process using React, for a true comparison of building client side applications using ASP.NET Core, Angular 2, and React. What You’ll Learn Understand the fundamentals of .NET Core and what that means to the traditional .NET developer Build a data access layer with Entity Framework Core, a RESTful service with ASP.NET Core MVC, and a website with ASP.NET Core MVC and Bootstrap Automate many build tasks with client side build utilities Who This Book Is For Intermediate to advanced .NET developers Contents Part I: Visual Studio 2017 and .NET Core Chapter 1: Introducing Entity Framework Core Chapter 2: Building the Data Access Layer with Entity Framework Core Chapter 3: Building the RES
Book Description NHibernate is an innovative, flexible, scalable, and feature-complete open source project for data access. Although it sounds like an easy task to build and maintain database applications, it can be challenging to get beyond the basics and develop applications that meet your needs perfectly. The NHibernate Cookbook explains each feature of NHibernate 3.0 in detail through example recipes that you can quickly apply to your applications. Set yourself free from stored procedures and inline SQL. Quite simply, if you build .NET applications that use databases, this book is for you. The book will take you from the absolute basics of NHibernate through its most advanced features and beyond, showing you how to take full advantage of each concept to quickly create amazing database applications. Beginners will learn several techniques for each of the 4 core NHibernate tasks – mapping, configuration, session & transaction management, and querying – and which techniques fit best with various types of applications. In short, you will be able to build an application using NHibernate. Intermediate level readers will learn how to best implement enterprise application architecture patterns using NHibernate, leading to clean, easy-to-understand code, and increased productivity. In addition to new v3.0 features, advanced readers will learn creative ways to extend NHibernate core, as well as techniques using the NHibernate search, shards, spatial, and validation projects. Get solutions to common NHibernate problems to develop high-quality performance-critical data access applications What you will learn from this book : Create a persistent object model for moving data in and out of your database Build the database from your model automatically Configure NHibernate for use with WebForms, MVC, WPF, and WinForms applications Create database queries using a variety of methods, including the new LINQ to NHibernate and QueryOver APIs Build an enterprise-level data access layer Improve the performance of your applications using a variety of techniques Build an infrastructure for fast, easy test-driven development of your data access layer Extend NHibernate to add data encryption and auditing Implement entity validation, full-text search, horizontal partitioning (sharding), and spatial queries using NHibernate Contrib projects Approach This book contains quick-paced self-explanatory recipes organized in progressive skill levels and functional areas. Each recipe contains step-by-step instructions about everything necessary to execute a particular task. The book is designed so that you can read it from start to end or just open up any chapter and start following the recipes. In short this book is meant to be the ultimate “how-to” reference for NHibernate 3.0, covering every major feature of NHibernate for all experience levels. Who this book is written for This book is written for NHibernate users at all levels of experience. Examples are written in C# and XML. Some basic knowledge of SQL is assumed. Book Details Paperback: 328 pages Publisher: Packt Publishing (October, 2010) Language: English ISBN-10: 184951304X ISBN-13: 978-1849513043 File Size: 5.9 MiB Hits: 1,048 times

1,977

社区成员

发帖
与我相关
我的任务
社区描述
.NET技术 其他语言讨论
社区管理员
  • 其他语言社区
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧