Example: marketing

SUGI 28: Beyond Debugging: Program Validation - SAS

Paper 58- 28 beyond debugging : Program Validation Neil Howard, Ingenix, Basking Ridge, NJ Abstract "Act in haste and repent at leisure; code too soon, and debug forever." Raymond Kennington In their paper on debugging , Lora Delwiche and Susan Slaughter say that good debuggers make good programmers. Let's take that one step further to say that good analysts and problem-solvers make good programmers. Just because a SAS Program is free of errors, warnings, notes, and bugs does not guarantee that the Program is doing what it is supposed to do. This tutorial addresses the process that follows debugging : Program Validation . It covers techniques for ensuring that the logic and intent of the Program is correct, that the requirements and design specifications are met, and that data errors are detected. It also discusses fundamental SAS Program design issues that give purpose and structure to your programming approach.

Paper 58-28 Beyond Debugging: Program Validation Neil Howard, Ingenix, Basking Ridge, NJ Abstract "Act in haste and repent at leisure; code too soon,

Tags:

  Programs, Validation, Beyond, Debugging, Beyond debugging, Program validation, 28 beyond debugging

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of SUGI 28: Beyond Debugging: Program Validation - SAS

1 Paper 58- 28 beyond debugging : Program Validation Neil Howard, Ingenix, Basking Ridge, NJ Abstract "Act in haste and repent at leisure; code too soon, and debug forever." Raymond Kennington In their paper on debugging , Lora Delwiche and Susan Slaughter say that good debuggers make good programmers. Let's take that one step further to say that good analysts and problem-solvers make good programmers. Just because a SAS Program is free of errors, warnings, notes, and bugs does not guarantee that the Program is doing what it is supposed to do. This tutorial addresses the process that follows debugging : Program Validation . It covers techniques for ensuring that the logic and intent of the Program is correct, that the requirements and design specifications are met, and that data errors are detected. It also discusses fundamental SAS Program design issues that give purpose and structure to your programming approach.

2 This paper will address: the definitions of verification, Validation , testing, and debugging , as well as the structure of the Software Development Life Cycle (SDLC). It will illustrate how the SAS system can be used to help satisfy the requirements of your SDLC and accomplish the tasks of verification and Validation . Since as much as 80% of a programmer s time is invested in testing and Validation , it s important to focus on tools that facilitate correction of syntax, data, and logic errors in SAS programs . The presentation focuses on wide variety of SAS features, tips, techniques, tricks, and system tools that can become part of your routine testing methodology. Introduction [..Overheard at an interview for a SAS programming position: But you don t have to test SAS programs !!! ..] As the interviewers quickly escort the confused candidate out the door, they recall how often it is assumed that a fourth generation language does so much for you that you don t have to test the code.

3 The SAS system is easy to use, and the learning curve to productivity is relatively short. But SAS is just as easy to ABUSE. Programmers and analysts must not lose sight of the indisputable facts: data is seldom clean, logic is too often faulty, and fingers walk clumsily over keyboards. Condition codes are not an accurate indicator of successful programs . There are traditional methodologies for preventative pest control, but there is no PROC TEST-MY-CODE or PROC WHITE-OUT or PROC READ-MY-MIND. The SAS system offers many features for identifying syntax, logic, and data errors. The results will undoubtedly include reduced stress and bigger raises for SAS programmers, satisfied clients, accurate output, and quality programs that are reliable and maintainable. This supports the business need to deliver results to the FDA, minimize time to approval and time to market.

4 Definition of Terms The responsibility for ensuring Program quality remains with the programmer. Today's competitive environment demands that we discuss testing methods and useful SAS system tools that will help us meet the challenges of verification, Validation , testing, and debugging . The following definitions were provided by the Validation maven at Parke-Davis and serve as the benchmarks for this paper. VERIFICATION: Checking (often visual) of a result based on predetermined criteria, , check that a title is correct. Validation : The process of providing documented evidence that a computerized system performs its functions as intended and will continue to do so in the future. TESTING: Expected results are predetermined, so actual results can be either accepted or rejected by one or more of the testing types: unit, integration, worst case, valid case, boundary value, alpha, beta, black box, white box, regression, functional, structural, performance, stability, etc.

5 debugging : The process of finding and correcting the root cause of an unexpected result. Terms are Relative The author conducted a brief informal survey: 1) within Parke-Davis, among the Clinical Reporting Systems management team, selected senior programmers and systems analysts, clinical team leaders, developers and biometricians, and 2) Beyond the company, within a selected network of colleagues in the SAS community. The intent of the survey was SUGI 28 Beginning Tutorialsto see how well our baseline definitions help up against greater scrutiny, perception and application. IEEE on Terms One survey respondent follows the Teri Stokes school of Validation , based on IEEE standards. Std 1012-1986, "IEEE Standard for Software Verification and Validation Plans", states: VERIFICATION is "the process of determining whether or not the products of a given phase of the software development cycle fulfill the requirements established during the previous phase.

6 " Informally, making sure your Program is doing what you think it does. Validation is "the process of evaluating software at the end of the software development process to ensure compliance with software requirements." Informally, making sure the client is getting what they wanted. TESTING is "the process of analyzing a software item to detect the differences between existing and required conditions (that is, bugs), and to evaluate the features of the software item." IEEE Standard , "IEEE Standard Glossary of Software Engineering Terminology, offers DEBUG: "To detect, locate, and correct faults in a computer Program . Techniques include use of breakpoints, desk checking, dumps, inspection, reversible execution, single-step operation, and traces." Survey Results For the most part, the survey respondents were consistent in their definitions, especially for Validation and debugging .

7 Verification and testing were murkier subjects, often assumed to be the same thing. Comments on verification included: 1) quick and dirty check, 2) the systems testing portion of the SDLC, 3) it s definitely a Department of Defense word, not commonly used, 4) identifying that something is correct, 5) creation of test data for test plan, 6) making sure a Program does what [it] says, the results are accurate and stand up to the specs. Validation was consistently related to the SDLC it is requirements, specifications, development, verification, and user acceptance. Respondents said Validation was: 1) thorough documentation of the SDLC, 2) formal, accomplishes the specs for inclusion in the research reports, 3) inclusive of change control and retirement, 4) to ensure compliance with regulations, making it legal, 5) formal test plan, test data, system verification, creation of valid protocol, user acceptance.

8 Recurring words were: reproduce-able, efficacious. Testing yielded the most vague responses. Some felt it was synonymous with verification or that it is part of Validation . Other comments: informal testing during the development phase, putting the Program through the ringer, comes in many disguises (unit, integration, systems, etc.). If someone asks me to test something, I ask them if they want it verified or validated. Respondents said a Program wasn t ready for Validation if it still had bugs in it. debugging was said to be testing of logical systems to ensure they re not subject to errors of certain a priori identified types. Generally, debugging was felt to be the fixing of errors in the development phase of the SDLC. Zero Defect programs [..Lubarsky s Law of Cybernetic Entomology: There s always one more ] How are errors introduced into a Program ? Naturally, no one intends to produce flawed code.

9 Bugs just turn up mysteriously to wreak havoc when we least expect it. However, we should recognize bugs for the errors that they are and attempt to produce code with zero defects. It is not enough to assume without verification that the work of good programmers will be correct. The slightest typo in the code or misunderstanding of the user requirements can cause serious errors during later execution. The purpose of testing is to identify any errors and inconsistencies that exist in a system. debugging is the art of locating and removing the source of these errors. Together, the testing and debugging tasks are thought to cost 50% to 80% of the total cost of developing the first working version of a system. Clearly, any techniques that will facilitate these tasks will lead to improved productivity and reduced costs of system development. We must broaden our philosophy of testing to go Beyond the syntax check.

10 As shown in Figure 1, the cost of correcting errors increases exponentially with the stage of detection. An error in a large system discovered after release of the product to the user can cost over two hundred times the cost of correcting the same error if it were discovered at the beginning of the system s development. Costs to fix these later errors include changes in documentation, re-testing, SUGI 28 Beginning Tutorialsand possible changes to other programs affected by the one in error. While costs of correcting errors in smaller systems or segments of code do not increase so dramatically over time, early testing can still reduce total costs substantially and improve system accuracy and reliability. Figure Cost of Correcting ErrorsSource: FarleyRequirementsDesignCodingDevelopmen t TestingAcceptance TestingOperation050100150200 Stage of Error DetectionRelative Cost Testing and debugging are recommended at all stages of the software development life cycle: determination of functional requirements; systems or Program design; coding, unit testing, and implementation; and Validation of results.


Related search queries