Advertisement

Learning about the Design

  • Frank RoginEmail author
  • Rolf Drechsler
Chapter
  • 508 Downloads

Abstract

A single simulation run can be a valuable source of information to detect many errors in the system model. However, summarizing multiple runs to get a general abstraction that holds for all concrete runs offers completely new debugging opportunities. So, new and different aspects about the design can be extracted. Such an induction technique is introduced in this chapter (see Figure 5.1). In general, two main approaches can be distinguished.

Keywords

Clock Cycle Complex Property Execution Trace Property Candidate Checker Type 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. PE04.
    J.H. Perkins and M.D. Ernst. Efficient Incremental Algorithms for Dynamic Detection of likely Invariants. In ACM SIGSOFT Symposium on the Foundations of Software Engineering, pp. 23–32, 2004.Google Scholar
  2. CS06.
    C. Csallner and Y. Smaragdakis. Dynamically Discovering likely Interface Invariants. In International Conference on Software Engineering, pp. 861–864, 2006.Google Scholar
  3. HCNC05.
    S. Hangal, N. Chandra, S. Narayanan, and S. Chakravorty. IODINE: A Tool to Automatically Infer Dynamic Invariants for Hardware Designs. In Design Automation Conference, pp. 775–778, 2005.Google Scholar
  4. IB06.
    B. Isaksen and V. Bertacco. Verification through the Principle of Least Astonishment. In IEEE/ACM International Conference on Computer-Aided Design, pp. 860–867, 2006.Google Scholar
  5. AF02.
    T. Arts and L.-A. Fredlund. Trace Analysis of Erlang Programs. In ACM SIGPLAN Workshop on Erlang, pp. 16–23, 2002.Google Scholar
  6. ABL02.
    G. Ammons, R. Bodik, and J.R. Larus. Mining Specifications. ACM SIGPLAN Notices, Volume 37, Issue 1, pp. 4–16, 2002.CrossRefGoogle Scholar
  7. YE04.
    J. Yang and D. Evans. Automatically Inferring Temporal Properties for Program Evolution. In International Symposium on Software Reliability Engineering, pp. 340–351, 2004Google Scholar
  8. HL02.
    S. Hangal and M.S. Lam. Tracking down Software Bugs using Automatic Anomaly Detection. In International Conference on Software Engineering, pp. 291–301, 2002.Google Scholar
  9. DF04.
    R. Drechsler and G. Fey. Design Understanding by Automatic Property Generation. In Workshop on Synthesis And System Integration of Mixed Information Technologies, pp. 274–281, 2004.Google Scholar
  10. FD04.
    D. Engler, D.Y. Chen, S. Hallem, A. Chou, and B. Chelf. Bugs as Deviant Behavior: A General Approach to Inferring Errors in Systems Code. ACM SIGOPS Operating Systems Review, Volume 35, Issue 5, pp. 57–72, 2001CrossRefGoogle Scholar
  11. SO06.
    N. Shi and R.A. Olsson. Reverse Engineering of Design Patterns from Java Source Code. In International Conference on Automated Software Engineering, pp. 123–134, 2006.Google Scholar
  12. Vok06.
    M. Vokac. An efficient Tool for Recovering Design Patterns from C++ Code. Journal of Object Technology, 5(1):139–157, 2006.CrossRefGoogle Scholar
  13. Vera.
    Synopsys Inc. OpenVera. [Online], http://www.open-vera.com accessed May 2008
  14. GD04b.
    G. Fey and R. Drechsler. Improving Simulation-Based Verification by Means of Formal Methods. In Asia and South Pacific Design Automation Conference, pp. 640–643, 2004.Google Scholar
  15. DAC99.
    M. B. Dwyer, G. S. Avrunin, and J. C. Corbett. Patterns in Property Specifications for Finite-state Verification. In International Conference on Software Engineering, pp. 411–420, 1999.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.Institutsteil EntwurfsautomatisierungFraunhofer - Institut für Integrierte SchaltungenDresdenGermany
  2. 2.Universität Bremen AG RechnerarchitekturBremenGermany

Personalised recommendations