Log In
New user? Click here to register. Have you forgotten your password?
NC State University Libraries Logo
    Communities & Collections
    Browse NC State Repository
Log In
New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Dr. Laurie Williams, Committee Member"

Filter results by typing the first few letters
Now showing 1 - 11 of 11
  • Results Per Page
  • Sort Options
  • No Thumbnail Available
    Automatic Generation and Execution of Specification-Based Test Cases in a Network Based Environment
    (2003-07-29) Chalke, Yatin; Dr. Mladen Vouk, Committee Chair; Dr. Edward Gehringer, Committee Member; Dr. Laurie Williams, Committee Member
    Automatic testing of software has always been of interest. The concept has been receiving more and more attention in the software development climate of today (e.g., agile software processes) where managers and testers are being asked to turn around their products under shorter schedules and with smaller resources. Automation is being looked at as a pathway to meet the product testing and delivery deadlines. There are many research methods and tools, as well as a number of commercial tools that automate test design, generation and execution. Some are specification-based, some are structure based, and some are combinations of both. However the current state of automation of software testing has problems. These include design and maintainability of automation (e.g., current methods may not track product changes in a cost-effective way, and test-case suites do not always evolve easily and consistently), generation of excessive number of test-cases, end-user friendliness of the tool interfaces, measurability of the test-suite effectiveness, and so on. The goal of the work presented here is to study automation of efficient specification-based test-case design, generation and execution. The investigation focuses on the principle known as pair-wise testing. Empirical evidence exists that shows that the technique can produce, depending on the algorithm used, test suites which are not only efficient in fault detection, but also have a manageable size. Objectives of the work were to a) enhance and further asses the open-source pair-wise test case design tool, developed previously by K.C. Tai and his students, called PairTest, b) develop a prototype of an adaptable easy-to-use system for automatic execution of the PairTest test cases, something that the original release of PairTest does not do, in a setting here one tests network-based systems, and c) assess usability of pair-wise testing strategy and automation of testing in an industrial environment. This work explicitly recognizes two stages of testing automation: i) automated test-case design and generation, and ii) automated construction of executable test-case from the generated suite. In assessing performance of PairTest, it was compared to the automatic test case generation practices in the industry with the help of different user scenarios. The prototype system for conversion of PairTest generated test-cases into executable test-suites is based on the open-source 'Expect' toolset. PairTest performs favorably where it comes to test-case design and generation, as well as fault-detection power. It exhibits an ability to contain exponential growth in the number of tests generated with the increase in the number of specification parameter values. A quantitative analysis of pair-wise testing and test automation in two industrial environment shows that, compared with manual methods, automatic test design and execution in general help a) increase the error detection capability (efficiency) of testing, b) reduce resource consumption, and c) increase evolvability (maintainability, adaptability) of the testing process. Results confirm that properly automated testing, as a 'complete' testing solution, not only improves the quality of the software and system, but also saves time and resources.
  • No Thumbnail Available
    Bootstrapping Referral Systems With Social Network Information
    (2007-05-16) Batalagundu Viswanathan, Arvind; Dr. Edward Gehringer, Committee Member; Dr. Laurie Williams, Committee Member; Dr. Munindar P. Singh, Committee Chair
    This thesis addresses the challenge of facilitating human interactions in solving problems. To this end, it assigns an agent to each user, and models a social network as a multiagent system. A user's agent helps them by sending out and responding to queries on their behalf. Each agent makes its decisions based on its models of the expertise and trustworthiness of other agents. However, such models are not trivial to construct and maintain. This thesis develops an approach wherein the models are seeded based upon information extracted from the user's emails and from existing social networking sites. The main contribution of this thesis is in the specification of heuristics by which expertise and trustworthiness can be computed. It also provides a general schema and methodology by which additional sources of social information can be incorporated.
  • No Thumbnail Available
    A Conformance Review Strategy for Regulating Safety-Critical Software.
    (2006-09-24) Jetley, Raoul Praful; Dr. S. Purushothaman Iyer, Committee Chair; Dr. Laurie Williams, Committee Member; Dr. Matthias Stallmann, Committee Member; Dr. Alexander Dean, Committee Member
    Safety is an important concern for software used in life-critical systems such as air transport, nuclear power and medical devices. The critical nature of these systems necessitates that the software used therein be reliable and free of errors. It becomes imperative, therefore, to have a stringent review process in place to ascertain the correctness of the software and to ensure that it meets all requirements and standards. Regulatory agencies encourage the use of formal methods based techniques in the development of safety critical software. However, most manufacturers are reluctant to use these techniques, citing them as too complex and time consuming. As a result, (potentially life-threatening) errors are often not discovered until the software is already on the market. When such an error is eventually discovered, it becomes essential to trace the failure to its exact source in the implementation and to assure that the error correction restores the overall safety and effectiveness of the device. In this dissertation, we present how efficient premarket and postmarket reviews of designs and implementations can be carried out using formal methods based techniques, to enable the process of reviewing software in safety-critical devices. To facilitate premarket conformance reviews, we introduce the notion of usage models -- standardized formal models that serve as design templates. We present an approach to conformance checking of safety-critical software through formal verification and automated test case sequences derived from these standardized models. To provide for efficient postmarket reviews, we establish a methodology based on integrating program slicing with model abstraction to trace software failures to their root cause. We formalize this methodology by presenting an iterative algorithm for abstraction-driven slicing and realize this algorithm through the implementation of the CAdS -- a forensic analysis tool for C programs. We provide case studies involving typical medical device software to illustrate the various concepts involved and present results from these studies to gauge the effectiveness of our proposed approach.
  • No Thumbnail Available
    Data Organization and Abstraction for Distributed Intrusion Detection
    (2005-04-06) McBride, Sean Patrick; Dr. Christopher G. Healey, Committee Member; Dr. Robert St. Amant, Committee Chair; Dr. Laurie Williams, Committee Member
    Due to the rapid pace of technological development, we find that old systems are 'thrown away' in favor of newer technology. However, we find that data created by these earlier systems is persistent. A Digital Rosetta Stone [16] must be created to allow newer systems to correctly process data created by earlier technology. This document provides a case study of techniques that can be used to create a Digital Rosetta Stone between data formats and within a single evolving format. The intrusion detection domain provides a solid basis for this study. In a distributed intrusion detection system, many sensors and analyzers must communicate with each other. The Intrusion Detection Message Exchange Format (IDMEF) is a standardized XML format for such communication. To its detriment, the IDMEF specification has been evolving since its inception. Also, the XML parsing during queries can be cumbersome and hinder intrusion detection. Therefore, two Digital Rosetta Stones were created. One migrates information between different versions of the IDMEF standard. The other translates IDMEF XML information into a relational database management system to improve query performance.
  • No Thumbnail Available
    Design and Implementation of fauxBay: Test-Bed for Bidding Agents in Online Auction Markets.
    (2006-04-26) Singh, Ratna; Dr. Peter R. Wurman, Committee Chair; Dr. Laurie Williams, Committee Member; Dr. R. Michael Young, Committee Member
    Online auctions provide a very competitive market for a variety of goods and services and have gained a lot of popularity over the last few years. The increasing participation of buyers and sellers in these auctions has triggered the development of software bidding agents. These agents are automated agents that are designed to place bids on behalf of the user. In recent years, simulation of online auction markets has become a major research area in the field of electronic commerce. This is because testing different bidding strategies in a real auction environment involves risks and can result in heavy losses. Moreover, testing in a real life auction involves substantial amount of time and participation, in order to derive a definite conclusion about the bidding agent's performance. We propose to solve this problem by designing and implementing a tool for simulating online auction markets. Our simulation tool called fauxBay can be configured to develop test scenarios for testing bidding agent strategies. Currently, fauxBay supports the reserve price auction mechanism found on eBay. However, support for other auction mechanisms can be easily integrated with fauxBay in the future. We have designed fauxBay to offer two modes of simulating auction environments for testing bidding strategies: event-based and clock-based. Event-based simulation is used to obtain a quicker analysis of the bidding agent's performance, by moving forward to each event rather than moving linearly by the clock. In contrast, the clock-based simulation offers to test and gauge a bidding agent's performance in an auction environment where time moves linearly by the clock similar to real life online auctions. A key characteristic of fauxBay is that it tests bidding agent's strategies against real eBay data on completed auctions. Therefore, the results obtained by testing bidding strategy in fauxBay give a better insight into its anticipated performance in a real life online auction. Overall, fauxBay is designed to be extensible, configurable and give close to realistic results on the performance of bidding agents.
  • No Thumbnail Available
    Improving Software Comprehension In Regulating Safety-Critical Systems
    (2008-06-06) Zhang, Yi; Dr. Tao Xie, Committee Member; Dr. Laurie Williams, Committee Member; Dr. Matthias Stallmann, Committee Member; Dr. S. Purushothaman Iyer, Committee Chair
  • No Thumbnail Available
    Netset - A Software Framework for Automation of Network based Tests
    (2008-12-16) Arora, Puneet; Dr. Injong Rhee, Committee Chair; Dr. Khaled Harfoush, Committee Member; Dr. Laurie Williams, Committee Member
    Testing network protocols or networking devices for generating performance benchmarks is integral to computer networking research. The general pattern in conducting such tests is configuring a desired topology over a network test-bed, deploying test software and tools onto network nodes, setting up load traffic on the network, executing the deployed software, gathering data and generating reports for analysis and comparison. A typical example of such a process is performance testing of congestion control protocols. Congestion control protocols are designed to satisfy a complex set of goals and their performance is sensitive to network topology, network delays, and router queue sizes and policies. It is thus non-trivial to construct testing procedures for congestion control protocols. While there exist network emulation test-bed services like Emulab, WanInLab, Deter etc. they limit themselves to merely providing a set of nodes which the developer can use for testing. They provide no means of setting up the test environment, generating test traffic, gathering data and generating reports. As a result developers tend to write their own automation procedures to carry out these steps. Such tendency restricts portability, repeatability and comparison of test procedures and their results. This lack of a generally accepted practice or of a testing tool affects many parts of the networking research community including researchers, students, standardization bodies and developers. To address this issue, we develop a tool for network based testing called Netset. The tool implements a general model which accommodates a wide range of testing processes and provides programming interface to develop specialized tools for a particular category like congestion control protocols. We then apply this tool to two testing scenarios and compare its benefits versus using a manual approach in the same scenarios.
  • No Thumbnail Available
    On the relative advantages of teaching Web services in .NET vs. J2EE.
    (2003-09-17) Kachru, Sandeep; Dr. Munindar Singh, Committee Member; Dr. Edward Gehringer, Committee Chair; Randy Miller, Committee Member; Dr. Laurie Williams, Committee Member
    .NET and J2EE are currently the two leading technologies in enterprise-level application development. In the coming years, according to various surveys, these two technologies will capture an almost equal amount of market share. They are also the platforms of choice for developing Web services. There is an ongoing debate about the advantage of developing Web services in one over the other. We look at this question from the perspective of educators. We compare and analyze the two platforms using a number of parameters such as features present in each platform, tools and resources offered by the two and compatibility with the rest of the curriculum. We study the most significant difference between the two platforms — the platform independence of J2EE and the language independence of .NET, and discuss their relative advantages in an academic environment. We discover that both of the platforms offer equal support for the development of Web services and teach the concepts equally well. While .NET offers integrated, native support for various phases of Web services development, Java platform achieves this with several new libraries. On the other hand, J2EE's major advantage over .NET is the popularity of the Java language in academia. Thus, teaching Web services in Java maintains uniformity in the curriculum. A looming factor is the growth of C# as a teaching language. Though it seems destined to be adapted as a primary language in more schools, it will be some time before it can challenge Java as the most popular language in universities. We finally compare the development process of Web services in IBM's Websphere and Microsoft?s Visual Studio .NET and find them remarkably similar. Both the tools provide comparable features to develop Web services easily. Thus, the choice of platform will depend on factors other than the relative ease of teaching Web services. Arguments in favor of J2EE are platform independence, multiple vendor support, popularity of Java in universities, a greater number of tools and resources etc. However, it does not allow programming in any other language besides Java and does not offer native support for Web services. On the other hand, the .NET platform has support for multiple languages, integrated support for Web services, an excellent development tool and a language that is becoming more popular in academia. The factors that go against .NET are inadequate platform independence and single-vendor support. We conclude that there is no clear winner and the choice of platform will depend on various local factors. Finally, we provide a road-map that will help the educators in making the decision.
  • No Thumbnail Available
    Optimizing Effectiveness and Efficiency of Software Testing: A Hybrid Approach
    (2006-11-08) Bell, Kera Zakiyah; Dr. Winser Alexander, Committee Member; Dr. Donald Bitzer, Committee Member; Dr. Laurie Williams, Committee Member; Dr. Mladen Vouk, Committee Chair
    The overall goal of software testing is to disclose defects efficiently (i.e. minimal time and cost) and effectively (i.e. maximum faults detected). It takes time to understand what to test, to generate test cases, to execute the test suite and to analyze the results. In a situation where one can parameterize inputs and variables of interest, the cost of generating random operational profile conformant test cases may be acceptable. It is typically O(N*p) where p is the number of parameters and N is the number of test cases. However, this can result in large test suites and may take long to execute. Systematic approaches tend to generate smaller test suites, thus reducing run-time and analysis costs, but may take much longer to generate since they may require a higher-level of initial expertise to develop. Is there a way to maximize benefits of both statistical and systematic approaches together to simultaneously optimize both efficiency and effectiveness? Hybrid approaches combine one or more testing techniques. A particular parameter-based systematic technique of interest is called n-wise testing. It assumes that most of the faults will be found if all or most of the parameter n-tuple values are covered by the tests. The efficiency and effectiveness of a hybrid approach that combines statistical testing with k-wise where 2 <= k < n (k-wise hybrid) testing is explored. Results show that under certain conditions the n-wise hybrid may help maximize efficiency and increase effectiveness beyond statistical testing. Using a hybrid approach seems most effective when the number of values associated with a parameter is very large. The total number of test cases produced by the hybrid technique to detect all n-way defects is approximately vˆ(n-k) times the number of test cases produced by the systematic approach alone where n is the level of testing required to guarantee coverage of all n-way constructs. A potential use of this approach is in testing for failures that may result from a complex combination of interacting parameter values (v), such as those found in security failures, and in testing highly complex network-based systems and workflows in general.
  • No Thumbnail Available
    RaPTEX: Rapid Prototyping Tool for Embedded Communication Systems
    (2007-05-17) Lim, Jun Bum; Dr. Laurie Williams, Committee Member; Dr. Injong Rhee, Committee Member; Dr. Mihail L. Sichitiu, Committee Chair
    Advances in microprocessors, memory, and radio technology have enabled the emergence of embedded systems that rely on communication systems to exchange information and coordinate their activity in spatially distributed applications. Developing embedded communication systems that are efficient and reliable, is a challenge due to the trade-offs imposed by the conflicts between application requirements and hardware constraints. In this thesis, we present RaPTEX, an integrated development environment (IDE) for embedded communication systems. RaPTEX consists of three major subsystems: a graphical module to facilitate component composition, code generation with access to component-level parameters, and a performance evaluation framework for allowing system designers to explore what-if scenarios and clearly expose the trade-offs of their choices. We also present two case studies of developing wireless sensor network applications using RaPTEX.
  • No Thumbnail Available
    Spectral Clustering for Graphs and Markov Chains
    (2010-03-08) Liu, Ning; Dr. William J. Stewart, Committee Chair; Dr. Harry G. Perros, Committee Co-Chair; Dr. Laurie Williams, Committee Member; Dr. Michael Devetsikiotis, Committee Member
    Spectral graph partitioning based on spectral theory has become a popular clustering method over the last few years. The starting point is the work of Fiedler who shows that an eigenvector of the Laplacian matrix of an undirected graph (symmetric system) provides the minimum cut of graph nodes. The spectral technique can also be applied to a Markov chain to cluster states and, in general, is more broadly applicable to nonsymmetric systems. Enlightened by these facts, we combine them to show that Markov chains, due to two different clustering techniques they offer, are effective approaches for clustering in more general situations. In this dissertation, we advance the state of the art of spectral clustering and introduce a new algorithm to decompose matrices into blocks. We first prove that the second eigenvector of the signless Laplacian provides a heuristic solution to the NP-complete state clustering problem which is the dual problem of graph partitioning. A new method for clustering nodes of a graph that have negative edge weights is also proposed. Second, a connection between the singular vectors obtained from an SVD decomposition and the eigenvectors from spectral algorithms on data clustering is revealed. We show that the singular vectors of the node-edge incidence matrix generate not only clusters on the nodes but also clusters on the edges. Third, relating spectral clustering and state clustering of Markov chains, we present two clustering techniques for Markov chains based on two different measures and suggest a mean of incorporating both techniques to obtain comprehensive information concerning state clusters. Fourth, we display the connection between spectral clustering and dimension reduction techniques in statistical clustering. Also, the results obtained from spectral and statistical clustering are shown to be related. Finally, we develop a new improved spectral clustering procedure for decomposing matrices into blocks. This algorithm works well in several applications, especially in problems of detecting communities in complex networks, where some existing methods, e.g. MARCA and TPABLO, fail.

Contact

D. H. Hill Jr. Library

2 Broughton Drive
Campus Box 7111
Raleigh, NC 27695-7111
(919) 515-3364

James B. Hunt Jr. Library

1070 Partners Way
Campus Box 7132
Raleigh, NC 27606-7132
(919) 515-7110

Libraries Administration

(919) 515-7188

NC State University Libraries

  • D. H. Hill Jr. Library
  • James B. Hunt Jr. Library
  • Design Library
  • Natural Resources Library
  • Veterinary Medicine Library
  • Accessibility at the Libraries
  • Accessibility at NC State University
  • Copyright
  • Jobs
  • Privacy Statement
  • Staff Confluence Login
  • Staff Drupal Login

Follow the Libraries

  • Facebook
  • Instagram
  • Twitter
  • Snapchat
  • LinkedIn
  • Vimeo
  • YouTube
  • YouTube Archive
  • Flickr
  • Libraries' news

ncsu libraries snapchat bitmoji

×