Browsing by Author "Dr. Mladen Vouk, Committee Member"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
- Analysis and Quantification of Test Driven Development Approach(2002-09-06) George, Boby; Dr. Laurie Williams, Committee Chair; Dr. Aldo Dagnino, Committee Member; Dr. Mladen Vouk, Committee MemberSoftware industry is increasingly becoming more demanding on development schedules and resources. Often, software production deals with ever-changing requirements and with development cycles measured in weeks or months. To respond to these demands and still produce high quality software, over years, software practitioners have developed a number of strategies. One of the more recent one is Test Driven Development (TDD). This is an emerging object-oriented development practice that purports to aid in producing high quality software quickly. TDD has been popularized through the Extreme Programming (XP) methodology. TDD proponents profess that, for small to mid-size software, the technique leads to quicker development of higher quality code. Anecdotal evidence supports this. However, until now there has been little quantitative empirical support for this TDD claim. The work presented in this thesis is concerned with a set of structured TDD experiments on very small programs with pair programmers. Programmers were both students and professionals. In each programmer category (students and professionals), one group used TDD and the other (control group) a waterfall-like software development approach. The experiments provide some interesting observations regarding TDD. When TDD was used, both student and professional TDD developers appear to achieve higher code quality, as measured using functional black box testing. The TDD student pairs passed 16% more test cases while TDD professional pair passed 18% more test cases than the their corresponding control groups. However, professional TDD developer pairs did spent about 16% more time on development. It was not established whether the increase in the quality was due to extra development time, or due to the TDD development process itself. On the other hand, the student experiments were time-limited. Both the TDD and the non-TDD student programmers had to complete the assignment in 75 minutes. Professional programmers took about 285 minutes on the average, to complete the same assignment. Consequently, the development cycle of the student-developed software was severely constrained and the resulting code was underdeveloped and of much poorer quality than the professional code. Still, it is interesting to note that the code developed using the TDD approach under these severe restrictions appears to be less faulty than the one developed using the more classical waterfall-like approach. It is conjectured that this may be due to the granularity of the TDD process, one to two test cases per feedback loop, which may encourage more frequent and tighter verification and validation episodes. These tighter cycles may result in a code that is better when compared to that developed by a coarser granularity waterfall-like model. As part of the study, a survey was conducted of the participating programmers. The majority of the programmers thought that TDD was an effective approach, which improved their productivity.
- Application Based Resource Allocation Policies in MultiService Networks(2005-07-27) Stanisic, Vladica; Dr. Mladen Vouk, Committee Member; Dr. Arne Nilsson, Committee Member; Dr. Mihail Devetsikiotis, Committee Chair; Dr. J Keith Townsend, Committee MemberEfficient and reliable bandwidth allocation is one of the most important open issues in the management of networks that aim to offer a guaranteed Quality of Service. The bandwidth allocation problem becomes more difficult in multiservice networks, where a large variety of different applications, each one with different requirements in terms of bandwidth, duration or delay, information loss use the network infrastructure simultaneously. Most of the previous work has analyzed bandwidth allocation policies under the context of resolving conflicts due to dynamics of user requests without taking network availability, user mobility, or the delivery (i.e., physical environment) conditions into account. Since static bandwidth allocation policies lack adaptive mechanisms to combat these dynamics in the network and improve bandwidth utilization, we believe that a more flexible service model which allows variable QoS is needed. Adaptive resource management coupled with dynamic load balancing aims at decreasing the possibility of congestion and maintaining high resource utilization, under transient traffic variations and node/link failure. We have formulated preemption algorithms and criteria for optimization by preemption algorithms, studied existing algorithms and investigated suboptimal preemption algorithms with random selection of connections to be rerouted. We have also performed numerical and simulation comparisons of rerouting algorithms by analyzing their performance on a single link, dynamic setting and in a full network environment with a heterogeneous traffic mix. In order to account for the users application type, QoS requirements and quantify users' value we have introduced a utility-based QoS model. We have investigated network utilization, QoS observed by the customers, and revenue generation perspectives for different utility-quantified bandwidth allocation schemes. We have presented approximate analytical tools to obtain blocking probabilities in a multi rate multi class system, where users of the same class can have different resource requirements. We have evaluated the blocking probabilities for a single link case and validated our approach through the simulation of such a system. Also we have expanded our single link model to calculate blocking probabilities for a multihop path, when the offered traffic of each source destination pair along the path is known.
- Measurement Based Connection Admission Control(2003-09-23) Jaising, Rahul; Dr. Arne Nilsson, Committee Chair; Dr. Mihail Sichitiu, Committee Member; Dr. Mladen Vouk, Committee Member; Dr. Zsolt Haraszti, Committee MemberWe consider the problem of using the real-time measurements of the network elements for connection admission control on QoS aware networks. Our objective is to study the measurement process and determine the real-time utilization of a link, and use these measurements to determine the admission of new flows into the network, while providing statistical guarantees on the QoS ensured for the existing admitted flows. First we survey the vast amount of existing literature in the field and identify the components of a measurement-based admission control system, and the various factors which affects the performance of an algorithm. We use the ns-2 simulator to simulate some of the proposed (though not all) algorithms and test the performance of these algorithms for various arrival processes at the connection level. We use the results from these simulations to verify the performance claims of the various algorithms, and use the performance tuning parameters to find optimal performance regions. We study the buffer dynamics at the burst level and analyze the loss caused by admitting excessive flows. Our work extends from the existing literature in studying the effect of different arrival processes on the blocking probabilities of new flows and surveying an ad-hoc mix-and-match approach of the estimation technique and decision process to explore a higher performance benchmark.