Appendix A: Data Tables Used for Use Cases
TABLE A.1
Complexity Level Evaluation for CP1
0-4 NEM |
5-8 NEM |
9-12 NEM |
>13 NEM |
|
0-1 NSR |
Low |
Low |
Average |
High |
2-3 NSR |
Low |
Average |
High |
High |
4-5 NSR |
Average |
High |
High |
Very High |
>5 NSR |
High |
High |
Very High |
Very High |
TABLE A.2
Complexity Level Evaluation for CP2
0-2 NSR |
0-5 NOA |
6-9 NOA |
10-14 NOA |
>15 NOA |
0-4 NEM |
Low |
Low |
Average |
High |
5-8 NEM |
Low |
Average |
High |
High |
9-12 NEM |
Average |
High |
High |
Very High |
>13 NEM |
High |
High |
Very High |
Very High |
(а)
3-4 NSR |
0-4 NOA |
5-8 NOA |
9-13 NOA |
>14 NOA |
0-3 NEM |
Low |
Low |
Average |
High |
4-7 NEM |
Low |
Average |
High |
High |
8-11 NEM |
Average |
High |
High |
Very High |
>12 NEM |
High |
High |
Very High |
Very High |
(b)
>5 NSR |
0-3 NOA |
4-7 NOA |
8-12 NOA |
>13 NOA |
0-2 NEM |
Low |
Low |
Average |
High |
3-6 NEM |
Low |
Average |
High |
High |
7-10 NEM |
Average |
High |
High |
Very High |
>11 NEM |
High |
High |
Very High |
Very High |
(c)
System Component Type |
Description |
Complexity |
|||
Low |
Average |
High |
Very High |
||
PDT |
Problem Domain Type |
3 |
6 |
10 |
15 |
HIT |
Human Interface Type |
4 |
7 |
12 |
19 |
DMT |
Data Management Type |
5 |
8 |
13 |
20 |
TMT |
Task Management Type |
4 |
6 |
9 |
13 |
TABLE A.3
Degree of Influences of Twenty Four General System Characteristics
ID |
System characteristics Dl |
ID |
System Characteristics |
Dl |
Cl |
Data Communication ..... |
C13 |
Multiple Sites |
|
C2 |
Distributed Functions ..... |
C14 |
Facilitation of Change |
|
C3 |
Performance ..... |
C15 |
User Adaptively |
|
C4 |
Heavily Used configuration ..... |
C16 |
Rapid Prototyping |
|
C5 |
Transaction Rate ..... |
C17 |
Multiuser Interactivity |
|
C6 |
Online Data Entry ..... |
C18 |
Multiple Interface |
|
C7 |
End User Efficiency ..... |
C19 |
Management Efficiency |
|
C8 |
Online Update ..... |
C20 |
Developer’s Professional Competence |
|
C9 |
Complex Processing ..... |
C2I |
Security |
|
CIO |
Reusability ..... |
C22 |
Reliability |
|
Cll |
Installation Ease ..... |
C23 |
Maintainability |
|
C12 |
Operational Ease ..... |
C24 |
Portability |
|
TDI |
Total Degree of Influence (TDI) |
Appendix B: Review Questions
- 1. Among all the proposed metrics list any five metrics which, in your opinion, predict component development effort most precisely.
- 2. Which of these proposed metrics significantly adds to the predictive ability ofCOCOMO?
- 3. Can the proposed metrics be used as a basis for further remuneration options for COCOMO?
- 4. Describe the top model/technique for software cost estimation.
- 5. Case study a safety critical system and find out the differences in estimating software costs compared to normal systems.
- 6. How do we evaluate the effectiveness of software effort estimation models?
- 7. Explain in detail the importance of “Project Methodology” as a parameter for software cost estimation.
- 8. Define scalability as far as the software cost estimation is concerned.
- 9. What is the importance of the use case points method in software effort estimation?
- 10. What are the most used approaches for evaluation of software effort estimation models?
- 11. Is it possible to do a software cost estimation before requirement collection?
- 12. What method can be used to measure the accuracy of prediction for an effort estimation technique in software development?
- 13. Explain the research gaps in the deep learning, machine learning ... soft computing techniques in the analysis of estimating efforts.
- 14. List the available simulation tools to work on soft computing techniques.
- 15. What is the state of the knowledge surrounding COTS technology implementation?
- 16. Describe the forerunners of both obstacles and designers of COTS technology implementation.
- 17. What are the issues for successful adoption and performance of COTS technology?