Software metricIn software engineering and development, a software metric is a standard of measure of a degree to which a software system or process possesses some property. Even if a metric is not a measurement (metrics are functions, while measurements are the numbers obtained by the application of metrics), often the two terms are used as synonyms. Since quantitative measurements are essential in all sciences, there is a continuous effort by computer science practitioners and theoreticians to bring similar approaches to software development.
LinuxLinux (ˈlɪnʊks ) is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991, by Linus Torvalds. Linux is typically packaged as a Linux distribution, which includes the kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name "GNU/Linux" to emphasize the use and importance of GNU software in many distributions, causing some controversy.
Software development effort estimationIn software development, effort estimation is the process of predicting the most realistic amount of effort (expressed in terms of person-hours or money) required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds. Published surveys on estimation practice suggest that expert estimation is the dominant strategy when estimating software development effort.
Kernel (operating system)The kernel is a computer program at the core of a computer's operating system and generally has complete control over everything in the system. It is the portion of the operating system code that is always resident in memory and facilitates interactions between hardware and software components. A full kernel controls all hardware resources (e.g. I/O, memory, cryptography) via device drivers, arbitrates conflicts between processes concerning such resources, and optimizes the utilization of common resources e.
Linux kernelThe Linux kernel is a free and open-source, monolithic, modular, multitasking, Unix-like operating system kernel. It was originally written in 1991 by Linus Torvalds for his i386-based PC, and it was soon adopted as the kernel for the GNU operating system, which was written to be a free (libre) replacement for Unix. Linux is provided under the GNU General Public License version 2 only, but it contains files under other compatible licenses.
COCOMOThe Constructive Cost Model (COCOMO) is a procedural software cost estimation model developed by Barry W. Boehm. The model parameters are derived from fitting a regression formula using data from historical projects (63 projects for COCOMO 81 and 163 projects for COCOMO II). The constructive cost model was developed by Barry W. Boehm in the late 1970s and published in Boehm's 1981 book Software Engineering Economics as a model for estimating effort, cost, and schedule for software projects.
Cost estimation in software engineeringCost estimation in software engineering is typically concerned with the financial spend on the effort to develop and test the software, this can also include requirements review, maintenance, training, managing and buying extra equipment, servers and software. Many methods have been developed for estimating software costs for a given project.
Function pointThe function point is a "unit of measurement" to express the amount of business functionality an information system (as a product) provides to a user. Function points are used to compute a functional size measurement (FSM) of software. The cost (in dollars or hours) of a single unit is calculated from past projects. There are several recognized standards and/or public specifications for sizing software based on Function Point. 1. ISO Standards FiSMA: ISO/IEC 29881:2010 Information technology – Systems and software engineering – FiSMA 1.
Putnam modelThe Putnam model is an empirical software effort estimation model. The original paper by Lawrence H. Putnam published in 1978 is seen as pioneering work in the field of software process modelling. As a group, empirical models work by collecting software project data (for example, effort and size) and fitting a curve to the data. Future effort estimates are made by providing size and calculating the associated effort using the equation which fit the original data (usually with some error). Created by Lawrence Putnam, Sr.
Software sizingSoftware sizing or software size estimation is an activity in software engineering that is used to determine or estimate the size of a software application or component in order to be able to implement other software project management activities (such as estimating or tracking). Size is an inherent characteristic of a piece of software just like weight is an inherent characteristic of a tangible material. Software sizing is different from software effort estimation.