Data modelA data model is an abstract model that organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities. For instance, a data model may specify that the data element representing a car be composed of a number of other elements which, in turn, represent the color and size of the car and define its owner. The corresponding professional activity is called generally data modeling or, more specifically, database design.
Multiprotocol Label SwitchingMultiprotocol Label Switching (MPLS) is a routing technique in telecommunications networks that directs data from one node to the next based on labels rather than network addresses. Whereas network addresses identify endpoints the labels identify established paths between endpoints. MPLS can encapsulate packets of various network protocols, hence the multiprotocol component of the name. MPLS supports a range of access technologies, including T1/E1, ATM, Frame Relay, and DSL. In an MPLS network, labels are assigned to data packets.
Data governanceData governance is a term used on both a macro and a micro level. The former is a political concept and forms part of international relations and Internet governance; the latter is a data management concept and forms part of corporate data governance. On the macro level, data governance refers to the governing of cross-border data flows by countries, and hence is more precisely called international data governance. This new field consists of "norms, principles and rules governing various types of data.
Library classificationA library classification is system of organization of knowledge in which sources are arranged according to the classification scheme and ordered very systematically. Library classifications are a notational system that represents the order of topics in the classification and allows items to be stored in the order of classification. Library classification systems group related materials together, typically arranged as a hierarchical tree structure.
Data integrityData integrity is the maintenance of, and the assurance of, data accuracy and consistency over its entire life-cycle and is a critical aspect to the design, implementation, and usage of any system that stores, processes, or retrieves data. The term is broad in scope and may have widely different meanings depending on the specific context - even under the same general umbrella of computing. It is at times used as a proxy term for data quality, while data validation is a prerequisite for data integrity.
Colon classificationColon classification (CC) is a library catalogue system developed by Shiyali Ramamrita Ranganathan. It was an early faceted (or analytico-synthetic) classification system. The first edition of colon classification was published in 1933, followed by six more editions. It is especially used in libraries in India. Its name originates from its use of colons to separate facets into classes. Many other classification schemes, some of which are unrelated, also use colons and other punctuation to perform various functions.
Iris flower data setThe Iris flower data set or Fisher's Iris data set is a multivariate data set used and made famous by the British statistician and biologist Ronald Fisher in his 1936 paper The use of multiple measurements in taxonomic problems as an example of linear discriminant analysis. It is sometimes called Anderson's Iris data set because Edgar Anderson collected the data to quantify the morphologic variation of Iris flowers of three related species.