Learning theory (education)Learning theory describes how students receive, process, and retain knowledge during learning. Cognitive, emotional, and environmental influences, as well as prior experience, all play a part in how understanding, or a worldview, is acquired or changed and knowledge and skills retained. Behaviorists look at learning as an aspect of conditioning and advocate a system of rewards and targets in education.
Feature learningIn machine learning, feature learning or representation learning is a set of techniques that allows a system to automatically discover the representations needed for feature detection or classification from raw data. This replaces manual feature engineering and allows a machine to both learn the features and use them to perform a specific task. Feature learning is motivated by the fact that machine learning tasks such as classification often require input that is mathematically and computationally convenient to process.
Automated machine learningAutomated machine learning (AutoML) is the process of automating the tasks of applying machine learning to real-world problems. AutoML potentially includes every stage from beginning with a raw dataset to building a machine learning model ready for deployment. AutoML was proposed as an artificial intelligence-based solution to the growing challenge of applying machine learning. The high degree of automation in AutoML aims to allow non-experts to make use of machine learning models and techniques without requiring them to become experts in machine learning.
Performance appraisalA performance appraisal, also referred to as a performance review, performance evaluation, (career) development discussion, or employee appraisal, sometimes shortened to "PA", is a periodic and systematic process whereby the job performance of an employee is documented and evaluated. This is done after employees are trained about work and settle into their jobs. Performance appraisals are a part of career development and consist of regular reviews of employee performance within organizations.
Ensemble learningIn statistics and machine learning, ensemble methods use multiple learning algorithms to obtain better predictive performance than could be obtained from any of the constituent learning algorithms alone. Unlike a statistical ensemble in statistical mechanics, which is usually infinite, a machine learning ensemble consists of only a concrete finite set of alternative models, but typically allows for much more flexible structure to exist among those alternatives.
Data miningData mining is the process of extracting and discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. Data mining is an interdisciplinary subfield of computer science and statistics with an overall goal of extracting information (with intelligent methods) from a data set and transforming the information into a comprehensible structure for further use. Data mining is the analysis step of the "knowledge discovery in databases" process, or KDD.
Scale-invariant feature transformThe scale-invariant feature transform (SIFT) is a computer vision algorithm to detect, describe, and match local features in images, invented by David Lowe in 1999. Applications include object recognition, robotic mapping and navigation, , 3D modeling, gesture recognition, video tracking, individual identification of wildlife and match moving. SIFT keypoints of objects are first extracted from a set of reference images and stored in a database.
Algorithmic artAlgorithmic art or algorithm art is art, mostly visual art, in which the design is generated by an algorithm. Algorithmic artists are sometimes called algorists. Algorithmic art, also known as computer-generated art, is a subset of generative art (generated by an autonomous system) and is related to systems art (influenced by systems theory). Fractal art is an example of algorithmic art. For an image of reasonable size, even the simplest algorithms require too much calculation for manual execution to be practical, and they are thus executed on either a single computer or on a cluster of computers.
Data warehouseIn computing, a data warehouse (DW or DWH), also known as an enterprise data warehouse (EDW), is a system used for reporting and data analysis and is considered a core component of business intelligence. Data warehouses are central repositories of integrated data from one or more disparate sources. They store current and historical data in one single place that are used for creating analytical reports for workers throughout the enterprise. This is beneficial for companies as it enables them to interrogate and draw insights from their data and make decisions.
Random accessRandom access (more precisely and more generally called direct access) is the ability to access an arbitrary element of a sequence in equal time or any datum from a population of addressable elements roughly as easily and efficiently as any other, no matter how many elements may be in the set. In computer science it is typically contrasted to sequential access which requires data to be retrieved in the order it was stored. For example, data might be stored notionally in a single sequence like a row, in two dimensions like rows and columns on a surface, or in multiple dimensions.