Honeycomb structureHoneycomb structures are natural or man-made structures that have the geometry of a honeycomb to allow the minimization of the amount of used material to reach minimal weight and minimal material cost. The geometry of honeycomb structures can vary widely but the common feature of all such structures is an array of hollow cells formed between thin vertical walls. The cells are often columnar and hexagonal in shape. A honeycomb shaped structure provides a material with minimal density and relative high out-of-plane compression properties and out-of-plane shear properties.
Computational economicsComputational Economics is an interdisciplinary research discipline that involves computer science, economics, and management science. This subject encompasses computational modeling of economic systems. Some of these areas are unique, while others established areas of economics by allowing robust data analytics and solutions of problems that would be arduous to research without computers and associated numerical methods.
Aggregate (composite)Aggregate is the component of a composite material that resists compressive stress and provides bulk to the composite material. For efficient filling, aggregate should be much smaller than the finished item, but have a wide variety of sizes. For example, the particles of stone used to make concrete typically include both sand and gravel. Aggregate composites tend to be much easier to fabricate, and much more predictable in their finished properties, than fiber composites.
Agent-based computational economicsAgent-based computational economics (ACE) is the area of computational economics that studies economic processes, including whole economies, as dynamic systems of interacting agents. As such, it falls in the paradigm of complex adaptive systems. In corresponding agent-based models, the "agents" are "computational objects modeled as interacting according to rules" over space and time, not real people. The rules are formulated to model behavior and social interactions based on incentives and information.
LaminationLamination is the technique/process of manufacturing a material in multiple layers, so that the composite material achieves improved strength, stability, sound insulation, appearance, or other properties from the use of the differing materials, such as plastic. A laminate is a permanently assembled object created using heat, pressure, welding, or adhesives. Various coating machines, machine presses and calendering equipment are used. There are different lamination processes, depending primarily on the type or types of materials to be laminated.
Experimental economicsExperimental economics is the application of experimental methods to study economic questions. Data collected in experiments are used to estimate effect size, test the validity of economic theories, and illuminate market mechanisms. Economic experiments usually use cash to motivate subjects, in order to mimic real-world incentives. Experiments are used to help understand how and why markets and other exchange systems function as they do. Experimental economics have also expanded to understand institutions and the law (experimental law and economics).
Computational biologyComputational biology refers to the use of data analysis, mathematical modeling and computational simulations to understand biological systems and relationships. An intersection of computer science, biology, and big data, the field also has foundations in applied mathematics, chemistry, and genetics. It differs from biological computing, a subfield of computer engineering which uses bioengineering to build computers. Bioinformatics, the analysis of informatics processes in biological systems, began in the early 1970s.
Theoretical computer scienceTheoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory. It is difficult to circumscribe the theoretical areas precisely. The ACM's Special Interest Group on Algorithms and Computation Theory (SIGACT) provides the following description: History of computer science While logical inference and mathematical proof had existed previously, in 1931 Kurt Gödel proved with his incompleteness theorem that there are fundamental limitations on what statements could be proved or disproved.
Algebraic structureIn mathematics, an algebraic structure consists of a nonempty set A (called the underlying set, carrier set or domain), a collection of operations on A (typically binary operations such as addition and multiplication), and a finite set of identities, known as axioms, that these operations must satisfy. An algebraic structure may be based on other algebraic structures with operations and axioms involving several structures.
Out-of-order executionIn computer engineering, out-of-order execution (or more formally dynamic execution) is a paradigm used in most high-performance central processing units to make use of instruction cycles that would otherwise be wasted. In this paradigm, a processor executes instructions in an order governed by the availability of input data and execution units, rather than by their original order in a program. In doing so, the processor can avoid being idle while waiting for the preceding instruction to complete and can, in the meantime, process the next instructions that are able to run immediately and independently.