Counterfeit consumer goodsCounterfeit consumer goods (or counterfeit and fraudulent, suspect items - CFSI) are goods, often of inferior quality, made or sold under another's brand name without the brand owner's authorization. Sellers of such goods may infringe on either the trademark, patent or copyright of the brand owner by passing off its goods as made by the brand owner. Counterfeit products made up 5 to 7% of world trade in 2013, and in 2014 cost an estimated 2.5 million jobs worldwide, with up to 750,000 jobs lost in the U.S.
CounterfeitTo counterfeit means to imitate something authentic, with the intent to steal, destroy, or replace the original, for use in illegal transactions, or otherwise to deceive individuals into believing that the fake is of equal or greater value than the real thing. Counterfeit products are fakes or unauthorized replicas of the real product. Counterfeit products are often produced with the intent to take advantage of the superior value of the imitated product.
Counterfeit moneyCounterfeit money is currency produced without the legal sanction of a state or government, usually in a deliberate attempt to imitate that currency and so as to deceive its recipient. Producing or using counterfeit money is a form of fraud or forgery, and is illegal. The business of counterfeiting money is nearly as old as money itself: plated copies (known as Fourrées) have been found of Lydian coins, which are thought to be among the first Western coins.
Counterfeit watchA counterfeit watch (or replica watch) is an unauthorised copy of an authentic watch. High-end luxury watches such as Rolex, Patek Philippe and Richard Mille are frequently counterfeited and sold on city streets and online. With technological advancements, many non-luxury and inexpensive quartz watches are also commonly counterfeited. According to estimates by the Swiss Customs Service, there are some 30 to 40 million counterfeit watches put into circulation each year.
InformationInformation is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artefacts such as analogue signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form.
Ethernet physical layerThe physical-layer specifications of the Ethernet family of computer network standards are published by the Institute of Electrical and Electronics Engineers (IEEE), which defines the electrical or optical properties and the transfer speed of the physical connection between a device and the network or between network devices. It is complemented by the MAC layer and the logical link layer. The Ethernet physical layer has evolved over its existence starting in 1980 and encompasses multiple physical media interfaces and several orders of magnitude of speed from 1 Mbit/s to 400 Gbit/s.
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Information scienceInformation science (also known as information studies) is an academic field which is primarily concerned with analysis, collection, classification, manipulation, storage, retrieval, movement, dissemination, and protection of information. Practitioners within and outside the field study the application and the usage of knowledge in organizations in addition to the interaction between people, organizations, and any existing information systems with the aim of creating, replacing, improving, or understanding the information systems.
Information overloadInformation overload (also known as infobesity, infoxication, information anxiety, and information explosion) is the difficulty in understanding an issue and effectively making decisions when one has too much information (TMI) about that issue, and is generally associated with the excessive quantity of daily information. The term "information overload" was first used as early as 1962 by scholars in management and information studies, including in Bertram Gross' 1964 book, The Managing of Organizations, and was further popularized by Alvin Toffler in his bestselling 1970 book Future Shock.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.