Lossless compressionLossless compression is a class of data compression that allows the original data to be perfectly reconstructed from the compressed data with no loss of information. Lossless compression is possible because most real-world data exhibits statistical redundancy. By contrast, lossy compression permits reconstruction only of an approximation of the original data, though usually with greatly improved compression rates (and therefore reduced media sizes).
One-way compression functionIn cryptography, a one-way compression function is a function that transforms two fixed-length inputs into a fixed-length output. The transformation is "one-way", meaning that it is difficult given a particular output to compute inputs which compress to that output. One-way compression functions are not related to conventional data compression algorithms, which instead can be inverted exactly (lossless compression) or approximately (lossy compression) to the original data.
Compression artifactA compression artifact (or artefact) is a noticeable distortion of media (including , audio, and video) caused by the application of lossy compression. Lossy data compression involves discarding some of the media's data so that it becomes small enough to be stored within the desired or transmitted (streamed) within the available bandwidth (known as the data rate or bit rate). If the compressor cannot store enough data in the compressed version, the result is a loss of quality, or introduction of artifacts.
Collision resistanceIn cryptography, collision resistance is a property of cryptographic hash functions: a hash function H is collision-resistant if it is hard to find two inputs that hash to the same output; that is, two inputs a and b where a ≠ b but H(a) = H(b). The pigeonhole principle means that any hash function with more inputs than outputs will necessarily have such collisions; the harder they are to find, the more cryptographically secure the hash function is.
Collision attackIn cryptography, a collision attack on a cryptographic hash tries to find two inputs producing the same hash value, i.e. a hash collision. This is in contrast to a where a specific target hash value is specified. There are roughly two types of collision attacks: Classical collision attack Find two different messages m1 and m2 such that hash(m1) = hash(m2). More generally: Chosen-prefix collision attack Given two different prefixes p1 and p2, find two appendages m1 and m2 such that hash(p1 ∥ m1) = hash(p2 ∥ m2), where ∥ denotes the concatenation operation.
Cryptographic hash functionA cryptographic hash function (CHF) is a hash algorithm (a map of an arbitrary binary string to a binary string with a fixed size of bits) that has special properties desirable for a cryptographic application: the probability of a particular -bit output result (hash value) for a random input string ("message") is (as for any good hash), so the hash value can be used as a representative of the message; finding an input string that matches a given hash value (a pre-image) is unfeasible, assuming all input str
Preimage attackIn cryptography, a preimage attack on cryptographic hash functions tries to find a message that has a specific hash value. A cryptographic hash function should resist attacks on its (set of possible inputs). In the context of attack, there are two types of preimage resistance: preimage resistance: for essentially all pre-specified outputs, it is computationally infeasible to find any input that hashes to that output; i.e., given , it is difficult to find an such that () = .
Block cipherIn cryptography, a block cipher is a deterministic algorithm that operates on fixed-length groups of bits, called blocks. Block ciphers are the elementary building blocks of many cryptographic protocols. They are ubiquitous in the storage and exchange of data, where such data is secured and authenticated via encryption. A block cipher uses blocks as an unvarying transformation. Even a secure block cipher is suitable for the encryption of only a single block of data at a time, using a fixed key.
Weak keyIn cryptography, a weak key is a key, which, used with a specific cipher, makes the cipher behave in some undesirable way. Weak keys usually represent a very small fraction of the overall keyspace, which usually means that, a cipher key made by random number generation is very unlikely to give rise to a security problem. Nevertheless, it is considered desirable for a cipher to have no weak keys. A cipher with no weak keys is said to have a flat, or linear, key space.
DeflateIn computing, Deflate (stylized as DEFLATE) is a lossless data compression that uses a combination of LZ77 and Huffman coding. It was designed by Phil Katz, for version 2 of his PKZIP archiving tool. Deflate was later specified in RFC 1951 (1996). Katz also designed the original algorithm used to construct Deflate streams. This algorithm was patented as , and assigned to PKWARE, Inc. As stated in the RFC document, an algorithm producing Deflate files was widely thought to be implementable in a manner not covered by patents.