ping is a computer network administration software utility used to test the reachability of a host on an Internet Protocol (IP) network. It is available for virtually all operating systems that have networking capability, including most embedded network administration software.
Ping measures the round-trip time for messages sent from the originating host to a destination computer that are echoed back to the source. The name comes from active sonar terminology that sends a pulse of sound and listens for the echo to detect objects under water.
Ping operates by means of Internet Control Message Protocol (ICMP) packets. Pinging involves sending an ICMP echo request to the target host and waiting for an ICMP echo reply. The program reports errors, packet loss, and a statistical summary of the results, typically including the minimum, maximum, the mean round-trip times, and standard deviation of the mean.
The command-line options of the ping utility and its output vary between the numerous implementations. Options may include the size of the payload, count of tests, limits for the number of network hops (TTL) that probes traverse, interval between the requests and time to wait for a response. Many systems provide a companion utility ping6, for testing on Internet Protocol version 6 (IPv6) networks, which implement ICMPv6.
The ping utility was written by Mike Muuss in December 1983 during his employment at the Ballistic Research Laboratory, now the US Army Research Laboratory. A remark by David Mills on using ICMP echo packets for IP network diagnosis and measurements prompted Muuss to create the utility to troubleshoot network problems. The author named it after the sound that sonar makes, since its methodology is analogous to sonar's echolocation. The backronym Packet InterNet Groper for PING has been used for over 30 years, and although Muuss says that from his point of view PING was not intended as an acronym, he has acknowledged Mills' expansion of the name.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
This is an introductory course to computer security and privacy. Its goal is to provide students with means to reason about security and privacy problems, and provide them with tools to confront them.
Le cours étudie les concepts fondamentaux de l'analyse complexe et de l'analyse de Laplace en vue de leur utilisation
pour résoudre des problèmes pluridisciplinaires d'ingénierie scientifique.
A computer network is a set of computers sharing resources located on or provided by network nodes. Computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are made up of telecommunication network technologies based on physically wired, optical, and wireless radio-frequency methods that may be arranged in a variety of network topologies. The nodes of a computer network can include personal computers, servers, networking hardware, or other specialized or general-purpose hosts.
In computing, traceroute and tracert are computer network diagnostic commands for displaying possible routes (paths) and measuring transit delays of packets across an Internet Protocol (IP) network. The history of the route is recorded as the round-trip times of the packets received from each successive host (remote node) in the route (path); the sum of the mean times in each hop is a measure of the total time spent to establish the connection.
Time to live (TTL) or hop limit is a mechanism which limits the lifespan or lifetime of data in a computer or network. TTL may be implemented as a counter or timestamp attached to or embedded in the data. Once the prescribed event count or timespan has elapsed, data is discarded or revalidated. In computer networking, TTL prevents a data packet from circulating indefinitely. In computing applications, TTL is commonly used to improve the performance and manage the caching of data.
In-network devices around the world monitor and tamper with connections for many reasons, including intrusion prevention, combating spam or phishing, and country-level censorship. Connection tampering seeks to block access to specific domain names or keywo ...
New York2023
In 1948, Claude Shannon laid the foundations of information theory, which grew out of a study to find the ultimate limits of source compression, and of reliable communication. Since then, information theory has proved itself not only as a quest to find the ...
EPFL2023
, , , , ,
Proton MR spectra of the brain, especially those measured at short and intermediate echo times, contain signals from mobile macromolecules (MM). A description of the main MM is provided in this consensus paper. These broad peaks of MM underlie the narrower ...