In computing, the robustness principle is a design guideline for software:
- Be conservative in what you do, be liberal in what you accept from others (often reworded as "Be conservative in what you send, be liberal in what you accept").
- TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.
In other words, programs that send messages to other machines (or to other programs on the same machine) should conform completely to the specifications, but programs that receive messages should accept non-conformant input as long as the meaning is clear.
Among programmers, to produce compatible functions, the principle is popularized in the form be contravariant in the input type and covariant in the output type.
RFC 1122 (1989) expanded on Postel's principle by recommending that programmers "assume that the network is filled with malevolent entities that will send in packets designed to have the worst possible effect". Protocols should allow for the addition of new codes for existing fields in future versions of protocols by accepting messages with unknown codes (possibly logging them). Programmers should avoid sending messages with "legal but obscure protocol features" that might expose deficiencies in receivers, and design their code "not just to survive other misbehaving hosts, but also to cooperate to limit the amount of disruption such hosts can cause to the shared communication facility".
In 2001, Marshall Rose characterized several deployment problems when applying Postel's principle in the design of a new application protocol. For example, a defective implementation that sends non-conforming messages might be used only with implementations that tolerate those deviations from the specification until, possibly several years later, it is connected with a less tolerant application that rejects its messages. In such a situation, identifying the problem is often difficult, and deploying a solution can be costly. Rose therefore recommended "explicit consistency checks in a protocol ... even if they impose implementation overhead".
In an Internet-Draft of 2017, Martin Thomson argues that Postel's robustness principle actually leads to a lack of robustness, including security.
- Postel, Jon, ed. (January 1980). Transmission Control Protocol. IETF. doi:10.17487/RFC0761. RFC 761. https://tools.ietf.org/html/rfc761. Retrieved June 9, 2014.
- Braden, R., ed. (October 1989). Requirements for Internet Hosts: Communication Layers. IETF. doi:10.17487/RFC1122. RFC 1122. https://tools.ietf.org/html/rfc1122. Retrieved June 9, 2014.
- Wilde, Erik (2012) . Wilde's WWW: Technical Foundations of the World Wide Web. Springer‑Verlag. p. 26. doi:10.1007/978-3-642-95855-7. ISBN 978-3-642-95855-7.
- Rose, M. (November 2001). On the Design of Application Protocols. IETF. doi:10.17487/RFC3117. RFC 3117. https://tools.ietf.org/html/rfc3117. Retrieved June 9, 2014.
- Thomson, Martin (October 2017). The Harmful Consequences of Postel's Maxim. IETF. https://tools.ietf.org/html/draft-thomson-postel-was-wrong-02. Retrieved January 15, 2018.