Jump to content

Web performance

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Apparition11 (talk | contribs) at 16:14, 14 June 2020 (Reverted edits by 43.251.93.66 (talk) to last version by Ahunt). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Web performance refers to the speed in which web pages are downloaded and displayed on the user's web browser. Web performance optimization (WPO), or website optimization is the field of knowledge about increasing web performance.

Faster website download speeds have been shown to increase visitor retention and loyalty[1][2] and user satisfaction, especially for users with slow internet connections and those on mobile devices.[3] Web performance also leads to less data travelling across the web, which in turn lowers a website's power consumption and environmental impact.[4] Some aspects which can affect the speed of page load include browser/server cache, image optimization, and encryption (for example SSL), which can affect the time it takes for pages to render[5]. The performance of the web page can be improved through techniques such as multi-layered cache, light weight design of presentation layer components and asynchronous communication with server side components.

History

In the first decade or so of the web's existence, web performance improvement was focused mainly on optimizing website code and pushing hardware limitations. According to the 2002 book Web Performance Tuning by Patrick Killelea, some of the early techniques used were to use simple servlets or CGI, increase server memory, and look for packet loss and retransmission.[6] Although these principles now comprise much of the optimized foundation of internet applications, they differ from current optimization theory in that there was much less of an attempt to improve the browser display speed.

Steve Souders coined the term "web performance optimization" in 2004.[7] At that time Souders made several predictions regarding the impact that WPO as an "emerging industry" would bring to the web, such as websites being fast by default, consolidation, web standards for performance, environmental impacts of optimization, and speed as a differentiator.[8]

One major point that Souders made in 2007 is that at least 80% of the time that it takes to download and view a website is controlled by the front-end structure. This lag time can be decreased through awareness of typical browser behavior, as well as of how HTTP works.[9]

Optimization techniques

Web performance optimization improves user experience (UX) when visiting a website and therefore is highly desired by web designers and web developers. They employ several techniques that streamline web optimization tasks to decrease web page load times. This process is known as front end optimization (FEO) or content optimization. FEO concentrates on reducing file sizes and "minimizing the number of requests needed for a given page to load."

In addition to the techniques listed below, the use of a content delivery network—a group of proxy servers spread across various locations around the globe—is an efficient delivery system that chooses a server for a specific user based on network proximity. Typically the server with the quickest response time is selected.

The following techniques are commonly used web optimization tasks and are widely used by web developers:

Web browsers open separate Transmission Control Protocol (TCP) connections for each Hypertext Transfer Protocol (HTTP) request submitted when downloading a web page. These requests total the number of page elements required for download. However, a browser is limited to opening only a certain number of simultaneous connections to a single host. To prevent bottlenecks, the number of individual page elements are reduced using resource consolidation whereby smaller files (such as images) are bundled together into one file. This reduces HTTP requests and the number of "round trips" required to load a web page.

Web pages are constructed from code files such JavaScript and Hypertext Markup Language (HTML). As web pages grow in complexity, so do their code files and subsequently their load times. File compression can reduce code files by as much as 80%, thereby improving site responsiveness.

Web Caching Optimization reduces server load, bandwidth usage and latency. CDNs use dedicated web caching software to store copies of documents passing through their system. Subsequent requests from the cache may be fulfilled should certain conditions apply. Web caches are located on either the client side (forward position) or web-server side (reverse position) of a CDN. Too, a web browser may also store web content for reuse.

Code minification distinguishes discrepancies between codes written by web developers and how network elements interpret code. Minification removes comments and extra spaces as well as crunch variable names in order to minimize code, decreasing files sizes by as much as 60%. In addition to caching and compression, lossy compression techniques (similar to those used with audio files) remove non-essential header information and lower original image quality on many high resolution images. These changes, such as pixel complexity or color gradations, are transparent to the end-user and do not noticeably affect perception of the image. Another technique is the replacement of vector graphics with resolution-independent raster graphics. Raster substitution is best suited for simple geometric images.

HTTP/1.x and HTTP/2

Since web browsers use multiple TCP connections for parallel user requests, congestion and browser monopolization of network resources may occur. Because HTTP/1 requests come with associated overhead, web performance is impacted by limited bandwidth and increased usage.

Compared to HTTP/1, HTTP/2

  • is binary instead of textual
  • is fully multiplexed instead of ordered and blocked
  • can therefore use one connection for parallelism
  • uses header compression to reduce overhead
  • allows servers to "push" responses proactively into client caches[10]

Instead of a website's hosting server, CDNs are used in tandem with HTTP/2 in order to better serve the end-user with web resources such as images, JavaScript files and Cascading Style Sheet (CSS) files since a CDN's location is usually in closer proximity to the end-user. [11]

References

  1. ^ "Google Adds Site Speed To Search Ranking". Retrieved 4 December 2012.
  2. ^ Sharon, Bell. "WPO | Preparing for Cyber Monday Traffic". CDNetworks. Retrieved 4 December 2012.
  3. ^ Souders, Steve. "Web First for Mobile". Retrieved 4 December 2012.
  4. ^ Bellonch, Albert. "Web performance optimization for everyone". Retrieved 4 December 2012.
  5. ^ "Why Website Speed Matters [Infographic] - LoveUMarketing". LoveUMarketing. 2018-10-06. Retrieved 2018-10-21.
  6. ^ Killelea, Patrick (2002). Web Performance Tuning. Sebastopol: O'Reilly Media. p. 480. ISBN 059600172X.
  7. ^ Frick, Tim (2016). Designing for Sustainability: A Guide to Building Greener Digital Products and Services. Boston: O'Reilly Media. p. 195. ISBN 1491935774.
  8. ^ Frick, Tim (2016). Designing for Sustainability: A Guide to Building Greener Digital Products and Services. Boston: O'Reilly Media. p. 56. ISBN 1491935774.
  9. ^ Souders, Steve (2007). High Performance Websites. Farnham: O'Reilly Media. p. 170. ISBN 0596529309.
  10. ^ "HTTP/2 Frequently Asked Questions". HTTP Working Group. Retrieved 14 April 2017.
  11. ^ "HTTP/2 – A Real-World Performance Test and Analysis". CSS Tricks. Retrieved 14 April 2017.