Jump to content

Consumerization of information technology

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Danielsltt (talk | contribs) at 04:56, 15 April 2023 (External links). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Consumerization is the reorientation of product and service designs to focus on (and market to) the end user as an individual consumer, in contrast with an earlier era of only organization-oriented offerings (designed solely for business-to-business or business-to-government sales). Technologies whose first commercialization was at the inter-organization level thus have potential for later consumerization. The emergence of the individual consumer as the primary driver of product and service design is most commonly associated with the IT industry, as large business and government organizations dominated the early decades of computer usage and development. Thus the microcomputer revolution, in which electronic computing moved from exclusively enterprise and government use to include personal computing, is a cardinal example of consumerization. But many technology-based products, such as calculators and mobile phones, have also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these products commoditized and prices fell. An example of enterprise software that became consumer software is optical character recognition software, which originated with banks and postal systems (to automate cheque clearing and mail sorting) but eventually became personal productivity software.

In a different sense, consumerization of IT is the proliferation of personally owned IT at the workplace (in addition to, or even instead of, company-owned IT), which originates in the consumer market, to be used for professional purposes.[1] This bring your own device trend has significantly changed corporate IT policies, as employees now often use their own laptops, netbooks, tablets, and smartphones on the hardware side, and social media, web conferencing, cloud storage, and software as a service on the software side.

Origins

[edit]

Consumerization has existed for many decades, as, for example, the consumerization of refrigeration occurred in the 1910s through 1950s. The consumerization of IT is believed to have been first regularly called by that term by Douglas Neal and John Taylor of the Leading Edge Forum in 2001; the first known published paper on this topic was published by the LEF in June 2004.[2] The term is now used widely throughout the IT industry, and is the topic of numerous conferences and articles. One of the first articles was special insert in "The Economist" magazine on October 8, 2011.[3] Later, Consumerization of IT has been used ambiguously. In an effort to structure the amorphous nature of the term, researchers suggested to take three distinct perspectives: an individual, organizational and market perspective.[4]

The technology behind the consumerization of computing can be said to have begun with the development of eight-bit, general-purpose microprocessors in the early 1970s and eventually the personal computer in the late 1970s and early 1980s. Thus, the microcomputer revolution, in which electronic computing moved from exclusively enterprise and government use to include personal computing, is the cardinal example of consumerization. However, it is significant that the great success of the IBM PC in the first half of the 1980s was driven primarily by business markets. Business preeminence continued during the late 1980s and early 1990s with the rise of the Microsoft Windows PC platform. Meanwhile, other technology-based products, such as calculators, fax machines, and mobile phones, also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these products commoditized and prices fell.

It was the growth of the World Wide Web in the mid-1990s that began to reverse this pattern. In particular the rise of free, advertising-based services such as email and search from companies such as Hotmail and Yahoo began to establish the idea that consumer IT offerings based on a simple Internet browser were often viable alternatives to traditional business computing approaches. Meanwhile, it is argued that consumerization of IT embodies more than consumer IT diffusion, but a chance for considerable productivity gains. It "reflects how enterprises will be affected by, and can take advantage of, new technologies and models that originate and develop in the consumer space, rather than in the enterprise IT sector".[5]

Business implications

[edit]

The primary impact of consumerization is that it is forcing businesses, especially large enterprises, to rethink the way they procure and manage IT equipment and services. Historically, central IT organizations controlled the great majority of IT usage within their firms, choosing or at least approving of the systems and services that employees used. Consumerization enables alternative approaches. Today, employees and departments are becoming increasingly self-sufficient in meeting their IT needs. Products have become easier to use, and cloud-based, software-as-a-service offerings are addressing an ever-widening range of business needs in areas such as video-conferencing, digital imaging, business collaboration, sales force support, systems back-up, and other areas.

Similarly, there is increasing interest in so-called Bring Your Own Device strategies, where individual employees can choose and often own the computers and/or smart phones they use at work. The Apple iPhone and iPad have been particularly important in this regard. Both products were designed for individual consumers, but their appeal in the workplace has been great. They have demonstrated that elements of choice, style and entertainment are now critical computer industry dimensions that businesses cannot ignore.

Equally important, large enterprises have become increasingly dependent upon consumerized services as search, mapping, and social media. The capabilities of firms such as Google, Facebook, and Twitter are now essential components of many firm's marketing strategies. One of the most important consumerization questions going forward is to what extent such advertising-based services will spread into major corporate applications such as email, Customer Relationship Management (CRM), and Intranets.

One of the more serious negative implications of consumerization is that security controls have been slower to be adopted in the consumer space. As a result, there is an increased risk to the information assets accessed through these less trustworthy consumerized devices. In a recent CSOOnline article by Joan Goodchild she reported a survey that found "when asked what are the greatest barriers to enabling employees to use personal devices at work, 83 percent of IT respondents cited "security concerns"[6] This shortcoming may soon be remedied by the chip manufacturers with technologies such as Intel's "Trusted Execution Technology" [7] and ARM's "Trust Zone" [8]—these technologies being designed to increase the trustworthiness of both enterprise and consumer devices.

Technology implications

[edit]

In addition to the mass market changes above, consumer markets are now changing large-scale computing as well. The giant data centers that have been and are being built by firms such as Google, Apple, Amazon and others are far larger and generally much more efficient than the data centers used by most large enterprises. For example, Google is said to support over 300 million Gmail accounts, while executing more than 1 billion searches per day.

Supporting these consumer-driven volumes requires new levels of efficiency and scale, and this is transforming many traditional data center approaches and practices. Among the major changes are reliance on low cost, commodity servers, N+1 system redundancy, and largely unmanned data center operations. The associated software innovations are equally important in areas such as algorithms, artificial intelligence, and Big data. In this sense, consumerization seems likely to transform much of the overall computing stack, from individual devices to many of the most demanding large-scale challenges.

References

[edit]
  1. ^ Sebastian Köffer, Kevin Ortbach, and Björn Niehaves. Exploring the Relationship between IT Consumerization and Job Performance: A theoretical framework for future research. Communications of the Association for Information Systems. http://aisel.aisnet.org/cgi/viewcontent.cgi?article=3823&context=cais
  2. ^ David Moschella, Doug Neal, John Taylor, and Piet Opperman. Consumerization of Information Technology. Leading Edge Forum 2004. http://lef.csc.com/projects/70 Archived 2014-12-28 at the Wayback Machine, Accessed 27/02/2012
  3. ^ Special Report: Personal Technology, "Consumerisation: The Power of Many", Economist, 2011, http://www.economist.com/node/21530921, Accessed 27/02/2012
  4. ^ Jeanne G Harris, Blake Ives, and Iris Junglas. IT Consumerization: When Gadgets Turn into Enterprise IT Tools. MIS Quarterly Executive.
  5. ^ Gartner IT Glossary: Consumerization. http://www.gartner.com/it-glossary/consumerization, Accessed 21/01/2016
  6. ^ Joan Goodchild, "Consumer device use is growing, but IT and security can't keep up", http://www.csoonline.com/article/686087/consumer-device-use-is-growing-but-it-and-security-can-t-keep-up?page=1, 2011, Accessed 27/02/2012
  7. ^ James Green & Sham Datta, White Paper, "Intel® Trusted Execution Technology", http://www.intel.com/technology/security/downloads/arch-overview.pdf, Intel, Accessed 27/02/2012
  8. ^ White Paper, "Building a Secure System using TrustZone® Technology", http://infocenter.arm.com/help/topic/com.arm.doc.prd29-genc-009492c/PRD29-GENC-009492C_trustzone_security_whitepaper.pdf, ARM, Accessed 27/02/2012
[edit]