Jump to content

OnLive: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m Reverted edits by Only777 (talk) to last version by WikiLaurent
Only777 (talk | contribs)
Line 40: Line 40:
*Charlie Jablonski is OnLive's vice president of operations. His career includes 16 years at [[National Broadcasting Corporation|NBC]] as Head of Engineering and Technology
*Charlie Jablonski is OnLive's vice president of operations. His career includes 16 years at [[National Broadcasting Corporation|NBC]] as Head of Engineering and Technology


==Why Onlive wont work==
==Corporate information==
Not only will these datacenters be handling the gameplay, they will also be encoding the video output of the machines in real time and piping it down over IP to you at 1.5MBps (for SD) and 5MBps (for HD). OnLive says you will be getting 60fps gameplay. First of all, bear in mind that YouTube's encoding farms take a long, long time to produce their current, offline 2MBps 30fps HD video. OnLive is going to be doing it all in real-time via a PC plug-in card, at 5MBps, and with surround sound too.
OnLive was incubated within Rearden LLC, a company founded by Steve Perlman. Since it was spun out as an independent company, it has also taken over control of MOVA, another Rearden start-up founded by Steve Perlman, as a wholly-owned subsidiary. MOVA is a facial creation and motion capture company whose technology has been used in films such as ''[[The Curious Case of Benjamin Button (film)|The Curious Case of Benjamin Button]]''.


It sounds brilliant, but there's one rather annoying fact to consider: the nature of video compression is such that the longer the CPU has to encode the video, the better the job it will do. Conversely, it's a matter of fact that the lower the latency, the less efficient it can be.
OnLive's original investors include [[Warner Bros.]], [[Autodesk]] and Maverick Capital.<ref name="Rearden: About">{{cite web|url=http://www.rearden.com/about/index.html|title=Rearden: About|publisher=www.rearden.com|date=|accessdate=2009=2009-08-12}}</ref> A later round of financing included AT&T Media Holdings, Inc. and Lauder Partners as well as the original investors.<ref name="OnLive Raises Series C Round from AT&T, Warner Bros. and Others">{{cite web|url=http://www.techcrunch.com/2009/09/29/onlive-raises-series-c-round-from-att-warner-bros-and-others/|title=OnLive Raises Series C Round from AT&T, Warner Bros. and Others|publisher=www.techcrunch.com|date=|accessdate=2009=2009-09-30}}</ref>

More than that, OnLive overlord Steve Perlmen has said that the latency introduced by the encoder is 1ms. Think about that; he's saying that the OnLive encoder runs at 1000fps. It's one of the most astonishing claims I've ever heard. It's like Ford saying that the new Fiesta's cruising speed is in excess of the speed of sound. To give some idea of the kind of leap OnLive reckons it is delivering, I consulted one of the world's leading specialists in high-end video encoding, and his response to OnLive's claims included such gems as "Bulls***" and "Hahahahaha!" along with a more measured, "I have the feeling that somebody is not telling the entire story here." This is a man whose know-how has helped YouTube make the jump to HD, and whose software is used in video compression applications around the world.

The Insurmountable Challenge: Latency

OnLive says that it has conducted years of 'psychophysical' research to lessen the effects of internet latency. That's the key issue here, and I can't see how OnLive can fudge its way around this one. In reality, it's going to need sub-150 millisecond latency from its servers at least, and a hell of a QoS (quality of service) to guarantee that this will in any way approximate the experience you currently have at home. The latency factor will probably need to be somewhat lower than that to factor in the video encoding server-side, and decoding client-side, which by any measurable standard right now is going to be impactful.
So, bearing in mind that OnLive is demonstrating at GDC, how is it achieving the results? It's difficult to say, but this is how I would do it. Firstly, I'd have a bank of whopper PCs behind the scenes running the games at 720p60. Each of them would be connected to a hardware h264 encoder which would in turn be connected via gigabit LAN to the clients. If the server-side PCs aren't on site, I'd have them at a very close-by datacenter. At the GDC demo, OnLive bosses Mike McGarvey and Steve Perlmen said that the company's servers were hosted 50 miles away. If this was a true test conducted over the internet, I'm betting that there was a whopping internet connection being used with oodles of bandwidth, even if only 5MBps of it was utilised.

Perhaps this suggests an element of smoke and mirrors, but if I were OnLive and about to give a demonstration of this importance, I'd definitely be looking to control as many of the conditions as possible. The main principles are being showcased, but in a best-case scenario. The thing is, actual performance has to live up to this demo and that's where things get tricky.

Factor in thousands more users, orders of magnitude more traffic at the datacenters, and all the vagaries and unreliability of the average internet connection and actual real-life performance must surely be in question. Much as we all want this to be brilliant, the fact of the matter is that even a Skype call over the internet is prone to failing badly at any given point, so the chances are that the far more ambitious OnLive is going to have its fair share of very tangible issues. Picture quality will be immensely variable and lag will remain an issue - but for the less discerning gamer, maybe - just maybe - it will work well enough.

How Could They Make It Work?

So, could this system actually live up to the claims being made for it? What sort of conditions are required to ensure optimal performance? Firstly, I don't think that the video encoding issues will be overcome and I don't buy into this 'interactive video algorithm' geek-speak. On high-action scenes, you're going to be seeing a lot of macroblocking; it's basically inevitable. I can't imagine Burnout ever being streamed in HD to acceptable standards at 60fps without at least two to three times the amount of bandwidth OnLive uses.

I can see 30fps video being the standard here rather than the mooted 60fps. It'll make the video quality look massively superior, and reduce the load on the client decoding it, plus it will help manage latency if the amount of frames being processed is halved. Plus of course there's the fact that 90-95 per cent of console games run at 30fps anyway. It's effectively the standard and it will lower the CPU/GPU requirements of the PCs server-side. But even then, don't think that this will result in lossless HDMI-quality video - far from it. Any game with fast-moving, colourful video is going to look very rough.
Let's give OnLive the benefit of the doubt for a moment and say that its encoder is better than the very best in compression available today. If its tech is the generational leap that Perlman and company say it is, maybe it could match that quality at 60fps. But still, blown up to full-screen, it's not going to be especially impressive.

Latency. I can only see one way to make this work and guarantee the necessary quality of service, and that's to adopt an IPTV-style model. The OnLive datacenters will be licensed to ISPs, who will have them at their base of operations. Latency will be massively reduced, the connection will be far more stable, plus the datacenters with the PCs and hardware encoders can be distributed worldwide in a more effective manner. ISPs will be cut into the deal the way that retailers are now with conventional game-purchasing.

But even in this scenario, practically, I still can't see it happening. Microsoft's IPTV venture still hasn't materialised anywhere outside of the USA, so what chance does OnLive have of brokering a deal? And with ISPs complaining about the load brought about by innovations like the BBC iPlayer, why would they want to be involved with a hugely congestive venture like OnLive?

And what about computer costs? OnLive is promising state-of-the-art PCs running your game experience. The costs in creating the datacenters are going to be humungous, even factoring in the assistance of a volume manufacturer like Dell or HP. And what happens when GTA or Half-Life comes out and everyone wants to play it simultaneously? Will we have to take turns on connecting to the available servers? Computer costs, bandwidth costs, development costs, publisher royalties... it's all starting to sound hugely, and prohibitively, expensive. Not surprisingly, OnLive is keeping mum about its cost structure to the end-user.


==Skepticism==
==Skepticism==

Revision as of 18:32, 9 March 2010

OnLive
File:OnLive Logo.jpg
ManufacturerOnLive
TypeGaming on demand
LifespanUSWinter 2009-10[1]
MediaN/A (on-demand content)
CPUOnLive (server based)
Controller input4 wireless, 2 USB
Online servicesOnLive Games On Demand

OnLive is a gaming-on-demand game platform, announced at the Game Developers Conference in 2009.[2] The service is a gaming equivalent of cloud computing: the game is synchronized, rendered, and stored on a remote server and delivered online. The service was announced to be compatible with any Windows PC running Windows XP or Windows Vista, or any Intel-based Mac running OS X and on smartphones.[3][4][5][6] A low-end computer, as long as it can play video, may be used to play any kind of game since the game is computed on the OnLive server. For that reason, the service is being seen as a strong competitor for the console market.[7][8] Steve Perlman states that a 1.5 Mbps connection will be needed to display games in SDTV resolution (typical output of Wii and previous generation console titles) while 4-5 Mbps will be needed for HDTV resolution, such as those output by the Xbox 360 and PlayStation 3.[9] The average broadband connection speed in the US at the end of 2008 was 3.9 Mbps, while 25% of US broadband connections were rated faster than 5 Mbps.[10]

It was announced that Electronic Arts, Take-Two, Ubisoft, Epic Games, Atari, Codemasters, THQ, Warner Bros., 2D Boy and Eidos Interactive have signed up to have their PC games available on the service.[11] Sixteen game titles are currently available from the OnLive service.[12]The service was originally planned for a retail release in the winter of 2009.[13][1] However, it has yet to be released. Steve Perlman plans to offer the service in the United Kingdom and the rest of Europe after getting it established in the United States over the coming year.[14]

Console

File:OnLiveMainMenu.jpg
OnLive main menu

OnLive will sell a console, called the "MicroConsole",[15] that can be connected to a television and directly to the OnLive service, so that it will be possible to use the service without owning a computer.[2]

Steve Perlman has also suggested that the underlying electronics and compression chip could be integrated into set-top boxes and other consumer electronics.[16]

The MicroConsole supports up to four wireless controllers and four Bluetooth headsets. It also has two USB ports for keyboards and mouse.[17]

Architecture

The OnLive service will be hosted in five co-located North American data centers. Currently there are facilities in California, (Santa Clara) and Virginia, with additional facilities being fitted out in Texas and elsewhere.[18] OnLive has stated that users must be located within 1,000 miles (1,600 km) of one of these to receive high quality service.[19][20][21]

The hardware used is a custom set up consisting of OnLive's proprietary video compression chip as well as standard PC CPU and GPU chips. For older, or lower performance, games such as LEGO Batman, multiple instances can be played on each server using virtualisation technology. However, high-end games such as Crysis Warhead will require one GPU per game. Two video streams are created for each game. One (the live stream) is optimised for gameplay and real-world internet conditions, while the other (the media stream) is a full HD stream that is stored server-side and used for spectators or for gamers to record Brag Clips of their games.[16]

Executive team

  • Steve Perlman is OnLive's CEO who is well known for QuickTime, WebTV, and other ventures.
  • Mike McGarvey is OnLive's COO who was Eidos Interactive's former CEO
  • Tom Paquin is OnLive's executive vice president of engineering. He is best known as being a key developer behind Netscape and as the founder of Mozilla.org.
  • John Spinale is OnLive's vice president of Games and Media. Spinale previously was SVP of Product Development at Eidos and Director and Executive Producer at Activision. He also founded and ran Bitmo.[22]
  • Paul V. Weinstein is OnLive's vice president of business development. Previously he worked as EVP of Business Development for open source database company MySQL.
  • Charlie Jablonski is OnLive's vice president of operations. His career includes 16 years at NBC as Head of Engineering and Technology

Why Onlive wont work

Not only will these datacenters be handling the gameplay, they will also be encoding the video output of the machines in real time and piping it down over IP to you at 1.5MBps (for SD) and 5MBps (for HD). OnLive says you will be getting 60fps gameplay. First of all, bear in mind that YouTube's encoding farms take a long, long time to produce their current, offline 2MBps 30fps HD video. OnLive is going to be doing it all in real-time via a PC plug-in card, at 5MBps, and with surround sound too.

It sounds brilliant, but there's one rather annoying fact to consider: the nature of video compression is such that the longer the CPU has to encode the video, the better the job it will do. Conversely, it's a matter of fact that the lower the latency, the less efficient it can be.

More than that, OnLive overlord Steve Perlmen has said that the latency introduced by the encoder is 1ms. Think about that; he's saying that the OnLive encoder runs at 1000fps. It's one of the most astonishing claims I've ever heard. It's like Ford saying that the new Fiesta's cruising speed is in excess of the speed of sound. To give some idea of the kind of leap OnLive reckons it is delivering, I consulted one of the world's leading specialists in high-end video encoding, and his response to OnLive's claims included such gems as "Bulls***" and "Hahahahaha!" along with a more measured, "I have the feeling that somebody is not telling the entire story here." This is a man whose know-how has helped YouTube make the jump to HD, and whose software is used in video compression applications around the world.

The Insurmountable Challenge: Latency

OnLive says that it has conducted years of 'psychophysical' research to lessen the effects of internet latency. That's the key issue here, and I can't see how OnLive can fudge its way around this one. In reality, it's going to need sub-150 millisecond latency from its servers at least, and a hell of a QoS (quality of service) to guarantee that this will in any way approximate the experience you currently have at home. The latency factor will probably need to be somewhat lower than that to factor in the video encoding server-side, and decoding client-side, which by any measurable standard right now is going to be impactful. So, bearing in mind that OnLive is demonstrating at GDC, how is it achieving the results? It's difficult to say, but this is how I would do it. Firstly, I'd have a bank of whopper PCs behind the scenes running the games at 720p60. Each of them would be connected to a hardware h264 encoder which would in turn be connected via gigabit LAN to the clients. If the server-side PCs aren't on site, I'd have them at a very close-by datacenter. At the GDC demo, OnLive bosses Mike McGarvey and Steve Perlmen said that the company's servers were hosted 50 miles away. If this was a true test conducted over the internet, I'm betting that there was a whopping internet connection being used with oodles of bandwidth, even if only 5MBps of it was utilised.

Perhaps this suggests an element of smoke and mirrors, but if I were OnLive and about to give a demonstration of this importance, I'd definitely be looking to control as many of the conditions as possible. The main principles are being showcased, but in a best-case scenario. The thing is, actual performance has to live up to this demo and that's where things get tricky.

Factor in thousands more users, orders of magnitude more traffic at the datacenters, and all the vagaries and unreliability of the average internet connection and actual real-life performance must surely be in question. Much as we all want this to be brilliant, the fact of the matter is that even a Skype call over the internet is prone to failing badly at any given point, so the chances are that the far more ambitious OnLive is going to have its fair share of very tangible issues. Picture quality will be immensely variable and lag will remain an issue - but for the less discerning gamer, maybe - just maybe - it will work well enough.

How Could They Make It Work?

So, could this system actually live up to the claims being made for it? What sort of conditions are required to ensure optimal performance? Firstly, I don't think that the video encoding issues will be overcome and I don't buy into this 'interactive video algorithm' geek-speak. On high-action scenes, you're going to be seeing a lot of macroblocking; it's basically inevitable. I can't imagine Burnout ever being streamed in HD to acceptable standards at 60fps without at least two to three times the amount of bandwidth OnLive uses.

I can see 30fps video being the standard here rather than the mooted 60fps. It'll make the video quality look massively superior, and reduce the load on the client decoding it, plus it will help manage latency if the amount of frames being processed is halved. Plus of course there's the fact that 90-95 per cent of console games run at 30fps anyway. It's effectively the standard and it will lower the CPU/GPU requirements of the PCs server-side. But even then, don't think that this will result in lossless HDMI-quality video - far from it. Any game with fast-moving, colourful video is going to look very rough. Let's give OnLive the benefit of the doubt for a moment and say that its encoder is better than the very best in compression available today. If its tech is the generational leap that Perlman and company say it is, maybe it could match that quality at 60fps. But still, blown up to full-screen, it's not going to be especially impressive.

Latency. I can only see one way to make this work and guarantee the necessary quality of service, and that's to adopt an IPTV-style model. The OnLive datacenters will be licensed to ISPs, who will have them at their base of operations. Latency will be massively reduced, the connection will be far more stable, plus the datacenters with the PCs and hardware encoders can be distributed worldwide in a more effective manner. ISPs will be cut into the deal the way that retailers are now with conventional game-purchasing.

But even in this scenario, practically, I still can't see it happening. Microsoft's IPTV venture still hasn't materialised anywhere outside of the USA, so what chance does OnLive have of brokering a deal? And with ISPs complaining about the load brought about by innovations like the BBC iPlayer, why would they want to be involved with a hugely congestive venture like OnLive?

And what about computer costs? OnLive is promising state-of-the-art PCs running your game experience. The costs in creating the datacenters are going to be humungous, even factoring in the assistance of a volume manufacturer like Dell or HP. And what happens when GTA or Half-Life comes out and everyone wants to play it simultaneously? Will we have to take turns on connecting to the available servers? Computer costs, bandwidth costs, development costs, publisher royalties... it's all starting to sound hugely, and prohibitively, expensive. Not surprisingly, OnLive is keeping mum about its cost structure to the end-user.

Skepticism

Soon after its announcement, many game journalists expressed skepticism concerning how the OnLive service would work. These mainly related to the quality of the service in real-world conditions, both in terms of the hardware required in OnLive server centers to render and compress the video, as well as the impact of commercial internet broadband connections on its delivery. During GDC 2009, which was held in San Francisco, the OnLive service was only 50 miles (80 km) from its Santa Clara data center. The closed beta has also only seen "hundreds of users on the system".[23] Near the E3 Conference, which is approximately 350 miles (560 km) away from their data center, OnLive tested their service in a real-world environment with a regular cable modem. Reports say that OnLive performed well.[24][25]

Matt Peckham from PC World stated that it might be technically difficult to transfer the amount of data that a high definition game would require; later on he mentioned that OnLive would need a "deterministic broadband". That is "a guaranteed, non-shared, uninterruptible speed", but "broadband isn't there yet, nor are ISPs willing to offer performance guarantees". He also mentioned concerns about the "mod community" being unable to create and offer mods since all the game data will be stored on the OnLive servers; as well as the fact that any games bought on OnLive are not actually owned by the user. If OnLive were to go under, all the user's games would also disappear with it. Currently, no widespread trial has been made to test the service so it is unclear whether it would work once live.[26][27]

Cevat Yerli, the CEO of Crytek, had researched a method for streaming games but concluded that Crytek's approach would not be viable until 2013 "at earliest". Yerli made it clear Crytek was not directly involved with the OnLive service, and Yerli had no personal experience using the service. Rather, Electronic Arts, the publisher of Crytek's Crysis Warhead, had partnered with OnLive and had tested and endorsed the OnLive technology. Yerli stated,

"I want to see it myself. I don't want to say it's either 'top or flop'. I hope it works for them because it could improve gamers' lives. The technology of video-based rendering is not actually a very new concept but they do some things that others didn't do before so it will be interesting to see."[28]

Reaction of console manufacturers

Steve Perlman has said the OnLive console, joystick and subscription would be cheaper than the cheapest of standard consoles.[29]

None of the console manufacturers made any official announcements about OnLive, however Sony registered a trademark for cloud gaming called "PS Cloud" the day after OnLive was announced.[29][30] However, the trademark covers a broad range of possibilities, including online videogames, Internet radio, electronic magazines, cloud computing, etc. and thus no direct conclusions can be drawn from it.[31]

Competitors

The first company to enter this space is the California-based company OTOY. It made an announcement on January 8, 2009 at the Consumer Electronics Show.[32]

Soon after OnLive was announced, another competitor, Gaikai, was announced.[33][34] Gaikai had not planned to announce its streaming browser-based Game-on-Demand service until June, but founder David Perry said it had to bring this forward when OnLive made its announcement.[35]

Playcast Media System announced a pilot launch in Israel to allow, "PlayStation 3 and Xbox 360 quality games, on demand," over the Hot cable tv network, though they have not announced any business relationships with publishers or specific titles.[36]

References

  1. ^ a b "FAQ". OnLive. Retrieved 2010-02-10. Cite error: The named reference "onlive_faq" was defined multiple times with different content (see the help page).
  2. ^ a b Roper, Chris. "IGN: GDC 09: OnLive Introduces The Future of Gaming". Pc.ign.com. Retrieved 2009-03-25.
  3. ^ http://g4tv.com/thefeed/blog/post/700743/OnLive-Service-Works-On-Cell-Phones-Too.html
  4. ^ http://www.gamespot.com/news/6240231.html
  5. ^ http://blog.onlive.com/2009/11/13/onlive-in-the-palm-of-your-hand/
  6. ^ Terdiman, Daniel (2009-03-19). "OnLive could threaten Xbox, PS3, and Wii | Gaming and Culture - CNET News". News.cnet.com. Retrieved 2009-03-25.
  7. ^ Ricker, Thomas (2009-03-20). "OnLive killed the game console star?". Engadget.com. Retrieved 2009-03-25.
  8. ^ "Le « cloud gaming », l'avenir du jeu vidéo ? - Actualités" (in Template:Fr icon). ZDNet.fr. Retrieved 2009-03-25.{{cite web}}: CS1 maint: unrecognized language (link)
  9. ^ The Impact of Emerging Technologies, Preview Sessions - 1
  10. ^ "Akamai: Fourth Quarterly "State of the Internet" Report". www.akamai.com. 2009-03-30. Retrieved 2009=2009-08-12. {{cite web}}: Check date values in: |accessdate= (help)
  11. ^ "Current Partners". OnLive. Retrieved 2009-03-25.
  12. ^ "List of OnLive Games". OnLive. Retrieved 2009-03-29.
  13. ^ Roper, Chris. "IGN: GDC 09: OnLive Introduces The Future of Gaming". Uk.pc.ign.com. Retrieved 2009-03-25.
  14. ^ http://www.theinquirer.net/inquirer/news/1556341/onlive-spell-trouble-pc-makers
  15. ^ Kelly, Kevin. "GDC09: Rearden Studios introduces OnLive game service and 'microconsole'". Joystiq.com. Retrieved 2009-03-25.
  16. ^ a b Joystiq: GDC09 interview: OnLive founder Steve Perlman, continued
  17. ^ "MicroConsole". OnLive. Retrieved 2009-03-29.
  18. ^ Joystiq: GDC09 interview: OnLive founder Steve Perlman wants you to be skeptical
  19. ^ "Beta Testing at the Speed of Light". OnLive. 2010-01-21. Retrieved 2010-01-23.
  20. ^ "OnLive Fully Detailed in Columbia University Presentation". The Escapist. 2009-12-30. Retrieved 2010-01-23.
  21. ^ "The Process of Invention: OnLive Video Game Service". The FU Foundation School of Engineering & Applied Science (Columbia University). Retrieved 2010-01-23.
  22. ^ "John Spinale". OnLive. Retrieved 2009-04-06.
  23. ^ Joystiq: GDC09 interview: OnLive founder Steve Perlman [page 2]
  24. ^ "Impressions: online and live with OnLive [update]". www.joystiq.com. 2009=06-03. Retrieved 2009-08-12. {{cite web}}: Check date values in: |date= (help)
  25. ^ Deam, Jordan (2009-06-08). "OnLive at E3: It Works". www.escapistmagazine.com. Retrieved 2009-08-12.
  26. ^ "GDC 09: 6 Reasons OnLive Could Be a Bust". PC World. 2006-07-31. Retrieved 2009-03-25.
  27. ^ By PETER SVENSSON – Mar 25, 2009 (2009-03-25). "The Associated Press: Streaming games could be bane or boon for ISPs". Google.com. Retrieved 2009-04-06.{{cite web}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  28. ^ "Crytek: Streaming games service viable in 2013". Gamesindustry.biz. 2009-04-02. Retrieved 2009-04-22.
  29. ^ a b Dumons, Olivier. "OnLive ou la fin annoncée des consoles de salon" (in French). LeMonde.fr. Retrieved 2009-04-07. De quoi inquiéter Sony, qui a senti là un danger non négligeable, et a immédiatement déposé (le lendemain de l'annonce) un brevet "PS Cloud" similaire à celui d'OnLive. (Enough to make Sony worried, who felt a significant danger, and immediately registered (the day after the announce) a trademark "PS Cloud" similar to OnLive's one.
  30. ^ U.S. Trademark 77,697,735
  31. ^ O'Gara, Maureen (2009-04-08). "Sony Trademarks the Term 'PS Cloud'". SYS-CON MEDIA. Retrieved 2009-04-28.
  32. ^ Hendrickson, Mark (2009-01-08). "AMD and OTOY Working Together on Fastest Supercomputer Ever". Tech Crunch. Retrieved 2009-07-29.
  33. ^ "Gaikai". Gaikai. Retrieved 2009-03-26.
  34. ^ "GDC Exclusive: David Perry's Entry into Server-Based Gaming". GameDaily. 2009-02-26. Retrieved 2009-03-26.
  35. ^ "OnLive: Inside and Out". Gamespot. 2009-03-24. Retrieved 2009-03-27.
  36. ^ C, Alex (2009-07-21). "Play PS3 Quality Games without a PS3". TheSixthAxis. Retrieved 2009-07-29.

External links

Template:Digital distribution platforms