OnLive - Will it work?
Moderator: Forum Moderators
-
HereComesPete
- Throbbing Cupcake

- Posts: 10249
- Joined: February 17th, 2007, 23:05
- Location: The maleboge
-
Dr. kitteny berk
- Morbo

- Posts: 19676
- Joined: December 10th, 2004, 21:53
- Contact:
-
FatherJack
- Site Owner

- Posts: 9597
- Joined: May 16th, 2005, 15:31
- Location: Coventry, UK
- Contact:
The broadband's probably the limit at the moment. While I can sustain an actual speed of around 16Mbps on a connection advertised at 'up to 20Mbps' that's rarely possible in normal usage, and I hit the limiter fairly quickly, bumping me back down to around 1.5Mbps.
Broadband in this country could be better and cheaper, but the infrastructure isn't there right now. Most people use ADSL over copper on connections sometimes little more than unshielded twisted pair circuits and it's only very recently that BT have considered plans to link homes fibreoptically.
Even that's rather old tech, with NTL (now Virgin) already having a fibre infrastructure in most of the country but it probably offers less bangs per buck than gigabit ethernet over copper in terms of line and switching gear cost, although 10G ethernet kit is IMO massively overpriced for what it is, though it exceeds even fibrechannel speeds.
Part of the problem is the limits ISPs impose to stop people taking the piss - if they let everyone do what they want, their datacentres would fall over. When I order a film on my V+ box, it downloads a fuck of a lot quicker than 20Mbps - it demonstrates what the connection is capable of when they're making direct profit from what I'm doing with it.
Actually though - I think the whole model is wrong. Each subscriber connects directly to their ISP and is regulated and charged according to what they do. There's no interconnection to other users in their locality, so if I want to send a massive file to my next-door neighbour - even if we use the same ISP - we both have to go via their datacentre and are each 'charged' for the amount of data transferred. The core datacentre and all other users on it effectively take that hit, too.
File Sharing has gotten itself a bad name, and ISPs seem to think it's all about stealing copyrighted works, but that's not all the technology can be used for. If there were local hubs where things like web-based traffic could be cached, so that other users in the same area could quickly download local copies of the same files others had already requested, the burden on the uplinks would be massively reduced. Most people, after all, do nothing but surf the web with their internets.
Legitimate uses of P2P file sharing as we know it would also benefit - the latest WoW patch or Linux distro could be downloaded in seconds with no impact on other users further up the chain. It could even lead to a more community-focussed approach to the intenet in general, with people hosting quickly-accessible information on the local area and files rather than flyers, instead of the disconnected society we live in now. Relevant hubs of local information and news, which expand as you traverse the nodes outward could be a reality, indeed all data people request could be accessed at least in part from other users, using the technology of P2P.
Back to OnLive. If there were Game Centres in each city, with sensible grouping of how they pump the data to customers, it could work. It wouldn't mean you could only play against people in your town, as the Game Centres would have low-latency links to each other, and it's only the position, orientation and posture of each player that needs to be transmitted between them. The big data dump is the resulting video information sent to each player, with dedicated centres it wouldn't be so bad, but relying on the existing connections to deliver this spells laggy failure. Also some single uber-brain computer centre outputting this video information to all connected clients across many games just doesn't seem possible.
Cost-wise, they'd have to charge a lot. For a computer capable of handling pumping 720p video to 16 concurrent users, you'd need one roughly five times more powerful than a £1000 PC. That's £5000+ for every 16 users, a £300-per-user startup cost. I know they want the MicroConsole to cost pence and just dumbly relay video, but those prices are well within proper console prices. The promise that it will scale to better technologies in the future will seem hollow to someone deciding between a hefty subscription now, or a potential new console purchase in the future.
Games. The console makers typically make a loss on the hardware and recoup it through the games. However they only recoup it through games bought for their system. The list of supposed partners is telling - it's the 'sick of the developing for different consoles shit' crowd. I don't think MS and Sony will abandon their hardware lightly, and will continue to make/license games exclusively for their machines, OnLive will only ever be able to offer a subset of all the games out there.
Broadband in this country could be better and cheaper, but the infrastructure isn't there right now. Most people use ADSL over copper on connections sometimes little more than unshielded twisted pair circuits and it's only very recently that BT have considered plans to link homes fibreoptically.
Even that's rather old tech, with NTL (now Virgin) already having a fibre infrastructure in most of the country but it probably offers less bangs per buck than gigabit ethernet over copper in terms of line and switching gear cost, although 10G ethernet kit is IMO massively overpriced for what it is, though it exceeds even fibrechannel speeds.
Part of the problem is the limits ISPs impose to stop people taking the piss - if they let everyone do what they want, their datacentres would fall over. When I order a film on my V+ box, it downloads a fuck of a lot quicker than 20Mbps - it demonstrates what the connection is capable of when they're making direct profit from what I'm doing with it.
Actually though - I think the whole model is wrong. Each subscriber connects directly to their ISP and is regulated and charged according to what they do. There's no interconnection to other users in their locality, so if I want to send a massive file to my next-door neighbour - even if we use the same ISP - we both have to go via their datacentre and are each 'charged' for the amount of data transferred. The core datacentre and all other users on it effectively take that hit, too.
File Sharing has gotten itself a bad name, and ISPs seem to think it's all about stealing copyrighted works, but that's not all the technology can be used for. If there were local hubs where things like web-based traffic could be cached, so that other users in the same area could quickly download local copies of the same files others had already requested, the burden on the uplinks would be massively reduced. Most people, after all, do nothing but surf the web with their internets.
Legitimate uses of P2P file sharing as we know it would also benefit - the latest WoW patch or Linux distro could be downloaded in seconds with no impact on other users further up the chain. It could even lead to a more community-focussed approach to the intenet in general, with people hosting quickly-accessible information on the local area and files rather than flyers, instead of the disconnected society we live in now. Relevant hubs of local information and news, which expand as you traverse the nodes outward could be a reality, indeed all data people request could be accessed at least in part from other users, using the technology of P2P.
Back to OnLive. If there were Game Centres in each city, with sensible grouping of how they pump the data to customers, it could work. It wouldn't mean you could only play against people in your town, as the Game Centres would have low-latency links to each other, and it's only the position, orientation and posture of each player that needs to be transmitted between them. The big data dump is the resulting video information sent to each player, with dedicated centres it wouldn't be so bad, but relying on the existing connections to deliver this spells laggy failure. Also some single uber-brain computer centre outputting this video information to all connected clients across many games just doesn't seem possible.
Cost-wise, they'd have to charge a lot. For a computer capable of handling pumping 720p video to 16 concurrent users, you'd need one roughly five times more powerful than a £1000 PC. That's £5000+ for every 16 users, a £300-per-user startup cost. I know they want the MicroConsole to cost pence and just dumbly relay video, but those prices are well within proper console prices. The promise that it will scale to better technologies in the future will seem hollow to someone deciding between a hefty subscription now, or a potential new console purchase in the future.
Games. The console makers typically make a loss on the hardware and recoup it through the games. However they only recoup it through games bought for their system. The list of supposed partners is telling - it's the 'sick of the developing for different consoles shit' crowd. I don't think MS and Sony will abandon their hardware lightly, and will continue to make/license games exclusively for their machines, OnLive will only ever be able to offer a subset of all the games out there.
-
ProfHawking
- Zombie

- Posts: 2101
- Joined: February 20th, 2005, 21:31
I'll belive it when i see it working.
I suppose it would be possible, but i'm imagining gaming over VLC or RDP - not good.
Teradichi (or whatever) seem to be able to handle "PC over IP" which i guess this is what this is, however that's designed for LAN not broadband.
Also, as above, cant see the finances working when the VC money dries up.
I suppose it would be possible, but i'm imagining gaming over VLC or RDP - not good.
Teradichi (or whatever) seem to be able to handle "PC over IP" which i guess this is what this is, however that's designed for LAN not broadband.
Also, as above, cant see the finances working when the VC money dries up.
-
Dr. kitteny berk
- Morbo

- Posts: 19676
- Joined: December 10th, 2004, 21:53
- Contact:
Just had a thought.
Input lag.
while the overall latency situation will probably be better than gaming currently (because everyone's machines will be in datacentres) you're still gonna have, say 80ms of lag between moving your controller, and seeing stuff move. (working on the basis of 40ms each way, which is about my average ping)
Remember early wireless mice? how frustrating they were in FPSs?
Yeah.
Input lag.
while the overall latency situation will probably be better than gaming currently (because everyone's machines will be in datacentres) you're still gonna have, say 80ms of lag between moving your controller, and seeing stuff move. (working on the basis of 40ms each way, which is about my average ping)
Remember early wireless mice? how frustrating they were in FPSs?
Yeah.
think about that, but in MP games, everyones imput will have the be sent back and forth between their computer, to the onlive servers, back to their pc and everyone else in the game, and the same for everyone else, and so on so forth, seems a bit ridonkulous to be honestDr. kitteny berk wrote:Just had a thought.
Input lag.
while the overall latency situation will probably be better than gaming currently (because everyone's machines will be in datacentres) you're still gonna have, say 80ms of lag between moving your controller, and seeing stuff move. (working on the basis of 40ms each way, which is about my average ping)
Remember early wireless mice? how frustrating they were in FPSs?
Yeah.
-
cheeseandham
- Shambler In Drag

- Posts: 780
- Joined: March 16th, 2007, 20:22
- Location: on the sofa
- Contact:
and then encoding it in real time to fit on a broadband pipe (unless you fancy streaming raw video?)FatherJack wrote:Cost-wise, they'd have to charge a lot. For a computer capable of handling pumping 720p video to 16 concurrent users
So in simple terms, you need a £300 console for each user, plugged into a kickass PC (or video hardware) in order to shift it youtube style to your screen, and the company has to produce or license games in order for people to play them, and make it cost effective? (Not to mention new game demand spikes and the inevitable new game hangover. At the moment the developers get your £40 after you've played the game for 2 hours, what happens to Onlive when a game has massive demand which drops off sharply?)
While they're at it, may I have my flying car?
-
HereComesPete
- Throbbing Cupcake

- Posts: 10249
- Joined: February 17th, 2007, 23:05
- Location: The maleboge
-
cheeseandham
- Shambler In Drag

- Posts: 780
- Joined: March 16th, 2007, 20:22
- Location: on the sofa
- Contact:





