Thank you, Lindens

Yesterday, Stefan wrote down his thoughts on Linden Lab’s troubles. I loved that post, he’s right on. Let me add to that by stating the importance of the work that many of the laid off Lindens did, and the role of OpenSimulator from here on, at the technological level and beyond.

Things are moving into 3D, including the Web. I’ve been very excited with everything that is going on with WebGL. Sure, WebGL is still not good enough to render very rich scenes like those we find in highly immersive games, especially when the scenes aren’t optimized, such as the case with user-generated 3D content. But I have very little doubt that the needed optimizations will happen, and that soon we will have immersion on the web browser. It’s already happening. People want it, Google wants, it will happen.

So, let’s fast forward to the time when the Web browser can render rich 3D scenes, which, at the rate that the Google people and the Unity3D people are going at it, it probably is only a couple of years away.

Technically, the client-side is not the whole story. In order to develop highly immersive real-time environments with that kind of viewer, we still need a server side that can serve those 3D scenes in real-time to several clients. HTTP alone won’t do, it’s too slow. All MMOs use some sort of UDP-based protocol for the rapid state change notifications. HTML5’s Web sockets are there precisely for that reason.

Bottom-line: the server-side must serve more than HTTP.

There are a few of these game servers out there, the vast majority of them proprietary. Which is fine for the companies that own them. In the scenario where the browser is the 3D viewer, those companies can easily develop the necessary JavaScript components that talk to their back-end servers via WebSockets.

All these proprietary servers, however, are a bottleneck for the massification of 3D content on the Web. Just like for the Web itself, we need open source 3D scene servers. At the very least, we need open, standard protocols, so that people can focus on developing content without having to reinvent the network connectors, and the servers, from scratch all the time. (But with open standards come open implementations, so the result is the same: we need open source servers). Yes, we need an Apache for RT 3D scenes. Hmm, where did I hear that before…?

OpenSimulator, of course. When WebGL is good enough to render those rich 3D scenes, OpenSimulator will be there to serve those scenes in real-time to multiple clients. For free. This is the real importance of OpenSim, its main contribution. Another, secondary, contribution is all the research work that we have been doing in secure, portable identity with the Hypergrid. This is something that doesn’t exist on the Web, but it can be ported back. I’m not entirely sure the 2D Web needs it, but I’m pretty sure a web of virtual worlds needs it. Not all 3D environments will want to be connected in a web of peer servers; but many will.

In all of this, we must thank Linden Lab, especially Cory Ondrejka and Marc Lentczner, for making their protocol public; that made OpenSim legally possible. All other game companies are protective of their protocols, to the point that they go after anyone who tries to reverse-engineer them. That was not the case with Linden Lab, and we must thank them for that. They allowed us to focus on the server side. The fact that the client was available independently, for free, was no small thing either: it made the effort appealing to lots of people who got instant gratification of a Linden-like world on their own computers. More people means more eyes, more energy. MW and Lbsa were right on from the very beginning!

Going forward, the main challenge for OpenSim is to step back from the monstrosity of LLUDP and LLCAPs, which were designed for one very particular kind of environment, and figure out the minimum set of messaging that’s necessary to serve 3D scenes in RT to multiple clients. Minimum is good. The good thing about OpenSim is that the client protocols are plugin modules; OpenSim is not tied to any one particular client, not even LL (in theory; the reality of the code is a bit different… that’s why I say that stepping back from the LL protocol will be a challenge for OpenSim).

I feel like we’re parting ways with a long-time, dysfunctional, companion who decided to make a left turn, and I’m looking forward to what’s coming!

9 replies on “Thank you, Lindens”

  1. […] Diva did an excellent follow-up blog that’s a […]

  2. Erik N. says:

    Very interesting discussion about very interesting developments! From an educator’s perspective, I want to extend an additional thanks to OpenSim developers for developing a viable standalone platform for immersive learning environments, filling a need that no other platform has come close to providing. With your distribution, Diva, we’re on the verge of being able to develop and distribute 3D educational content easily with limited technical knowhow, which is crucial to schools with limited budgets and school IT professionals already stretched thin with the huge range of expertise and responsibilities our work requires. So I want to acknowledge that you and the rest of the opensim developer community have already carried this torch in a much needed direction. This task needs some refinement to gain more adopters–such as not all educators dare touch a command line–but it’s doable enough to make a simulation, distribute the OAR file, and show people how to set it up in a 30-minute screencast, expecting that most people will succeed given enough interest and an idea of what they are getting into .

  3. mjhm says:

    I’m admittedly new to this virtual world work (my background is in film production CGI). However I’m concerned that looking two years down the road and thinking that WebGL is the answer is a bad bet. If you really want to deliver rich 3D content with lots of avatars you need to be pushing around a lot of geometry. Just a back of the envelope calculation suggests that the best you can expect with say a 20Mbs channel is around 40000 verts of deforming geometry. And while 40000 verts may seem like a generous budget for MMOs, it’s teensy tiny for the kind of high end graphics people are expecting from film and console games. On the other hand, that same 20Mbs could pretty comfortably transmit H.264 compressed HD video. So if you buy into the server side rendering concept, the answer to the question of “the minimum set of messaging that’s necessary to serve [arbitrarily complex] 3D scenes in RT to multiple clients” is the same as the bandwidth needed to send compressed video streams. Furthermore you can send that same video to laptops, tablets, and phones with weak graphics capabilities. And possibly most importantly, developers can stop wasting brain cells on clever MMO cheats, and work toward actually improving the visual appeal of these worlds.

    Therefore IMHO if you are projecting 2+ years down the road, you had better be considering server side rendering into your plans. It may just be the kind of paradigm shifting kick-in-the-pants that this field needs right now.

  4. Diva Canto says:

    @mjhm I’m using ‘WebGL’ as a representative token of technologies that will allow immersion on a web browser and that are non-proprietary. It might not be WebGL as it exists today. Server-side rendering is certainly an exciting alternative, especially when we start seeing open source solutions for it — as far as I know there aren’t any yet. That’s one kind of plugin that I can see being added to OpenSim. If someone knows the logic for doing that kind of rendering, it would be really easy to write a module for OpenSim that serves those video streams from the current scene graph.

    But independently of where the rendering happens, interactive environments send fast streams of individual events from the clients back to the server (so, the opposite direction), and the server has to react to them. Those can’t be http; they’re not video either; they’re somewhere in between.

  5. Could you comment on the possibility of using XMPP for sending fast streams of individual events from the clients back to the server?

  6. Diva Canto says:

    @Troy I’m not sure TCP connections are desirable here, at least not for some kinds time-critical of events.

  7. Breen Whitman says:

    @mjhm It’ll all be passed to the GPU so is moot. A simple javascript statement could load, render, and animate a complex model and push it around some x-y-z axis.

    Re server side rendering. Thats all well and good but someone will have to have ownership of the thousands of square meters of server farms stacked to the gunnels with hardware. And make a viable business case of it. So Fat Clients are the way. Otherwise it will be exactly as it is now, with a Linden Labs like company hosting, processing and streaming the client views. Ownership remains proprrietry.

    Nay, Opensim, a hypergrid type communication system, linking to self hosted regions(probably a mega-region like system available today), and communicating back to a WebGL client with sufficient graphics card(as that person would have already to run there id Tech 4, Crysis or Unreal engined games.

  8. mjhm says:

    @Diva Its good to know that server side rendering is not architecturally excluded, even though it isn’t a realistic option right now.

    @Breen I have the courage of my convictions to put down $L250 that says fat clients become nearly irrelevant in 5 years.

    Here’s why:

    1. As I tried to point out, creating high end graphics is more IO bound than compute bound. Consumer internet pipes are orders of magnitude too narrow to transmit the volume of data that you need to get much beyond the caricatured look of MMO avatars, static environments, and simplistic particle effects — no matter what kind of badass GPU you have on the client. It’s a credit to the MMO developers and content creators how they can get such rich looks with so little bandwidth.
    2. The overwhelming success of laptops, and more mobile devices, and the drive for lower prices on these devices, is going to push the industry to optimize client graphics capabilities for streaming video, not 3D rendering.
    3. On the other hand the pervasiveness of these devices combined with the growing acceptance of cloud based “flat” content are going to build a demand for and expectation of cloud based 3D graphics.

    The two developments to watch for over the next year are:
    1. How successful is “Onlive” (and other server side game ventures — Gaikai, OTOY, Playcast, etc.)? If it is a roaring success, there will be a flood of companies hanging on their coattails. If it fails badly, well, then I guess I won’t be doubling down on my $L250.
    2. Public cloud based GPUs. There have been rumors for nearly a year about Amazon and/or Rackspace introducing cloud based GPU instances, and behind the scenes Nvidia and AMD are working on virtualizing GPUs. If, when, and what price these become available will all point to what kind of business cases can be made for all graphics applications, from film production to medical imaging. This would also potentially open the door for non-proprietary virtual world hosting.

    I hate predicting the future because in a choice between outcome A and B, you invariably end up being blindsided with outcome C. But server side rendering has so much potential to open up virtual worlds — heavy particle effects, hundreds (or thousands) of avatars in a sim, ambient occlusion rendering, more realistic avatar to avatar interactions, sayonara to graphics system inconsistencies — that it would be foolish to write off. Even though there are clearly a lot of dots to connect to make it happen.

    Although fat clients rule the day today, and I wouldn’t switch careers yet, neither would I spend time crafting code for them as if they would be part of the Linux kernel. At least that’s what my $L250 says.

  9. Karen Palen says:

    @Erik (post #2) I suspect that there is a similar situation to FOSS in developing countries. Kids love to share their work and their best work deserves wide distribution.

    With a FOSS system the student or teacher simply does a dump of their setup (OS and all) and shares it with anyone they please. No proprietary system can allow this!

    In the 3D arena we have efforts such as: http://metatek.blogspot.com/2010/03/opensim-mars-simulation.html

    This package is generating a lot of excitement and is trivially shared and customized to individual needs.

Comments are closed.