Tuesday 5 April 2011

Streaming Video into Second Life/Opensim - The Easy Way

If you are considering streaming video into Second Life/Opensim (SLOS), either from a dynamic video source such as a webcam, or from pre-recorded video files (and for the purpose of this article I will concentrate on pre-recorded video files), then you have several choices. First the Prerequisites:

  • You must either be the owner or estate manager, or a member of a group that owns the property you plan to stream into (unless you are just streaming to others who are)
  • Your video source must be in the QuickTime format, or in a format that QuickTime supports
Note: The two most common Quicktime formats are *.mov and *.mp4, but there are other supported formats and these are listed on the QuickTime wiki page. If the video you are attempting to stream does not play in QuickTime Player then it will not play in Opensim, so it is a good idea to download and install the QuickTime Player, and check that your video source plays OK. Details on the specific format used for SLOS are available here.

The  choices available to would-be streamers are as follows:

  • Set up a video streaming server on a dedicated server or Virtual Private Server (VPS)
  • Lease a video streaming service
  • Do it the easy and zero-cost way
Setting up a video streaming server is not for the faint-hearted, and the streaming software can be quite expensive. There are open-source alternatives available, but can prove very daunting for the non-tech savvy, as they often need compiling from source and usually are command-line driven.

Leasing a video streaming service, such as Streamhoster or StreamUK, can likewise prove costly, depending on the number of users who simultaneously connect to the stream. There are some low-cost or free video streaming sites, such as Ustream, but where you have much less control, and compatibility with the QuickTime format is not always assured.

The whole point of streaming is that you do not need to wait for a video file to download completely before you can start viewing that video, you can start watching almost instantly. True streaming is one-to-many. If you are watching a stream and another user joins you, they will start watching the stream at the same point as yourself.

Youtube is the best known example of video streaming, but they use the Flash flv format which cannot be used in SLOS. It is possible to access an mp4 version of Youtube videos using any of a number of add-ons for browsers, particularly Firefox, that create a 'Download mp4' option on the Youtube page, and then rather than downloading the mp4 you could right-click and use the Copy Link Location option to get the URL link which you can paste directly into the About Land, Media tab, Media URL in SLOS. However, this often breaks as Google keeps changing the code for the mp4 streams on Youtube, and the URL links are now extremely long.

Do it the Easy Way
Until recently, I used the easiest method of all, and simply uploaded my mp4 files to a file hosting site, and linked directly to them. This works, but it is not streaming. I have to wait a minute or two for the video to download before it starts playing, and others who join me must do the same, so we are all watching the same video but at different points. If you leave the parcel while the video is playing, then return, the video will start again from the beginning, no matter how briefly you left the parcel.

The good news is that there is a middle way, and is easy, and has zero cost.

You can employ something called pseudo http streaming, but rather than one-to-many it is many one-to-ones. What this means is that I can watch a stream, but when another avatar joins me, the video will stream from the beginning for him, rather than at the same point as my video stream, but we both enjoy the advantages of streaming and do not need to wait for any files to completely download before watching the feed.

How do you set up pseudo http streaming?
Both mp4 and flv files are capable of being streamed using pseudo http streaming, and of course it is the  mp4 format that we are interested in. Servers do not stream mp4 files by default, and require a server side module to enable it. If you have your own server and wish to do this yourself, look here.

Fortunately, several free file hosting sites appear to be streaming module enabled. So I will explain the workflow for employing one of these sites.

Prepare your Files
You will need to first get your video files into the mp4 format. A good, free, and malware-free converter is Super (c) available from here (download link near bottom of page) and another is H.264, available here.

Super (c) Video Converter
As you can see, there are a number of parameters that can be set within the mp4 format. Some must be kept as standard, such as the Frame/sec of 29.97, and a sampling frequency of 44100 for the audio. The others can be experimented with. As I do not need excellent quality video within my Opensim I have gone for low settings: 320:240 video scale size; and video and audio bitrates of 160kbps and 32kbps respectively. This translates to a typical 3 minute music video file size of just under 20MB. This keeps well within the limits imposed by free file hosting sites, and ensures a fast start when played. Even better would be to select the typical 3G video scale size of 176:144, which gets the file size down to 10MB, and the difference in quality is almost imperceptible in SLOS.

Now, there is a trick to all this: many video studio software suites, and video encoders / converters, produce mp4 files with their metadata located at the end of each file, and most do not inform you about this, or give you any option to change it. This means that the entire file must be downloaded first before playback can begin. However, by moving the metadata to the front of the file the video can be streamed and watched almost instantly. A handy, free converter is available to do just this, simply called MetaData Mover. This great utility works on a folder of videos, rather than just on a single file. So create a new empty folder on your desktop and copy the files you want processing into it. Point MetaData Mover to the folder and it will process every file in the folder, moving the metadata from the back to the front of the file. One file typically takes 10 seconds to process.

You are now ready to upload your files.

File Hosting
The first file hosting site I tried did the job perfectly, so I am sure there are many more that are streaming module enabled. I used Fileden, which is a file hosting site, just requiring the registering of a free account. Once you have uploaded a file you are given the direct URL to the file. Not all file hosting sites provide this facility, so do shop around for one that suits you.

Once you have the URL to your mp4 file, just paste it into the media box of About land and the video should start playing almost instantly. If you are using some kind of video jukebox then just load the jukebox with the URLs of your uploaded video files. Needless to say, ensure that you avoid copyright infringement.

Playing your Video Streams
To set up a land parcel for playing video in SLOS, as mentioned previously,  requires that you either be a land owner, estate manager, or member of a group that has the privileges to set the Media for streaming video. This option can be accessed by either clicking the address in the title bar of the viewer, or by right clicking the ground and selecting the About Land option, then go to the Media Tab.

There are two main settings here for video. The first is the texture that will be replaced by the video when it plays, and the second is the URL for the video file. The texture can be any picture texture you like, but the same texture must be on the video screen that you intend to use. So don't use a common texture, like brick, or every house in your sim will turn into a video screen! You will also need to set up your Audio and Video settings in Preferences to enable streaming media. See Torley's video here on how to do all of this.

Have fun,

Sunday 3 April 2011

Kinect and SL/Opensim Animations: An Update

Following the previous article outlining my experiments with using the Kinect for creating animation files for Second Life and Opensim, I thought an update might be in order to summarise some of the lessons learnt, both by myself and other experimenters.

Since the last article on the 24th February, two of the key software applications used in the workflow, Brekel Kinect 3D Scanner, and Bvhacker have had further development. So what's new in the latest releases?

Brekel Kinect 3D Scanner
Two further releases have been issued since the last article:

v. 0.39
The most important changes in this release for SL and Opensim users is that now BVH positions are now OFF by default. Some experimenters were creating bvh files with Brekel, importing them in bvhacker, and in there they played OK, but when optimised by bvhacker, and then uploaded into Second Life, all kinds of problems were seen. As Deepgreenseas reported:
When I brought into SL, my avatar kind of floated oddly in the air, with arms straight out to the sides doing weird leg and abdomen contortions having nothing to do with what I saw in bvhacker. So, I'm at a total loss what I'm doing wrong. Any ideas?
 The problem was that in the BVH Capture section of the Brekel window, there is a check box for "Write
Positions." This check box was set ON by default, and so the resulting bvh file had both translation and rotation data recorded, while bvhacker and SL/Opensim require only rotation data. In v.039 this setting is now OFF by default, so only rotation data is present in the bvh files now.

Other improvements included easier installation, a more compact GUI,  a warning when running below 30fps (and thus dropping frames), more tooltips, and other improvements and the squashing of some bugs.

New features/improvements include:

  • knees/feet should be better now when turning around
  • Second Life format for bvh export option added *
  • hand rotations are now supported using some custom calculations 
  • added motor/LED support
    • you can tilt the Kinect up and down (see slider in the CPU/FPS dock)
    • the LED color will be set according to the skeleton calibration phase
    • please see the How To under the Help menu on how to update the driver 
This last improvement, the ability to control the Kinect movement, is a big benefit, and will save me from all those fiddly adjustments to my tripod settings.

* I have not yet tried the Second Life option when saving bvh files. I would be interested if it now solves Twisted Pharoah's problem below.

The latest release is now v.1.8, and the new features and changes are as follows:

  • Added three new video tutorials to the help section
  • Now possible to merge joints with their parent - useful for converting skeletons with roll bones
  • Now possible to delete descendants - useful for removing multiple finger or toe joints
  • Now possible to zero out descendants - see new video tutorial for use
  • Added preferences dialog (Ctrl P)
  • Now possible to set the default file view for open and save dialogs
  • Now possbile to show or hide the head mesh display
  • Added Zoom In and Zoom Out buttons to the view shortcuts
  • Removed 'Show first frame' view option

There is also a note on the file properties pane saying that the joint count includes End Sites. This had confused me at first as earlier research had shown that SL/Opensim typically uses 24 joints, yet when I counted them in bvhacker I could only see 19. This is because the 5 End Sites (one for the head and two each for the hands and feet) should also be added. End Sites can also be manipulated, to make the feet point backwards, for example, but I will leave it to the more imaginative experimenters how they use these End Sites.

I will now provide some further info about joints/bones and some of the problems faced by differing naming conventions used in various applications.

Some sources refer to the constituent parts of a bvh file as being composed of bones. This also had me confused at first, but Dave (the author of bvhacker) explained it well:
Although the terms 'bone' and 'joint' can be used interchangeably, I
prefer 'joint', as the position, rotation and translation apply to the
point in space where the joint actually is. Mathematically, the 'bone'
is actually just the space in-between two joints, so I personally find
it easier to think in terms of joints instead of bones.
Unfortunately, SL/Opensim mainly uses the bones naming convention, so rForearm is the bone between the wrist and elbow in the right arrm.

The Nite tracking software for the Kinect and Brekel use joint names, so they do use RightWrist and RightElbow. The list of joints they use are:

  • Hips
  • Chest
  • Neck
  • Head
  • End Site

  • LeftCollar
  • LeftShoulder
  • LeftElbow
  • LeftWrist
  • End Site

  • RightCollar
  • RightShoulder
  • RightElbow
  • RightWrist
  • End Site

  • LeftHip
  • LeftKnee
  • LeftAnkle
  • End Site

  • RightHIp
  • RightKnee
  • RightAnkle
  • EndSite

The list of bones/joints used in SL/Opensim are:
  • hip
  • abdomen
  • chest
  • neck
  • head
  • End Site

  • lCollar
  • lShldr
  • lForeArm
  • lHand
  • End Site

  • rCollar
  • rShldr
  • rForeArm
  • rHand
  • End Site

  • lThigh
  • lShin
  • lFoot
  • End Site

  • rThigh
  • rShin
  • rFoot
  • EndSite
As you can see, the NITE tracking software for the Kinect does not use both joints in the torso (chest and abdomen), just the chest, and explains the difference in the number of joints/bones, i.e. 23 and 24.

Bvhacker is designed to do the conversions between standard bvh files in joint-naming format and  the SL format using bone-naming format. However, it has been noted that the bone names in NITE/Brekel was causing some problems in Bvhacker, which renamed two joints to the same bone name. This was found by Twisted Pharaoh in the SLuniverse forum:

BVHacker renames 2 joints lShldr and 2 others rShldr.

What you should do is rename LeftShoulder into lCollar manually, RightShoulder in rCollar then do the Attempt SL Joint naming.
To manually rename joints in Bvhacker just load the bvh file, then in the joints pane on the left, select the joint and right click, and select Rename Joint.

Other Tips
Sarah Kline:
I discovered assuming a T pose before recording starts helped the shoulder rotation problems I had when bringing in to bvhacker.
 If you have any other Tips or suggestions for improving the current workflow, (if you find a cure for slight jitter, for example), then please add your comments to the article and I will provide a further update later.

Tuesday 15 March 2011

Moving Objects between Opensims

To move/copy your objects from your Opensim into another Opensim or Opensim-based world (where it is permitted) you need to do the following:

Install libomv and use its TestClient to login to your Opensim

1.      Download a copy of libopenmetaverse (libomv). The download page can be found here:

2.      Once you have downloaded and installed libomv you might want to create a shortcut on the desktop to the most frequently used tool in the toolbox, the TestClient, located in the bin folder.

3.      A full list of all the TestClient commands can be found below, but for now the simple logging in to a grid procedure will be explained:

Start the TestClient, a DOS-like window will open. Use the login command, which has the format: login firstname lastname password [simname] [loginuri]. For example,

login Fred Flinstone yabadabadoo  http://myopensim.dyndns.org:9000

If you wanted to log into a grid, but into a particular sim or region, then use:

login Fred Flinstone yabadabadoo Bedrock  http://myopensim.dyndns.org:9000

i) If you do not specify a loginuri then TestClient will use the default address for SL.
ii) Be patient, it usually takes a few seconds for commands to complete, and you often have no indication that the command is working in the background until it completes.
3. Although the TestClient says 'Type help for a command list' it will not work, until after you have logged in an avatar.

4.      If the login was successful, then you should see a window like the one below, and you are now ready to start issuing commands.

TestClient Login

Exporting and Importing Objects between Opensims
To export (backup) objects in xml format to your hard-drive follow these steps:

1.      Login to your Opensim using a regular client (Hippo, Phoenix, Imprudence, etc), and rezz on the ground all the objects you wish to export. You can now either logout, or stay logged in, but in that case you will need to login to your Opensim using libomv using a different account.

2.      Login to your Opensim with TestClient using the procedure above. Make sure you log into the same sim or region where the objects are rezzed. If this is not the case then you can teleport there, by using the goto command:

goto Bedrock/100/100/30

3.      Perform a scan for the rezzed objects using the findobjects [radius] command:

findobjects 20
(that will find all rezzed objects within a 20m radius, and provide a list of all objects, with their UUIDs).

4.      Copy the list of objects and Paste the results into Notepad (just so you can copy/paste the UUIDs easier, but see Note below).

5.      Objects can now be exported by using the export [UUID] [filename] command, eg:

export 50680f90-4e1d-11e0-b8af-0800200c9a66 barstool.xml

a) To Copy/Paste in the DOS-like cmd window see this How-to:
b) The barstool.xml file will saved in the bin folder.

6.      To import your barstool in another Opensim, login using the TestClient, then issue the command, import [filename], eg

import barstool.xml

a) For complex objects it can take some time for the object to be recreated. You should NOT interrupt the TestClient during this process until it informs you that it has either completed the import, or has issued an error.


All Test Client Commands

Thursday 24 February 2011

Creating SL Animations using the Kinect

I have been experimenting of late with the Xbox Kinect as a cheap Mocap source to generate BioVision Hierarchy (BVH) files for upload into Second Life, and the results are pretty encouraging.

What you Need

A standalone Kinect, with power supply (this comes as standard when purchased as a standalone), if you buy it as part of an Xbox bundle you will need to buy an additional cable sold separately to connect it to the PC. I bought my Kinect from a local gaming store for Euro139.

Kinect, showing its USB connector and power adaptor
Brekel Kinect 3D Scanner
SensorKinect Drivers
NITE User Tracking Module

For my procedure I used a laptop running 32 bit Windows XP , although Brekel Kinect is confirmed to work also with win7 x86 & x64 as well as XP x64 & x86, but no Mac/Linux version of Brekel Kinect planned.

1. Once you have unpacked your Kinect you need to place it on a table or a tripod (which is what I did), at around 0.6m - 1.8m from the ground. You can plug it into a power socket at this stage, but do not connect the USB connector to your PC/laptop just yet.

Kinect mounted on an inexpensive camera tripod
2. Download and install the following software in this order:

Brekel Kinect 3D Scanner v0.36

OpenNI Alpha Build for Windows v1.0.0.23

PrimeSensor v5.0.0 (Modules for OpenNI)
Click the Downloads button and choose the zip file. After downloading extract the avin2-SensorKinect folder, and within this go to the bin subfolder, and run SensorKinect-Win32-5.0.0.exe

PrimeSensor v5.0.0 with correct options selected
PrimeSense NITE Beta Build for Windows v1.3.0.17
Use this key during installation: 0KOIk2JeIBYClPWVnMoRKn5cdY4=
(This is NOT a pirate key)

NITE Installation Wizard
Copy and paste the License Key from above

 Now download and install Bvhacker 

3. Once the software is installed you can  connect the Kinect USB lead. The Found New Hardware dialog should appear.
No need to connect to Windows Update
When it asks if it can connect to Windows Update, choose No, not his time. On the next dialog screen choose to Install the software automatically.

The hardware that will be detected are the Kinect Motor, Xbox NUI Audio and Kinect Camera. The drivers for the Xbox NUI Audio will not be found and will fail, but ignore this, they are not needed.

Device Manager showing the new device, PrimeSensor
When the drivers have been installed you can check that the hardware and its drivers are ready to be used by Windows by going to Device Manager (Start, Settings, Control Panel, System, Hardware, Device Manager) and you should now see a new entry, PrimeSensor, and under that will be the Kinect Camera and Motor.

4.You can now launch Brekel Kinect.

Brekel Kinect GUI
Brekel may launch with a Dark skin. You can change this by unchecking it under the Window menu item. I found I had to do this as some of the check boxes did not show too well with this almost black skin.

The only setting you now have to change is to enable NITE User Tracking at the centre bottom of the window.

The only setting to change, make sure NITE User Tracking is enabled
Once NITE User Tracking is enabled you should get a message in red on the central image that says: User1 [Looking for Psi pose]. Ensure that your full body is visible in the video windows, then adopt the Psi pose (like you are a trident, or surrendering). Once the pose is recognised it will take just a few seconds for tracking to commence (be patient, and KEEP STILL!), and the message should now go green.
Psi Pose
 As soon as you are being tracked (indicated by white dots covering your body) you can start recording animations. Start with simple ones like the ubiquitous salute, or hand clap. To start recording to BVH click the Start Capture BVH button (near the top right). If you are on your own you may find that tracking is lost if you go out of the field of view of the Kinect, in this case either try to have the PC/laptop within reach, or have a friend start the capture. There is a four second delay (useful if you are on your own) before recording starts. When done, click the Stop Capture to BVH button.

For full details on the capture procedure, and what can go wrong, go to the main menu, Help, How To - Capture motion to a BVH file. Additional resources may also be found on the main Brekel website.

Brekel can save BVH in two main formats, HumanIK (used by MotionBuilder and others) and Biped (used by 3DSMax). Neither of these can be imported directly into Second Life, as the naming convention for Root, joints and bones are slightly different, this is where Bvhacker comes in.

To prepare the BVH file for Second Life, start Bvhacker then File, Open, and load the BVH file. Once loaded you can play it to see what it looks like using the Start button at the bottom of the window. If all is well go back to the menu, Hack, and choose Attempt SL joint naming. This should go without a hitch, and you can now save it out ready for importing into Second life.

Have fun,

Tuesday 5 October 2010

Is Linden Labs in Self-Destruct Mode?

 Following Linden Lab's internal restructuring which saw it lose 30% of its staff, and the recent decision to close the Teen Grid, with little or no consultation, they are now about to abandon discounts for Educational and Non-Profit (EDNP) organisations from the 1st January, 2011.

Will there be an exodus of a large number of EDNP organisations after the 1st January? I think so.

Currently EDNPs pay US$147.50 per month for their Maintenance Fee. All renewals after the 1st January will be invoiced at the standard rate for Private Regions, i.e. at US$295, or double their current rate. Here are a selection of some of the responses to this news:
Oh dear, this is very bad news. Budgets for educators are often set well in advance and in some cases very fixed even looking across a grant period of some years. In our case I have just negotiated funds to renew some of our regions from 12 months from now, so changes on a short time scale that double prices are quite a shock. In one case I have funding fixed for 2years out too.
Ai Austin
As one of your Educational customers I am furious! Fiscal Year Budget planning and approval happens in Aug-Sept for most Universities then you make an announcement like this in Oct!! I just had my budget approved 2 weeks ago and now I gotta go beg my superiors to approve an increase to the budget despite budgets being cut on every level of the University due to the economy. All of our sims exceed the maxium capabilities of Homesteads and OpenSpaces so there's no way we could use them.
Ron Ghostaltar
Doubling the price of regions for nonprofit and educational use strikes me as a really bad idea. Do we really need a mass exodus of these important members of the Second Life community at this time?
Shirley Marquez
I'd say that about 75% of all educational or non-profit organizations will leave next year.
Yuukie Onmura
Ah well Linden you have finally made the decision that tips the balance for us.  I have been hanging on in SL as I still truly believe in the value of the educational community that has been built here but over the last few months it has become increasingly difficult to justify not moving to OpenSim.  This however will be the final straw as I see no way that my institution will be able to justify paying double the price for our two islands. Fortunately we still have until Aug 2011 but I suspect we will be long gone by then!
Arwenna Stardust
Of course, some may abandon their Virtual campuses altogether rather than face a doubling of their invoices, but if there is to be an exodus there has to be somewhere for them to move to. So just where is the Promised Land for EDNPs?

The obvious alternative is any of the worlds built using Opensim, the Second Life open-source clone, which will give them the same look and feel as their current regions, and allow them to use the same viewer. However, one of the prime considerations for EDNPs is to avoid having to live cheek by jowl with some of the more 'adult' activities that go on in these virtual worlds.

Fortunately, several Opensim-based grids have sprung up that cater exclusively for the EDNP communities.

Cybergrid is a German language grid for young people of 12-17, and consists of 5 regions with over 500 registered uers. It is the Homeland of Cyberland-Jugendcommunity, netzcheckers, and netbridge. Regions cost €120 setup fee, and €40/month maintenance fee.


The jokaydiaGrid Project has a number of aims including:
  • Exploring ways to ‘multi-grid’ – eg. creating strategies, techniques and best practices for creating presence across a range of virtual worlds, and learning how to best use each environment for and to its best advantage
  • Engaging kids in our virtual worlds adventures – the jokaydiaGrid gives us the freedom to create a PG environment which is much more viable for k-12 educational use
  • Learning about Opensimulator – we are excited to be joining the opensource virtual worlds community and look forward to both learning about and contributing to the development of opensource virtual worlds options for education
  • Developing new 3D educational resources – Leveraging off the flexibility available to us on the opensim platform for public and private delivery (without the scary pricetag!)
  • … and most importantly t0 Play! We aim to continue to create a community that learns, inspires, explores and shares.
Regions on the ojokadiaGrid cost $50.00au setup fee + $25.00au per month.

ReactionGrid Inc. is a 3D world development company with offices in Orlando FL and Boston MA.  Their focus is educational, business and entertainment use of  3D environments.  Similar to a modern video game but oriented towards return on investment whether that is time saved, data visualized, or collaboration sessions for training or communications. 

Their clients range from Fortune 100 firms like Microsoft, IBM, Raytheon, Siemens, to government entities like the Veterans Administration to institutions like Boston College, Future University Japan, Hong Kong Polytechnic, University Autonomous Mexico and more.

ReactionGrid Inc. hosts and develops virtual worlds, and provides products that enable you to express your 3D ideas and share them with your colleagues, peers, and community. We deliver our systems virtualized on a platform we call Harmony.

They are able to deliver templated virtual worlds for specific industry and use cases both in the cloud on their hosted servers and firewalled. They make sure your system is setup quickly and support your project needs as you grow.

Because of ReactionGrid's firewalled solutions, all the grids they host (and an example is  the jokaydiaGrid above) can enjoy security and privacy, and is thus ideal for hosting Educational Projects.

ReactionGrid has a range of prices for Regions, starting from just $75, full details here.

The goal of ScienceSim is to enable new usages in education and visualization through the construction of persistent 3D spaces build and deployed by a federation of organizations and users.

To accomplish the goal, they propose to create a foundation with three objectives:
  • Maintain a stable distribution of the OpenSim 3D application platform
  • Document best practices for the use of OpenSim in science and education
  • Provide content and applications to support those best practices
 They propose to establish the foundation in two stages. The first is an interim stage focused on developing a stable release of the OpenSim code base. The second stage creates the full foundation structure.

ScienceSim is primarily a grid, i.e. it is mainly used by EDNPs that have their own Opensim-based world on their own PC or server, and would like it to be connected to a grid of other like-minded worlds.  Currently, ScienceSim provide their grid services free of charge.

Getting Help
For  EDNPs who need help in migrating from Second Life to another grid they may wish to consider the services of  Firesabre a company that specialises in this type of work.

Friday 16 July 2010

Haptics: The Next Big Thing for Virtual Worlds?

Some recent advances give us a clue what might next be in store for Virtual World and 3D Virtual gaming development in the not too-distant future, and they all centre around Haptics, the tactile feedback technology that applies forces, vibrations, mild electric shocks,  and/or motions to the user, to simulate the sense of touch. Here are some examples of the direction Haptic Technology has gone in recently.

The Holodeck
Fans of Star Trek will be quite familiar with the holodeck, depicted as an enclosed room in which objects and people are simulated by a combination of replicated matter, tractor beams, and shaped force fields onto which holographic images are projected, so the user appears to be in a nightclub, an alien world, their home planet, etc. Now, replicated matter, tractor beams and shaped force fields may be definitely science fiction, for now, but holographic images that can be touched are no longer in that realm, as of now, they are science fact.

Researchers at the University of Tokyo demonstrated the principles of Touchable Holography at the SIGGRAPH2009  exhibition in New Orleans last August. As they said at the time:

Recently, mid-air displays are attracting a lot of attention in the fields of digital signage and home TV, and many types of holographic displays have been proposed and developed. Although we can "see" holograhpic images as if they are really floating in front of us, we cannot "touch" them, because they are nothing but light.

This project adds tactile feedback to the hovering image in 3D free space. Tactile sensation requires contact with objects, but including a stimulator in the work space dilutes the appearance of holographic images. The Airborne Ultrasound Tactile Display solves this problem by producing tactile sensation on a user's hand without any direct contact and without diluting the quality of the holographic projection.
By using a non-linear property of ultrasound, called acoustic radiation pressure, the researchers were able to replicate the sensation of touch when a user placed his hand beneath a holographic ball, and even produced the sensation of splashes when a hand was placed beneath holographic raindrops. They really did feel like they were splashing onto the user's hand.

I think I know where this technology will head to. How long before the first holographic boyfriend or girlfriend makes its debut, 1 year, 2 years?

Transparent Touch Screens
Remember those amazing touchable computer graphics on glass screens in the hit movie Avatar last year? Science fact caught up with that science fiction with indecent haste. Before the end of the year Intel were demonstrating such a screen at CES 2009. This amazing screen, powered by the i7 processor, was capable of rendering almost 1 million polygons in real time.

I want one! Make that 2!

Touch Screens that Touch Back
One of the leading pioneers of haptics is the Russian scientist Dr Ivan Poupyrev, currently senior researcher at Disney Research Labs. In an article carried by the BBC last week he explained that:

The basic goal of the technology we are developing at Disney is to create a perception of texture - to let people 'feel' objects on screen by stroking them with their fingers.

We do this by applying a high voltage to a transparent electrode on the glass plate - in this case people will feel a texture on the glass. By varying the frequency and amplitude of the signal we can create different sensations.

The results can recreate the feeling of paper or a textile, simulate the smoothness of glass and even the roughness of sand paper.
While the aim of the current research is more focussed on handheld mobile devices, the scope for this technology in larger PC screens has not gone un-noticed.

Another leading light in this field is Esterline Technologies that are already offering vibration feedback technology to the medical, defense, and gaming industries, enabling display screens to give the sensation of touching back when pressed.

The advances in this industry never cease to amaze me.

Tuesday 13 July 2010

IllFonic Licenses CryEngine3 for Futuristic Arena First-Person Shooter

Denver, CO USA / Frankfurt, Germany – July 13th, 2010: Crytek GmbH (“Crytek”) and IllFonic, LLC announced today that IllFonic has licensed CryENGINE 3 for Nexuiz, their upcoming XBLA and PSN cult futuristic Arena First-Person Shooter dropping this Winter. CryENGINE®3 has allowed the IllFonic development team to achieve their vision for Nexuiz that will push the limits of what gamers can expect from an AAA digital downloadable title.

“IllFonic firmly believes in bringing the consumer the highest quality games at an affordable price through downloadable distribution channels”, said Charles Brungardt, President of IllFonic. “Switching to CryENGINE 3 has helped us stay true to our vision and build the Nexuiz arenas the way we see it without any limitations. We are incredibly proud that Nexuiz will be the first downloadable title developed on CryENGINE 3.”

"We’re delighted to have Illfonic join our community of licensees”, said Carl Jones, Director of CryENGINE Global Business Development. “It’s exciting to see a passionate group like Kedhrin and Charles’ team working with CryENGINE 3 on such a cool title. Nexuiz is going to deliver a game style that will be a blast for the console audience, matched with the best graphics possible on the consoles. We’re delighted to offer our engine to teams for XBLA and PSN titles so that gamers can enjoy the quality that CryENGINE 3 can provide, as soon as possible. Our real-time multiplatform pipeline, Live Create, is highly suited to prototyping and delivering quality for games with shorter development cycles; and you get all the benefits of the AAA features of the engine. Nexuiz is going to be a lot of fun and we’re glad Illfonic have chosen CryENGINE 3 to deliver it in style.”

"When we were strolling around GDC 2010's floor we stopped by the Crytek booth. I watched someone show off a few features of CryENGINE 3. Right then and there, I knew I had to have it. It's powerful, fast and easy to use,” said Kedhrin Gonzalez, Creative Director of IllFonic. “Crytek has been awesome to work with providing excellent support in a relationship that has really benefited us."

Nexuiz is a fast paced Arena first-person shooter with competitive game play built specifically for consoles. Featuring the innovative mutator system, players progress through the ranks opening up new mutators that allow players to alter the rules for each match. On launch, Nexuiz will feature multiplayer modes including Team Deathmatch and Capture the Flag complete with full competitive leader boards designed for social networking. New games modes, models, and maps will be available as downloadable content post launch.

Nexuiz is set in a galactic war fueled for centuries by the Kavussari and Forsellians. Over time the two races entered into treaties with the Herald Accord, a union between different cultures in the galaxy. Even though peace settled across their planets, the seething hatred between the races kept the fire of war simmering under the fragile truce. Sensing their newest members could spread war throughout the galaxy; the Herald Accord gave the Kavussari and Forsellians a choice. Pit their warriors against each other in the arena rather than on the fields of war, or face total annihilation. The Nexuiz was formed, a series of battle arenas on the home planets of the Kavussari, Forsellian and the desolate planet of Atavirta.

IllFonic will be showcasing Nexuiz using CryENGINE 3 at this year’s PAX Prime in Seattle, WA, on September 3-5.

For more information on Nexuiz, go to www.nexuiz.com or www.illfonic.com.

About IllFonic:

IllFonic, LLC, was founded in 2007 by musician Raphael Saadiq, engineer Chuck Brungardt, and game designer Kedhrin Gonzalez. IllFonic is committed to delivering AAA games digitally to consoles and PC at an affordable price. IllFonic utilizes many avenues in pop culture to cross brand its products in film, television, sports, music, and clothing. With offices in Los Angeles and Denver, IllFonic has built a team of artists, developers, producers, and musicians that believe providing fun game-play means conveying the highest level of visual awe, an immersive environment and a sick soundtrack. For more information on the company, go to www.illfonic.com.

About Crytek:

Crytek GmbH ("Crytek") is one of the world’s leading independent development studios for interactive entertainment. It is based in Frankfurt am Main (Germany) and has additional studios in Kiev (Ukraine), Budapest (Hungary), Sofia (Bulgaria), Seoul (South Korea) and Nottingham (UK). Crytek is dedicated to creating exceptionally high-quality video games for next-generation consoles and PC, powered by their proprietary cutting-edge 3D-Game-Technology, CryENGINE®. Since its foundation in 1999, Crytek has created the multi-award winning PC titles Far Cry®, Crysis® (awarded best PC Game of E3 2007 and Best Technology at the 2008 Game Developers Choice Awards) and Crysis Warhead® (awarded Best Graphics Technology at IGN Best of 2008 Awards).

Media Contact:
North and South Americas/ Asia
Tricia Gray

Europe/Australia/New Zealand
Chris Clarke
Tel: 00 44 208 6708425
Mobile: 00 44 7590 509278

Sunday 9 May 2010

Virtual Shopping

An article by Guest Contributor, Miidasu

The real life economy may be undergoing a slow, torturous recovery, but virtual world economies are thriving. Well, that’s not entirely true. Many worlds died in the last few months: There.com, Vivaty, and Metaplace. Still, Second Life announced that their virtual economy hit a high in the first quarter of 2010. IMVU is also looking at the new year optimistically, according to Tech Crunch. Even social games like Farmville are hitting the big bucks.

Virtual worlds reflect the real world in many aspects (they don't call them virtual "worlds" for no reason). In particular, user-generated virtual economies are similar to the globalized capitalist system. They rely on creative entrepreneurs to run businesses, for creators to supply and buyers to demand.

The question I have is why is there demand? Granted I am not in college anymore; don't mistake this as some sort of academic inquiry. I am just an intrigued metaverse lover. I understand the desire to create items, but purchasing items with money I can use for a real material object? What's the reason for it?
What is the appeal of virtual goods? Understandably, there are functional goods that increase performance, give more features and such. However, what entices people to purchase the aesthetic goods, like clothing, furniture, or even poses, with money they can use for real world items.

There are plenty of academic studies on why people purchase virtual goods, features, performance enhancements. Maybe I am way behind on my academic reading, but I am surprised that I have not seen the subject of accessibility included in some articles.

The Information Age is known for accessibility. I can get information in seconds, download musics and movies in minutes, in short, get what I want when I want it. We are a generation that craves instant gratification--we want to achieve goals now, and for short-term satisfaction. Case and point: instant tea versus a fresh brew, microwaves versus cooking, movies versus books, and so on. In the same way, virtual goods are easily accessible. A massive dose of instant-gratification at the tip of your fingers.

Getting around a few worlds can be difficult,  but shopping for clothes and other items is increasingly easy. IMVU and Frenzoo have shops that are easy to navigate. Trying on items is available without demos, the details are listed in one place, and there is no transferring of products.

Shopping in Second Life is more difficult, especially for those who don't know how the world works. Still,  Second life is fun because it is the most interactive shopping I have experienced, at least in the three worlds I am apart of. I used to have Friday night shopping sprees with a friend across the country. We would explore shops with our avatars, try on items, and ask each other what we thought. It's almost like a real world shopping experience. That's not to say other worlds aren't interactive. IMVU shop owners are getting creative and making boutique rooms: rooms where they display their items. Standing on a node next to that item opens options to purchase or try on. Of course, you have to find the correct boutique by searching through an endless list of rooms.

The point is that shopping in virtual worlds is more accessible than shopping in real life. I don't have to fight my way through traffic. In fact, I don't have to get up at all, and I am still achieving a goal (the goal being a sense of satisfaction). Not to mention, virtual worlds are almost always open. Even when there is little money in your wallet, virtual worlds are there for a leisurely escape, a social chit chat, or a good old exploration.

They are almost always open, 24/7, for your entertainment. Instant-gratification fix at your pleasure.

Enjoy, shopaholics.

Copy-Editor of Frenzy, a Frenzoo.com magazine
Thursday 6 May 2010

Virtual Farming: Where there's muck there's money!

Although I am taking a sabbatical from writing for a while, I just could not resist this one.

Business Insider reports that Zynga, the company behind Farmville, the popular game in Facebook has been provisionally valued at US$4 billion! When did lost kittens, lonely cows and manure get so popular?

Sunday 25 April 2010

Diary: 25th April, 2010

I am taking a break from writing and updating my blog for a little while, while I work on finishing a book. I hope to continue the blog at the end of May, or early June.

Have fun, and if in the meantime any of you would like to submit an article, on any aspect of Virtual Worlds, from whatever angle you like, then feel free to contact me via a Comment to this post, and I will be happy to publish it on Chapter & Metaverse.

Best Regards

Sunday 11 April 2010

The Birth and Adolescence of Web.Alive and the 3D Web

For some time Nortel had been pondering what the 'connectedness' of the Internet, broadband access, camera phones, voice-over-IP, instant messaging, social networking, video uploading, etc., all meant for businesses and organizations, and how this connectedness could be turned into a competitive advantage.

To answer this question Nortel commissioned IDC, the global market intelligence firm, to conduct a global study of almost 2,400 working adults in 17 countries. The study focused on quantifying the state of today's connectedness, tracking its acceptance and use across devices and applications as well as determining the pace of its growth and impact on the enterprise. The report was released in May 2008, under the title: "The Hyperconnected: Here They Come", in which they stated that 16 percent of business users were already hyperconnected, and predicted that that number would increase to 40% over the next 5 years.

The following month, in June 2008, Nortel released a demo of its virtual world in a browser, web.alive, that it had been working on for years, under the codename, Project Chainsaw. This was a step-change from the hardware-focused telecommunications giant, into the realms of software and services.

Things then began to move quite rapidly. In August 2008 Nortel acquired Diamondware in order to provide integrated 3D voice to its platform, and in January 2009 they licensed the Unreal Engine 2.5 to replace the existing engine. They hoped that talented creators of content for Epic's Unreal Engine-based games, would bring their skills to web.alive.

At the same time, Lenovo, the laptop maker, announced that they would be launching an e-commerce application, an eLounge named the Lenovo Virtual Showroom,  using web.alive. This was an expansion of Nortel's original marketing view of web.alive, which was focused on enterprise level collaboration and training applications. A very nice tour of the Lenovo eLounge then appeared on Dennis Shiao's It's All Virtual, blogsite.

Not all was well though in Nortel, the parent company. Despite the strength of their web.alive team, Nortel filed for bankruptcy protection in January 2009. Nortel then went into finacial meltdown and over the course of 2009 went into massive sell everything but the kitchen sink mode. However, keen interest had been displayed in the Enterprise Solutions unit, which included web.alive, by various suitors, so an auction was organised. The eventual winner was Avaya, and by the end of 2009 the sale of Nortel Enterprise Solutions to Avaya had been completed, and with it web.alive and the entire web.alive team. The reported price being $915 million.

During the meltdown, Nortel continued to develop web.alive, which was still in beta. It was a great platform, it was web-based, so no huge downloads were required, and it would be available to anyone who had a browser, regardless of the Operating System. In November 2009 web.alive beta 2 was released, which had an impressive array of features, including the ability to drag and drop documents into web.alive to make presentations.

Now that the future of web.alive has been secured, what about the future of the product itself?

Since the acquisition of its first customer, Lenovo, web.alive has acquired a few more, and the following have been identified (with the help of user wa 723, in the web.alive community forum):

My first impressions were not so good. By the time it took to load the browser plug-in, and for the avatar to be ready to start moving around, I had started to lose interest. I could have explored the first three or four products in a regular 2D store by the time the 3D store was ready to explore. I then found that wandering around a virtual store was nowhere near as focused as a 2D store. In a 2D store I could do a search for all laptops under $500, get a list and start to examine their looks and specs, something I could not do in the 3D store.

Clicking links in a 2D store is also a whole lot quicker than walking around a store (virtual or real). Of couse, it is also claimed that one of the benefits of a 3D virtual store over the regular 2D website, is that you can interact with salesfolk and other customers, and get great feedback. However, in every store I visited (and I visited over 80 in total, not just the web.alive-based stores) I never saw another soul. It was as much a ghost-town experience as I usually encounter in my visits to Second Life or the OS Grid. Could it be that sales staff only keep US office hours?

I also found, unlike 2D websites, I could not have more than one instance of web.alive open at the same time, for comparisons. When I tried, I got this:

Only one instance is permitted
Judging by the activity on the web.alive forum, where there are just 20 posts across all the topics to date (and the earliest post I found was dated  December 16th, 2008) it does appear that web.alive is not generating the kind of interest its designers had hoped for.

If the 3D Web is ever going to supplant the 2D Web, it needs to address some key issues:
  • 3D web pages need to load as quickly as 2D web pages
  • 3D web pages must be capable of being multi-instanced
  • The ability to search across products in a 3D store must be at least as easy as in a 2D store
  • The ability to teleport instantly to any product in the search results is a must
  • Companies must think about manning their virtual stores 24/7 to cater for all all time zones
I am still not convinced by the 3D web, and my lonely journey across it seems to confirm that I am not the only sceptic out there.

Thursday 8 April 2010

Vivaty: RIP

Continuing the recent trend of failing Virtual Worlds (e.g. Metaplace, Legend City Online, There.com), yet another company has bitten the virtual dust.

Vivaty have recently announced that they are closing their doors with a Shutdown Party on the 16th April.

Vivaty was yet another 3D Virtual Space that was spawned during the golden age of 2007-2008, who firmly believed in the 3D space concept, where people could inhabit 3D virtual rooms they could decorate, and invite friends in to join them for chat and other activities. Unlike the market leader, Second Life, Vivaty was web-based, and its scenes could be embedded in Facebook, blogs and other websites. They hoped that it could make money by selling Vivabux, the virtual currency that users could use to buy clothes for their avatars, furniture for their virtual homes, and other virtual goods and services. However, in a very frank statement, Jay Weber, Co-founder and Chief Technical Officer of Vivaty said:

Our business model was to earn money through Vivabux sales, but that has never come close to covering our costs. We tried for months to find a bigger partner that would support the site, but that didn’t work out.

Whether this is simply a consolidation of an over-populated market place, where just a few key players will survive, or indicative of a general decline in the market itself, is still to be seen.

Wednesday 7 April 2010

Blue Mars Roadmap: Second Stop

Today saw the launch of another significant Blue Mars release, with important updates to the Client, MyPage, and the SDK, although I will confine this article to just the Client and MyPage.

As there were so many changes to the Client the new release was a full release, Version 0.0.8237.0, rather than a patch. So, what's new?

Well, there are no new cities in this release, but there are some changes to existing cities. The Welcome Area gets better Time of Day, so it is not as dark now (but cloud shadows still drift across the land, when there are no clouds anywhere near the sun), plus some bug fixes; and Beach City has a "recent golf games" board in the Golf shop.

The main changes are these:

Private Messaging:

  • Click on a friend's name in your Friends list to open a private chat with them.
  • Currently has full logging enabled for PM.  Your chat text will be saved in the window even when you exit the Blue Mars client.
  • Handles multiple tabs of private messages which can be closed separately.
  • Click the minimize button on chat window to close window.
  • Note: The Friends list will be scrollable in the next release of the Client.
There is an immediate bug with this new functionality. If someone sends you a PM you get no notification that they have, in order for you to respond. A little like having a phone with no ring function (how do you answer it if iit does not ring?). I then checked MyPage to see if some test PMs sent by Friends inworld showed up in the Message Inbox. They didn't.

Unfortunately,  the Friends Lists still has the bug that if you have a lot of friends, only around 22 of them can be displayed, as this next image shows:

so if you wanted to PM someone, and their name is not in the first 22 displayed names, you cannot (except through your MyPage). This needs fixing asap.

  • Mouse wheel can now be used to control the camera distance.
  • The Camera View Change button in the Menu Tray has changed. 
  • Screenshot facility (undocumented, see note, two paragraphs below))
It did seem to me that the mouse wheel did not change the camera distance smoothly. Previously if you kept clicking on the Camera icon you cycled from 10m behind the avatar to 5m behind, to 1m behind, to First Person View. It now seems to me that the mouse wheel just replicates these 4 settings, with no interpolation between the settings, giving a jerky feel to the 'zoom'. I hope this is remedied in the future, and the range is extended to much further than 10m behind the avatar. In other Virtual Worlds, such as Second Life, you can zoom the camera over the Draw Distance, which can be set quite high. I usually have mine set at 512m, so zooming out gives a broad perspective of a region or City. It should be even higher for Blue Mars, given the huge sizes of Cities compared with the regions of Second Life. At least 1Km I would recommend.

I also do not like the use of a movie camera icon now for the camera (the previous camera icon now being reserved for a future Screenshot facility. But I have found that the Screenshot facility is already working. Just press F12 while the main window has the focus, then navigate to the My Documents\My Games\BlueMars\Screenshots folder, and your screenshot will be there). I would recommend the standard magnifying glass icons for the camera zoom functions, and the camera icon for screenshots.

The Preferences (brought up by clicking the Spanner icon, or by pressing Escape when the main window has the focus) has a new look, and more settings:

Screen Resolution: Change window size. Default is 1280 x 720.
Rendering Quality: Change render quality.
Set to High for better quality, set to Low for better performance.
Display Name: On: Display your avatar's name overhead.
Off: Hide your avatar's name overhead (but others will still see it).

Pointer Click Sound: On: Play a sound when you click the ground.
Off: Do not play a sound when you click the ground.

Show Bubbles: On: Show chat bubbles.
Off: Turn off chat bubbles.
Text Size: small: 50%, medium: Default, large: 200%
Max Distance: Distance that the other avatars' chat bubbles will be visible.
0m: Only your avatar's chat bubbles are visible.

One bug already found with switching chat bubbles off, is that it also switches off the display of avatar names above the heads of nearby avatars. The QA team are aware of this and are working on a fix.


  • All Pages
    • Added proper labels to each page on browser header and tabs
    • Loading icon now appears center.
  • Message Management (Message Page)
    • Fixed issue when adding a large amount of friends when composing a message.  Should now be contained in a scrollable text area.
    • Added counter for each message folder (inbox, sent, trash).
  • Friend Management (Friends Page)
    • Added background coloring
    • Added search filtering when searching for a friend on Blue Mars
    • Able to compose a message through the Friend's list
    • Scrolling through the friend's list is now smoother (only when using mouse).
I found that searching for Friends had several bugs. When I searched for the AR staffer, Summer Studios, using 'Summer' in the avatar firstname box, I got several hits with avatars using 'Summer' as their firstname, or part of their firstname, but I also got several hits of avatars with no sign of 'Summer' anywhere in their names. I also searched for Zoomer, who is on my Friends List, but a search for 'Zoomer' produced zero results!

But hey, it is still in beta, so these bugs are what they are expecting us to report on.

Tuesday 30 March 2010

Diary: 30th March 2010

OpenSim preparing Version 0.7
The core developers at OpenSim are busy preparing Release 0.7 of their opensource virtual world framework.

Release 0.7 will be the first one featuring the recent major refactoring and rearchitecting work that replaced the resource services and servers previously known as UGAIM, with one single server shell called ROBUST which can now run any combination of services within it.

The planning for Release 0.7 has been posted on the OpenSim wiki site. Speaking of sites, Opensim have also given us a sneek peek at their new redesigned website, which is a huge improvement, in my opinion.

More SL Creators are Testing the Blue Mars Waters
I see that Mako Magellan that "Purveyor of apparel for princes and paupers, princesses and premises, poseurs and parcels" of Second Life fame, is to open a store in Desmond Shang's Caledonia in Blue Mars, after trialling some of his creations in Beach City. The problem of where to get my tux for some of those more formal Blue Mars occasions is now solved :)

News from Myst Online
Myst Online: Uru Live is a massive multiplayer online game (MMOG) unlike anything else. Instead of repetitive kill/take/buy gameplay of other MMOGs, the very essence of Myst Online is to explore vast, fantastic worlds; savoring and uncovering new areas and new information at every turn. It is an amazing hybrid of MMOG and Virtual World.

MO:ULagain is the currently available reincarnation of everything in MO:UL in a free, donation-supported, server run by Cyan. It is the first step in “opening” MO:UL. Cyan is planning that the client, servers and tools of MO:UL will soon become open source, allowing fans to continue developing the game and its universe.