It’s been a whirlwind week here. Craig, Jamie Kwan and I attended the Heritage Toronto Gala Tuesday night to roll out the first public viewing of Longhouse3.x. Jamie was my graduate research assistant this year in the Master in Digital Media program here at Ryerson University, who used his architectural training to help visualize the modern interpretation of a 3D longhouse in Longhouse 2.5. It proved to be a stellar night and full of surprises from a research perspective. I want to thank Heritage Toronto for the opportunity to present our work and a special thanks to Claire van Nierop and Ron Williamson from ASI for inviting us to be part of their presentation.
Due to some last minute difficulties we had running the Ocular Rift DK2 on our Alienware Laptop, we switched to a monitor setup with XBox360 controllers for people to use for one station and Craig used his HP Laptop and OR DK2 for our virtual reality experience. Both interaction platforms were well received, but the OR obviously was the favourite choice among the 30+ or so people who participated.
We had a wide range of age, genders and Heritage professionals and enthusiasts try the VR experience. A non scientific observation was that our female participants spent a considerable amount of time within the environment, experiencing and observing all of the aspects of the reimagined longhouse, while our male participants usually donned the VR for it’s “cool” factor and then ran around quickly without taking the time to notice all of the elements within the environment. As we had older guests and also didn’t know who among our potential visitors might have ocular issues when putting on the headset, we chose to go with a seating position to ensure some stability for those who might encounter balance issues. Headphones were used to focus the hearing into the virtual space (which was a combination of forest, water, animal and burning fire sounds based on where you were). The controller was used to move the individual forward or backwards with the head movement dealing primarily with where you would look in VR space. As one visitor observed, the OR DK2 naturally allowed the Heritage professionals to look up and around, as they would normally do. One feature we didn’t have was a crouch command to allow people to inspect objects on the ground or below the standard height within the gaming environment.
The video loop above is our latest test of the longhouse within Unity5. By staging the visualization of items in the longhouse with everyday domestic items such as food and cooking utensils, it started further discussions on potential placement and use of those items within the space. Additional constraints involved the light and how it would effect shadows and highlights within what would really be a dark environment. Lastly, Craig had added smoke from all of the fires, but we soon discovered that it really filled the entire space, especially at the 4-5ft level with a dense fog which made it difficult to see the details in the models. We plan to provide a smoke and non-smoke version shortly to demonstrate what it would be like, which would likely be very unpleasant to function in during the long winter months.
We added items such as cooking tools, pots and bowls (even with liquid in some….boiling to come later) but the placement is completely assumed and somewhat random. We can easily change position and hopefully in the next couple of iterations we should be able to pick up objects and move them elsewhere. Craig did a wonderful job replicating the bowls and spoons and we used previously modelled Iroquoian ceramics from the Sustainable Archaeology test in Longhouse 2.2, although we did have to vastly simplify the students models for the gaming environment.
One of the major issues we encountered was the complexity and detail we had been adding into the environment. There has been a lot of thought and detail put into every element and along the way we have tried to optimize the digital assets so that real-time play would not be compromised, but it was clear with the test we did at Heritage Toronto that some creative “faking” will need to happen so as to speed things up virtually. This faking method would be to use texture maps instead of models for things such as bark cordage/rope, using more pre-rendered complex images and greatly reducing the polygon count on each of the objects within the scene.
Another observation was with the outside bark shingles. They look bright and new and it’s likely that vast amounts of moss and other errant plant material would be growing on the sides, edges and tops of the longhouse. Rotting of some sort would have taken root as well with the shingles itself and I suspect there would be discolouration due to weathering. We still need to add the exterior exoskeleton which helps to stabilize and support the shingles.
This test marks a major stage in the research. We are fairly close to the final product and will likely be spending the next month or so cleaning up the assets, increasing the speed of the virtual interaction and hopefully providing some user abilities at least in this version for users to pick up objects and possibly interact with the environment more substantially. As an artist, I crave the hyperreal fully rendered images and sequences, but practically to allow for as many people to engage with the research, a gaming engine is needed and thus that hyperreal look becomes more stylized.
I would encourage our weekly readers to post comments or send questions through email. This is how we are learning about new theories, methods and perspectives which only strengthens the projects goals. Take a spin through the rendered gaming sequence and feel free to comment!
If you are in Midland Ontario this weekend, don’t forget to attend the Ontario Archaeology Societies Symposium – Circles of Interaction: The Wendat and their Neighbours in the Time of Champlain!