NotebookLM – on Firestorm VR Mod
NotebookLM by Google (https://notebooklm.google.com/) is an AI-powered research and writing assistant that works with a number of the information sources that you upload or provide URLs for.
A sample NotebookLM with sources relevant to Firestorm VR Mod is at https://notebooklm.google.com/notebook/8b56ea41-fb98-4c9d-b6e9-31c5ba6ee2f0
You can access an audio podcast style chat between two people discussing the project.. completely AI generated… click “Notebook Guide” blue link on bottom right and then should see an Audio Deep Dive widget box in upper right of the UI dialog, which then should have a “Load” button to load a previously generated audio result [Local Copy]…
NotebookLM can be used to generate things like FAQs, contents lists, summaries, etc…
Firestorm VR Mod
Firestorm VR Mod is a version of the popular Firestorm Viewer for Second Life and OpenSimulator with modifications to provide VR capabilities for VR Head Mounted Displays (HMDs) via SteamVR. This is experimental. You can find the latest version of Firestorm VR Mod at https://github.com/humbletim/firestorm-gha/releases
Blog posts to describe the use of some versions of Firestorm VR Mod are available…
- 7.1.10 – with WebRTC support coming soon.
- 7.1.9 – https://blog.inf.ed.ac.uk/atate/2024/06/24/firestorm-vr-mod-7-1-9/
NASA Lunar Gateway in Second Life
On 15-Aug-2024 NASA 3D Resources released a 3D model of the Lunar Gateway in the glTF (.glb) format along with a near Moon environment surround. See https://nasa3d.arc.nasa.gov/detail/gateway
The glTF model can be loaded into Second Life by rezzing a cube, scaling it to 1m x 1m x 1m and rotating it X = 90°, and then while its selected using Develop -> GLTF -> Open on the .glb model. You may need to move the model high in the sky (say above 1,000m) so that the ground and sea are not visible.
Creating the HDRI Environment
Take the “Low Lunar Orbit.jpg” image provided and in a suitable graphics program (e.g. GIMP) export it as file type .exr which provides an HDRI environment. This can be loaded into Second Life viewer as the replacement environment at present with Develop -> Render Tests -> HDRI Preview.
HG Safari Blog – Ai Austin Interview
This blog post gives the information provided as a basis for a blog post by Thirza Ember on the HG Safari Blog. An interview on the OSGrid regions of RuthAndRoth, Black Rock and Space City was held on 21-Aug-2024 to add material.
https://hgsafari.blogspot.com/2024/08/austins-eye.html
Thirza Ember 16-Aug-2024: I am working on a few general interest stories on the HG Safari Blog, and I wondered if you would be willing to do an interview? It would be about your Opensim career principally, how you got into Opensim, what your main interests are etc – obviously with all the appropriate links to your blogs and so on. If you’d be kind enough to provide some bio info in here too, so I get accurate and up to date info.
Ai Austin:
Virtual worlds/OpenSim focused bio…
Austin Tate (avatar: Ai Austin – pronounced “eye”) 16-Aug-2024
In Real Life I am an educator and researcher in artificial intelligence and robotics, with a special interest in distributed collaboration and teamwork. I was Director of the AI Applications Institute (AIAI) at the University of Edinburgh and Coordinator for the Virtual University of Edinburgh (Vue). I am a Fellow of the Royal Academy of Engineering (FREng) and Fellow of the Royal Society of Edinburgh (FRSE, Scotland’s National Academy). I am now retired and Emeritus Professor at the University of Edinburgh and remain an Honorary Staff Member of the University. I am an open source developer and advocate of open educational resources and through that continue to play an active part in developments in my fields of interest.
https://www.aiai.ed.ac.uk/~bat/
As with many potentially useful educational and collaborative technologies, I had initial explorations with multi-user persistent virtual spaces going right back to the MUD/MOO days of the 1970s. As these environments became more graphically orientated, several groups at the University of Edinburgh, including my group, were using professional and hobby related virtual worlds in the early 2000s such as Forterra OLIVE, There, etc. Second Life was also being explored in its early days, around 2004. Second Life became more useful and more widely publicized in 2006 and several groups came together soon afterwards to form the Virtual University of Edinburgh (Vue) to coordinate our efforts in using these environments for teaching, research and outreach.
https://blog.inf.ed.ac.uk/atate/2011/10/22/a-brief-history-of-virtual-collaboration/
https://blog.inf.ed.ac.uk/atate/2018/04/27/virtual-worlds-technology-for-university-of-edinburgh/
https://vue.ed.ac.uk
I have had a presence in Second Life since then and in OpenSim (as both user and grid manager) since July 2007. We also make use of a wide range of platforms beyond Second Life and OpenSim. I frequently make blog posts about experiments and uses of virtual worlds tools and techniques.
https://blog.inf.ed.ac.uk/atate/
Mirror at https://aiaustin.wordpress.com/
For professional use and meetings, my avatar usually wears a green flight suit which has been with me since I first set up in Second Life. I licenced the use of the textures from the creator back then. When exploring in virtual worlds I usually have an outfit that reminds me of Strider in Lord of the Rings, though the sword is Orcrist from the Hobbit. That reminds me of a visit to Weta Workshop in New Zealand where we handled the original filming prop and I have a nicely crafted replica of that.
https://blog.inf.ed.ac.uk/atate/2019/09/02/ai-austin-mk-2/
https://blog.inf.ed.ac.uk/atate/2018/04/12/orcrist/
Vue, Openvue and OpenVCE
The Vue regions in Second Life were made available from 28th May 2007. A timeline of events is available at http://vue.ed.ac.uk/ and over the years a number of departments and units as well as externally funded projects have contributed to the server costs for 12 years. An in-house hosted copy of the regions has been provided on OpenSim (Openvue). Security concerns means the in-house version can only be made available within the University firewall, but an openly accessible version is still hosted on OSGrid for continuity.
Other training and simulation related project regions (such as the RGU Oil Rig for immersive training of offshore workers) are also kept on OSGrid. The OARs for the Openvue regions are available as open educational resources. Our Open Virtual Collaboration Environment (OpenVCE) region created on a project funded by the US Army Research Lab's Human Research and Engineering Directorate (HRED) as an open source resource (OAR) also continues to be available as a basis for a facility to support synchronous and asynchronous collaboration in many forms.
https://blog.inf.ed.ac.uk/atate/2017/11/30/vue-openvue/
https://blog.inf.ed.ac.uk/atate/2021/07/19/open-educational-resources-vue-and-openvce/
https://blog.inf.ed.ac.uk/atate/2022/02/27/openvce-for-opensim-2022/
https://blog.inf.ed.ac.uk/atate/2017/06/07/virtual-oil-rig-enhancing-higher-education/
I-Room
One of our projects related to virtual worlds collaboration and meeting support is the I-Room – a virtual space for intelligent interaction. We used this on a wide range of projects and for experimentation.
https://blog.inf.ed.ac.uk/atate/2011/09/15/i-room-a-virtual-space-for-intelligent-interaction/
AI Planning MOOC
Our virtual world spaces were also used to give briefings and tutorial support for our Massive Open Online Course (MOOC) in AI Planning. Over its three runs on the Coursera platform it reached 115,000 students. The resources continue to be available on our media servers, YouTube and as Open Educational Resources. The MOOC materials are also used as the basis for graduate level studies by other Universities.
https://blog.inf.ed.ac.uk/atate/2016/06/15/ai-planning-mooc-interview/
Open Source Projects, OAR Converter, Ruth2/Roth2 Mesh Avatars
I am an open source software developer and contribute mostly via testing and the occasional code contribution to a range of projects including OpenSim, the Second Life and Firestorm viewers, I am especially involved with the Firestorm VR Mod variant. I helped Fumi Iseki and his team in Japan create the OAR Converter to take content from OpenSim into Unity and other 3D modellers. I am one of the core team behind the Ruth2 and Roth2 open source mesh avatars.
https://blog.inf.ed.ac.uk/atate/?s=Firestorm+VR+Mod
https://blog.inf.ed.ac.uk/atate/2024/06/10/oar-converter/
https://blog.inf.ed.ac.uk/atate/2020/08/30/ruth2-v4/
https://blog.inf.ed.ac.uk/atate/2020/05/24/roth2-v2/
HG Safari Blog Questions
Thirza Ember:
I know you brought Vue to Second back in about 2007, and then into Opensim about three years later – can you tell me something about how the virtual university concept came about? What do /did you expect the students to get out of it? Can you also talk about the students who participate in Vue – what disciplines they come from, beyond IT and business studies? How other faculty have responded to the concept?
Ai Austin:
We had our first Vue region in Second Life May 2007 and had about 10 regions a year or two later. So a mini-continent. We maintained paid for regions on Second Life through to March 2019. The first OpenSim regions were started in July 2007 so following on quickly.
Timeline at https://vue.ed.ac.uk/ and https://vue.ed.ac.uk/openvue/
There were perhaps a dozen units and departments and many groups inside those that were actively involved in using Second Life and virtual worlds for some aspect of their teaching, research and outreach. Central units like the Library, Alumni, Disability Office and Corporate Services were involved as well as academic departments. Online graduations took place as mixed reality events linked to the main physical graduation hall. A list of those involved and some of the uses is available in a presentation made for University management and handouts linked from https://vue.ed.ac.uk/ e.g.
https://vue.ed.ac.uk/resources/presentations/vue-overview.pdf
Thirza Ember:
The move to Opensim is linked, perhaps I’m right in saying, to the collapse of SL’s support of academic sims. Do you remember the transition as traumatic? Did it involve a big learning curve, or were you already deep into the code, and interested in bringing Vue to a place with less ready-made commercial feel and more options for experimentation? Can you remember the reaction of the students to the change? What have been the advantages and disadvantages of the change in your view, from both the techie side of things, and the end user’s perspective?
Ai Austin:
No, nothing traumatic at all. Though the costs of the Second Life regions always was a funding issue and we had to work hard to maintain interest in putting so much money into the facilities. We were already using multiple virtual worlds like There and Forterra OLIVE (for professional simulation tasks) and we assumed there would be development and potentially some platforms would become unavailable or move in inappropriate directions for a University. We made sure assets could be re-used in multiple contexts. We looked at self hosted Second Life using what was called “Second Life Enterprise” at the time, but that concept was not continued by Linden Lab. In July 2007 we starting experimenting with OpenSim as self-hosted capability as that was attractive in a University context and we had the “Openvue” as a shared facility from 2008. I would say though that Second Life was easier for staff and students than trying to set up on and use OpenSim.
https://blog.inf.ed.ac.uk/atate/2017/11/30/vue-openvue/
https://blog.inf.ed.ac.uk/atate/2018/04/27/virtual-worlds-technology-for-university-of-edinburgh/
Thirza Ember:
Of course, your work is not just Vue but also open AI. You have a long and really fascinating history of blogging Opensim and SL and I guess ‘the viewers’ to give it a generic name. Your blogging work has proved to be a valuable resource both for getting news, understanding developments, and charting the progress of this platform. Can you talk a bit about how you got started in blogging, what your motivation is to do it, and the rewards and responsibilities, if that’s not too grand a term, are as time had gone by ? Are there other Opensim-related blogs or threads that you read, and can recommend?
Ai Austin:
As for blogging, I got interested in that form of shared communication back in 2009/2010. I was doing an MSc in e-Learning taught via distance education methods with our School of Education. I did that to understand the methods and tools better as I became responsible for the early development of the School of Informatics distance education and later MOOC programmes. The MSc showed a range of platforms and tools and amongst them we used a range of blogging and micro-blogging platforms. So when our School and the University provided the WordPress blog platform for staff and students I took to that right away.
Since then I use it to record many experiments, capture screen shots, describe useful tips and resources, etc. Its another way to share and get feedback and learn from others. I use twitter/X as one way to get pointers top relevant and interesting content. I often use a blog post I have read to go explore some tool or virtual world location. I make “Resources” posts in my blog to point at web sites, download links for tools and recall links to other relevant blog posts.
Back in 2000-2002 I directed a large project involving 30 organisations spread over 4 countries (CoAX) and we shared knowledge and assets between those involved. We created a simple tool to allow for push notification of relevant information to subscribers of certain parts of the content and status information was encoded in the notifications. It was a sort of Twitter system (without the scaling that involved). We switched to using Twitter with structured syntax and URL content in tweets soon after it became available in 2006. I still use twitter/X that way with structed information in some entries.
I have always considered open sharing of educational resources important. Our AI planning systems were right from the start (back in the 1970s) licenced in a way that allowed for re-use and sharing. That’s a way to build a community of interest, get feedback and extend capabilities in a way that can benefit all.
inZOI – Resources
inZOI is a life simulation game created by Korean developer KRAFTON where you can visualize stories of life with immersive, realistic graphics and deep, detailed simulation that creates unexpected occurrences.
inZOI on Twitter/X – @PlayinZOI
inZOI Web – https://playinzoi.com/en
Canvas
Canvas provides a platform to have experiences using characters (called ZOIs) created in the Character Creator or chosen from characters already available and created by others…
Mirrors in OpenSim
Mirrors are supported in OpenSim 0.9.3.0 dev master.
Ensure you are using a viewer and on a region that supports mirrors and that the debug setting RenderMirrors = TRUE which it is by default in most viewers.
- Rez the object to have a face which will be your mirror. Size and rotation does not matter.
- Make the face to be a mirror have a shiny face… e.g. use a reflective PBR material or blinn-phong with a simple blank texture, make Specular have glossiness factor 255, environment 255, and colour tint black.
- Rez a Box, size does not matter. rotate Z+ away from face to be the mirror.
- Make that box be a Mirror (Everything) probe… sphere or box type does not matter.
- Mirror Reflection Probe goes transparent (see it with ctrl+alt+t). But use Build -> Options -> Select Reflection Probes to be able to reselect it for editing.
- Move it into the reflective surface’s plane. The probe centre line should be just above the mirror’s surface.
- Resize the Mirror Reflection Probe as needed to cover the face of the Mirror and to adjust any edge effects.
The quality and update frequency of the Mirror can be adjusted in Preferences -> Graphics (-> Advanced settings).
Tutorials and Videos of How to Create Mirrors in Second Life
Tutorial: Creating a Simple (Prim) Mirror in Second Life – Inara Pey – Blog Post 12-Jun-2024.
Firestorm SgeoMinVR 7.1.9
This blog post provides links and resources related to Firestorm SgeoMinVR which is still undergoing testing and revision. The current standard Firestorm VR Mod can be found via this blog post.
@Sgeo on Discord created a variant of Peter Kappler’s VR Mod code which sets the VR settings automatically for the various VR HMDs rather than needing the FN keys to set them up. Hence the F5/F6/F7 settings and adjustments are removed (no vrconfig.ini is saved) comparesd to the standard Firestorm VR Mod. F3 provides enhanced VR HMD related information. Sgeo calls this his “minimal” VR build, close to P373R except not needing settings to be set.
Download (via Discord P373R channel)
https://discord.com/channels/613119193734316042/613119193734316044/1260076127003869205
File: Phoenix-FirestormOS-SgeoMinVR-7-1-9-74748_Setup.exe (166.45 MB)
Caveats
- No sound at present (a later GHA automated build will hopefully fix this).
- This release needs openvr_api.dll added to the install directory (the GHA automated build will hopefully fix this). A copy can be taken from the usual Firestorm VR Mod release.
- The FOV defaults to small. Ctrl-8 should adjust that, you might need the advanced menu open.
- The 3D view is distorted when the mouse isn’t in the middle.
Approach
Peter Kapplers VR Mod (P373R) code contains a line:
glBlitFramebuffer(bx, by, tx, ty, m_iTextureShift, 0, m_nRenderWidth + m_iTextureShift, m_nRenderHeight, GL_COLOR_BUFFER_BIT, GL_LINEAR);
Sgeo replaced that with
glBlitFramebuffer(bx, by, tx, ty, m_nRenderWidth * uMin, m_nRenderHeight * vMin, m_nRenderWidth * uMax, m_nRenderHeight * vMax, GL_COLOR_BUFFER_BIT, GL_LINEAR);
So instead of a horizontal texture shift, it’s adjusted and scaled by uMin/uMax and vMin/vMax, which are automatically calculated based on SL’s FOV and the VR projection matrix.
At least one (small) change is made to llviewerdisplay.cpp: gVR.m_fEyeDistance to gVR.eyeDistance()
Source Code and Changed Files
https://github.com/Sgeo/p373r-sgeo-minimal/
https://github.com/Sgeo/p373r-sgeo-minimal/blob/sgeo_min_vr_7.1.9/indra/newview/llviewerVR.cpp
https://github.com/Sgeo/p373r-sgeo-minimal/blob/sgeo_min_vr_7.1.9/indra/newview/llviewerVR.h
https://github.com/Sgeo/p373r-sgeo-minimal/blob/sgeo_min_vr_7.1.9/indra/newview/llviewerdisplay.cpp
llviewerVR.cpp and llviewerVR.h should be fully replaced as expected, although Sge suggests that it would probably be a good idea to fix up the weird way it’s pointing to OpenVR, llviewerdisplay.cpp has one very small change from the P373R version.
Testing
Sgeo only tested on Quest 3 via Steam Link. It should work fine on other headsets and in other ways to use PC VR via Quest.
Scalextric – Monster Trucks (C1024)
I acquired the Scalextric Monster Trucks Set No.C1024 courtesy of a nephew. This set was produced in the 1990s and uses old style connection track. which can be joined up with recent Scalextric Sport track with a “Classic to Sport Converter Track” so adds nicely to the Scalextric Speed Hunters set I got earlier this year..
Roblox – Six Flags
Six Flags have joined with DigitalTwin/Coastercoin to create a theme park experience for Roblox.
- Six Flags Launches Groundbreaking Metaverse Experience on Roblox, Bridging Physical and Digital Worlds: A First in the Theme Park Industry – Businesswire Blog Post June 28, 2024
- https://www.pymnts.com/metaverse/2024/six-flags-launches-roblox-metaverse-experience-with-rewards-program/ – PYMNTS Metaverse Blog Post June 28, 2024
Fixing hop in Firestorm
I am a fan of the hop:// protocol for providing addresses in OpenSim. I like hop as it has a true URL format and can be linked on web pages. These have gradually got more useful as glitches have been fixed… including the recent change in Firestorm 7.1.9.74745 which addresses failures when any hop had multiple partial matches on a region name, where exact name match was required.
This blog post gives a few issues remaining along with any Firestorm JIRA and other links that relate to them. It would be nice to see tidied up one day. But of course, I am not expecting all of these will be, or can be, fixed.
- Copy of the Address Bar contents for OpenSim hop:// where a space exists in the region name does not URL encode the address, as it (correctly) does with secondlife:// addresses. That means the copied item when pasted is not a valid URL, will not work in nearby chat or as a link outside of the viewer. https://jira.firestormviewer.org/browse/FIRE-34020
- Paste of an address that does not have X/Y element does not link as clickable in Nearby Chat/IM. E.g. hop://hg.osgrid.org:80/RuthAndRoth/
- Map Tool Copy SLurl button does not give correct results, especially when on a grid other than the one you first logged into. It should always give same address that shows when you click on and copy the top address bar. E.g., test between login.osgrid.org to and from cc.opensimulator.org:8002
- Clicking on a Hypergrid Landmark in Inventory and asking for “Copy SLurl” gives wrong results. Show on map also does not work.
- < and > chevron navigation buttons do not work correctly for Hypergrid.
- Negative Z values in a hop are not handled correctly. They are interpreted as Z = 0 in the address bar. In the chat tool the “-Z” value is not hyperlinked properly. Test at, for example, hop://hg.osgrid.org:80/Marineville/114/94/-19
Negative Z Issues Not Related to hop://
- Underwater blue haze effects work down to 0m but below that the lighting turns to the same as above the sea level. Test at hop://hg.osgrid.org:80/Marineville/114/94/+2 and then move down below Z = 0m.
- Initial placement of objects below Z = 0m using the Build Tool or dragged from inventory jump to 0,0,0. Build Tool works to then move them to a -ve Z okay. See https://jira.firestormviewer.org/browse/FIRE-33929
- The Terrain Tool only allows terrain to be lowered to 0m. Existing terrain saved as an OAR can be loaded at a negative offset with — displacement displacement <0,0,-Z>
Second Life Mobile
On 25-Jun-2024 the Second Life Mobile app, which has been under development using the Unity platform by Linden Lab for over a year, became available to Premium Second Life users as par of the Beta test rollout before opening the app to all users.
The mobile app is available for both Apple iOS and Android devices. For iOS, Linden Lab recommend a minimum of an iPhone X running iOS 16.6.1. For Android, they recommend a mid- to high-end device, comparable to Pixel 6 or later running Android 13 or later.
In my own tests, it does run on an older Apple iPad Mini 4 with iOS 15.8.2 (June-2024), but slowly.
Firestorm VR Mod 7.1.9
Download link: https://github.com/humbletim/firestorm-gha/releases/
Firestorm VR Mod 7.1.9.74745 is the first PBR (Physically Based Rendering) VR Mod viewer release.
Firestorm VR Mod is a version of the popular Firestorm Viewer for Second Life and OpenSimulator with modifications to provide VR capabilities for VR Head Mounted Displays (HMDs) via SteamVR. This is experimental. Firestorm VR Mod is now available from https://github.com/humbletim/firestorm-gha/releases
For Firestorm VR Mod community support use the Discord Discussion Channel:
P373R-WORKSHOP by p373r_kappler [Invite]
Firestorm 7.1.9 + P373R VR Mod: This build includes P373R’s VR Mod 6.3.3 changes merged into the Firestorm 7.1.9.74745 release branch. The VR Mod approach takes a minimalistic approach to inserting VR capabilities into the viewer in order that the maintenance overhead is decreased and the potential longevity of the approach is increased. VR Controller support is not included.
The Firestorm VR Mod viewer (for Windows only) is available as an .exe “Setup” installer or as a .7z zipped file which can be unzipped to any directory and run from there without an install. If required, a free .7z unzip utility is available at https://www.7-zip.org/. Download the release for the latest version at https://github.com/humbletim/firestorm-gha/releases/
You need to install your usual VR Headset drivers and SteamVR. Firestorm VR Mod when run uses SteamVR which will launch any necessary VR headset specific underlying drivers.
Firestorm VR Mod is created using “GitHub Actions” (GHA) thanks to @humbletim and @thoys. Firestorm VR Mod version 7.* is still based on Peter Kappler’s VR Mod code changes (working since version 6.6.3 with only minor changes for the merge) but due to PBR changes, Linden Lab (and hence core Firestorm) reworked the render buffer structures which is one of the tight couplings into the core FS code base. The PBR viewer approach changed how the viewer finds the main screen information. The VR Mod initially relied on a direct path, but that path got moved after PBR. Where a developer would use “o.mScreen” before they would now use “o.mRT->screen”. The GitHub Actions have been changed to automatically map the original VR Mod code to new path, so the VR Mod code itself doesn’t need to be changed.
The build also uses the open source openal.dll audio library for sound and builds in the openvr_api.dll library for VR HMD connection.
Firestorm VR Mod has its own app and channel names “FirestormVR” and the install directory changed to “FirestormOS-VR-GHA” so the installation can exist side-by-side with stock Firestorm if desired. Settings and cache are shared with standard Firestorm. If you installed Firestorm VR Mod from an earlier version (up to 6.6.8) you can delete the now unused settings and cache directories: %APPDATA%\FirestormVR_x64 and %LOCALAPPDATA%\FirestormVROS_x64.
U S A G E
VR Mode instructions are available via prompts in the viewer or via information on https://gsgrid.de/firestorm-vr-mod/. In short…
- Press CTRL+TAB to load or unload the SteamVR driver. Do this each time you want to enter VR mode after starting up.
- Press TAB key to enable and disable VR mode.
- Press F5 to open the settings menu, you should see a text menu in the middle of the screen. The settings menu works only when VR mode is enabled.
- Press F6 to increase the selected value. Press F7 to decrease the selected value.
- Press F5 again to switch to the next menu entry.
- By pressing F5 on the last menu entry the menu will close and save the settings in the config file which is located in
“C:\Users\your_user_name\AppData\Roaming\Firestorm_x64\vrconfig.ini”
and which can be edited directly. Pressing TAB for VR mode reloads the config file. - Hold F3 to see some debug info (example here).
- Press F4 to disable and enable HMD’s direction changes. It may be better to disable the HMD’s direction interface when editing and flying with the camera. This may be subject to change in future versions.
- In the camera floater two buttons has been added to offset the HMD’s base rotation.
- Moving the mouse to the corners or the sides will shift the screen to this direction so menus can be accessed more easily.
For issues on some specific headsets you might wish to try the Firestorm VR Mod Discord Channel: P373R-WORKSHOP by p373r_kappler [Invite]. Peter Kappler also offers the following advice…
- Firestorm VR Mod works best while sitting and using mouse and keyboard.
- WindowsMixedReality users may need to press windows key + Y to unlock the mouse when the HMD is worn.
- If your VR hardware cannot maintain constant 90 FPS you could try enabling motion reprojection in your HMD. In WindowsMixedReality it can be done by uncommenting “motionReprojectionMode” : “auto”, in the config file located at “C:\Program Files (x86)\Steam\steamapps\common\MixedRealityVRDriver\resources\settings\default.vrsettings”. This will make the HMD interpolate between frames and create a smoother experience. Vive and Oculus should have similar functionalities which can be accessed from SteamVR settings. For Vive it is called Motion Smoothing.
S E T T I N G S
As usual, Ctrl+TAB initially sets up SteamVR (and HMD support as needed), TAB is used to toggle VR mode on or off, F5 lets you select and step through the various VR HMD or user specific settings for IPD, texture shift to register the left and right eye images, and focal distance to change depth perception, etc. F6/F7 are used to increment and decrement each setting selection.
Peter Kappler suggested the following process to establish suitable settings for your HMD:
- Set IPD to 0 (zero)
- Then adjust Texture Shift until image is sharp and focused
- Then adjust IPD which separates your cameras to left and right to get a good 3D effect
@Sgeo on Discord provided a tool to help in calculating the settings for Firestorm VR Mod.. at least to give you a starting position to adjust to your taste…
https://sgeo.github.io/firestorm-vr-calculator/
Source of the calculator is at https://github.com/Sgeo/firestorm-vr-calculator
Hovertips
If you see a lot of hover tips showing under the mouse it could be that the debug setting “ShowHoverTips” is set to TRUE (the default) which may show something constantly under the mouse even for inert unscripted objects. You can turn that off via Debug Settings or via Preferences > User Interface > 3D World > Show Hover Tips. Via that same preferences panel, you might alternatively prefer to lengthen the delay before hover tips are activated.
Chat Bubbles
In VR Mode it may be useful to show local nearby chat in “bubbles” over each avatar’s head. This can be done via Preferences > Chat.
T R O U B L E S H O O T I N G
Misaligned VR Cursor
Note from Gaffe on Discord: Firestorm VR Mod’s VR cursor will have a small-to-extreme offset on Windows in particular if you are using Windows UI Scaling with any settings OTHER than 100%. To fix the VR cursor offset, set the Windows UI Scaling for your primary display to 100%.
Missing Menus and Buttons in Centred VR View
If you notice that the top menu bar and bottom and side button areas do not show in your headset when you are in VR Mode and your view is centred (i.e. mouse is positioned centrally in the 2D view) it may be that the VR settings you chose for Texture Shift and Texture Zoom need to be adjusted, or set to zero.
All Black HMD Display or Black Edges or Strips in HMD Display
An all black display in the VR HMD was an issue in earlier versions and is mostly resolved now… but it can still occur with some headsets. In case you encounter issues with a black HMD display… Peter Kappler suggests the following:
- Create a program-specific profile for the viewer in your graphic card settings and enable FXAA.
- Second Life only supports FXAA. Other types of Anti-aliasing can be disabled.
Firestorm VR Mod shifts the display in VR mode to an edge if the mouse or pointer is placed towards an edge or corner of the viewer window. This is to allow easier access to menus, user interface buttons and HUDs. It can be confusing though if you enter VR mode and find that part of the view is black. It is usually because the mouse is placed towards a corner or edge. Just move the mouse back to the centre of the screen and the full VR view should appear.
SteamVR Reset/Quit Screen Shows in HMD
When you switch to VR mode (after activating VR with Ctrl+TAB and using TAB), you may see a “Next Up… Firestorm” message or a SteamVR popup screen to “Reset the View” and “Quit SteamVR”. This has been observed to occur on the first run of a newly installed viewer. It can usually be dismissed with your controller if that is active, but if not the screen may continue to show the popup in VR mode in the HMD. Try another round of ctrl+TAB and TAB or if that does not work try stopping and restarting the viewer to clear this. These glitches may be more to do with legacy OpenVR + the latest SteamVR updates rather than Firestorm or VR Mod code changes.
If you have issues with some of the Function keys (F5 or other Firestorm VR Mod keys) not working… look to see if the F keys involved are mapped to active “Gestures”. You can find a list of the gestures you currently have active and the keys associated with them using the “Gestures” toolbar button… or the Comm -> Gestures menu item (Ctrl+G shortcut).
Adjust Over-the-Shoulder Camera View to Suit Yourself
Sometimes in VR mode the camera will be too high or far back from the avatar. If so, press Esc a couple of times, then Shift+Esc a couple of times, or Ctrl + 9 resets the camera to its default position. Shift + scroll mouse button moves camera up and down. You can also set specific Debug Settings (Ctrl+Alt+Shift+S) for camera positions, e.g. “CameraOffsetRearView”. If that doesn’t help, see http://wiki.secondlife.com/wiki/Camera_Control
A D V I C E O N F R A M E R A T E
You do need to ensure you have a good frame rate to have a comfortable VR experience. The Firestorm VR Mod Viewer will not work well if the Second Life/OpenSim region you visit cannot normally be displayed in 2D with a decent frame rate. In VR mode you can assume you will get 50% or less of the frame rate that shows on the 2D normal screen. At low frame rates bad flickering or texture tearing will occur in VR mode. My suggestion is to look at the frame rate (in Firestorm it is displayed in the upper right hand corner of the viewer) and to adjust the graphics settings (especially draw distance, shadows and quality sliders) until you have around 100fps (and definitely more than 50fps) and then try VR.
You may need to disable “vsync” in Settings -> Graphics -> Hardware as if this is on (the default) the FPS is capped to the frame rate of your 2D monitor (often 60fps, meaning in VR you would get less than 30fps).
Firestorm includes an “Improve Graphics Speed” performance tool and facilities to autotune the FPS which may be helpful. See advice on FPS improvement and the new “Performance Floater” and “FPS Autotune” capabilities in Beq Janus’s Blog Post (21-Mar-2022).
To improve frame rate (FPS) you might opt to set shadows to “None”, Water reflections to “None: Opaque”, Mirrors “Off” and use a reasonably low draw distance appropriate to the scene. Also close viewer UI windows and tools and detach any HUDs you are not actively using. The rendering of Linden Water, the water surface and its effects, can significantly reduce frame rates. Setting water reflections to “None; opaque” which gives a big FPS boost whilst still leaving the water looking okay. In an extreme situation, and in an environment that makes sense such as a meeting room, disabling Linden Water entirely can boost frame rates. Do that via Advanced -> Rendering Types -> Water. If the Advanced menu is not shown use Settings -> Advanced -> Show Advanced Menu or World -> Show More -> Advanced menu.
Peter Kappler also suggests: Particles… a fireplace is going to eat 20 to 30 fps! So turn them off for VR.
Tips from David Rowe for using the CtrlAltStudio VR Viewer (which is no longer maintained) may also be relevant:
- To improve your frame rate, reduce your draw distance and/or tweak other display settings such as advanced lighting model, shadows, FOV, pixel density, etc.
- Make sure you don’t have Preferences > Graphics > Rendering > Limit Framerate enabled.
- To display avatar chat above avatars use Preferences > Chat > Visuals > Show chat in bubbles above avatars.
- With floating text you may want to adjust the distance the floating text fades at so that distant text is not so annoying in VR mode: Preferences > User Interface > 3D World > Floating text fade distance.
C O N T R O L L E R S
Currently, specialised VR Controllers are not supported, but a range of game controllers and 3D navigation devices do work where supported by the normal viewer code.
Xbox One Controller
An Xbox One controller as used with the Oculus Rift (or an Xbox 360 controller) can be enabled, as usual, in Firestorm via Preferences -> Move & View -> Movement -> Joystick Configuration -> Enable Joystick.
You will probably find the controls are under or over sensitive, or some buttons and triggers don’t do what you expect. See this blog post and the image here (click for a larger version) for some suggestions as to how to amend the settings…
https://hugsalot.wordpress.com/2012/12/03/joystick-settings-for-firestorm-with-xbox-360-controller/
You might want to enter “-1” rather than axis “5” as an indication that axis is not mapped. With the setup suggested the “A” button toggles between the normal avatar view and “FlyCam” mode allowing you to move the camera separately to the avatar.
3D SpaceNavigator or SpaceMouse
As with all versions of Firestorm, the viewer supports other forms of “joystick”. One is the 3Dconnexion SpaceNavigator (aka SpaceMouse) which is a “3D mouse” supporting both avatar motion and by clicking the left hand button the separate “FlyCam” camera control.
My recommendation is to install the SpaceNavigator just by plugging it into Windows and receiving default Windows drivers for the device. I do not install any special SpaceNavigator drivers as suggested on the Second Life Wiki, some of which are incompatible with Second Life viewers.
Firestorm source is available at https://github.com/FirestormViewer/phoenix-firestorm. Look under “Commits” and select the branch for the specific Firestorm version required.
With Firestorm VR Mod Peter Kappler uses a coding approach which injects VR capabilities into the Firestorm Viewer to make the mod easier to maintain in future and for others to repeat or adapt. The source is available from his web page at https://gsgrid.de/firestorm-vr-mod/ [Local Copy].
Impressively, the source is written in a way that it requires only some editing in the llviewerdisplay.cpp and adding 2 files to the project. All changes are marked with #####P373R##### comments. Peter also included the openvr header and lib files you will need in the rar. For information about the rest of the files you will need, read how to compile Firestorm at https://wiki.firestormviewer.org/fs_compiling_firestorm.
Latest version of openvr_api.dll can be obtained (Win64 version for this test version) from https://github.com/ValveSoftware/openvr/tree/master/bin/.
The GitHub Actions (GHA) source by @HumbleTim used to combine Firestorm source and Peter Kappler’s P373R VR Mod addons, make necessary adaptations and build it using Microsoft Visual Studio is available via https://github.com/humbletim/firestorm-gha.
VRLand on OSGrid is a metrics area for performance testing and to establish virtual field of view in your VR headset.
hop://hg.osgrid.org:80/RuthAndRoth/16/16/1000More detail at: https://blog.inf.ed.ac.uk/atate/2016/07/20/vrland-a-community-and-test-region-for-virtual-reality-in-virtual-worlds/
You can also pick up a VR Headset attachment for your avatar in OpenSim on the OSGrid on both the RuthAndRoth and (if available) the VRLand regions. Or in Second Life pick up the VR HMD on the Second Life Marketplace. The 3D models of the Oculus Rift were provided for free use by William Burke (MannyLectro) and imported to OpenSim by Michael Cerquoni (Nebadon Izumi) and Second Life by Ai Austin.
F5 Settings for Specific VR HMDs
- Oculus Rift DK2
- Meta Quest Link app version 66.0.0.308.370. SteamVR Version 2.6.2.
- IPD = 65.0 (default)
- Focus Distance = 10.0
- Texture Shift = 0.0
- Texture Zoom = 0.0
- FOV = 100.0
- Oculus Rift CV1
- Meta Quest Link app version 66.0.0.308.370. SteamVR Version 2.6.2.
- IPD = 65.0 (default)
- Focus Distance = 10.0
- Texture Shift = 50.00
- Texture Zoom = 0.0 (others report 86-200 works)
- FOV = 100.0
- Please provide other VR HMD settings
If you see errors like “missing vcruntime.dll”, “missing msvcp.dll” or the application is not starting at all, then please download and install Visual C++ Redistributable for Visual Studio (link for 64-bit operating systems).
Frames Per Second (FPS) Testing
With Mirrors Without Mirrors Second Life Release 7.1.8.9375512768 56 120 Firestorm 7.1.9.74745 57 135 VR Mod 7.1.9 before ctrl+tab 55 130 VR Mod 7.1.9 after ctrl+tab 2D mode 50 98 VR Mod 7.1.9 VR mode 20 37Firestorm VR Mod 7.1.9.74745 for testing.
Graphics: Ultra. Draw Distance: 256m. Environment fixed = Midday.
FOV, camera, location, 2D windows size, etc all identical.
No other avatars on region.
Server: Second Life.
Test of Firestorm VR Mod 7.1.9.74745 in @SecondLife with my usual VR Mod Graphics Settings. 2D = 185fps, VR Mode = 75 fps. Click thumbnail for larger size image to see graphics settings used.
Firestorm – OpenSim Add Grid
Firestorm 7.* provides an add grid a capability accessible under the drop down Grid List or directly visa this URL…
https://phoenixviewer.com/app/fsdata/fs_grid_builder.html
This grid list and the add grid links are based on https://www.hypergridbusiness.com/statistics/active-grids/
Adding a Grid to the Grid list which Firestorm Uses
Firestorm menu Viewer -> Preferences -> OpenSim allows grids to be added just by entering their loginURI. This then asks the grid itself (which must be online at the time) for other information as needed.
Firestorm also provides a way for any grid owner or user to provide a link which when clicked will automatically add the grid to the list to be used by Firestorm for grid selection:
hop:///app/gridmanager/addgrid/http%3A%2F%2Fyour-grid.com%3A8002
Note the use of three slashes. The hop:// protocol is normally registered to launch the Firestorm viewer at the time it is installed. secondlife:// may also be used but you may have registered that to a Second Life only viewer. The last part of the URL is the URL-encoded grid login URL with : (%3A) / (%2F) and spaces (%20) encoded. Note the port is required even if the loginuri for the grid leaves it optional :80.
OAR Converter
OAR Converter can take an OpenSimulator Archive (OAR) and from it create textures, meshes and terrain suitable to import into a Unity or Unreal Engine scene. This blog post covers use converting an OpenSim OAR to Collada for use via Unity. It describes OAR Converter 1.7.* an update in June 2024 to replace the earlier version 1.0.6 from 2013-2017.
From Unity a range of virtual world or virtual reality experiences can be created. The converter has been created by Fumikazu Iseki (Avatar: @Fumi.Hax – @fumi_hax) and his colleagues at the Network Systems Laboratory of Tokyo University of Information Sciences (TUIS) in Japan with support from Austin Tate at the University of Edinburgh.
- OAR Converter from OpenSimulator Archive (OAR) to Collada files for use in Unity3D.
- Software developed by Fumi Iseki, Austin Tate, Daichi Mizumaki and Kohe Suzuki.
Download the OAR Converter for Windows latest version from https://blackjack.nsl.tuis.ac.jp/Download/Release/OARConverter/OARConvWin-1.7.9.zip
(file: OARConvWin-1.7.9.zip) [Local Copy]. Latest Version at the time of this blog post in June 2024 is v1.7.9 (at 15-Jun-2024).
This blog post is provided for convenience and using content from the original TUIS OAR Converter Japanese Web Site which should be considered definitive.
OAR Converter can run on Linux and Windows and source code is available. Full instructions for compiling and using the source code version on these platforms is available via the TUIS Wiki OAR Converter Page. For convenience a version with Windows UI is available as a ready to run package.
Note that an OAR file is actually a .tar.gz (.tgz) file, and hence is a Unix TAR format contents zipped to compress it to a smaller file. Sometimes downloading an OAR file can lead to the web server of underlying file save unzipping it automatically. Make sure you are using a normal OAR file or the OAR Converter will report the error message “File Extract Error”.
OAR Converter with Windows UI – Quick Start
Download the OAR Converter for Windows latest version from https://blackjack.nsl.tuis.ac.jp/Download/Release/OARConverter/OARConvWin-1.7.9.zip
[Local Copy]. Latest Version at the time of this blog post is v1.7.9 (at 15-Jun-2024).
For straightforward conversions, simply follow these steps:
- Place your OpenSim OAR file in a suitable directory. Using defaults, the conversions will be for Collada (DAE) and use in Unity, with the outputs placed in separate subdirectories of this same directory with names based on the OAR file name prefixed by OAR_ and DAE_
- You can use “Tools” -> “Output Format” and “Tools” -> “Settings” to change this.
- Run the OAR Converter and using “File” -> “Open OAR File” select the OAR file you wish to convert. This will create a directory called OAR_ with the unpacked contents of the OAR file ready for conversion.
- Now select “File” -> “Convert Data” from the OAR Converter File menu. This will create a directory called DAE_ with the converted content in it.
- The DAE_ directory created will contain the DAE/Collada objects for the conversion which have colliders (are solid) and one special DAE/Collada object for the terrain (named the same as the OpenSim region name). It will also have sub-directories for all Textures and for the Phantoms (objects with no collider).
Import to Unity3D
You could follow the video instructions in this YouTube video by Fumikazu Iseki. The first part of this video shows the use of the Linux version of the converter, but the part from [2:23] to [6:10] gives an example of importing the converted DAE/Collada folder contents and merging that with a Unity project/scene including adding a Unity Standard Asset water surface. Importing the included “UnityChan” character is shown in the last part of the video.
Import to Unity – Quick Start
- Ensure you add relevant parts of the contents of the “Unity” folder in the OAR Converter distribution into your Unity project, adding Editor/SelectOARShader.cs and Editor/SetLocationByParameter.cs at least.
- In your Unity project add an empty game object at 0,0,0 and name it the same as your OpenSim region name. Under this add three empty game objects named Solids, Phantoms and Terrain also at 0,0,0.
- Drag the DAE_ folder in its entirety onto the Unity “Project” (Assets) panel.
- Select all the objects in the top level of this directory except the Textures and Phantoms sub-directories and drag them onto the “Solids” game object in the Unity “Hierarchy” panel.
- Optional: As the (large) terrain object for the region is imported it is usually split automatically by Unity into three sub-meshes of less than 64k polygons. These are all under one object named the same as your OpenSim region and will have sub-mesh names starting “GEOMETRY_”. You may wish to move this terrain objects and its three parts to the “Terrain” object in the hierarchy for tidiness and ease of management.
- Select all the objects in the Phantom directory and drag them onto the “Phantoms” game object in the Unity “Hierarchy” panel.
- Look at the imported objects and if you see any that are not correctly textured (usually showing as magenta coloured items) perform a “Assets” -> “Reimport All” to fix that.
- Optional: Add a water layer at 0,0,0. [Instructions in YouTube Video][2:23] to [6:10].
- Optional: Add a ThirdPartyController/Avatar to be able to run the scene and view the contents. [Instructions in YouTube Video] after [6:10].
Project Base for OAR Converter Projects
Once you have successfully tried a conversion and understand the elements, you may wish to create a base for any future OAR conversion… which can include all the steps except for the drag and drop in of the actual DAE converted content. Do this yourself to incorporate the very latest OAR Converter content, scripts and Unity assets. You can if you wish save this as a “unitypackage” to incorporate into future Unity projects.
For convenience my own base project is available at http://www.aiai.ed.ac.uk/~ai/unity/oarconv/ (file: Unity-OAR-Converter-Base.unitypackage). This can be loaded into a “New Project” made in Unity 6000.0 onwards. I use the 3D Template with the “Built-in Render Pipeline”. It provides Mixamo Xbot and Ybot avatars along with UnityChan movement animations and avatars linked cameras.
After that you can make a copy of the base project you created, or create a new unity project and import the unitypackage as a base, and then drag the OAR Converter produced “DAE” directory into the Unity Project Assets area, add the DAE folder assets and those in the DAE/Phantoms folder to the hierarchical view panel. Adjust the position of the chosen avatar and attached camera to suit the region, save the project, save the scene, and you should be good to go.
Advanced Uses – Settings
OAR Converter has Tools and Settings to allow for a range of more flexible uses. As well as Collada .DAE files it can create Wavefront OBJ, FBX (forthcoming) or .STL files (suitable to drive 3D printers).
Exported objects can be shifted in the X, Y and Z (up/down) directions, perhaps for multiple side by side regions on a 256mx256m grid. [Note: this is also easily done afterwards in the Unity editor also.]
Once objects are converted they can also be examined in a 3D viewer built into the tool.
Advanced Uses – Avatar Conversion
This version of OAR Converter can also help with importing OpenSim avatars into Unity. The avatar should be added into an OpenSim scene before making the OAR to make it available for the OAR Converter to use. the “Use Joints” checkbox must be ticked in Settings to allow this. The process is quite involved but instructions by Fumi are in this video: https://blackjack.nsl.tuis.ac.jp/video/PronamaChan_are_go.mp4
WebGL – Web Presentation of OAR Conversions via Unity
Once an OAR has been converted and imported to Unity, so long as you have installed the WebGL capability to Unity (it can be added if its not yet installed) you can make a WebGL build of your project. here are a couple of examples:OpenVCE – the Open Virtual Collaboration Environment as it appears on OSGrid; Marineville – an underwater build as it appears on OSGrid; and TUIS – Network Systems Laboratory of Tokyo University of Information Sciences. Click images below to open the WebGL experience.
As glTF becomes a mesh content format accessible in Second Life and OpenSim it might be useful to be able to take OAR Conversions to Unity and export to glTF. A glTF importer and exporter is available for Unity. Details of how to add it to your Unity Project and create a glTF export are at https://blog.inf.ed.ac.uk/atate/2024/04/29/gltf-exports-from-unity/. Once converted the glTF mesh can be imported to various platforms such as Blender or Second Life (via the Develop -> Render Tests -> glTF Mesh Preview or Develop -> GLTF -> Open capabilities in recent Second Life Viewers).
Technical Papers on 2017 version
Iseki, F., Tate, A., Mizumaki, D. and Suzuki, K. (2017) OpenSimulator and Unity as a Shared Development Environment, Journal of Tokyo University of Information Sciences, Vol. 21, No. 1, pp.81-87 (2017). [PDF Format]
Iseki, F., Tate, A., Mizumaki, D. and Suzuki, K. (2017) OAR Converter: Using OpenSimulator and Unity as a Shared Development Environment for Social Virtual Reality Environments, OpenSimulator Community Conference 2017 (OSCC-2017), 9th-10th December 2017. [PDF Format] [Presentation: PDF Format]
Further Information and Resources
- OpenSim OAR Convert to Unity Scene Blog Post by Austin Tate on 30th August 2015.
- OpenSim OAR Convert to Unity Scene with Windows Interface Blog Post by Austin Tate on 24th October 2015.
- YouTube: OAR Converter Instruction video by Fumikazu Iseki [10:34]
- YouTube: OAR Converter Presentation at OpenSimulator Community Conference 2017 (OASCC17) [25:51]
OAR Converter License
OAR Converter © 2014-2016 Fumi Iseki, Austin Tate, D.Mizumaki and K.Suzuki
License (2016 11/19) – http://www.nsl.tuis.ac.jp/, All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
- Neither the name of the OAR Converter nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
- Please respect the copyright of content providers when using OAR Converter.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
OpenSim – Japan Open Grid
The Japan Open Grid (JOG) is a hypergrid enabled virtual world created by Fumikazu Iseki (Avatar: Fumi Hax – @fumi_hax) and his colleagues at the Network Systems Laboratory of Tokyo University of Information Sciences (TUIS) in Japan. Fumi is the creator of the OAR Converter and other useful OpenSim and virtual world tools.
Loginuri: http://jogrid.net:8002
Web Interface: https://www.jogrid.net/wi/
Arrival region and shopping mall: hop://jogrid.net:8002/Dejima/129/128/27
The arrival region has a range of shopping malls. Hypergrid visitors may find some free items or local currency offered to them via poster boards to purchase items.
Second Life – glTF Import Progress
Linden Lab is in the process of adding glTF 3D model support to Second Life, starting initially with PBR materials and moving on to full glTF (and .glb) scene, mesh nodes (including hierarches), materials and animation import and export for round trip updates between the viewer and external 3D modelling tools like Blender. As part of this effort a Release Candidate “Materials featurette” viewer is already available that lets you locally load a glTF mesh or scene (multiple meshes). The contents are visible only to yourself and not saved at present though test viewers are already experimenting with upload of glTF meshes. @davep (Runtai Linden) has stated that the eventual aim is to be able to import anything as it shows in the Khronos glTF Sample Viewer or in Adobe Substance.
glTF Mesh Loading Method
Though the method by which glTF meshes are loaded will change, for test purposes the following procedure can be used as at 8-Jun-2024:
- Create a simple prim, such as a cube and scale it to 1.0m. The scale is used to set the scale of the mesh that will be imported. So 0.5m would be 50% scale. 1.0m would be 100% scale. You might want to also rotate it by 90 deg in the X direction as that corresponds with the usual orientation of models in Blender.
- Edit the cube to select it as the target to attach the glTF mesh.
- Use Develop -> Render Tests -> glTF Scene Preview to select a .gltf or .glb file containing your model. [This will be replaced by other methods alter: e.g. in some glTF Development test versions use: Develop -> GLTF -> Open, save As, Upload]
- Rotate or move the object as necessary. Also scale as you wish.
- Make the root prim be 100% transparency to hide it.
- You can select parts of the mesh using “Edit Linked” to move, rotate and scale them. Selecting the root of a part hierarchy moves the whole item (e.g. for a wing of a vehicle).
- In some glTF Development test versions after adding a mesh to a root prim, you can use: Develop -> GLTF -> Upload< to upload the content to Second Life servers. This can only work on regions with the GLTFUpload debug setting true. This uploads the textures and then the glTF bin (mesh) and gltf (wrapper) files. After the upload finishes (indicated by a final two L$1 uploads) you can Take or Take Copy the root prim into inventory to save it for reuse from inventory.
- You can link other prims or mesh objects to a glTF mesh if you wish (e.g. to add scripted seats to a vehicle). You can also attach the mesh to avatar locations and position it, e.g. for swords.
At present texture need to be a power of 2 on each side, though that constraint should be relaxed in due course since the Khronos glTF Sample Viewer can render non-power of 2 textures okay.
A short video by @davep (Runtai Linden) shows the process [Video (MPEG4) via Discord]
Potted Plants Meshes
Sketchfab – Optimized Potter Plants by by Nicholas-3D (@Nicholas01)
Suggested by @davep (Runtai Linden) as a free test mesh.
https://sketchfab.com/3d-models/optimized-potted-plants-967b4bf23fac4098993776fcfc2d3318
Ready Player Me glb Avatars
Ready Player Me (https://readyplayer.me) avatars are made available as .glb meshes, ready rigged (with Mixamo compatible skeletons) and PBR materials. They appear to load fine in the test viewer. Of course the rigging is not operational at this stage.
This avatar is available in .glb format via
https://models.readyplayer.me/6619152aaaa958d48f2143ee.glb
Supercar, Black Rock Lab Mesh and Mike Mercury Statue Meshes
I uses these meshes as tests in various platforms. Supercar has 510174 vertices and 273230 triangles. Black Rock Lab Exterior has 68412 vertices and 39037 triangles. Black Rock Lab Interior has 455142vertices and 181935 triangles. The Mike Mercury figure has 42606 vertices and 24415 triangles.
As at Second Life Release 7.1.7.8852879962 on 29-Apr-2024 Supercar, Mike Mercury figure and Black Rock Lab interior and exterior are all rendering quite well. The Black Rock Lab textures are not yet showing as many are not set to a power of two.
RGU Oil Rig
RGU Oil Rig glTF mesh loaded in Second Life Test 7.1.7.9121781241 (17-May-2024). The model has 955880 triangles and 1303822 vertices and 9342 mesh elements.
OpenSim-NGC – Resources
A variant of OpenSim created by Mike Dickson and his collaborators is “OpenSim – Next Generation Core”. The aim stated there is:
The OpenSim-NGC project is designed to be different from OpenSimulator Core in a couple of important ways. We place an emphasis on quality and security testing of our code. This is built into our development process, so the team gets continuous feedback on how we’re doing in that regard. The security focus is especially important and feeds into another “project” we’re developing. The Trusted HyperGrid is a set of policies and practices that grid owners have adopted to ensure residents and content creators are as protected as we can possibly make them through both process and technology. You can read about the History of the project here.
- GitHub – OpenSim-NGC
- GitHub Discussion Forum
- GitHub Wiki
- Discord – OpenSim-NGC [Invite – https://discord.gg/n99spnUurN]
Second Life – Duet Using Suno.ai
https://maps.secondlife.com/secondlife/Yawgoo/84/96/2992
An installation by technorabbit in Second Life. Replicants Duet between Deckard and Rachel inspired by Bladerunner
Information from YouTube Video description: This demonstration makes use of https://suno.ai to generate the voices and backing track for a duet between Deckard and Rachel inspired by Bladerunner. Various sub-themes were combined into a poem. A chorus or two were used to rewrite the poem into a song, which was then fed into Suno.AI to generate the audio. The arm and lip movement was derived from automated closed caption tooling.
While suno.ai might have done the singing, the lyrics come from human sources; and then there is costuming, direction, post-production, scouting venues, and many more aspects than the singing. The video is shot in Second Life because 3D settings and production costs are so low — there are just so many creative and fully built out venues available.
Three Hills Grid – Pandora
There are many wonderful Na’vi/Avatar themed virtual world regions in Second Life and OpenSim. See some examples in my log posts https://blog.inf.ed.ac.uk/atate/?s=na’vi
A new one for my exploration is the Iknimaya region on Three Hills Grid, an OpenSim DreamGrid-based installation. Explore the region, its flora, fauna and the clans of Iknimaya. Pick up some of the free items of Na’vi apparel.
https://opensimworld.com/hop/83767
hop://three.hills.grid.outworldz.net:8002/iknimaya/107/121/23
Blender – Fix Mirrored Parts Shading Issues
I am getting shading/colour issues on glTF exports of parts in Blender models that were probably created in the far distant past as a mirror of an airplane wing or piece on one side transposed to the other. The “broken” parts sometimes look dark or black in Blender at other times they may be normally coloured. For some parts I was just able to duplicate and mirror the part again in Blender 4.1.1 to replace a broken object and that worked fine. But some parts cannot be mirrored.
I can’t see what the difference is between each side, but when I export to glTF (for example) in the Khronos viewer the parts shows as a grey colour versus the proper colour of the other side. I checked and there is no negative scale on the piece. I also flipped face normals (both ways!) but that made no difference.
I have checked a glTF export of the Blender model in the Khronos glTF Sample Viewer and in the tool’s Debug Settings the faulty part does have different “Shaded Normals”, “Geometry Normals”, “Geometry Tangents” and “Geometry Bitangents”.
I wonder if anyone has seen this and knows the trick to correct it?
Ideally I want to be able to fix the model in Blender, be able to export as glTF and have that work in the Khronos glTf Sample Viewer without shading issues. The have a process clear enough ton be able to repeat on other models with similar issues. As well as document that process here to help others in future.
X-15 Rocket Plane Test Model
This occurs on multiple Blender models, but one sample X-15 rocket plane model that exhibits this behaviour is the “X-15-1” by TaffGoch on 3D Warehouse
https://3dwarehouse.sketchup.com/model/a3a3a3d2ccc590eeaef99de91a3e555/X-15-1
Khronos glTF Sample Viewer and Debug Views
Discussions and Online Suggestions
https://forum.unity.com/threads/symmetric-models-with-mirrored-normal-maps-shader-fix.65859/
https://www.reddit.com/r/opengl/comments/sectzr/i_suspect_the_loaded_model_tangent_bitanget_and/
These posts clearly are exploring the right area, so it is likely its something to do with normals, tangents and bitangents! But all the suggest I tried in here (that I can understand) have not worked, eves the idea of exporting via FBX and reimporting.