Austin Tate's Blog

Subscribe to Austin Tate's Blog feed Austin Tate's Blog
Individual Blog Entries - CC-BY-NC
Updated: 25 min 14 sec ago

The Nature Collective in Second Life

2024, March 1 - 11:38

Following a snapshot posted by @Wurfi on Twitter/X I thought “The Nature Collective” virtual world build by Emm Vintner and her team on Gealain region in Second Life would be worth exploring.

http://maps.secondlife.com/secondlife/Gealain/21/134/108

Kemlo and Krillie

2024, February 19 - 09:11

       

This is a blog post under development. It describes a background storyline for exploring AI tools for creators (leonardo.ai, scenario.com).

The Kemlo series is a collection of children’s science fiction and adventure novels written by E. C. Eliott published in 1954 to 1963. More information in this blog post. The new storyline is loosely based on the Kemlo books and takes place a few years after the adventures in the book series. This might lead one day, maybe, to a new Kemlo story and VR experience.


Situation – It is the month of May in a year in the future. Earth has an outer space presence in Earth Orbit, on the Moon and beyond. International United Nations entities are now preferred to nationalistic governments and other authorities. Education, cultural and Internet services are freely available everywhere funded by proportional contributions by all countries. Grants support the provision of equipment to access these resources where circumstances require it. The Internet is now a safe and welcoming place since the introduction of an international requirement for strong privacy protections and open source for any major platform. A Universal Basic Income is provided to everyone both on world and off world.

Belt K – An Earth orbit space habit spread over 20km including habitation, living spaces, educational areas, recreation, physical conditioning areas, manufacturing, solar power generation and storage, space agriculture and hydroponics in large domes (automatically positioned to maximise crop growing quality and time). Robotics and droids are used throughout the Belt. Children born on Belt K are given names starting with “K”.

Education – Space-born children begin their education and practice simulated space operations very early. Classrooms and experience areas allow for e-Learning (enhanced learning), VR immersion and simulated field trips including holographic spaces and linked teleoperations of devices in many locations. Kids in the Belt call the facilities “sKool”. Belt residents engage in lifelong learning and training opportunities.

Space Operations Training – By the age of 11 many children have usually become familiar with space vehicle operations through simulation and play and can already use autonomous space scooters with confidence. At 13 children can take a basic flight operations test so that they can use the unsupervised mode on space scooters with appropriate oversight by the Belt K Operations Authority. At 15 they are allowed to use space scooters with limited unsupervised operations. At 18 with a pilot’s license they can use space runabouts. Belt children usually achieve their spacecraft qualifications very soon after their relevant birthdays as they make use of e-Learning, simulators and VR ahead of time. At age 20, for those wanting to use space transports professionally, they can obtain a Space Operations License (SOL) from the Space Transport Authority (STA) via exams and after logging flight experience. The SOL is renewable annually.

Spacecraft – Space Scooters (SS) for intra-habitat local transport and exploration, Space Runabouts (SR) for in orbit travel, Space Transports (ST) managed by the Space Transport Authority (STA) are the workhorses for orbital and Earth-Orbit operations for cargo and passengers, and Experimental Spacecraft (SX).

Space Scooters (SS) – small two to four seat personal spacecraft for travel within a Belt. Highly automated with remote supervisory capability for younger travellers.

Space Runabouts (SR) – two seat or larger spacecraft for travel around and between the Satellite Belts. Automation is used for safety. Fully autonomous versions provide a taxi service.

Robotic Assistants (RA) – a range of intelligent agents which are implemented in a distributed fashion. They can be personalised and embedded in a range of physical forms such as a wrist device or a robot. By tradition such robots are given names starting with the initial of the belt they are deployed on (e.g. “KaRA”)

K-Pad – a device with screen for communications, information, augmented reality for technical operations, education, etc. Age appropriate facilities are on the device. Updates ensure the device stays appropriate to its user for life. Strong privacy protection is enforced with locally stored data entirely private to the user and not shared off device.

Kemlo – Male, 18 years old, born 3rd March on Belt K, Sector A. Kemlo has has an Open World University (OWU) Degree in Planetary Geosciences. Skilled pilot. Captain of the Space Scouts. Helps train younger children to fly and maintain space scooters. Kemlo is involved in the test programme for an experimental modular space runabout (SX-MR2). Kemlo’s robotic assistant which he calls “Komputer” is embedded in his Omega wrist band (a gift for his 18th birthday) which he wears with the screen under his left wrist in “driver-style”.

Krillie – Female**, 17 years old, born 11th November on Belt K, Sector A. Killie has an Open World University (OWU) Degree in Space Construction Engineering and is currently studying for a Masters by e-Learning in AI and Robotics from the University of Edinburgh in Scotland. Krillie is the author of a diary and series of books describing life as a Space Girl which are popular with children on Earth and in the Belts and beyond. Killie has an interest in AI-enhanced fashion. Krillie’s robotic assistant “KaRA” is embedded in her K-Pad. [** Gender change from the Kemlo books.]

Open World University (OWU) – the main provider of educational opportunities and experiences to on-world and off-world learners of all ages. OWU physical bases and computing centres are on and under sea islands named Atlantica Sea City and Pacifica Sea City run by the international United Nations (UN) Organization. OWU programmes are run for all ages and support lifelong learning. e-Learning (enhanced learning) using distance education is employed with group and social functions, VR simulated field trips and experimental labs. Advanced courses including Masters degrees are provided through OWU by specialised Educational Institutions across the world and beyond.

Offworld Heritage Sites and International Monuments

Some early space age activities on the Moon and in Earth Orbit have been kept intact and preserved for future generations. The International Space Station (ISS) constructed in the late 1990s, and the first wheel shaped rotating space station (often referred to as the 2001 Space Station as a nod to the film 2001 that depicted such a station) are in orbit and can be visited externally or in detail via remote VR operated telerobotics.

Using AI Tools to Suggest Content for Kemlo & Krillie

Leonardo.ai was used to suggest initial images for the main characters, Kemlo and Krillie. The initial attempt was useful, with a few visual glitches e.g. around Krillie’s right eye, and spacesuit patches that were too distinct. This was improved by using a further “image-to-image” generation and photoshop on Kemlo’s spacesuit arm patch.

Leonardo.ai Prompt: Kemlo and Krillie are space-born teenagers, they sit in a small two-seat personal spacecraft, travelling between two space stations, Earth orbit.

Leonardo.ai (https://leonardo.ai) for a similar prompt also generated some images that might be suitable for two-seat Space Scooters for younger children and Space Runabouts for older teenagers.

These images were then sharpened, enlarged and backgrounds removed where useful in Scenario (https://www.scenario.com/).

Leonardo.ai images of a Space Scooter and a Space Runabout as generated from the original covers of two of the Kemlo books (paperback versions) are show here…

The “Kemlo and the End of Time” book contains a colour illustration of SPITAR (Space Personal Investigation Training and Research Craft Number XK240) on test. Scenario.com was used to enhance a scan of this illustration…

A simple Space Runabout created in OpenSim, exported to Collada (DAE) and imported to Blender is shown here…

Simple Unity Experience – Kemlo & Krillie Visit the 2001 Space Station



Simple OpenSim Experience – Kemlo & Krillie Visit the Spacecraft & Space Telescopes Monuments

Mirrors in Second Life

2024, February 10 - 15:28

A later viewer 7.1.4.8208322938 already makes the process easier by making the Mirror Refection Probe volume (defined by its extent) set which objects have mirrored surfaces.

With the introduction of Physically-Based Rendering (PBR) glTF materials support in Second Life viewers, the development of a “Mirror” reflection capability that is high enough quality and updates in real time enough to look like a real life mirror has been under development. A project viewer has been under test for a while. The viewer and region server code need to support mirrors. As at 10th Feb 2024, a test viewer has been posted via the Discord Second Life “content-features” channel (Second_Life_Project_Featurettes 7.1.3.7848563555) and a test region on the Beta Grid at “Rumpus Region 2048” made available for testing.

The notes below reflect the design and operation of mirrors in Second Life as at 10th February 2024, but they are in a state of flux and could be altered, perhaps significantly, before finally being properly released.

Instructions to Make a Mirror

Ensure you are using a viewer and on a region that supports mirrors and that the debug setting RenderMirrors = TRUE which is is by default in the test viewer currently.

  1. Rez the object to have a face which will be your mirror. Size and rotation does not matter.
  2. Make the face to be a mirror have a shiny face… e.g. use PBR or blinn-phong blank specular, 255 factor, 255 environment, colour tint black.
  3. Rex a Box, size does not matter. rotate Z+ away from face to be the mirror.
  4. Make into mirror probe … sphere or box type does not matter.
  5. Move it into the reflective surface’s plane. The probe centre line must be just beneath the mirror’s surface.
  6. Shift drag copy the PROBE! (due to current viewer bug).
  7. Copy of probe goes transparent (see it with ctrl+alt_t) (another viewer bug?).
  8. Original stays with yellow colour if show reflection probes is set.
  9. Delete the original probe (yellow one).

There is a BUG in that when a mirror reflection probe is drag copied the original goes transparent and does not show as yellow if you have reflection probes to show.

Note mirror currently shows what is given by the mirror reflection probe nearest the CAMERA.. so other mirror reflection probes nearby (even if not in view) may intersect and override what you might expect to see.

Observation

With the current mechanism, this seems to be far too complicated.

Mirror Reflection Probe Interference and Priority Issues

Mirror Probes currently have an “Influence Zone” where they effect objects with reflective surfaces that is 10cm deep (fixed by a built in shader apparently) and the effect can go far out so can intersect other reflective objects that might be unexpected. But this is expected to change as the mirror approach is refined.


Images from Zi Ree (Firestorm)

Dantia Gothly on Discord commented: I set it up for my reflective surface aligned it and got it working. Then I took that reflective object while leaving the probe where it was and went 3000m up and the mirror still worked. So the mirror plane works across the whole region so long as its aligned to that surface.

Geenz (one of the developers) on Discord Commented: Right now how we handle the placement is WIP – eventually it’s gonna get the same falloff and such as regular probes. Just didn’t have time to get that done yet. We’re still debating and discussing the UX around this. So you’ll be able to just plop down a box or sphere probe, size it up, and get anything that intersects with it to get the mirror probe’s image.

Suggestions for Improvement

  • Provide a tick box to make the surface of an object be a mirror and then autoplace a Mirror Reflection Probe correctly placed and rotated (Z+ outward) wrt that surface.
  • Influence zone for Mirror effect defined by size of the Reflection Probe itself (currently its size is immaterial) rather than extending well beyond the object’s mirror surface.
  • Have a way to limit influence zone where a Mirror Reflection probe can show on a mirror surface to the land plot so neighbours builds do not interfere.
  • Reflection Probe used to be based on object the surface is rendered on not the probe nearest the avatar.

Update: 27th March 2023 – Improved Mirrors Viewer Candidate – 7.1.4.8428057494

Linden Lab Second Life Test Viewer 7.1.4.8428057494 on the Aditi Beta grid Rumpus Room 2024 region provides some tests of the next step in the implementation of mirrors. It includes a simpler setup of the mirror reflection probe and allows the volume of that probe to define which surfaces act as mirrors.

Update: 2nd April 2023 – Improved Mirrors Viewer Candidate – 7.1.4.8510662315

Using Linden Lab Second Life Test Viewer 7.1.4.8510662315 again on the Aditi Beta grid Rumpus Room 2024 region. I think the algorithm for selecting which mirror probe applies to a surface needs a tweak and not just use the nearest probe to the camera position…

  1. Only use probes in the field of view of the camera. Or as a poor substitute the direction (180 degrees cut off) of the camera angle.
  2. Ignore probes that are not mirror probes too? As present it seems to pick up any old probe that happens to be nearby (even within a few metres) and which could be on adjacent plots owned by another user and hence not controllable by the user for their visual intention.

North facing mirror that is spoiled when camera moves a little, due to another reflection probe just behind the avatar.
South facing mirror that works well when camera moves further way, due to another there being no other reflection probes in the area to interfere.

AIAI Training and CPD

2024, February 1 - 09:32

This blog post provides historical information about the training and Continued Professional Development (CPD) activities of the Artificial Intelligence Applications Institute (AIAI) at the University of Edinburgh between 1984 and 2020.

As part of AIAI’s technology transfer remit it ran an extensive programme of training and CPD offerings. A summary of the training offerings for 1996 from AIAI and pricing can be found on this snapshot web archive area…
https://www.aiai.ed.ac.uk/project/ftp/pub/home/iwh/cdrom/www/aiai/training.htm

Training Labs, Advanced Facilities and Support for Engineers

AIAI ran two laboratories with cutting edge equipment into which visiting scientists, engineers and industry could carry out pilot projects – a Knowledge Representation Systems Training Lab (KRSTL) and a Parallel Architecture Lab (PAL). AIAI had a contract with the UK EPSRC to help train engineers in the use of AI techniques in their subject (AI Support for Engineers). Due to its pioneering collaborative work between industry and academia, AIAI also won an award which allowed the equipping of a training laboratory with multiple workstations and advanced systems to improve its training capabilities.

To support these programmes, AIAI had a series of Short Courses which could be offered in Edinburgh on a shared basis or delivered within a company or organisation. The courses could be tailored to meet client needs. It also offered a packaged Study Programme in AI Applications to allow visitors to pursue a short application project under AIAI staff supervision and Research Programme in AI Applications to support more experienced personnel.

Short Courses

Below is a list of the courses offered on a stand alone basis and as part of the Study and Research programmes. These short courses are no longer available. Contact the School of Informatics for study opportunities.

This archival web page has a bit more detail including Winter 2002-2003 pricing… https://www.aiai.ed.ac.uk/archive/2003-07-14/training/

Study Programme in AI Applications

From AIAI’s prospectus for 1995 (original here)…

Study Programmes are concerned with creating skilled knowledge engineers. They are aimed at organisations that have reason to believe that one or more of their problems may be addressed by application of KBS techniques and which have personnel who lack sufficient KBS skills to carry out the project effectively.

Study Programmes emphasise the production of high quality applications. Visitors build an initial system which their organisation can develop further. This gives a quick start on a project for a organisation, and relevant knowledge engineering skills for its staff. This is achieved through:

  • a closely supervised and well planned work programme;
  • attending appropriate short courses;
  • building a system alongside skilled AI practitioners;

The fundamental training strategy behind this “journeyman” scheme is the building of a fully documented knowledge based system with supervision from AIAI staff. The Study Programme usually lasts ten weeks and visitors may register on a full-time or part-time basis. Programmes start in January, April and October.

Research Programme in AI Applications

From AIAI’s prospectus for 1995 (original here)…

Research Programmes are aimed at organisations who already have skilled staff in the domain of AI but who wish to further their research into the application of the techniques to meet business requirements.

The visitor will be a well motivated individual who can work with minimum direct supervision, and join as part of one of AIAI’s technical groups. The AI application research will be a project chosen to fit in with, and compliment, AIAI’s technical focii of:

  • planning & scheduling technology;
  • enterprise & process modelling technology;
  • corporate knowledge management technology.

For the period of the Research Programme the visitor will be allocated;

  • a desk within the AI Building at South Bridge, Edinburgh in close proximity to a large number of staff and students working in AI;
  • Computer provision via AIAI’s extensive network of workstations;
  • an extensive range of AI toolkits and languages;
  • excellent library and information services.

It is expected that a minimum of a joint publication describing the results must be an outcome of the application research work. Research Programmes last a minimum of three months and visitors may apply at any time.

AI Planning MOOC (2013-2015 and online afterwards)

AIAI was one of the first groups at the University of Edinburgh to engage in the development and running of Massive Open Online Courses (MOOCs) via the Coursera platform. The MSc level course was on AI Planning. Over 113,000 students took part in the three synchronous sessions offered. The materials continue to be available online via YouTube, open.ed and an AIAI web server and other Universities use the materials in their courses. See https://www.aiai.ed.ac.uk/project/plan/ooc/

Firestorm PBR Alpha Tests for VR Mod

2024, January 29 - 15:52

This is an experiment by @Sgeo to test Firestorm VR Mod 7.1.2.72850 PBR Alpha Test version to check whether the VR Mod approach works – and the good new is that it does.



The above images are all taken using the Firestorm VR Mod 7.1.2.72850 viewer on the OpenSim OSGrid “Oil Rig” region. This is an immersive educational training region as used by RGU Oil & Gas Centre in Aberdeen to train offshore oil rig workers prior to deployment. More information on the Oil Rig Training Experience on this blog post.

More Detail

The Firestorm viewer currently has an alpha test version in development that incorporates changes for Physically-Based Rendering (PBR) materials support. This has changed the rendering approach used by the viewer.

The current Firestorm VR Mod approach based on SteamVR/OpenVR to allow for VR use via a wide range of VR HMDs was developed by Peter Kappler. It uses a simple mechanism in a modified Firestorm Viewer from version 6.3.3 from November 2019. (An earlier approach was in Firestorm 6.0.1.57000 from August 2019). This basic approach continues to work in Firestorm up to 6.6.8 – the last version that has been prepared by @Humbletim using GitHub Actions (GHA).

@Sgeo on the Firestorm VR Mod Discord Group (Invite Link) took the latest available Firestorm PBR alpha source code from https://github.com/FirestormViewer/phoenix-firestorm-alpha (7.1.2.72850) and added in his own variant of the VR Mod approach which uses the same rendering mechanism. Sgeo’s approach includes code that allow for automatic setting up of the VR headset for IPD, texture Offset, and other parameters which have to be manually set (via the F5 key in VR Mode) for the Peter Kappler approach.

Download, Install and include openvr_api.dll

On 28-Jan-2024 @Sgeo provided a build of this version of Firestorm VR Mod 7.1.2.72850 for early tests and access by the P373R-WORKSHOP Discord group (see File Phoenix-FirestormOS-SgeoVR-7-1-2-72850_Setup.exe downloadable from this Viewer Download Link, Discord Group Invite Link). This version is a Windows 64 bit installer. It installs in a different directory to the normal Firestorm and Firestorm VR Mod versions so can be used alongside those. It shares the settings for normal Firestorm.

After install you need to add the openvr_api.dll library to allow the viewer to connect to SteamVR and the specific VR HMD drivers you use. This version of the viewer does not automatically include that as usual for Firestorm VR Mod. The latest version of openvr_api.dll can be obtained (Win64 version for this test version) from https://github.com/ValveSoftware/openvr/tree/master/bin/.

Then you can launch the viewer and as usual use Ctrl+Tab to load Steam VR and the VR HMD’s drivers. After that use Tab to go into and return from VR mode. Note I experience a crash of the viewer a few times when doing Ctrl+Tab. But it worked most times. If it crashes, it may be better to leave SteamVR and the VR HMD drivers running and restart Firestorm VR mode and that seems to work reliably – so it may be some sort of timing issue in the launch.

@Sgeo’s version of Firestorm VR Mod sets up the VR HMD IPD, texture offset and other parameters automatically, so with luck you should see a crisp VR image with 3D depth. As usual you can monitor on the 2D screen what is showing to each eye of the VR HMD using the SteamVR “Display VR View” capability (in its menu you can show “Both Eyes”).

UI Elements – Out of View in VR Mode

Note that currently it is difficult to see the Viewer UI elements like menus, buttons and edge mounted HUDs as they are out of the VR field of View (FOV). Normal Firestorm VR Mod using Peter Kappler’s approach allows for a shift in the area viewed as the mouse is moved to the edges or corners of the view to bring those UI elements into the FOV. The image below shows the FOV in the VR view for an Oculus Rift DK2 and you can see it is quite limited compared to the whole viewer screen. In thus picture, th UI elements and tools were moved to just be out of view, even went looking to the extreme sides, top and bottom.

2D View Squashed and a Temporary Fix

Also, in this current test version, on return to 2D mode (tab) the 2D image can be left squashed and not centred.

A simple fix for that is to resize the 2D screen to have the same ratio as ONE EYE of the SteamVR VR View monitor screen, e.g. try a ratio of 8×9 (width=8 units and height=9 units, note its half of a 16×9 screen ratio). The earlier VR Mod approach automatically resized the 2D screen to get a similar effect. This also seems to correct the distortion of the name labels over avatars too.

Colour Changes

I do see a darkening of the colours across the whole image when switching from 2D mode (left image below) to VR Mode (right image below). The 2D mode looks identical colour to the non-VR version of Firestorm PBR Alpha. This image was taken on the Firestorm Beta Grid “Rumpus Room 4” region which has a range of PBR test objects.


OpenSim OSGrid VRland Test Area

On OpenSim OSGrid there is a VR test area that can help establish the VR HMD Field of View (FOV, shown in blue in VR View below) etc. at hop://hg.osgrid.org:80/RuthAndRoth/16/16/1000


PBR Materials Display in OpenSim

And the point of all this is to be able to display PBR materials in Second Life and OpenSim… here is a sample image… using the latest OpenSim 0.9.3.0 Dev Master server code already available on OSGrid and which includes support for PBR materials.

Oculus Rift CV1

Tests of Firestorm VR Mod 7.1.2.72850 PBR Alpha Test on Oculus Rift CV1 which on Firestorm VR Mod 6.6.8 has a texture offset of +30 and a Texture Zoom of 0. All looks fine. FOV for Oculus Rift CV1 is wider than DK2 and if I really squint up down, left and right I can JUST see at the extreme edges the inner most parts of the menu bars and edge buttons. Not enough to click them, but they are JUST there… FOV indicated here on the OSGrid VRLand test area.


Frames Per Second (FPS)

It is important to try to maintain a good rate of frames per second for comfortable VR. The graphics quality settings and draw distance should be adjusted to make sure the 2D view FPS is high enough so that when you switch to VR Mode and likely go to less than 50% of the 2D FPS it still works smoothly.

Source Code

Firestorm PBR alpha source code:
https://github.com/FirestormViewer/phoenix-firestorm-alpha

Sgeo Firestorm PBR Alpha with SgeoVR Mod code:
https://github.com/Sgeo/phoenix-firestorm-alpha/tree/VR_Sgeo_2024

Latest version of openvr_api.dll can be obtained (Win64 version for this test version) from https://github.com/ValveSoftware/openvr/tree/master/bin/.

The LICENSE file from there should be included (renamed LICENSE-OpenVR) from https://github.com/ValveSoftware/openvr/blob/master/LICENSE

Hippiestock Festival 2024 in Second Life

2023, December 28 - 15:17


Visit the “Hippiestock” Festival in Second Life in January 2024… a lovely watercolour style region themed like the famous Woodstock Festival from 1969. The watercolour trees and leaves are a nice touch. All very trippy. Built by CK, the area will support a number of events, exhibitions and storytelling.

A21 News – Supercar

2023, December 23 - 10:04

There are a number of Supercar themed stories by Andrew Clements under the “A Gerry Anderson A21 News Story” banner. See https://www.gerryanderson.com/tag/supercar/.

Scalextric – Batmobile

2023, December 22 - 17:30

Having bought the Scalextric Legends – Jim Clark Triple Set, I could not resist getting the detailed 1960s style Batmobile model too. The Batmobile was created by George Barris (see this blog post) and based on the 1955 Lincoln Futura concept car.

https://uk.scalextric.com/products/batmobile-1966-tv-series-c4175


Batman originates all the way back to 1939 where he was first introduced in Detective Comics. Since then he has grown to become one of the most famous, if not the most famous, superheroes of all time. Batman made it from the comic book pages onto the TV screen in the 1960s when the Batman TV series first aired starring Adam West as Batman/Bruce Wayne.

Both Batman and his companion Robin were two crime fighters there to defend Gotham City, their mode of transport was the Batmobile. This detailed Scalextric slot car of the 1960s Batmobile captures the shape of this iconic car and includes a Batman figure in the driver’s seat.

Images from Scalextric

Scalextric Legends – Jim Clark

2023, December 21 - 14:42


Having seen some of Jim Clark’s Formula 1 and Rally cars in the flesh a few times and remembering that I used Jim Clark’s F1 race car as my slot racing car in competitions and in the Scalextric club I used to run at my secondary school, back in the 1960s… I could not resist getting the Scalextric Legends Jim Clark Triple Set when it came out in November 2023.

https://uk.scalextric.com/products/legend-jim-clark-triple-pack-c4395a

Born in 1936 in fife, Scotland, Jim Clark is today remembered as being one of, if not the, greatest natural talents to ever sit behind the wheel of a Formula One car. Born into a farming family Jimmy began his racing career in local road rally events before graduating to circuit racing in a borrowed DKW. By 1958 he has gained the attention of the local Border Reivers team and was racing a Jaguar D-Type across national events. Winning 18 of the races he entered.

In 1958 he raced a Lotus Elite at Brands Hatch, finishing second to Colin Chapman, boss of Lotus, and launching his career with the famous Norfolk brand. By 1960 he was racing for Lotus in Formula One, making his debut at the Dutch GP and winning his first Grand Prix at the 1962 Belgian race at the daunting Spa Francochamps. His first World Championship came in 1963 and was followed by another in 1965, both for Lotus.

His exploits however were not just confined to F1, he won the Indy 500, raced at LeMans and at the wheel of the fantastic Lotus Cortina he won numerous touring car races and a British Saloon Car Championship. Regarded by many to be the most naturally talented driver to ever grace Formula One Jimmy was tragically killed during a Formula 2 race in Germany in 1968. Today he is remembered as a true great of the sport, a natural talent like no other, and a man incomparable on the track.

For more information on Jim Clark or for information on the Jim Clark Museum in Duns, Scotland please visit www.jimclarktrust.com

glTF – Model Viewers and Validation Tools

2023, December 20 - 16:53

This blog post lists some resources used to view and validate glTF/GLB 3D files.

Khronos Group glTF Test Viewer

https://github.khronos.org/glTF-Sample-Viewer-Release/ [GitHub Source Code]

As a test, a Blender model of Supercar (see this blog post) was exported to glTF (.glb or .gltf) and tested in the Khronos glTF Test Viewer, which is intended to act as a benchmark for such models.

Khronos Group glTF Validator

Live drag-n-drop tool: https://github.khronos.org/glTF-Validator [GitHub Source Code]

ModelViewer.dev glTF/GLB Viewer with Khronos glTF Validator Built In

Live Drag and Drop Viewer: https://modelviewer.dev/editor [GitHub Source Code]

Again, as a test, a Blender model of Supercar (see this blog post) was exported to glTF (.glb or .gltf) and tested in the ModelViewer Tool, which can also export the model in a form which can be viewed via a web browser. Click here for the 3D view of Supercar.

Hiberworld – Resources

2023, December 13 - 16:19

Hiberworld (https://hiberworld.com) is a virtual world in which users can link Ready Player Me (https://readyplayer.me) avatars and create experiences.

Hiberworld can share equipment and clothing items across multiple virtual worlds through the Ready Player me platform.

Sledging in Second Life 2023

2023, December 13 - 14:13


Pick up a sledge and take to the Sled Run on Winter Wonderland in Second Life (2023)…
http://maps.secondlife.com/secondlife/Winter%20Wonderland%203/163/71/92

OpenSimulator Community Conference 2023 – OSCC23

2023, December 9 - 12:00



The OpenSimulator Community Conference (OSCC) is one of the longest running virtual conference series, having started in 2013 and run annually since. This is OSCC’s 11th year and the event celebrates 16 years of OpenSimulator as the first commit was January 31, 2007. The OpenSimulator community and Avacon Inc. come together to run the event on the OpenSimulator Community Conference (OpenSimCC) grid – http://cc.opensimulator.org:8005 [LoginURI: http://cc.opensimulator.org:8002]

The main keynote presentations area uses the adjacent corners of 4 sims to provide capacity for up to around 400 attendees.

There are many other regions for avatars, shopping, exhibition booths for presenters, OpenSim community hub, music and dance venues, etc.

Links to my blog posts on earlier OpenSimulator Community Conferences.. https://blog.inf.ed.ac.uk/atate/?s=oscc

Sponsors and Crowdfunders

Day One

Day Two

Ada Radius, Ai Austin and Serie Sumei participated in a panel on the “Max” open source mesh avatar which can be morphed into female and male variants – “Maxine” and “Maxwell”. For more details see https://blog.inf.ed.ac.uk/atate/max-history/.

Second Life – Winter Wonderland 2023

2023, December 6 - 16:17

Annually Second Life hosts “Winter Wonderland” regions with ice skating, sledding, snow scenes and fireworks. This year the environment was built with Physically-Based Rendering “Materials” (PBR) to mark the release of PBR glTF compatible viewers for Second Life (and OpenSim).

http://maps.secondlife.com/secondlife/Winter%20Wonderland%204/184/77/46




Virtual World Vehicle Scripting System

2023, December 6 - 15:07

Cuga Rajal provides the Supercar vehicle Scripting System for Second Life and OpenSimulator. Current version: Supercar Plus 2.0.4, July 24, 2023.

Supercar Plus is a free LSL land vehicle (car) script compatible with Opensim and Second Life. It supports a wide range of creative options for various car features and the runtime is low-impact on the server. By using a Notecard for settings, vehicles can be updated easily, this also helps manage a large car collection. The full project is available at https://github.com/cuga-rajal/supercar_plus.

Many popular features are supported, such as driver’s animation, passenger seats, multiple gears/speeds with reverse, rotating wheels, headlights, horns, engine sounds, tank tread motions, and much more. A variety of add-on scripts are included with instructions.

Palia – Resources

2023, November 30 - 18:00


Palia (https://palia.com) is a World of Warcraft style questing game. This blog post provides some resources to help in initial game play.

Palia Avatars

Avatars can be tailored at start up and after a delay (a week?) some outfit items can be altered.


Palia Quests

Palia sets tasks and provides NPC to interact with to obtain information, tasks and access to tools. Resources can be chopped or mined to allow items to be crafted. A home location can be set up with crafted items at the start of game play.

Palia Friends and Communities

You can link up with friends and create communities within Palia. A referral link is provided to all users so that hey can invite friends to join.

A Few Controls and Keyboard and Commands

O = Social, P = Skills, M = Map, R = Tools.
Use the mouse scroll wheel to switch between multiple different inventory bars.

Palia Map and Almanac

Fine getting started instructions in the Palia Almanac (PDF Document).

Aurelia’s Region in Second Life

2023, November 29 - 21:01

Aurelia’s region in Second Life is a wonderful atmospheric build by JuicyBomb (Gorgeous Aurelia in Second Life, @gogolita on Twitter/X). Build by Sparkely Sugar she tells me. http://maps.secondlife.com/secondlife/Aurelias/128/128/28

She added a PBR reflection probe on her skating rink to show the effects of PBR/glTF materials which are now enabled in Second Life.

360° Snapshot – Click to view Image on Flickr

Second Life Ice Skating 2023

2023, November 29 - 14:09

Flyte Ice Skating Lake Updated for 2023 by Jenna Dirval and her Team
http://maps.secondlife.com/secondlife/FlyGearZ/7/215/26

As in 2022, Flyte on the FlyGearZ region has a wonderful skating lake set in a moody landscape of snow and ice under a sky with aurora and rainbows. The lake edge has lapping water, ice slabs and boulders. Choose relaxed or trick skating for singles or couples and enjoy the scenery.





For other Winter Sports and Ice Skating experiences in Second Life see
https://blog.inf.ed.ac.uk/atate/second-life-ice-skating-2022/

Supercar Updated Model

2023, November 2 - 10:46


The original Supercar model was created in Cinema 4D by Mick Imrie with support from Austin Tate back in 1998 (see https://www.aiai.ed.ac.uk/~bat/GA/supercar-3d.html) and subsequently converted to a number of other modelling tools by people in the Gerry Anderson Model Makers Alliance (GAMMA). Via conversions its been used in a range of ways including flight simulators, space simulators, games, virtual worlds, VR experiences, etc.

Over the last couple of years Mick Imrie has also been developing a new model with help from Austin Tate and Shane Pickering. It includes simplified geometry suitable for making various levels of detail and for having variants suited to different platforms like flight simulators and game engines as well as high fidelity mesh models for rendering. See https://blog.inf.ed.ac.uk/atate/2021/02/23/mick-imrie-supercar-take-2/

1998 Model Updated with Lessons Learned

Using some of the lessons learned while studying for the 2021-onwards update by Mick Imrie, Austin Tate has taken the original 19998 model, converted it for Blender and incorporated changes that are compatible with the original shape. It is still a model with high complexity and the basic shape is unaltered from the 1998 model. Bit it fixes a number of issues with flipped normals, the fuselage and dashboard top piping is altered to a light brass colour, etc.

Supercar in Unity

Via FBX (.fbx) export from Blender, the Supercar model can be taken into Unity using the Standard Rendering Profile (SDP) or the High Definition Rendering Profile (HDRP).

Khronos glTF Test Viewer

As a test the Blender model was exported to glTF(.glb or .gltf) and tested in the Khronos glTF Test Viewer, which is intended to act as a benchmark for such models.

https://github.khronos.org/glTF-Sample-Viewer-Release/ [GitHub Code]


Pages