The talking heads are talking about interoperability. They’re saying how you’ll be able to take your digital assets from one platform to another, seamlessly, and into infinity.
You’ll be able give your kids your epic loot. You’ll walk seamlessly from one virtual world to the next without having to change identity. The talking heads talk A LOT about how they’re going to solve the issue and they’re developing the tech… but it’s not ready yet!
If the talking heads would shut up for a few hours and actually use the metaverse, they’ll realize the people here are already doing this. Even now, your great-children will be able to play Pokémon Neon Ultra Green using monsters from your Pokémon Home. We pass down our epic loot by creating new games, experiences, and worlds for the people of the future to explore. We create the infinite with less scarcity and mindful curation. That’s a little too fluffy though.
In the last decade, computer people have hammered out a lot of standards that didn’t exist in Web1 or Web2 days. Just because your items can’t be accessed in two clicks doesn’t mean you can’t port your stuff from one world to the next. As long as computers keep working how they do, the assets created by the people of the metaverse can already work on multiple platforms out of the box, and with some learning and tweaking, those assets can be used anywhere else… into infinity.
To speak concretely, the VRM format has been in development since 2018. Exploring various crypto based virtual worlds, I’m finding these teams are just NOW discovering it, and scrambling to export the actual 3D models they used for the renders. I think that would be a much heavier weight, and wait, of content generation.
I am not a Python or Blender scripter or pro, but I know ma way round a skrip and can sling a mean prim. Writing the code to iterate through showing and hiding the various collections in your hierarchy and rendering a 4k image, sure. Now add exporting the mesh, adding the rig, adjusting the weights, lol lmao textures and materials… and to spend the time on these high def, shiny images, when you have no control over how your avatar is going to look depending on the renderer.
VRM is limited to the MToon and standard Unity shaders. MToon doesn’t have a ton of features yet, and it really doesn’t have the ability to display the kinds of textures used in a render. Granted, MToon 1.0 adds several additional features, but that’s for another post. While some platforms have support for special shader effects, it’s not feasible to think all platforms can support all shaders.
I have a feeling the avatar based NFTs will only be used by the clueless, like hexagon profile pictures. Why pick and pay for a random avatar you ‘like’ instead of making one yourself, for as much as you’re willing to spend, down to zero dollars? Oh, you want the clout that comes with having something expensive and exclusive? Then commission one of the more popular model makers. You’ll get a real product, in your digital hands, that you own forever, and can edit and remix without having to pay to make changes.
Right now, you can purchase an avatar you like from Gumroad or Booth.
You can install Blender, VRoid Studio, Homonclusse, Cecil Transformation, Mixamo, Paint 3D, whatever else and make your own from scratch.
You can commission an artist to make custom textures and adjust the mesh.
You can upload that custom avatar to VRChat and ChilloutVR.
You can do a few tweaks in Unity to make it a VRM.
That VRM can go into The Seed Online, cluster, VRoid Hub, and a whole list of other apps.
A few more tweaks and your model can go in Vircadia/HiFi based worlds.
Do some Blender work and upload it to Second Life.
Of course, there will always be platforms where you cannot bring your own avatar and have to work with in game assets. I don’t think that’s ever been an issue for metaverse citizens as long as there are a proper variety of assets. When a platform has varied and inclusive avatar pieces, I have a lot of fun trying to create a look that is still me. I think focusing on inclusive avatar elements is more important, especially in hair styles.
The people already in the metaverse understand that sometimes the mechanics of a certain platform require a certain avatar and that’s okay. Metaverse people with a strong connection to their avatar have already created versions of their model in different looks: chibi, low poly, 2D, nekomimi, Minecraft…
I think the tl;dr is we already have interoperability but the talking heads haven’t spent enough time in-world to learn what the people of the metaverse can already do.