top of page
Search
  • Writer's pictureSally Slade

Introducing the UAD

The year was 2014. The film was [redacted per NDA]. The assignment was to unify the color pipeline across Maya, Nuke, 3DS Max, Photoshop, and RV.


It sounds like a daunting task, but it wasn’t. That’s because there was an incredible, concise, human-readable file format developed by a community of VFX artists and engineers which did all of the heavy lifting.


Woah, it’s so easy!

I’m talking about OpenColorIO, a filetype which was embraced by Software Developers and Pipeline TD’s across an entire industry. Across multiple industries!


The file had a simple task: Provide bi-directional color mapping from default linear space to the chosen look of the show. With these two small equations defined, we could infer scores of other mappings, or even chain mappings together.


This is because every color transformation on the show provided mappings to and from linear space: Linear space was the hub of all color activity! Everything passed through it, and you could get to anywhere from it.


Grand Central Linear Space Station

The file itself is gorgeous, and contains mappings whose definitions look something like this:

- !<ColorSpace>
  name: sRGB
  from_reference: !<FileTransform> {src: linear_to_sRGB.spi3d, interpolation: linear}
  to_reference: !<FileTransform> {src: sRGB_to_linear.spi3d, interpolation: linear}

What’s magical about this file, is that it provided a framework that developers unanimously agreed upon and adopted into their workflows. From there, the larger software corporations like Autodesk, Adobe, and Foundry began to tune in as well.



Yeah, So?


So, what if we had a similar solution to sharing Avatars across platforms? What if we called it a Universal Avatar Descriptor (UAD) file?

I know what you’re thinking, but please bear with me.


If this statement is at all triggering, I urge you to look at the problem statement from a higher level: I am not talking about implementing something like Pixar’s Universal Scene Descriptor (USD) file. I am talking about implementing something like OCIO, but for Avatar Makers.


Think about it.


Every time we get our hands on a new Avatar-supported platform, what’s the first thing we do? Implement as many key traits as possible that correspond with our desired identity. Some platforms have some traits, some platforms have all of the traits, but not all platforms have all traits.

Just me porting my virtual identity across the balkanized Web2 Metaverse.


What if we were able to frontload these Avatar platforms with our preferred traits? A tough ask among the walled gardens of Web2, but a highly feasible proposition using Wallets on Web3.


Imagine if you connect your Wallet, it scrapes your address for an associated UAD file, and it pre-populates its avatar system with compatible traits.


In order to participate in this highly interoperable metaverse, all a developer team would need to do would be to implement a bi-directional definition of avatar traits between their project and the master UAD file. If the master UAD file was lacking in an important trait, they would simply append it and make a pull-request. Subscribed developers would see this upcoming trait, and add support (or not) to their own UAD mappings to increase the fidelity of incoming avatars.


What would it look like?


Well, the master UAD could look something like this:

eye_color:
  - brown
  - blue
  - green
  ..
hair_front:
  - none
  - bangs
  ..
hair_back:
  - bun
  - ponytail
  - pigtails
hair_texture:
  - 1a
  - 2b
  - 4c
hair_color:
  - black
  - brown
  - blonde
  ..
..

And, the bi-directional per-project UADs could look something like this:

eye_color: iris
  - brown: cocoa
  - blue: sky
  - green: sky
  ..
hair_front: bangs
  - none: false
  - bangs: true
  ..
hair_back: has_ponytail
  - bun: false
  - ponytail: true
  - pigtails: true
hair_texture: hair_curls
  - 1a: straight
  - 2b: wavy
  - 4c: coily
hair_color: haircolor
  - black: #000000
  - brown: #482b2b
  - blonde: #f2d360
  ..
..


Give me a practical example


Say Cosmic Cowgirls is making a social space. If they want guests from Citizens of Killtopia to appear rendered in the Cosmic Cowgirl style, they will need some kind of simple solution like this. Via pre-existing bi-directional UAD mapping files to the master project, both projects could understand approximately what each other might look like.



Perhaps they’ll even make custom mappings between their two projects for the highest possible fidelity upon avatar creation.


They’ll then be able to create newly derived avatars using the very avatar-makers leveraged to create the original collections in the first place.


What if these projects never used a UAD-compliant avatar maker? What if they simply randomized PNG layers? Not a problem! Our solution comes with an open-source PNG-based avatar maker GUI which makes UAD mappings a straightforward task for any project member, engineer or no.


Why the focus on PNGs, when the metaverse is implicitly a 3D platform? Well, it’s easier to collect our thoughts in 2D; we are currently a one-woman team with an expertise in 2D rigging.


If and when our idea catches on, it will be easy enough to comb Maya, Max, and Blender scene files for various avatar blendshapes and materials in the same way that we walk directory structures for PNGs. We’re not afraid!



I want it now!

LayerCake will be our first Open Source application used to make 2D Avatars with corresponding UADs


We are currently working on two projects simultaneously which will utilize this workflow to talk to each other. Likewise, we’ll be creating bi-directional UAD mapping files for several avatar-centric projects we’re on friendly terms with*, to kick-start interest in our vision.


*If you or your project would like to work with us, please reach out to hello@voltaku.com!



Additional Use Cases


Beyond metaverse interoperability, a few other use cases pop into mind.



Shopping


Imagine if you could tag artwork or photography with UAD mappings, you could filter entire platforms and marketplaces. I’m thinking of platforms like OpenSea, Pixiv, Society6, etc.



Socializing

You could filter embodied spaces for avatars who you might be “twinning” with, and engage them in a conversation about your shared aesthetic.



Breeding


If Corgis have taught us anything, it’s that offspring will simply be tiny Corgis with new shader maps.

Likewise, if two Avatars wished to “procreate”, it’s reasonable to presume their offspring would wholly retain the underlying model of one parent, while taking on key traits from the other parent. This is easily achievable with UADs.


Is this a bridge too far? Woof!



Transmutation


You could create a project whose sole purpose is to see existing projects avatars in a new light. For example, your project may be uniquely designed to simply re-imagine Crypto Punks as My Little Ponies! Perhaps it reimagines sandwiches as celebrities. Using arbitrary mappings, the possibilities are endless.




bottom of page