creating 'Sculpties' for Second Life

Next
 From:  Cindy
597.1 
Hi Michael,
I'm looking for an easy way to create 'Sculpties' for Second Life. They are in beta now.
Simplified, a Sculptie is like a Sphere with Poles (geosphere?), modified by a special Texture.
(so the Sculpt-Map Texture _is_ the Object)
The Texture's (R,G,B) values are used as Vertex Coordinates(X,Y,Z). The order of the Pixels is the Order in which
the Triangles are created.

some URL's with more Details about this:
http://blog.secondlife.com/2007/04/27/the-advent-of-an-illustrious-age-of-sculpted-prims/
https://wiki.secondlife.com/wiki/Sculpted_Prim_Explanation
http://www.secondlife.com/ (just in case you dont know it)

You see, this kind of Mapping is new, there is no 3D Modeling Application with built-in Support for this Format.
Solutions so far are : a MEL script for Maya, a Way to export from Blender (both use Texture Baking for this)
... and my (very poor) obj2sculpt Program (not released, because its almost useless).
I spent many hours searching for a (free) NURBS Modeler that can be used for this Work.
Your MOI looks perfect, because most Second Life Citizens are not Expert 3D Artists,
so we need something easy to use.

Now, the Problem is :
I export a simple NURBS Object to .obj.
I can change the Number of Vertices in the Output File, but its Trial and Error. (Fewer Polygons - More Polygons).
My obj2sculpt Program tries to build a Sculpt-Map from this. It would be easy, if i got a certain Number of Vertices,
like 32 x 32 (this would map directly to the Sculpt-Map).
In fact, the default NURBS Sphere in Blender gives me exactly that (but Blender's NURBS Tools are no Fun).

What can i do ? I wished there was a 'Magic Algorithm' to re-build the Mesh at a certain 'Size'.
Or use Texture Baking, too ?

Second Life has more than 6 Million Members, and MOI could be the Number one Program for Sculptie creation :)

Thank you for reading all this.
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Frenchy Pilou (PILOU)
597.2 In reply to 597.1 
Just a detail Moi will be not a "free" prog :)
---
Pilou
Is beautiful that please without concept!
My Gallery
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Cindy
597.3 In reply to 597.2 

Well, the current beta Version is free.
I hope the 'final' Version will be a bit cheaper than Rhino :)
Some Second Life Members are discussing if Maya is worth
buying to create sculpties... It cant get more expensive
(and overkill) than that.

  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Frenchy Pilou (PILOU)
597.4 In reply to 597.3 

Sure betas are free :)
after seems moi will be something maybe like rhino / 5
Only Michael knows this sort of thing :)

---
Pilou
Is beautiful that please without concept!
My Gallery
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
597.5 In reply to 597.3 
Hi Cindy,

> I hope the 'final' Version will be a bit cheaper than Rhino :)

Yes, as Pilou mentions version 1.0 will be right around 20% of the price of Rhino, so it certainly fits this aspect.

But MoI just isn't focused on restricting the mesh output to fit this particular type of constraints, it's focused on general purpose arbitrary geometry. I'm also trying pretty hard not to add too many new things right now, I'm really trying to wrap things up for the 1.0 release...


> What can i do ? I wished there was a 'Magic Algorithm' to re-build the Mesh
> at a certain 'Size'.
> Or use Texture Baking, too ?

Hmmm, I think I may be able to give you this "Magic Algorithm".

It would be difficult to re-build just any mesh to a certain MxN grid size, but there is one extra piece of information that MoI saves out to .obj files which I think you can leverage to do it more easily: UV coordinates at each mesh vertex.

When MoI generates a mesh for a surface, it will create 3D x,y,z point locations for each polygon vertex, but it will also create 2D u,v texture coordinates for each polygon vertex as well (these are the vt entries in the .obj file). Those u,v coordinates come from the NURBS surface.

So say you want to create a 16x16 node grid. What you'll do is create a UV coordinate for each of these nodes within the UV space, which in this case is between 0.0 and 1.0. So for instance you would get 2d coordinates at 0, 1/16, 2/16, 3/16, 4/16, .... through to 1.0 .

So you now have a UV coordinate for each point. You now want to convert this into a 3D coordinate.

To do the UV->3D conversion, you're going to find which triangle contains this point. Each triangle vertex has a UV coordinate (those vt points in the .obj file) in addition to the 3D coordinates. For the moment you just consider the triangle to be a 2D triangle using the UV vertices. Test each triangle to see if your UV point is inside of it, until you find the triangle that contains it.

If there is no triangle that contains it, it means you had a trimmed surface - I should mention that this algorithm only works for certain conditions - your object should be made up of just one single surface, not multiple surfaces (like a cube has 6 surfaces in it normally, this won't work. You have to create a special cube that is made up of just one surface instead). And also the surface should not be trimmed, like it should not have been run through booleans or those types of processes. That's because you want the UV space to be completely covered by the triangulation which is only the case for untrimmed surfaces. Another constraint - to make things a bit easier to process you probably want the .obj file to be saved as Triangles only, instead of using N-Gons.

Anyway, once you find which triangle contains your UV point, you want to calculate the barycentric coordinates of your UV point. The barycentric coordinates basically will express the point as proportional weighting or averaging between the 3 triangle points. Once you know these proportions, you can then apply this proportional blending to the 3D points of the triangle to generate an equivalent 3D point inside the 3D version of the triangle. That's your 3D point for that node.

This will basically sample some points on existing triangles to create your regular grid. Of course if you don't use very many points it will be pretty rough and jagged in many spots...

Also, I don't think you want your .obj file that you're processing to have a very low polygon count, because that will kind of increase sampling error. You want the base mesh to be of at least normal density or maybe actually a fair bit high density, that will make your final re-sample closer to the original surface. I mention this because it might be tempting to try to aid the final reduction by having a coarse mesh, but you will really be adding additional error by that.


Anyway I think it would work. Let me know if you need any more clarification on the algorithm or more details on any of the steps.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
597.6 In reply to 597.5 
The other possibility to generate a regular grid would be to do it directly from the NURBS surface.

This would mean instead of reading the .obj file, you would use the OpenNURBS toolkit to read the NURBS surface data from the .3dm file directly, and then evaluate uv points on the NURBS surface.

The procedure I describe above is basically an approximation of evaluating the NURBS surface, using the triangulated data instead of using the surface directly.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Cindy
597.7 In reply to 597.6 
Wow, thank you, this should be enough to get me started (my Programing skills are not the best).
Using the UV Data like this... i have to imagine it a few times to really understand it.
Reading the Data with OpenNURBS should give the best Results, so i will see
if i manage to implement that.

Thanks again :)
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
597.8 In reply to 597.7 
The OpenNURBS library does provide the code for evaluating the 3D point location of a surface at a particular u,v NURBS parameter location. So that method would actually probably involve writing less new code on your part.

The tricky part may just be navigating through the structure of a NURBS model if you are not familiar with that.

Here is a quick overview that should jump start you.

First, you need to download the OpenNURBS toolkit, this is a library that lets you read the contents of .3dm files, that's at http://opennurbs.org

There is an example project in there called example_read, that shows the basic steps to open a .3dm file and read it into memory. Basically you open a file, then define an ON_BinaryFile object that takes the file pointer, and then there is kind of a high level helper class called ONX_Model that sort of represents the entire model file contents, which can be populated by a call to ONX_Model::Read().

ONX_Model has a member called m_object_table, this is an array that holds all the objects that were sucked out of the .3dm file, each entry in the array is an ONX_Model_Object.

There are various types of objects, like a curve object, a brep object, etc... - you'll want to look for a brep object. I guess you'll probably expect to have just a single surface saved in the file.

ONX_Model_Object has a member m_object, which is an ON_Object, this is the generic base class for the various object types.

To test if the object is a brep, you do something like this:

ON_Brep* pBrep = ON_Brep::Cast( m_object );
if ( pBrep != NULL )
{
// It's a brep, use it.
}

Basically that tests if the generic object is actually a brep object and then casts it to a brep pointer so now you have a brep to work with.

If you're assuming that the brep is made up of just a single surface, you can access that surface by calling

ON_Surface* pSrf = pBrep->m_S[0]; // Grab the first surface of the brep.

Now that you have the surface, you can evaluate points on it. First you need to get the boundaries of the surface's uv space. Surfaces don't necessary have uvs that start at 0 and go to 1, they can be between any range. So to grab the ranges:

double UMin, UMax, VMin, VMax;
pSrf->GetDomain( 0, &UMin, UMax );
pSrf->GetDomain( 1, &VMin, VMax );


Now to evaluate points - you'll step between those ranges. So for example if you want to evaluate 16 steps along the surface, you will go in sixteenth increments between UMin and UMax for the u coordinate, same for v. So that gives you numbers for a u,v point.


Now you can get the 3D point for this UV parameter value by calling:

ON_3dPoint pt = pSrf->PointAt( u, v );

And that's your x,y,z point for that one spot.

Note that I just wrote this up quickly, I didn't test anything so there may be typos above. But those are the basic steps.

If you give it a try, I can probably help you out if you get stuck on something.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Cindy
597.9 In reply to 597.8 

Now you wrote almost the complete Program for me :)
I got the first Result already, i wrote the vertices to a .obj File and fed it to my old obj2sculpt Program
(its C#, so i cant simply copy and paste it to the C++ program).
The Result looks good. I guess it works.
Now its time for more testing, some cleanup and making it easy to use.

/me bows to the Master of 3D Programming

  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
597.10 In reply to 597.9 
Wow, that was quick Cindy, that's really cool!

Is there any way to create a file that represents an assembly of multiple 'Sculpties' that are positioned in relation to one another?

I guess right now if you wanted to make a more complex object made out of multiple untrimmed NURBS surfaces, it would be kind of difficult because each different sculpt map that you create probably represents an object centered around its own local origin. So I guess you would have to manually reposition each surface into its proper relative place later on.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Cindy
597.11 In reply to 597.10 
It should be possible, using LSL (the Second Life Scripting Language).
I still have some testing and Bugfixing to do, to get the best possible Result.
At the Moment, the origin of my Sculpties is off Center, this will be next.
My Program exports only a single Object now, but support for multiple
Objects would be a nice Feature - and not too hard to do.
I think the Perfect Sculptie Export would do these extra Steps :
-Scale all Axes to the Maximum, prior to the Rounding.
(for best Resolution, 8 Bits is not much)
-Create a LSL Script to scale the Object back
..and maybe align multiple Objects to fit together...
Well, a Programmer's work is never done. ;)
But first, i try to solve the off-center Problem.
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
 From:  Michael Gibson
597.12 In reply to 597.11 
> Well, a Programmer's work is never done. ;)

I know that feeling! :)

The nice thing about your stand-alone tool is that it can be used with a few different programs - MoI, Rhino, and I think that Ayam modeler can save a NURBS surface in a .3dm file, and possibly some other CAD programs like SolidWorks...

One other idea - probably right now you are doing that even stepping in UV space - that's a type of "uniform" tessellation.

That can be a problem if the surface has different behavior in different areas. Like if the surface is fairly flat in one half of it, but wiggly in the other half, you're still going to place a bunch of points in the flat area where it doesn't really need the points.

A different style, "adaptive" tessellation, would do something like subdivide UV space into half by placing one point in the middle, then try to analyze each of the remaining halves and if one of them was pretty flat then it would not add any more in that half and instead continue adding more in the bendy part.

That would be a way to try and optimize the point placement.

It's a bit tricky, but I could help you with it a bit at some point if it seems like it would improve the quality. It may not make much of a difference if people are doing mostly fairly evenly curved things. In fact one problem is if things are really evenly curved maybe the uniform way is actually a bit better spacing than adaptive...

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged
 

Reply to All Reply to All