M
M
marsep2015-07-12 06:47:13
3D
marsep, 2015-07-12 06:47:13

Modeling in Unity3D, "sculpting" objects on the framework of the Mesh class!?

Hello! There is a problem in Unity3D in terms of "sculpting" or object modeling, ie. I want to create, for example, a sphere within the framework of procedural generation, and in the game itself change the shape of this sphere, for example, when you click on a certain area with the mouse, enlarge or retract this area, like on a terrain. I learned how to create simple objects, from squares to complex shapes, but I don't know how to force them to change these shapes in the game, as it happens when editing a terrain. I have already searched everything in the search, there are good examples, but they do not touch on this topic and whether this issue is solved using standard methods in Mesh or whether it is necessary to affect the geometry of objects, and to think over methods for increasing or retracting a specific area of ​​\u200b\u200bthe object yourself? Additional question - is there any way to get and see the Terrain class to at least see how it changes... Thanks in advance for any help!

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Daniil Basmanov, 2015-07-13
@marsep

There are no standard tools for modeling, you need to write everything yourself or take a plugin.
You can view the sources using the decompiler, take dotPeek or .NET Reflector and go ahead. You won’t be able to see everything, Terrain and TerrainData are written in this language, and their sources can only be obtained from developers, but you can get TerrainInspector and HeightmapPainter, they use raycasts on the terrain collider:

public bool Raycast(out Vector2 uv, out Vector3 pos)
{
    RaycastHit hit;
    Ray ray = HandleUtility.GUIPointToWorldRay(Event.current.mousePosition);
    if (this.m_Terrain.GetComponent<Collider>().Raycast(ray, out hit, float.PositiveInfinity))
    {
        uv = hit.textureCoord;
        pos = hit.point;
        return true;
    }
    uv = Vector2.zero;
    pos = Vector3.zero;
    return false;
}

As far as I understand, vertices are not directly modified in the terrain, it uses texture maps, and the obtained coordinates are used to draw on the height map, from which the mesh is then built.
In short, there are several implementation options. To find an application point, you can hang a meshcollider on your geometry and raycast on it. Or you can raycast on geometry using math, in the case of a sphere, you can save a lot of money. To modify a mesh, you need a heightmap, which you use to create a mesh from scratch and don't worry about the geometry. Alternatively, you can write a data structure that can find neighboring vertices, then you can work with an existing mesh. In principle, one does not interfere with the other.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question