I've been writing a little script to generate random asteroid/crystal sort of shapes for use in a Unity game. The basic idea is quite simple: there is a top vertex, and a bottom vertex, and then a ring of vertices around the middle, with triangles linking them to the top and bottom. (The whole thing is then wrapped in a sphere collider.)

Now I am new to scripting meshes in Unity (i.e. I started doing this today) but it doesn't look very hard, and the code I have does work - apart from a maddening issue with some of the lower triangles, as you can see in the picture above. This has been annoying me severely. It looks like some are missing, but they aren't - they're just facing the other way. But only ever on the lower half of the
It's probably something extremely obvious but those are always the hardest things to spot. I have a feeling that it is something to do with the way I am adding triangles to the mesh.triangles array, but I have no idea what it is. I've tried setting normals for all vertices automatically, and manually, and neither makes any difference; I just can't see how the normals for the triangles themselves are set.
This is my code, by the way:
I suppose the next step is to use scripting to interrogate meshes that I've made in Blender with faces pointing in different directions, and see what the difference in the internal representation is.

Now I am new to scripting meshes in Unity (i.e. I started doing this today) but it doesn't look very hard, and the code I have does work - apart from a maddening issue with some of the lower triangles, as you can see in the picture above. This has been annoying me severely. It looks like some are missing, but they aren't - they're just facing the other way. But only ever on the lower half of the
It's probably something extremely obvious but those are always the hardest things to spot. I have a feeling that it is something to do with the way I am adding triangles to the mesh.triangles array, but I have no idea what it is. I've tried setting normals for all vertices automatically, and manually, and neither makes any difference; I just can't see how the normals for the triangles themselves are set.
This is my code, by the way:
function GenerateNewMesh () : boolean {
var mesh : Mesh = GetComponent(MeshFilter).mesh;
mesh.Clear();
var vertexCount : int = Mathf.Ceil(Random.value * 10) + 8;
var vertices = new Vector3[vertexCount];
var normals = new Vector3[vertexCount];
var uv = new Vector2[vertexCount];
vertices[0] = Vector3.up + Random.insideUnitSphere * 0.25;
uv[0] = Vector2(0.5, 0.5);
vertices[1] = Vector3.down + Random.insideUnitSphere * 0.25;
uv[1] = Vector2(0.5, 0.5);
var triangles = new Array();
var angleStep = 2 * Mathf.PI / (parseFloat(vertexCount) - 2);
// Generate randomly placed edge
for (var f = 2; f < vertexCount; f++) {
var angle = (f-2) * angleStep + (angleStep * (Random.value * 0.4 - 0.2));
uv[f] = Vector2(Mathf.Sin(angle), Mathf.Cos(angle)) * (Random.value * 0.2 + 0.8);
vertices[f] = Vector3(uv[f].x, Random.value * 0.4 - 0.2, uv[f].y);
// print("angle=" + angle + ", uv=" + uv[f] + ", vertex=" + vertices[f]);
var neighbour = f + 1;
if (neighbour == vertexCount) neighbour = 2;
triangles = triangles.Concat(new Array(0, f, neighbour, 1, f, neighbour));
}
for (f = 0; f < vertexCount; f++) {
normals[f] = vertices[f].normalized;
}
mesh.vertices = vertices;
mesh.uv = uv;
mesh.triangles = triangles.ToBuiltin(int);
mesh.normals = normals;
// mesh.RecalculateNormals();
mesh.Optimize();
return true;
}
I suppose the next step is to use scripting to interrogate meshes that I've made in Blender with faces pointing in different directions, and see what the difference in the internal representation is.