How to use a shaderModifier to alter the color of specific triangles in a SCNGeometry

匿名 (未验证) 提交于 2019-12-03 09:06:55

问题:

First, before I go on, I have read through: IOS8 Scenekit painting on texture with texture coordinates which seems to suggest I'm on the right track.

I have a complex SCNGeometry representing a hexasphere. It's rendering really well, and with a full 60fps on all of my test devices.

At the moment, all of the hexagons are being rendered with a single material, because, as I understand it, every SCNMaterial I add to my geometry adds another draw call, which I can't afford.

Ultimately, I want to be able to color each of the almost 10,000 hexagons individually, so adding another material for each one is not going to work.

I had been planning to limit the color range to (say) 100 colors, and then move hexagons between different geometries, each with their own colored material, but that won't work because SCNGeometry says it works with an immutable set of vertices.

So, my current thought/plan is to use a shader modifier as suggested by @rickster in the above-mentioned question to somehow modify the color of individual hexagons (or sets of 4 triangles).

The thing is, I sort of understand the Apple doco referred to, but I don't understand how to provide the shader with what I think must essentially be an array of colour information, somehow indexed so that the shader knows which triangles to give what colors.

The code I have now, that creates the geometry reads as:

NSData *indiceData = [NSData dataWithBytes:oneMeshIndices length:sizeof(UInt32) * indiceIndex]; SCNGeometryElement *oneMeshElement =     [SCNGeometryElement geometryElementWithData:indiceData                                   primitiveType:SCNGeometryPrimitiveTypeTriangles                                  primitiveCount:indiceIndex / 3                                   bytesPerIndex:sizeof(UInt32)]; [oneMeshElements addObject:oneMeshElement];  SCNGeometrySource *oneMeshNormalSource =     [SCNGeometrySource geometrySourceWithNormals:oneMeshNormals count:normalIndex];  SCNGeometrySource *oneMeshVerticeSource =     [SCNGeometrySource geometrySourceWithVertices:oneMeshVertices count:vertexIndex];  SCNGeometry *oneMeshGeom =     [SCNGeometry geometryWithSources:[NSArray arrayWithObjects:oneMeshVerticeSource, oneMeshNormalSource, nil]                             elements:oneMeshElements];  SCNMaterial *mat1 = [SCNMaterial material]; mat1.diffuse.contents = [UIColor greenColor]; oneMeshGeom.materials = @[mat1]; SCNNode *node = [SCNNode nodeWithGeometry:oneMeshGeom];

If someone can shed some light on how to provide the shader with a way to color each triangle indexed by the indices in indiceData, that would be fantastic.

EDIT

I've tried looking at providing the shader with a texture as a container for color information that would be indexed by the VertexID however it seems that SceneKit doesn't make the VertexID available. My thought was to provide this texture (actually just an array of bytes, 1 per hexagon on the hexasphere), via the SCNMaterialProperty class and then, in the shader, pull out the appropriate byte, based on the vertex number. That byte would be used to index an array of fixed colors and the resultant color for each vertex would then give the desired result.

Without a VertexID, this idea won't work, unless there is some other, similarly useful piece of data...

EDIT 2

Perhaps I am stubborn. I've been trying to get this to work, and as an experiment I created an image that is basically a striped rainbow and wrote the following shader, thinking it would basically colour my sphere with the rainbow.

It doesn't work. The entire sphere is drawn using the colour in the top left corner of the image.

My shaderModifer code is:

#pragma arguments sampler2D colorMap; uniform sampler2D colorMap; #pragma body vec4 color = texture2D(colorMap, _surface.diffuseTexcoord); _surface.diffuse.rgba = color;

and I apply this using the code:

SCNMaterial *mat1 = [SCNMaterial material]; mat1.locksAmbientWithDiffuse = YES; mat1.doubleSided = YES;  mat1.shaderModifiers = @{SCNShaderModifierEntryPointSurface :                              @"#pragma arguments\nsampler2D colorMap;\nuniform sampler2D colorMap;\n#pragma body\nvec4 color = texture2D(colorMap, _surface.diffuseTexcoord);\n_surface.diffuse.rgba = color;"};  colorMap = [SCNMaterialProperty materialPropertyWithContents:[UIImage imageNamed:@"rainbow.png"]];  [mat1 setValue:colorMap forKeyPath:@"colorMap"];

I had thought that the _surface.diffuseTexcoord would be appropriate but I'm beginning to think I need to somehow map that to a coordinate in the image by knowing the dimensions of the image and interpolating somehow.

But if this is the case, what units are _surface.diffuseTexcoord in? How do I know the min/max range of this so that I can map it to the image?

Once again, I'm hoping someone can steer me in the right direction if these attempts are wrong.

EDIT 3

OK, so I know I'm on the right track now. I've realised that by using _surface.normal instead of _surface.diffuseTexcoord I can use that as a latitude/longitude on my sphere to map to an x,y in the image and I now see the hexagons being colored based on the color in the colorMap however it doesn't matter what I do (so far); the normal angles seem to be fixed in relation to the camera position, so when I move the camera to look at a different point of the sphere, the colorMap doesn't rotate with it.

Here is the latest shader code:

 #pragma arguments  sampler2D colorMap;  uniform sampler2D colorMap;  #pragma body  float x = ((_surface.normal.x * 57.29577951) + 180.0) / 360.0;  float y = 1.0 - ((_surface.normal.y * 57.29577951) + 90.0) / 180.0;  vec4 color = texture2D(colorMap, vec2(x, y));  _output.color.rgba = color;

ANSWER

So I solved the problem. It turned out that there was no need for a shader to achieve my desired results.

The answer was to use a mappingChannel to provide the geometry with a set of texture coordinates for each vertex. These texture coordinates are used to pull color data from the appropriate texture (it all depends on how you set up your material).

So, whilst I did manage to get a shader working, there were performance issues on older devices, and using a mappingChannel was much much better, working at 60fps on all devices now.

I did find though that despite the documentation saying that a mapping channel is a series of CGPoint objects, that wouldn't work on 64 bit devices because CGPoint seems to use doubles instead of floats.

I needed to define my own struct:

typedef struct {   float x;   float y; } MyPoint;  MyPoint oneMeshTextureCoordinates[vertexCount];

and then having built up an array of these, one for each vertex, I then created the mappingChannel source as follows:

SCNGeometrySource *textureMappingSource =   [SCNGeometrySource geometrySourceWithData:     [NSData dataWithBytes:oneMeshTextureCoordinates                    length:sizeof(MyPoint) * vertexCount]                                    semantic:SCNGeometrySourceSemanticTexcoord                                 vertexCount                             floatComponents:YES                         componentsPerVector:2                           bytesPerComponent:sizeof(float)                                  dataOffset:0                                  dataStride:sizeof(MyPoint)];
标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!