OpenGL ES SDK for Android - 6

貓尾巴發表於2018-09-30

Instanced Tessellation

該應用程式顯示一個旋轉的實心圓環,其周圍有一個低多邊形線框網格。 使用OpenGL ES 3.0通過例項鑲嵌技術繪製環面。

OpenGL ES SDK for Android - 6

該應用程式顯示一個旋轉的實心圓環,其周圍有一個低多邊形線框網格。

要執行例項化細分,我們需要將模型劃分為幾個補丁。每個貼片都密集地填充三角形,並改善圓形表面的效果。在曲面細分的第一階段,補丁由以正方形形式放置的頂點組成。一旦傳遞到著色器,它們就會根據儲存在統一塊中的控制點轉換為Bezier曲面。繪圖呼叫的每個例項都呈現環面的下一部分。

以下應用程式例項化兩個類,它們管理實體環形模型和圍繞它的線框。第一個類負責配置具有著色器的程式,該著色器能夠例項化繪圖,初始化資料緩衝區和處理例項化繪製呼叫。為了簡化數學並滿足補丁之間C1連續性的條件,我們假設圓環由12個圓構成,每個圓也由12個點定義。通過這種方式,我們可以將圓環的“大”和“小”圓分成四個象限,並構建近似完美圓形的貝塞爾曲面。為此目的,控制點不能放置在環面的表面上,而是必須適當地扭曲。

第二類管理與線框相對應的元件。它使用放置在環面上的頂點,並使用GL_LINES模式進行簡單的繪製呼叫。它的“小圓圈”的尺寸略大於實心圓環的相應尺寸,因此兩個模型之間有一個空間。

這兩個類的公共元素放在一個抽象的Torus類中。

Setup Graphics

首先,我們需要生成我們將渲染的模型的座標。 這是在WireframeTorus和InstancedSolidTorus類的建構函式中實現的。

wireframeTorus = new WireframeTorus(torusRadius, circleRadius + distance);
複製程式碼
solidTorus = new InstancedSolidTorus(torusRadius, circleRadius);
複製程式碼

請注意,我們希望線框物件比實體物件稍微大一點,這就是為什麼circleRadius增加了距離。 由於這個原因,我們將看到一個由線框物件包圍的實體物件。

兩個模型的座標以相同的方式生成,如下所示:

void TorusModel::generateVertices(float torusRadius, float circleRadius, unsigned int circlesCount, unsigned int pointsPerCircleCount, float* vertices)
{
    if (vertices == NULL)
    {
        LOGE("Cannot use null pointer while calculating torus vertices.");
        return;
    }
    /* Index variable. */
    unsigned int componentIndex = 0;
    for (unsigned int horizontalIndex = 0; horizontalIndex < circlesCount; ++horizontalIndex)
    {
        /* Angle in radians on XZ plane. */
        float xyAngle = (float) horizontalIndex * 2.0f * M_PI / circlesCount;
        for (unsigned int verticalIndex = 0; verticalIndex < pointsPerCircleCount; ++verticalIndex)
        {
            /* Angle in radians on XY plane. */
            float theta  = (float) verticalIndex * 2.0f * M_PI / pointsPerCircleCount;
            /* X coordinate. */
            vertices[componentIndex++] = (torusRadius + circleRadius * cosf(theta)) * cosf(xyAngle);
            /* Y coordinate. */
            vertices[componentIndex++] = circleRadius * sinf(theta);
            /* Z coordinate. */
            vertices[componentIndex++] = (torusRadius + circleRadius * cosf(theta)) * sinf(xyAngle);
            /* W coordinate. */
            vertices[componentIndex++] = 1.0f;
        }
    }
}
複製程式碼

在呈現請求的模型時使用單獨的程式物件。 線框模型上沒有應用光照,也沒有複雜的頂點平移,這使事情變得更加容易。 請檢視渲染線框模型時使用的著色器。

線框環面的頂點著色器源

/* Input vertex coordinates. */ 
in vec4 position;
/* Constant transformation matrices. */
uniform mat4 cameraMatrix;
uniform mat4 projectionMatrix;
uniform mat4 scaleMatrix;
/* Coefficients of rotation needed for configuration of rotation matrix. */
uniform vec3 rotationVector;
void main()
{
    mat4 modelViewMatrix;
    mat4 modelViewProjectionMatrix;
    
    /* Matrix rotating Model-View matrix around X axis. */
    mat4 xRotationMatrix = mat4(1.0,  0.0,                            0.0,                            0.0, 
                                0.0,  cos(radians(rotationVector.x)), sin(radians(rotationVector.x)), 0.0, 
                                0.0, -sin(radians(rotationVector.x)), cos(radians(rotationVector.x)), 0.0, 
                                0.0,  0.0,                            0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Y axis. */
    mat4 yRotationMatrix = mat4( cos(radians(rotationVector.y)), 0.0, -sin(radians(rotationVector.y)), 0.0, 
                                 0.0,                            1.0,  0.0,                            0.0, 
                                 sin(radians(rotationVector.y)), 0.0,  cos(radians(rotationVector.y)), 0.0, 
                                 0.0,                            0.0,  0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Z axis. */
    mat4 zRotationMatrix = mat4( cos(radians(rotationVector.z)), sin(radians(rotationVector.z)), 0.0, 0.0, 
                                -sin(radians(rotationVector.z)), cos(radians(rotationVector.z)), 0.0, 0.0, 
                                 0.0,                            0.0,                            1.0, 0.0, 
                                 0.0,                            0.0,                            0.0, 1.0);
    
    /* Model-View matrix transformations. */
    modelViewMatrix = scaleMatrix;
    modelViewMatrix = xRotationMatrix  * modelViewMatrix;
    modelViewMatrix = yRotationMatrix  * modelViewMatrix;
    modelViewMatrix = zRotationMatrix  * modelViewMatrix;
    modelViewMatrix = cameraMatrix     * modelViewMatrix;
    
    /* Configure Model-View-ProjectionMatrix. */
    modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;
    
    /* Set vertex position in Model-View-Projection space. */
    gl_Position = modelViewProjectionMatrix * position;
}
複製程式碼

線框圓環的片段著色器源

precision mediump float;
uniform vec4 color;
/* Output variable. */
out vec4 fragColor;
void main()
{
    fragColor = color;
}
複製程式碼

如果我們現在想要在螢幕上渲染線框環面,那麼使用GL_LINES模式發出glDrawElements()呼叫就足夠了。 當然應該使用正確的程式物件和頂點陣列物件。

void WireframeTorus::draw(float* rotationVector)
{
    GLint rotationVectorLocation = GL_CHECK(glGetUniformLocation(programID, "rotationVector"));
    /* Set required elements to draw mesh torus. */
    GL_CHECK(glUseProgram(programID));
    GL_CHECK(glBindVertexArray(vaoID));
    GL_CHECK(glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indicesBufferID));
    /* Pass Model-View matrix elements to the shader. */
    GL_CHECK(glUniform3fv(rotationVectorLocation, 1, rotationVector));
    /* Draw lines described by previously determined indices. */
    GL_CHECK(glDrawElements(GL_LINES, indicesCount, GL_UNSIGNED_INT, 0));
}
複製程式碼

請檢視我們用於渲染實體圓環的著色器物件。 這裡的情況更復雜,因為應用了一些照明,並且釋出了將頂點轉換為貝塞爾曲面的例項繪製技術。

固體圓環的頂點著色器源

/* Number of control points in one dimension for a patch.. */
const uint patchDimension = 4u;
/* Total number of control points in a patch. */
const uint controlPointsPerPatchCount = patchDimension * patchDimension;
/* Number of quads in a patch. */
const uint quadsInPatchCount = (patchDimension - 1u) * (patchDimension - 1u);
/* Total number of vertices in a patch. */
const uint verticesCount = 144u;
/* Input patch vertex coordinates. */
in vec2 patchUVPosition;
/* Constant transofrmation matrices. */
uniform mat4 cameraMatrix;
uniform mat4 projectionMatrix;
uniform mat4 scaleMatrix;
/* Coefficients of rotation needed for configuration of rotation matrix. */
uniform vec3 rotationVector;
/* Uniform block that stores control mesh indices. */
uniform ControlPointsIndices
{
    uint indices[controlPointsPerPatchCount * verticesCount / quadsInPatchCount];
};
/* Uniform block that stores control mesh vertices. */
uniform ControlPointsVertices
{
    vec4 vertices[verticesCount];
};
/* Normal vector set in Model-View-Projection space. */
out vec3 modelViewProjectionNormalVector;
void main()
{
    const float pi = 3.14159265358979323846;
    
    mat4 modelViewMatrix;
    mat4 modelViewProjectionMatrix;
    
    /* Array storing control vertices of current patch. */
    vec4 controlVertices[controlPointsPerPatchCount];
    
    /* Initialize array of current control vertices. */
    for (uint i = 0u; i < controlPointsPerPatchCount; ++i)
    {
        controlVertices[i] = vertices[indices[uint(gl_InstanceID) * controlPointsPerPatchCount + i]];
    }
    
    /* Coefficients of Bernstein polynomials. */
    vec2 bernsteinUV0 = (1.0 - patchUVPosition) * (1.0 - patchUVPosition) * (1.0 - patchUVPosition);
    vec2 bernsteinUV1 =  3.0 * patchUVPosition  * (1.0 - patchUVPosition) * (1.0 - patchUVPosition);
    vec2 bernsteinUV2 =  3.0 * patchUVPosition  *        patchUVPosition  * (1.0 - patchUVPosition);
    vec2 bernsteinUV3 =        patchUVPosition  *        patchUVPosition  *        patchUVPosition ;
    
    /* Position of a patch vertex on Bezier surface. */
    vec3 position = bernsteinUV0.x * (bernsteinUV0.y * controlVertices[ 0].xyz + bernsteinUV1.y * controlVertices[ 1].xyz + bernsteinUV2.y * controlVertices[ 2].xyz + bernsteinUV3.y * controlVertices[ 3].xyz) +
                    bernsteinUV1.x * (bernsteinUV0.y * controlVertices[ 4].xyz + bernsteinUV1.y * controlVertices[ 5].xyz + bernsteinUV2.y * controlVertices[ 6].xyz + bernsteinUV3.y * controlVertices[ 7].xyz) +
                    bernsteinUV2.x * (bernsteinUV0.y * controlVertices[ 8].xyz + bernsteinUV1.y * controlVertices[ 9].xyz + bernsteinUV2.y * controlVertices[10].xyz + bernsteinUV3.y * controlVertices[11].xyz) +
                    bernsteinUV3.x * (bernsteinUV0.y * controlVertices[12].xyz + bernsteinUV1.y * controlVertices[13].xyz + bernsteinUV2.y * controlVertices[14].xyz + bernsteinUV3.y * controlVertices[15].xyz);
    
    /* Matrix rotating Model-View matrix around X axis. */
    mat4 xRotationMatrix = mat4(1.0,  0.0,                            0.0,                            0.0, 
                                0.0,  cos(radians(rotationVector.x)), sin(radians(rotationVector.x)), 0.0, 
                                0.0, -sin(radians(rotationVector.x)), cos(radians(rotationVector.x)), 0.0, 
                                0.0,  0.0,                            0.0,                            1.0);
                
    /* Matrix rotating Model-View matrix around Y axis. */
    mat4 yRotationMatrix = mat4( cos(radians(rotationVector.y)), 0.0, -sin(radians(rotationVector.y)), 0.0, 
                                 0.0,                            1.0,  0.0,                            0.0, 
                                 sin(radians(rotationVector.y)), 0.0,  cos(radians(rotationVector.y)), 0.0, 
                                 0.0,                            0.0,  0.0,                            1.0);
    
    /* Matrix rotating Model-View matrix around Z axis. */
    mat4 zRotationMatrix = mat4( cos(radians(rotationVector.z)), sin(radians(rotationVector.z)), 0.0, 0.0, 
                                -sin(radians(rotationVector.z)), cos(radians(rotationVector.z)), 0.0, 0.0, 
                                0.0,                             0.0,                            1.0, 0.0, 
                                0.0,                             0.0,                            0.0, 1.0);
    
    /* Model-View matrix transformations. */
    modelViewMatrix = scaleMatrix;
    modelViewMatrix = xRotationMatrix * modelViewMatrix;
    modelViewMatrix = yRotationMatrix * modelViewMatrix;
    modelViewMatrix = zRotationMatrix * modelViewMatrix;
    modelViewMatrix = cameraMatrix    * modelViewMatrix;
    /* Configure Model-View-ProjectionMatrix. */
    modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;
    /* Set vertex position in Model-View-Projection space. */
    gl_Position = modelViewProjectionMatrix * vec4(position, 1.0);
    /* Angle on the "big circle" of torus. */
    float phi = (patchUVPosition.x + mod(float(gl_InstanceID), 4.0)) * pi / 2.0;
    /* Angle on the "small circle" of torus. */
    float theta = (patchUVPosition.y + mod(float(gl_InstanceID / 4), 4.0)) * pi / 2.0;
    /* Horizontal tangent to torus. */
    vec3 dBdu = vec3(-sin(phi), 0.0, cos(phi));
    /* Vertical tangent to torus. */
    vec3 dBdv = vec3(cos(phi) * (-sin(theta)), cos(theta), sin(phi) * (-sin(theta)));
    /* Calculate normal vector. */
    vec3 normalVector = normalize(cross(dBdu, dBdv));
    /* Calculate normal matrix. */
    mat3 normalMatrix = transpose(inverse(mat3x3(modelViewMatrix)));
    /* Transform normal vector to Model-View-Projection space. */
    modelViewProjectionNormalVector = normalize(normalMatrix * normalVector);
}
Fragment shader source for the solid torus

precision mediump float;
/* Input normal vector. */
in vec3 modelViewProjectionNormalVector;
/* Structure storing directional light parameters. */
struct Light
{
    vec3  lightColor;
    vec3  lightDirection;
    float ambientIntensity;
};
/* Color of the drawn torus. */
uniform vec4  color;
/* Uniform representing light parameters. */
uniform Light light;
/* Output variable. */
out vec4 fragColor;
void main()
{
    /* Calculate the value of diffuse intensity. */
    float diffuseIntensity = max(0.0, -dot(modelViewProjectionNormalVector, normalize(light.lightDirection)));
    /* Calculate the output color value considering the light. */
    fragColor = color * vec4(light.lightColor * (light.ambientIntensity + diffuseIntensity), 1.0);
}
複製程式碼

如果我們現在想在螢幕上渲染實體圓環,那麼使用GL_TRIANGLES模式發出glDrawElementsInstanced()呼叫就足夠了。 當然應該使用正確的程式物件和頂點陣列物件。

void InstancedSolidTorus::draw(float* rotationVector)
{
    /* Location of rotation vector. */
    GLint rotationVectorLocation = GL_CHECK(glGetUniformLocation(programID, "rotationVector"));
    /* Set required OpenGL ES state. */
    GL_CHECK(glUseProgram     (programID                                    ));
    GL_CHECK(glBindVertexArray(vaoID                                        ));
    GL_CHECK(glBindBuffer     (GL_ELEMENT_ARRAY_BUFFER, patchIndicesBufferID));
    if (rotationVectorLocation != -1)
    {
        /* Pass rotation parameters to the shader. */
        GL_CHECK(glUniform3fv(rotationVectorLocation, 1, rotationVector));
    }
    else
    {
        LOGE("Could not locate \"rotationVector\" uniform in program [%d]", programID);
    }
    /* Draw patchInstancesCount instances of patchTriangleIndicesCount triangles. */ 
    GL_CHECK(glDrawElementsInstanced(GL_TRIANGLES, patchTriangleIndicesCount, GL_UNSIGNED_INT, 0, patchInstancesCount));
}
複製程式碼

Result

我們希望模型旋轉,這就是為什麼我們需要計算每幀的新旋轉角度。 生成旋轉向量後,它將用於更新兩者的頂點位置:線框和實心圓環。

/* Increment rotation angles. */
angleX += 0.5;
angleY += 0.5;
angleZ += 0.5;
if(angleX >= 360.0f) angleX = 0.0f;
if(angleY >= 360.0f) angleY = 0.0f;
if(angleZ >= 360.0f) angleZ = 0.0f;
float rotationVector[] = {angleX, angleY, angleZ};
複製程式碼

然後在繪製所請求的模型時使用計算的值。

wireframeTorus->draw(rotationVector);
複製程式碼
solidTorus->draw(rotationVector);
複製程式碼

Instancing

此示例介紹使用OpenGL ES 3.0的例項繪製技術。

OpenGL ES SDK for Android - 6

每個立方體都是同一物件的例項。

記憶體中只有一個立方體頂點資料的副本,並且繪製的每個立方體都是該資料的一個例項。 這減少了需要傳輸到GPU的記憶體量。 通過在著色器中使用gl_instanceID,每個立方體可以具有不同的位置,旋轉速度和顏色。 在場景中使用重複幾何體的任何地方都可以使用此技術。

Generating a Geometry

要渲染立方體(最基本的3D形狀),我們需要為其頂點生成座標。 這是我們要關注的第一步。 請看下面的圖片。

OpenGL ES SDK for Android - 6

立方體頂點的座標。

如圖所示,立方體頂點座標圍繞點<0,0,0>排列,將它們置於[<-1,-1,-1>,<1,1,1>]範圍內。 這不是必要的。 可以在任何方向上平移座標或縮放立方體,但必須確保立方體在螢幕上仍然可見。 如果不確定如何操作,請按照我們的建議操作。 我們還有另一個原因使用圍繞螢幕中心排列的座標(點<0,0,0>) - 我們將生成立方體的副本,每個例項將被轉換為新位置(以便立方體在圓形軌跡上移動)。

使立方體頂點不足以繪製立方體形狀。 基本的OpenGL ES渲染技術基於繪製構成所請求形狀的三角形。 這裡要提到的是,在描述立方體三角形頂點時,應該遵循順時針或逆時針順序,否則OpenGL ES在檢測正面和背面時會遇到一些麻煩。 在這個例子中,我們使用順時針(CW)順序來描述立方體座標,因為這是OpenGL ES的預設值。

OpenGL ES SDK for Android - 6

構成立方體形狀的三角形。

    /* Please see header for the specification. */
    void CubeModel::getTriangleRepresentation(float** coordinatesPtrPtr,
                                              int*    numberOfCoordinatesPtr,
                                              int*    numberOfPointsPtr,
                                              float   scalingFactor)
    {
        ASSERT(coordinatesPtrPtr != NULL,
               "Cannot use null pointer while calculating coordinates");
        /* Index of an array we will put new point coordinates at. */
        int       currentIndex                    = 0;
        /* 6 faces of cube, 2 triangles for each face, 3 points of triangle, 3 coordinates for each point. */
        const int numberOfCubeTriangleCoordinates = NUMBER_OF_CUBE_FACES        *
                                                    NUMBER_OF_TRIANGLES_IN_QUAD *
                                                    NUMBER_OF_TRIANGLE_VERTICES *
                                                    NUMBER_OF_POINT_COORDINATES;
        /* Allocate memory for result array. */
        *coordinatesPtrPtr = (float*) malloc(numberOfCubeTriangleCoordinates * sizeof(float));
        /* Is allocation successful?. */
        ASSERT(*coordinatesPtrPtr != NULL,
               "Could not allocate memory for result array.")
        /* Example:
         * Coordinates for cube points:
         * A -1.0f,  1.0f,  1.0f
         * B -1.0f,  1.0f, -1.0f
         * C  1.0f,  1.0f, -1.0f
         * D  1.0f,  1.0f,  1.0f
         * E -1.0f, -1.0f,  1.0f
         * F -1.0f, -1.0f, -1.0f
         * G  1.0f, -1.0f, -1.0f
         * H  1.0f, -1.0f,  1.0f
         * Create 2 triangles for each face of the cube. Vertices are written in clockwise order.
         *       B ________ C
         *      / |     /  |
         *  A ......... D  |
         *    .   |   .    |
         *    .  F|_ _.___ |G
         *    . /     .  /
         *  E ......... H
         */
        const Vec3f pointA = {-1.0f,  1.0f,  1.0f};
        const Vec3f pointB = {-1.0f,  1.0f, -1.0f};
        const Vec3f pointC = { 1.0f,  1.0f, -1.0f};
        const Vec3f pointD = { 1.0f,  1.0f,  1.0f};
        const Vec3f pointE = {-1.0f, -1.0f,  1.0f};
        const Vec3f pointF = {-1.0f, -1.0f, -1.0f};
        const Vec3f pointG = { 1.0f, -1.0f, -1.0f};
        const Vec3f pointH = { 1.0f, -1.0f,  1.0f};
        /* Fill the array with coordinates. */
        /* Top face. */
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* Bottom face. */
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* Back face. */
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* Front face. */
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* Right face. */
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* D */
        (*coordinatesPtrPtr)[currentIndex++] = pointD.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointD.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* H */
        (*coordinatesPtrPtr)[currentIndex++] = pointH.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointH.z;
        /* C */
        (*coordinatesPtrPtr)[currentIndex++] = pointC.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointC.z;
        /* G */
        (*coordinatesPtrPtr)[currentIndex++] = pointG.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointG.z;
        /* Left face. */
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* B */
        (*coordinatesPtrPtr)[currentIndex++] = pointB.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointB.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* F */
        (*coordinatesPtrPtr)[currentIndex++] = pointF.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointF.z;
        /* A */
        (*coordinatesPtrPtr)[currentIndex++] = pointA.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointA.z;
        /* E */
        (*coordinatesPtrPtr)[currentIndex++] = pointE.x;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.y;
        (*coordinatesPtrPtr)[currentIndex++] = pointE.z;
        /* Calculate size of a cube. */
        if (scalingFactor != 1.0f)
        {
            for (int i = 0; i < numberOfCubeTriangleCoordinates; i++)
            {
                (*coordinatesPtrPtr)[i] *= scalingFactor;
            }
        }
        if (numberOfCoordinatesPtr != NULL)
        {
            *numberOfCoordinatesPtr = numberOfCubeTriangleCoordinates;
        }
        if (numberOfPointsPtr != NULL)
        {
            *numberOfPointsPtr = numberOfCubeTriangleCoordinates / NUMBER_OF_POINT_COORDINATES;
        }
    }
複製程式碼

我們想告訴OpenGL ES獲取資料並繪製立方體。

首先,我們需要一個緩衝物件,用於儲存構成立方體的三角形的頂點座標。

生成緩衝區物件(在程式碼中我們生成3個緩衝區物件,因為我們將在後續步驟中使用它們,但此時只需要其中一個:cubeCoordinatesBufferObjectId緩衝區物件):

/* Generate buffers. */
GL_CHECK(glGenBuffers(numberOfBufferObjectIds, bufferObjectIds));
cubeCoordinatesBufferObjectId  = bufferObjectIds[0];
cubeColorsBufferObjectId       = bufferObjectIds[1];
uniformBlockDataBufferObjectId = bufferObjectIds[2];
複製程式碼

我們需要呼叫函式(已在上面描述)來獲取立方體的座標資料。

/* Get triangular representation of a cube. Save data in cubeTrianglesCoordinates array. */
CubeModel::getTriangleRepresentation(&cubeTrianglesCoordinates,
                                     &numberOfCubeTriangleCoordinates,
                                     &numberOfCubeVertices,
                                      cubeSize);
複製程式碼

下一步是將檢索到的資料複製到緩衝區物件中。

/* Buffer holding coordinates of triangles which create a cube. */
GL_CHECK(glBindBuffer(GL_ARRAY_BUFFER,
                  cubeCoordinatesBufferObjectId));
GL_CHECK(glBufferData(GL_ARRAY_BUFFER,
                  numberOfCubeTriangleCoordinates * sizeof(GLfloat),
                  cubeTrianglesCoordinates,
                  GL_STATIC_DRAW));
複製程式碼

接下來要做的是為立方體的座標設定頂點attrib陣列。 該函式適用於當前繫結的陣列緩衝區物件。 我們在應用程式中多次重新繫結緩衝區物件,這就是為什麼我們需要在這裡再做一次。 但是,如果確定儲存頂點座標的緩衝區物件當前繫結到GL_ARRAY_BUFFER目標,則不需要這樣做。 請注意,應為活動程式物件呼叫以下所有函式。

GL_CHECK(glBindBuffer             (GL_ARRAY_BUFFER,
                                   cubeCoordinatesBufferObjectId));
GL_CHECK(glEnableVertexAttribArray(positionLocation));
GL_CHECK(glVertexAttribPointer    (positionLocation,
                                   NUMBER_OF_POINT_COORDINATES,
                                   GL_FLOAT,
                                   GL_FALSE,
                                   0,
                                   0));
複製程式碼

此時,應該對positionLocation變數感興趣:它代表什麼? 這是從我們用於渲染的程式物件中檢索的屬性位置。 一旦熟悉程式物件並正確初始化該值,就可以呼叫它

glDrawArrays(GL_TRIANGLES, 0, numberOfCubeVertices);
複製程式碼

在螢幕上渲染單個立方體。 請注意,使用了GL_TRIANGLES模式,它對應於我們生成的立方體座標的三角形表示。

Program Object

要開始使用程式物件,我們必須首先生成其ID。

renderingProgramId = GL_CHECK(glCreateProgram());
複製程式碼

程式物件需要將片段和頂點著色器附加到它上面。 現在讓我們專注於生成和設定著色器物件。

Shader::processShader(&vertexShaderId,   VERTEX_SHADER_FILE_NAME,   GL_VERTEX_SHADER);
Shader::processShader(&fragmentShaderId, FRAGMENT_SHADER_FILE_NAME, GL_FRAGMENT_SHADER);
複製程式碼

基本機制如下:

  1. 建立著色器物件:
*shaderObjectIdPtr = GL_CHECK(glCreateShader(shaderType));
複製程式碼
  1. 設定著色器源:
GL_CHECK(glShaderSource(*shaderObjectIdPtr, 1, strings, NULL));
複製程式碼

請注意,strings變數儲存從檔案中讀取的著色器源。

strings[0] = loadShader(filename);
複製程式碼
  1. 編譯著色器:
GL_CHECK(glCompileShader(*shaderObjectIdPtr));
複製程式碼

通過檢查GL_COMPILE_STATUS(期望GL_TRUE)來檢查編譯是否成功總是一個好主意。

GL_CHECK(glGetShaderiv(*shaderObjectIdPtr, GL_COMPILE_STATUS, &compileStatus));
複製程式碼

為片段和頂點著色器呼叫這些函式後,將兩者都附加到程式物件,

GL_CHECK(glAttachShader(renderingProgramId, vertexShaderId));
GL_CHECK(glAttachShader(renderingProgramId, fragmentShaderId));
複製程式碼

連結程式物件,

GL_CHECK(glLinkProgram(renderingProgramId));
複製程式碼

並設定要使用的程式物件(活動)。

GL_CHECK(glUseProgram(renderingProgramId));
複製程式碼

我們在應用程式中使用的著色器物件更高階,但是,如果只想渲染一個立方體,則使用下面的程式碼定義著色器就足夠了。

頂點著色器:

#version 300 es
in      vec4 attributePosition;
uniform vec3 cameraVector;
uniform vec4 perspectiveVector;
void main()
{
    float fieldOfView = 1.0 / tan(perspectiveVector.x * 0.5);
                      
    mat4 cameraMatrix = mat4 (1.0,            0.0,            0.0,           0.0, 
                              0.0,            1.0,            0.0,           0.0, 
                              0.0,            0.0,            1.0,           0.0, 
                              cameraVector.x, cameraVector.y, cameraVector.z, 1.0);
    mat4 perspectiveMatrix = mat4 (fieldOfView/perspectiveVector.y, 0.0,         0.0,                                                                                             0.0, 
                                   0.0,                             fieldOfView, 0.0,                                                                                             0.0, 
                                   0.0,                             0.0,        -(perspectiveVector.w + perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z),      -1.0, 
                                   0.0,                             0.0,        (-2.0 * perspectiveVector.w * perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z), 0.0);
  
    /* Return gl_Position. */
    gl_Position = perspectiveMatrix * cameraMatrix * attributePosition;
}
複製程式碼

片段著色器:

#version 300 es
precision mediump float;
out vec4 fragmentColour;
void main()
{
    fragmentColour = vec4(0.3, 0.2, 0.8, 1.0); //Please use any colour you want
}
複製程式碼

現在可以看到positionLocation變數代表什麼。 它是名為attributePosition的屬性的位置。 我們如何獲得屬性位置?

程式物件連結並啟用後,可以呼叫

positionLocation = GL_CHECK(glGetAttribLocation   (renderingProgramId, "attributePosition"));
複製程式碼

請記住,檢查檢索到的值是否有效總是一個好主意。 如果在著色器中未找到屬性名稱或處於非活動狀態(未使用),則返回-1(被視為無效的屬性位置)。

ASSERT(positionLocation != -1,  "Could not retrieve attribute location: attributePosition");
複製程式碼

如果返回了有效值,則可以在先前的步驟中使用它。

Instanced Drawing

該應用程式的主要思想是提出例項化繪圖技術。 一旦熟悉了前面的部分,就可以這樣做了。

首先,我們必須知道要渲染立方體物件的例項數。

/* Number of cubes that are drawn on a screen. */
#define NUMBER_OF_CUBES (10)
複製程式碼

接下來要做的是調整繪圖命令,以便渲染所有立方體。 請注意,這次我們使用glDrawArraysInstanced()而不是glDrawArrays()。

/* Draw cubes on a screen. */
GL_CHECK(glDrawArraysInstanced(GL_TRIANGLES,
                               0,
                               numberOfCubeVertices,
                               NUMBER_OF_CUBES));
複製程式碼

我們現在在螢幕上繪製了NUMBER_OF_CUBES數量。 但問題是所有立方體都在螢幕上的相同位置渲染,因此我們無法看到所有這些立方體。為每個立方體設定不同的位置。 立方體也應該有不同的顏色,所以我們將在這種情況下使用uniform塊。

在我們的頂點著色器中新增了兩個新東西

/*
 * We use uniform block in order to reduce amount of memory transfers to minimum. 
 * The uniform block uses data taken directly from a buffer object. 
 */
uniform CubesUniformBlock
{
    float startPosition[numberOfCubes];
    vec4  cubeColor[numberOfCubes];
};
複製程式碼

其中numberOfCubes定義為

const int   numberOfCubes = 10;
複製程式碼

然後,如果我們想從其中一個陣列中取一個元素,我們需要使用gl_InstanceID作為索引,它指示當前正在渲染的元素的索引(在我們的例子中,該值來自一個範圍[ 0,NUMBER_OF_CUBES - 1])。

在API中,我們需要檢索uniform塊的位置,

uniformBlockIndex = GL_CHECK(glGetUniformBlockIndex(renderingProgramId, "CubesUniformBlock"));
複製程式碼

驗證返回的值是否有效,

ASSERT(uniformBlockIndex != GL_INVALID_INDEX, "Could not retrieve uniform block index: CubesUniformBlock");
複製程式碼

並設定資料。

/* Set binding point for uniform block. */
GL_CHECK(glUniformBlockBinding(renderingProgramId,
                               uniformBlockIndex,
                               0));
GL_CHECK(glBindBufferBase     (GL_UNIFORM_BUFFER,
                               0,
                               uniformBlockDataBufferObjectId));
複製程式碼

程式物件將使用儲存在名為uniformBlockDataBufferObjectId的緩衝區物件中的資料。

/* Buffer holding coordinates of start positions of cubes and RGBA values of colors. */
GL_CHECK(glBindBuffer(GL_ARRAY_BUFFER,
                      uniformBlockDataBufferObjectId));
GL_CHECK(glBufferData(GL_ARRAY_BUFFER,
                      sizeof(startPosition) + sizeof(cubeColors),
                      NULL,
                      GL_STATIC_DRAW));
GL_CHECK(glBufferSubData(GL_ARRAY_BUFFER,
                         0,
                         sizeof(startPosition),
                         startPosition));
GL_CHECK(glBufferSubData(GL_ARRAY_BUFFER,
                         sizeof(startPosition),
                         sizeof(cubeColors),
                         cubeColors));
複製程式碼
void generateStartPosition()
{
    float spaceBetweenCubes = (2 * M_PI) / (NUMBER_OF_CUBES);
    /* Fill array with startPosition data. */
    for (int allCubes = 0; allCubes < NUMBER_OF_CUBES; allCubes++)
    {
        startPosition[allCubes] = allCubes * spaceBetweenCubes;
    }
}
複製程式碼
void fillCubeColorsArray()
{
    for (int allComponents = 0;
             allComponents < numberOfValuesInCubeColorsArray;
             allComponents++)
    {
        /* Get random value from [0.0, 1.0] range. */
        cubeColors[allComponents] = (float)rand() / (float)RAND_MAX;
    }
}
複製程式碼

我們希望我們的立方體在圓形軌跡上移動,以不同的速度旋轉,並具有不同的顏色。

頂點著色器程式碼

/* [Define number of cubes] */
const int   numberOfCubes = 10;
/* [Define number of cubes] */
const float pi            = 3.14159265358979323846;
const float radius        = 20.0;
in      vec4 attributeColor;
in      vec4 attributePosition;
out     vec4 vertexColor;
uniform vec3 cameraVector;
uniform vec4 perspectiveVector;
uniform float time; /* Time value used for determining positions and rotations. */
/* [Define uniform block] */
/*
 * We use uniform block in order to reduce amount of memory transfers to minimum. 
 * The uniform block uses data taken directly from a buffer object. 
 */
uniform CubesUniformBlock
{
    float startPosition[numberOfCubes];
    vec4  cubeColor[numberOfCubes];
};
/* [Define uniform block] */
void main()
{
    float fieldOfView = 1.0 / tan(perspectiveVector.x * 0.5);
    
    /* Vector data used for translation of cubes (each cube is placed on and moving around a circular curve). */
    vec3 locationOfCube = vec3(radius * cos(startPosition[gl_InstanceID] + (time/3.0)),
                               radius * sin(startPosition[gl_InstanceID] + (time/3.0)),
                               1.0);
    /* 
     * Vector data used for setting rotation of cube. Each cube has different speed of rotation,
     * first cube has the slowest rotation, the last one has the fastest. 
     */
    vec3 rotationOfube = vec3 (float(gl_InstanceID + 1) * 5.0 * time);
    
    /* 
     * Set different random colours for each cube. 
     * There is one colour passed in per cube set for each cube (cubeColor[gl_InstanceID]).
     * There are also different colours per vertex of a cube (attributeColor).
     */
    vertexColor = attributeColor * cubeColor[gl_InstanceID];
    
    /* Create transformation matrices. */
    mat4 translationMatrix = mat4 (1.0,             0.0,             0.0,             0.0, 
                                   0.0,             1.0,             0.0,             0.0, 
                                   0.0,             0.0,             1.0,             0.0, 
                                   locationOfCube.x, locationOfCube.y, locationOfCube.z, 1.0);
                                  
    mat4 cameraMatrix = mat4 (1.0,           0.0,           0.0,           0.0, 
                              0.0,              1.0,           0.0,           0.0, 
                              0.0,           0.0,           1.0,           0.0, 
                              cameraVector.x, cameraVector.y, cameraVector.z, 1.0);
    
    mat4 xRotationMatrix = mat4 (1.0,  0.0,                               0.0,                                0.0, 
                                 0.0,  cos(pi * rotationOfube.x / 180.0), sin(pi * rotationOfube.x / 180.0),  0.0, 
                                 0.0, -sin(pi * rotationOfube.x / 180.0), cos(pi * rotationOfube.x / 180.0),  0.0, 
                                 0.0,  0.0,                               0.0,                                1.0);
                                
    mat4 yRotationMatrix = mat4 (cos(pi * rotationOfube.y / 180.0), 0.0, -sin(pi * rotationOfube.y / 180.0), 0.0, 
                                 0.0,                               1.0, 0.0,                                0.0, 
                                 sin(pi * rotationOfube.y / 180.0), 0.0, cos(pi * rotationOfube.y / 180.0),  0.0, 
                                 0.0,                               0.0, 0.0,                                1.0);
                                
    mat4 zRotationMatrix = mat4 ( cos(pi * rotationOfube.z / 180.0), sin(pi * rotationOfube.z / 180.0), 0.0, 0.0, 
                                 -sin(pi * rotationOfube.z / 180.0), cos(pi * rotationOfube.z / 180.0), 0.0, 0.0, 
                                  0.0,                               0.0,                               1.0, 0.0, 
                                  0.0,                               0.0,                               0.0, 1.0);
                                 
    mat4 perspectiveMatrix = mat4 (fieldOfView/perspectiveVector.y, 0.0,        0.0,                                                                                              0.0, 
                                   0.0,                            fieldOfView, 0.0,                                                                                              0.0, 
                                   0.0,                            0.0,        -(perspectiveVector.w + perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z),        -1.0, 
                                   0.0,                            0.0,        (-2.0 * perspectiveVector.w * perspectiveVector.z) / (perspectiveVector.w - perspectiveVector.z), 0.0);
    /* Compute rotation. */
    mat4 tempMatrix = xRotationMatrix;
    
    tempMatrix = yRotationMatrix * tempMatrix;
    tempMatrix = zRotationMatrix * tempMatrix;
    
    /* Compute translation. */
    tempMatrix = translationMatrix * tempMatrix;
    tempMatrix = cameraMatrix      * tempMatrix;
                
    /* Compute perspective. */
    tempMatrix = perspectiveMatrix * tempMatrix;
                
    /* Return gl_Position. */
    gl_Position = tempMatrix * attributePosition;
}
複製程式碼

片段著色器程式碼

in vec4 vertexColor;
out vec4 fragmentColour;
void main()
{
    fragmentColour = vertexColor;
}
複製程式碼

在API中,我們通過呼叫查詢頂點著色器中使用的所有uniform的位置

uniformLocation = glGetUniformLocation(renderingProgramId, "uniformName");
ASSERT(uniformLocation != -1, "Could not retrieve uniform location: uniformName");
複製程式碼

當然,uniformName代表著色器中使用的uniform的實際名稱。

然後,根據uniform型別(float,vec3,vec4),我們使用不同的OpenGL ES呼叫來設定uniform的值。

在渲染過程中,攝像機位置和透視向量是恆定的,因此僅呼叫下面顯示的函式就足夠了。

GL_CHECK(glUniform4fv(perspectiveMatrixLocation,
                      1,
                      (GLfloat*)&perspectiveVector));
複製程式碼
GL_CHECK(glUniform3fv(cameraPositionLocation,
                      1,
                      (GLfloat*)&cameraVector));
複製程式碼

應該每幀更新time值,因此呼叫

GL_CHECK(glUniform1f(timeLocation, time));
複製程式碼

為每個正在渲染的幀發出(放在renderFrame()函式內)。

完成上述所有步驟後,我們得到結果:

相關文章