Due to the fact that the points renderer does not know for sure
which attributes will be used for rendering, it is now mandatory
to provide both vertex/fragment shaders for rendering AND hit
detection.
The hit detection shaders should expect having an `a_hitColor`
that they should return to allow matching the feature uid.
This will be all one eventually by shader builders under the hood.
Now most attributes will be custom ones defined by the user of the renderer.
The hit detection is broken for now and still has to be fixed.
Also it is not possible anymore to give a texture to the renderer,
this will have to be fixed as well.
For each feature its opacity value index is encoded on 4 bytes
in the color values, and the uid is stored in the opacity
value, allowing for a much higher range of uids to be read.
For now only `forEachFeatureAtCoordinate` is implemented.
Each time the viewport is rendered, another similar render pass is
done using the specific hit detection instructions. Feature uid's are
encoded in the r,g,b,a channels and can then be decoded on the fly.
Note: the `readPixels` operation is taking a lot of time,
around 10-20ms each frame.
Hit detection is done by rendering features with their id encoded in the
color attribute. A parallel set of render instructions and a second
vertex buffer is used specifically for that.
The worker currently works by receiving GENERATE_BUFFERS messages and
will send back the same kind of message, with the generated buffers
attached. All properties of the original message are kept, so that
when a GENERATE_BUFFERS message comes back to the main thread it
is possible to know what and how the buffers where generated.
This is typically used for the `projectionTransform` matrix, and
will also be necessary when working with tiles.
The worker receives a transferable array of instructions
and sends back two transferable arrays (vertex and index buffer).
The projection transform is also sent so that when the main thread
receives the buffers from the worker it also knows which projection to
apply when rendering the geometries.
The `WebGLHelper` class now provides a `makeProjectionTransform` method
that updates a transform to match the projection for a given frame state.
This also means that the WebGLHelper does not set the projection matrix
uniform anymore, this is the responsibility of the renderer as
the rendered coordinates will not be in world space from now on.