Skip to content
Alex Dixon edited this page Apr 1, 2019 · 19 revisions

The entity component system (ecs) and pmfx and the 2 most notable and powerful features of pmtech, they can be used together to provide users with high level rapid data-driven development and provide fast low level performance.

Entity Component System

The ecs is a data-oriented structure of arrays system for updating and rendering entities. It stores all entity components in contiguous arrays of memory and processes components sequentially to provided maximum theoretical throughput for data and instruction caches.

Pmfx

Pmfx is a high level shader system and data driven renderer. It uses HLSL as a basis for the shader language and extends HLSL to add techniques, permutations and shader parameters which can be configured using json. Pmfx can be user to render ecs entities through scene view renderers and the rendering pipeline can also be configured using json config files.

Shaders

HLSL shader model 4 and 5 can be used to write shaders, GLSL and metal shaders are automatically generated using the build script build_pmfx.py. You can build shaders for a target platform as follows:

pmbuild -shaders

Building shaders will generate a platform specific shader, run this through the platform shader compile pipeline to get errors, and generate an info file containing reflection information and other metadata.

Simple syntax differences between languages are handled via macros in _shader_macros.h. More complex differences and handled through code generation. There are some slight differences to regular HLSL to take note of when handling textures:

declare_texture_samplers
{
    texture_2d( diffuse_texture, 0 );
    texture_2d( normal_texture, 1 );
    texture_2d( specular_texture, 2 );	
    texture_3d( sdf_volume, 14 );
    texture_2d( shadowmap_texture, 15 );
};

void main()
{
    float4 col = sample_texture(diffuse_texture, tex_coord);
}

The second parameter passed to the texture declaration functions is the register to bind it to, meta data for each shader is generated during compilation so that platforms that do not support explicit register binding (such as GLSL 330) can be fixed up at load time and the appropriate textures bound to the correct slots.

The same system is used to bind constant buffers to slots, all shader constants must be supplied by constant buffer, single OpenGL style uniforms are not permitted.

cbuffer per_pass_view : register(b0)
{
	float4x4 vp_matrix;
	float4x4 view_matrix;
	float4x4 view_matrix_inverse;
	
	float4 camera_view_pos;
	float4 camera_view_dir;
};

The ecs and pen::renderer will bind certain built in constant buffers to designated slots which can be relied upon to avoid code duplication or repeated work.

cbuffer 0 = per_pass_view (view materix / camera info for the currently rendering view).
cbuffer 1 = per_draw_call (object world matrix, inverse world matrix and 8 floats of custom data).
cbuffer 3 = per_pass_lights (list of lights and a count).
cbuffer 4 = per_pass_shadow (list of shadow map matrices).
cbuffer 5 = per_pass_shadow_distance_fields (list of signed distance field volume matrices).
cbuffer 7 = material_data (specified in technique constants).
texture 14 = signed distance field texture.
texture 15 = shadow map texture array.

Files can be included to share functionality, modular functions can be found in lighting.slib, skinning.slib, maths.slib and more. These add different lighting equations, attenuation functions, skinning utility functions and more.

Techniques

Multiple shaders can exisit within the same file, a json object is used to specify which vsmain and psmain to use and to generate a shader technique much like hlsl/fx techniques. Techniques can also specify tweakable constants which will go into cbuffer 7 and they are automatically enumerated in pmtech editor:

"forward_lit_sdf_shadow":
{
    "vs": "vs_main",
    "ps": "ps_forward_lit",

    "constants":
    {
	 "albedo": { "type": "float4", "widget": "colour", "default": [1.0, 1.0, 1.0, 1.0] },
	 "roughness": { "type": "float", "widget": "slider", "min": 0, "max": 1, "default": 0.5 },
	 "reflectivity": { "type": "float", "widget": "slider", "min": 0, "max": 1, "default": 0.5 },
	 "surface_offset": { "type": "float", "widget": "slider", "min": 0, "max": 1, "default": 0.3 }
    }
},

Vertex stream out / transform feedback can be utilised on any shader by simply setting "stream_out" to "true". The Vertex Stream Out sample shows how vertex stream out can render a large number of skinned meshes through instancing but only skin the mesh once per frame.

"pre_skin":
{
    "vs": "vs_main_pre_skin",
    "stream_out": "true"
},

Permutations

Shader permutations akin to uber shaders can be created by using a modified if statement, defines are placed into the "defines" or "permutations" key in the pmfx json block. All if: statements are converted to #if defined() before compilation, the reason for preferring if style statements as opposed to # is personal preference of finding the indentation and nesting of complex #if #endif statements to be harder to follow:

if:(SDF_SHADOW)
{
    float s = sdf_shadow_trace()
    light_col *= smoothstep( 0.0, 0.1, s);
}
"gbuffer":
{
    "vs": "vs_main",
    "ps": "ps_gbuffer",
        
    "permutations":
    {
        "SKINNED": [31, [0,1]],
        "INSTANCED": [30, [0,1]],
        "UV_SCALE": [1, [0,1]]
    },
        
    "inherit_constants": ["forward_lit"]
},

Permutation combinations will be automatically generated with the shader name and an appended number which is generated from the set bitmask. In the above example "SKINNED" is bit 31 and it can be on or off, permutation options can have multiple states such as "QUALITY_LEVEL" 0-5.

Renderer

pmfx renderer can be configured using json config files to define render pipleine state and make draw calls through a mechanism called a view. A view specifies a render function to be dispatched into a render target with a specified pipeline state. Note that it is possible to use an un-strict json variant without the need for quotes around keys, for this reason pmfx render configs have a .jsn extension.

Render States

Render states can be created and re-used by name:

blend_states:
{
    additive:
    {
	blend_enable: true,
	src_blend   : one,
	dest_blend  : one
    },
},

sampler_states:
{
    wrap_linear:
    {
	filter : linear,
	address: wrap
    },
},

depth_stencil_states:
{
    default:
    {
	depth_enable: true,
	depth_write : true,
	depth_func  : "less"
    },
},

raster_states:
{
    front_face_cull:
    {
	cull_mode: front,
	fill_mode: solid
    },
},

render_targets:
{                
    shadow_map:
    {
	size  : [2048, 2048],
	format: d24s8
    }
}

Render states can also be obtained and set in code like the following:

u32 sampler_state = pmfx::get_render_state(PEN_HASH("wrap_linear"), pmfx::RS_SAMPLER);

Views

Render states are collected into views, views can inherit one another by using inherit: view_name

main_view:
{
    target: [main_colour, main_depth],
    clear_colour : [0.0, 0.0, 0.0, 1.0],
    clear_depth : 1.0,
    colour_write_mask : 0xf,
    blend_state : disabled,
    viewport : [0.0, 0.0, 1.0, 1.0],
    raster_state : default,
    depth_stencil_state : default,
    scene : main_scene,
    camera : model_viewer_camera,
    scene_views : ["ces_render_scene"],
    render_flags : ["forward_lit"]

    sampler_bindings :
    [
        { texture: shadow_map, unit : 15, state : wrap_linear, shader : ps },
    ],
},

At startup render functions and cameras need to be registered with pmfx, so they can be accessed in the config files, ecs scenes have a render functions which draw any geometry entites and also to draw light volumes for deferred rendering. Scene views will setup render state for each pass, but render functions are free to modify render state themselves.

Views are collected together into view sets and a single view set is enabled at one time. A view set for a game or program may look like this:

  • render shadow maps
  • render main scene into gbuffer
  • render light volumes and apply shadow maps
  • render debug lines

You can setup multiple view sets and switch between them in real time, it may be useful for example to switch between a game mode and edit mode, or potentially a game might want to switch between forward and deferred rendering depending on the environment.

Post process sets and chains

Post processes are treated as views and can be chained together inside a post process set to create complex post processing effects. Post process sets consist of a chain of post processes and a set of parameters.

  • A chain is a set of post processes.
  • A post process is a set of views.
  • Post process parameters are .pmfx shader constants.

For example bloom would consist of:

  • High pass filter
  • Downsample
  • Blur Horizontal
  • Blur Vertical
  • Repeat for wider blooms

Users do not need to worry about "ping-ponging" to different read and write buffers. Simply reading and writing to a render target in a config file is enough, the pmfx system will work out render target dependencies and allocate the minimum number of auxiliary targets for the effects required.

A UI is automatically created to make creating and editing post processes simple and easy, it can save out the resulting .jsn config file with post process sets, chains and parameters all setup ready to be loaded.

Clone this wiki locally