Högni and Amy's game project

Thought in deed

Archive for the ‘VoidEngine’ Category

Standard material done

leave a comment »

For all practical purposes I consider the work I’ve done on exporting the standard material type from ShaderFX to Ogre3D readable format, complete.

It now exports correctly all the subnodes and lights and the calculations all appear correct now, including advanced functions like projection mapping based on vertex positions, alphas based on vertex colors, ambient occlusion, diffuse, normal and more.

I’m now going to proceed to create an exporter for the Glow and Subsurface scattering materials, as those are the ones I see us needing in the near future.

Even though this has taken a lot of time to implement I think it will save us untold hours of work in the future. This applies even if I leave it standing at only the standard material.

Unfortunately I’ve not gotten any further response from Luminox on this matter, so I don’t know if this work will ever reach the public. I can’t release it on my own as it is proprietary software that I’ve made extensions on…

I’ll update on this status when I can.

Written by Högni Gylfason

06.10.09 at 12:43

Ogre3D and ShaderFX… Living in (almost) perfect harmony

leave a comment »

There is some progress on the exporter for Ogre in ShaderFX. I’ve almost finished the StandardMat support. There are still some niggles, like light projection not rotating with the object, but I think I can figure that out.

Planet in Max with SFX Panel

Planet in Max with SFX Panel

This is the material as displayed in Max in mode CGFX. The normal is scaled and UV Rotated with time input and the diffuse texture is plugged into the ambient socket with a modulator as well as the diffuse socket.

Planet in OgreMax Viewport

Planet in OgreMax Viewport

This is the same object in the OgreMax viewport in max. Looks pretty neat. I also tested it in the OgreMax viewer to test the UV rotation and it works like a charm with the time input. Below are the shader settings in the OgreMax material editor panes.

Diffuse shader settings

Diffuse shader settings

Ambient shader settings

Ambient shader settings

I’m quite happy with it so far. I’m hoping to do the Glow material next, not because it’s next in order of general importance, but because I need it 🙂

Written by Högni Gylfason

26.09.09 at 13:21

Ogre3D and ShaderFX

leave a comment »

I’ve been using Lumonix’ excellent ShaderFX plugin for 3ds max to generate shaders, and then altering the generated shaders to comply with the Ogre3D  material framework. To say the least this has not been without problems. The main problem is that Ogre3D is designed to be very open as far as the content pipeline is concerned. The material structure is very flexible, but at the same time suffers for it in its inability to use techniques native to shader programs.

A shader program usually follows a set pattern. At the top are definitions of constants and “tweakables”. These are the inputs to the shader program, such as colours, values for specular and gloss levels, texture samplers and more. After these follow definitions of the matrix structures the shader requires to work properly, such as the world matrix, world view projection, view matrix and many many others. Thereafter come definitions for input and output structs for the vertex and fragment shaders. Then comes the real meat, the vertex and fragment shader entry points themselves and their internals. Last come the technique definitions, where everything is woven together.

Most 3d engines read the techniques and apply materials to the objects in the 3d representation. Ogre goes about this circuitously at best. It does not read the techniques at all, but leaves it to the programmer to know the entry point names and then declare shader definitions that are in turn used in material scripts. To add to this, most shaders exported from any shader authoring software do not compile properly with Ogre without alteration. For example matrix multiplication is reversed when multiplying multi column matrices with single column (vectors). The shader declarations need to include the entry point name, the target (vs_2_0, ps_3_0, etc) and the source file for the shader proper. Then you define the bindings for the world matrices to the shader inputs, e.g. wvp -> world view projection matrix. These, in turn, are read by entirely different files, called material files, where you point to the declarations you made in the previous files, called program files.

This process is labourious to say the least. So, I’ve been trying to find ways to automate some of the steps. The biggest hurdle at first was editing the shader files to conform to Ogre’s format. If you’ve ever seen the contents of a shader file, you know what I mean when I say that it is not pretty. Second was writing the shader definition files (.program) and thirdly including it in the materials. I wrote a short python script that cleaned up the shader file for comsumption by ogre, but it seemed a cumbersome way to do things, parsing the code of the shader and then outputting it to a new file. After trying this with a few different shaders, I decided to trash the project and try something else.

I started looking into how ShaderFX is constructed. Without going into too much detail (proprietary software etc), I managed to build a half assed export extension that exports Ogre compatible shaders. Now I just need to get some support from Lumonix to extend the extension so that it generates the necessary material and program files. So far Lumonix have responded well to this project, so I hope they will be able to answer the questions I have raised with them.

Hopefully to be continued.

Written by Högni Gylfason

24.09.09 at 13:04

Client Server Physics Integrated

leave a comment »

Well, I finally managed to integrate the hkpCharacterRigidBody into the client server networking in a satisfactory manner. After weeks of trying to get it to work sending the inputs across network I finally gave up and have it running separately. The client has its own physics handler integrated in its message loop, while the server runs completely indepentently in a separate code stack and thread. There is no communication between the two aside from the client interface being able to start and stop the server thread using a message pump through the scripting engine.

What I do is run a separate replica of each static geometry object on the client. When the server starts it creates the world and then proceeds to transmit all the relevant static geometry data to the clients. It doesn’t transmit the pathfinding phantoms nor does it transmit any physical properties for dynamic entities, only position and velocity updates are sent for those.

The client in turn takes input for the character controller the player owns and transmits the position and velocity updates to the server. The server then takes the position, verifies its validity and keyframes it into position. After watching the effects in the Havok debugger it seems to work very well. I’m not sure how to affect the player with dynamic server side objects, as the player character is a keyframed object and therefore has infinate mass, but I figure I can solve that later as long as the baseline is set. The memory footprint for the replicated objects doesn’t seem to be too significant, so I think this will do it quite well.

After this long battle I’m looking forward to continuing implementing the engine features that remain.

Written by Högni Gylfason

19.07.09 at 14:30