What I did today

Remember when i posted that i figured out bump mapping? Yeah, i was wrong. The sphere was normal mapped correctly only when the light was at a certain position.
I looked up the problem on google and struggled for 3 days to get the thing working. Basically i needed to convert the light & eye position to texture-space, then do calculations on that.
To do that i needed to calculate the TBN (tangent, binormal, normal) 3x3matrix and multiply lightPos & eyePos by it.
I had a lot of problems on the way, like having some sort of ā€œgradientā€ on the faces that were rendered —>

I was trying to pull off some sick math to remove it, but then the next day, i deleted the code, so i could start over.
I looked at the code for a bit, and then spontaneously surrounded the tangent vector with a normalize() function. It worked. HALLELUJAH !
If anyone has problems with normal mapping in the future, pm me, i think i can point you in the right direction :slight_smile:
Here is the awesome article that made everything clear : http://http.developer.nvidia.com/CgTutorial/cg_tutorial_chapter08.html

Might want to normalize them before sending them to the shader, less GPU instructions the better. :wink:

Unless the instructions are done in parallel, which is why shaders and GPUs exist.

You need to renormalize it at fragment shader anyway.

Math seems to be lot intuitive if you transfrom normalmap normal to world/viewspace and do lighting there. This also make normal mapping code lot less intrusive and cost is not dependant of numbers of lights. It’s also way to go with deferred rendering.

You had to normalize it because it was a directional vector. I think. Directional vectors’ length must always be one. Right? Yeah…?

Decided to mess with CLion. It’s a bit underdeveloped, but I love it so far. I even decided to make a little ā€œfluidā€ physics thing with Box2D and SFML.

_op8RZI9ozE

In other news, I realized that having an unreleased game with a public github repo is a bad idea and swapped to BitBucket for free privacy.

So, I was messing around in some of my old projects when I decided to open up my 3D world again. I played around a bit, and I realized how much better it would look with larger, less repetitive land masses. So, I just increased the persistence and number of octaves. However, that made each tile a bit too jagged for my liking, so I decided to implement a smoothing algorithm. I think at the moment it does more harm than good, but at least it’s trying. :wink: Next step would be to get cross-chunk smoothing working (which should be trivial) and tuning; but it’s late, I’m tired, and have to be up early tomorrow…so I’m going to call it a night! Some screenshots for proof of my ramblings…

Before:

After:

Edit: It’s crazy that I haven’t worked on this for over six months!

Took another crack at battlefield, got an internal hack working in C++. Basically I wrote a program to inject a DLL I wrote into the memory address space of the game, so with the proper SDK offsets I could literally access the stuff like I was programming from the real source. Very fun. I can even call virtuals! I got engine debug rendering working, but then decided to simply hook into directx instead because it suited my needs.

Renewed my temporary permit so I can take my driving test in 10 days and get my drivers license! I’ve also continued to learn more about the libraries and things I need to know for my internship, including JUnit and Tomcat. Not the most interesting, but also kind of fun to learn all the little support libraries that you can use.

I’ve also discovered I do NOT like front end web development. It’s just not fun to me at all!

Today I tried to compile Intel’s own Adaptive Order-Independent Transparency algorithm on an Intel graphics card and immediately regretted my decision when the entire driver crashed during shader linking. I think this is pretty much the ultimate proof of what kind of state Intel drivers are in.

What I Did Today Since RPC Launched:

Troubleshot many, many Intel HD Graphics systems. It always seems like if RPC has a problem initializing LWJGL/OpenGL, the user is on Intel Graphics. Completely hit and miss too, because I have 3 laptops and a Mac Mini all with different generations of Intel HD chipsets and they all work just fine. :confused:

No shit. Every time we release a new major build I have to go through my shaders and code to Intel-proof it. I don’t know what I expected this time to be honest. I don’t think I’ve ever managed to compile a GLSL 400+ shader at all on Intel in the first place. Intel + OpenGL = ButIPoopFromThereException.

Speaking of intel, just ran across this: http://www.joshbarczak.com/blog/?p=667

and the nice visual break down of rendering in Deus Ex: http://www.adriancourreges.com/blog/2015/03/10/deus-ex-human-revolution-graphics-study/

I used to have a bit of code in my games that went something like this:


if (vendor.indexOf("Intel")) { return false; } // Fuck Intel

Cas :slight_smile:

Achieved 100 days streak count on GitHub.


http://i.imgur.com/4kOI9kO.png

:-* ::slight_smile: :o

Wanted to play my game on my phone and thought it would be a fun challenge, so I’ve begun porting my game to android.

Finished a GLFW-based input manager to replace my old Jinput one. Spent a lot of hours fiddling with the character class, and learned to hate Bullet’s triangle meshes all over again in the process. At least the code is a lot tidier now.

Finally learning to draw vector graphics, this is what I drew in 30 minutes in GIMP, and started my journey to get away from blocky pixel art as much as I can.

Still has to improve, and next up learn animating them using Blender. Yes you read it right, Blender can also be used to create 2D animations.

Haven’t you considered using Spine ?
It’s not difficult to learn, and it gives you beautiful 2d animation. The only ā€œproblemā€ is that it is not free. However I think it’s well worth being able to use it seamlessly with libgdx.