How well do you understand the programing behind video games?

Started by Legend, Jul 22, 2017, 11:18 PM

previous topic - next topic

0 Members and 1 Guest are viewing this topic.

Legend

Eh thought this could be a more open and fun version of the Unity thread.


I'll start by saying I don't fully understand deferred rendering. I understand the general concept but I don't get how the math actually works out for calculating lighting.

VizionEck uses forward rendering.

darkknightkryta

After programming a 8x8 Led matrix, I have a better understanding of how GPUs draw.

the-pi-guy

Very little actual stuff.

But I know how programming works and stuff. From that I can usually pick up other concepts. Also I have ideas on how stuff could work.
But I have yet to really put in the time to learn it. I have a few great books though.

I guess I know AI and stuff, so there's that. :D

Legend

The other day I was doing some multithreading and couldn't use Unity's function for getting a random point inside a sphere, so I made my own.


It picks a random point inside a cube and if it's not within the sphere, it tries again.   8)


Unity's function must work the same way since I was getting identical performance. Calculating it with proper math took ~4 times longer on average.

After programming a 8x8 Led matrix, I have a better understanding of how GPUs draw.
I was kinda slow to grasp that GPUs render the full scene that's being looked at, even if there's an object blocking the view.

Xevross

I have literally no idea about any of it

Legend

I Fell asleep on the coutch for 2 hours and had a nightmare that I was sleeping instead of working.

Now that's game development.

darkknightkryta

The other day I was doing some multithreading and couldn't use Unity's function for getting a random point inside a sphere, so I made my own.


It picks a random point inside a cube and if it's not within the sphere, it tries again.   8)


Unity's function must work the same way since I was getting identical performance. Calculating it with proper math took ~4 times longer on average.
I was kinda slow to grasp that GPUs render the full scene that's being looked at, even if there's an object blocking the view.
Well they don't draw what's off screen.  The CPU just keeps track of it.  There's polygon culling.  But I mean, the way a pixel is lit and how the GPU has to send electronic data to the TV and the T.V. has to interpret that data and run electricity through the right wires (embedded).  I mean, go look at the pinout of a Matrix Display.  That's only like 8/8.  Now imagine a larger framebuffer.  Though thinking back at how 8 bit graphics were drawn, that's a bit closer to what I'm thinking.  Like I had my students program pong on a Matrix Display.  I think all graphics programmer has to make pong on a Matrix display to appreciate what graphic cards do.

Legend

Well they don't draw what's off screen.  The CPU just keeps track of it.  There's polygon culling.  But I mean, the way a pixel is lit and how the GPU has to send electronic data to the TV and the T.V. has to interpret that data and run electricity through the right wires (embedded).  I mean, go look at the pinout of a Matrix Display.  That's only like 8/8.  Now imagine a larger framebuffer.  Though thinking back at how 8 bit graphics were drawn, that's a bit closer to what I'm thinking.  Like I had my students program pong on a Matrix Display.  I think all graphics programmer has to make pong on a Matrix display to appreciate what graphic cards do.
YEah backface culling, frustrum culling, and occlusion culling decrease the amount of overdraw but it's always there to some extent.

That's cool that you were doing something so low level as that.



Oh another graphics thing, I love dithering for transparency. I used to hate it but with 1080p and 4k it can work great. Normally transparent objects are a huge pain for games to render and they're done in a seperate pass. Dithering is the act of rendering them as solid but just rendering only a portion of their pixels. With a bit of blur and a high enough resolution, it can look almost as good.

Almost every recent LOD system uses this because it's so cheap.

darkknightkryta

YEah backface culling, frustrum culling, and occlusion culling decrease the amount of overdraw but it's always there to some extent.

That's cool that you were doing something so low level as that.



Oh another graphics thing, I love dithering for transparency. I used to hate it but with 1080p and 4k it can work great. Normally transparent objects are a huge pain for games to render and they're done in a seperate pass. Dithering is the act of rendering them as solid but just rendering only a portion of their pixels. With a bit of blur and a high enough resolution, it can look almost as good.

Almost every recent LOD system uses this because it's so cheap.
Yeah the PSX had good dithering.  It can save some space with textures too, but oh well.

Legend

Jul 23, 2017, 07:08 PM Last Edit: Jul 23, 2017, 11:37 PM by darkknightkryta
Yeah the PSX had good dithering.  It can save some space with textures too, but oh well.
DIthering for color is cool too. A dev on gaf is building a 3d renderer around it.




darkknightkryta

DIthering for color is cool too. A dev on gaf is building a 3d renderer around it.




Yeah, it always boggled my mind how Capcom screwed up their 90s ports on PSX so bad when they could have saved so much  space with dithering.

the-pi-guy

Jul 24, 2017, 12:09 AM Last Edit: Jul 24, 2017, 03:58 AM by darkknightkryta
I like how DKK edited your post originally.    ;D

I have literally no idea about any of it
"The only thing I know about game programming is that it's spelled wrong in the title".


I have mentioned this a few times before.   ::)
But might as well mention it in on topic. 

I have a couple of textbooks that go into graphics. 
Another textbook especially goes into game programming.  It's a huge book, I'm not exactly sure what it goes into specifically.  But it's an incredibly massive textbook and I'm sure it goes into an amazing amount of things.

I also mentioned I have a game engine textbook. That one also goes into a huge amount of things. 
Both books seem to cover slightly different things, but they are both two of the largest books I have; so they should cover an insane amount of material each. 

I'm planning to wait until I'm more confident in Japanese first though. 

With that said, I am also going to take a class on graphics in about 2 years. A short description here:.
Quote
An introduction to the mathematics, data structures, and algorithms used to create both 2D and
3D graphical output. 2D topics include viewing transformation, clipping, scan conversion,
geometric transformations, hierarchical modeling and animation. 3D topics include projections,
viewing systems, back face culling, polygon clipping, wireframe images, visible surface
algorithms, Phong reflection model, Gouraud and Phong shading techniques, color dithering,
color quantization, ray tracing and Bezier patches.
I'm pretty excited, even though it is a ways away.

Touch screens suck.  P.S. I'm doing this one on purpose :P

Legend


the-pi-guy

Eh thought this could be a more open and fun version of the Unity thread.


I'll start by saying I don't fully understand deferred rendering. I understand the general concept but I don't get how the math actually works out for calculating lighting.

VizionEck uses forward rendering.
I'm guessing you have looked at this?  

Deferred Rendering in Killzone 2

Legend

I'm guessing you have looked at this?  

Deferred Rendering in Killzone 2
No I haven't seen that.

Was fun seeing most of those slides and understanding them  8)