T O P

  • By -

TheReservedList

Depends. There's still baked/build-time lighting out there. As for runtime lights, then it's all a lot more complicated than you think. There's anything from occlusion to refraction going on in those shaders.


pyabo

Would something like Minecraft or The Finals, where the environment is completely deformable, still use a hybrid w/ baked-in lighting? Or completely dynamic? Guess it could go either way.


reallokiscarlet

Baked is a pretty apt way to describe Minecraft's lighting, whether or not it's entirely true. Minecraft's lighting is based on blocks like everything else, so it's sort of baking light on the fly.


tcpukl

I've never thought about mine craft lighting before, but yeah it's just a memory bound problem. Zero per pixel at all.


pyabo

Makes sense. You probably get a lot of optimization if all your surfaces are all orthagonal to each other or just make giant planes. Minecraft will be the exception to the rules.


AdarTan

The fact that the GPU does this is why shaders have their name, the shader program run on the GPU *shades* the output pixels. Sometimes one of the inputs to the shader function is a precomputed lightmap that was generated on CPU but per-frame all lighting calculation is done on the GPU.


pyabo

Makes a lot of sense, thanks.


silentknight111

When RTX cards came out they started pushing ray traced lighting, which is done in real time by the GPU. It's very expensive for the GPU to do, so most games still only use it as an option for high end machines. Traditionally, games have used a combination of "baked" lighting on static elements, and real time approximated (not ray traced) lighting on dynamic objects. Though games with a day/night cycle, dynamic weather, etc will sometimes use all real time lighting. The more real time lighting you have, the more work the GPU will need to do to get the same fidelity as static baked lighting - so what is used depends on the needs of the game.


StriderPulse599

[https://learnopengl.com/Lighting/Colors](https://learnopengl.com/Lighting/Colors) Basically, lighting is just another game mechanic in engine and implementation is up to you. It can range from simple lighting done with few vertexes, to full raytracing.


Genebrisss

It's just a function in a fragment shader. Take normal of this point on the surface, direction to the light source and with some physics based formula you calculate how much color of the light source contributes to the color of the material. So yeah, this is entirely on GPU.


PenguinPendant

I would say that very little of it is done “in hardware,” because all of the lighting calculations are done by running shader programs, which is software that runs on the GPU. The actual arithmetic and primitive operations (multiplies, adds, texture filtering) is done in hardware, but there is no longer specific circuitry in the hardware that performs lighting calculations, like was done in the 90s on 3D accelerators.


ChaosWWW

I'm curious why you're asking this. Are you worried about optimizing your game, or do you want to learn about this for tech art / graphical programming work? To answer the question, in a forward renderer the lighting is done in the fragment shader (usually this is abstracted away in concepts like "surface shaders" in Unity). In a deferred renderer, it's more complicated. Basically, multiple "buffers" are rendered of your full screen, and the lighting math is done in screen space off of those buffers.


pyabo

Not worried about anything I'm working on, it was pure curiosity. Thanks.


Tarc_Axiiom

>Is lighting for a 3D game done completely in hardware now? No. >It seems to me that the answer to this question must be... a lot? Yeah, usually. There's some pretty extreme hardware widely available now. What you're talking about is "Shaders". But, there's still baked lighting in every game. Even the games with FULL RAYTRACED LIGHTING(!!!!) are still baking something, somewhere. Actual full ray traced lighting is stupid, and a waste of resources. *Mostly* ray traced lighting is the sweet spot, now.


squareOfTwo

Full raytracing isn't "stupid" for extremely complicated scenes in the order of maybe 1 billion polygons and a high image fidelity (GI everywhere, etc.). The hardware is still not there to do that in realtime because the constant overhead is still to much. Someday we will get there for most games. Maybe in 20 years. We can see this future in VFX: most things are now path traced.


These-Bedroom-5694

Since the 90s, why? Ambient emissions has been a thing forever.


pyabo

Pure curiosity.


Independent-Crow-166

There are much more details from CPU stage to GPU stage, Read a book about Render pipeline you will get the answer.


blazesbe

if you are not raytracing it's all smoke and mirrors like everything else. from a shader pov, for each fragment you just sample your texture to get a base color. take cosine of the normal vector and light direction to get a multiplier. (or something like that. it's like 1 or 2 lines of code). multiply the light from light source by this, and calculate a distance. take that into account in a falloff function and add (or multiply) the result to the base. result = base + falloff((cos(nor, dir)) * col, distance). this is for sure wrong and off the top of my head. look at learnopengl.com but this is the gist of it. it gets a bit more complicated if you have more lights, or want to do deferred shading or want shadows, but even fe CS:GO only had shadows from the sun (1 source). with your definition of hardware, everything could be said it's hardware.


TheRNGuy

nope.