Not only is the lighting improved, but Digital Foundry pointed out that new light points were also added to FINAL FANTASY VII REMAKE INTERGRADE. Can you elaborate as to why these were included?Ikeda: Since we were able to leverage more wiggle room with our resources, it did allow us to add more light assets in various locations. The purpose here was to add more information to some of the flat areas, which helps emphasize the hustle and bustle of Midgar. 

FINAL FANTASY VII REMAKE INTERGRADE features upgraded screen-space reflections (SSR), which are really noticeable on water reflections. Can you talk about the work that went into this enhancement?

Ikeda: The SSR we used in FINAL FANTASY VII REMAKE INTERGRADE traces rays not just from glossy surfaces, but also rough surfaces, which generally tend to be omitted. By doing so, a more accurate occlusion is considered geometry-wise, and so you can see things like the light of a foot lamp bouncing off of a surface, or the hilt of Cloud’s sword being reflected in its guard.

There were times when aliasing from ambient light passing through could be seen on the eyes and the nose, which was a bit unsightly, but that was dramatically reduced. Typically, the mapping vectors obtained via normal mapping point in directions that would not be visible by the line of sight in many instances. When using these vectors in SSR, the direction of the reflection points to the inside of the surface, so it will always return a tracing error. This time, the mapping vectors on the polygon surface were stored in the buffer as well, and we corrected the error pixels when reflections were created to combat this issue.

In typical SSR, the image in the previous frame would be referenced, so whenever something like a camera switch happens, you can see a delay in the reflections being processed. For this title, whenever something like that happens, we would prioritize rendering the incomplete current frame so it would make an alternative reference, in order to address any hopping. Additionally, in a typical implementation, the prior frame that is referenced may already have the fog rendered, which causes that fog to be processed twice when referenced in SSR.

While we were developing the imagery, we would get a very strong impression of the volume fog, with this error being exacerbated on water surfaces that reflected the distant landscape. So, we stored two images–with fog and without–so that the fog density can be more environmentally appropriate when reconstructed based on the distance reached by the reflection ray, thus achieving a much more natural reflection. 

Source: Unreal Engine Blog