Baldur's Gate 3 PC – DF Tech Review – Graphics Analysis + Optimised Settings



A massive, runaway hit, Baldur’s Gate 3 is an exceptional PC release and a brilliant game. In this video review, Alex Battaglia discusses what appeals to him about the game, discusses the proprietary game technology and of course, delivers optimised settings to maximise performance without losing much at all from the fully maxed out experience.

To get access to the free-camera photo mode mod used in this video, visit Frans Bouma’s Patreon: https://www.patreon.com/Otis_Inf

Subscribe for more Digital Foundry: http://bit.ly/DFSubscribe

Join the DF Patreon to support the team more directly and to get access to everything we do via pristine quality downloads: https://bit.ly/3jEGjvx

Want some DF-branded tee-shirts, mugs, hoodies or pullovers? Check out our store: https://bit.ly/2BqRTt0

For commercial enquiries, please contact [email protected]

00:00:00 Introduction
00:01:07 Gameplay and Graphics
00:03:09 Should you use Vulkan or DX11?
00:09:16 User Experience and Menu Options
00:12:18 Alex’s Optimised Settings
00:19:28 Conclusion

source

24 thoughts on “Baldur's Gate 3 PC – DF Tech Review – Graphics Analysis + Optimised Settings”

  1. I feel like since the final months of the ps4 and xbone gen pretty much all games are gorgeous now. We're looking at graphics that would've destroyed expectations less than 8 years ago and saying "eh it's serviceable but nothing special" and I'm thinking "we literally live in the graphical future i couple only dream about my entire life" but it's just ok now because there's no (way overrated) ray tracing and AI powered deep learning super resolution and real time image reconstruction to make an artistically gorgeous game into a photorealistic, real life simulator with no real creative vision or artistic soul outside of "make graffix reel lul"

    Reply
  2. this game ran well even on extremely dated machine… and on harddrive.
    Sandy Bridge 2500k + RX580 😂 SSD will reduce funny visual issue caused by slow asset loading, but it's totally playable even on harddisk.

    Reply
  3. What the hell? BG3 on launch (actually, as of Hotfix #1, I nevere played the original launch versioon) was infinitely more broken than Elden Ring (which I played on launch).

    Reply
  4. Great video! Playing at 1440P Ultra on a 7900XT here with a 5800X3D. Comparing with your findings? About 12 seconds from clicking PLAY to seeing the main screen with the sea port. DX11 is definately better than Vulkan in this title. Glass smooth and rock solid frametimes. As for TAA on my game play? At the exact same place by the river… The splashes are NOT ghosting at all. Not even a little. TAA looks crisp in the splashes and no ghosting. Is that because I have FidelityFX Sharpening enabled in the Options? Or that I am at 1440P and not using FSR? FidelityFX is only an AMD only setting. Curious if TAA looks different on AMD cards? Been 2 generations since I used nVidia TBH as a daily driver at least.

    Reply
  5. Would have been interesting if you talked more about how this game is a CPU hog. My ageing CPU leaves my GPU half-asleep sometimes, while on the Steam Deck the framerate fluctuations are harsh, depending on where you are. All down to how demanding the background math is on the CPU – graphical settings do not help pretty much at all.

    Reply
  6. I thinknthebgame is beautiful. Not over the top, but still very nice. Voice work is god tier. The game is very immursive, I'm clamering to see what happens next, and hoping my choices are good. It's rare you expect a game to blow your mind, and it exceeds those expectations. It's blown my whole fucking body. If ever a game deserved a perfect score. It's BG3. I haven't felt like this since FF7 on Playstation. I am utterly enthralled.

    Reply
  7. I played half the game on dx11 now I can't even make it to the menu using it. Had to swap to vulkan to finish the game. I still can't play on dx11 gets to 100% at the load screen then freezes before the main menu and I have to sign out or restart the computer. Pretty annoying. 12900K with a 3090Ti

    Reply

Leave a Comment