Based on this, I think it’s a combination of factors. Besides the periodic runtime checks to verify your hardware fingerprint, it also does on-the-fly decryption of some values, and apparently does weird things with the stack. The decryption could have a big impact on memory consumption and performance, depending on the amount of data.
That post says the performance impact in Hogwarts Legacy is negligible (even if their technique for measuring is imperfect), but I suspect that Denuvo is configurable by the dev. Maybe Capcom raised the slider up to 11, but WB kept it lower?
Yeah, how bad Denuvo is performancewise definitely depends on the specific integration - there have been titles where Denuvo just tanks the performance because it checks a ton of values per second, and others where it really doesn’t matter - I would say Hogwarts Legacy falls in the second category, where checks have been intelligently stuffed into loading sequences and scene transitions. Capcom normally falls into the first category.
Based on this, I think it’s a combination of factors. Besides the periodic runtime checks to verify your hardware fingerprint, it also does on-the-fly decryption of some values, and apparently does weird things with the stack. The decryption could have a big impact on memory consumption and performance, depending on the amount of data.
That post says the performance impact in Hogwarts Legacy is negligible (even if their technique for measuring is imperfect), but I suspect that Denuvo is configurable by the dev. Maybe Capcom raised the slider up to 11, but WB kept it lower?
Yeah, how bad Denuvo is performancewise definitely depends on the specific integration - there have been titles where Denuvo just tanks the performance because it checks a ton of values per second, and others where it really doesn’t matter - I would say Hogwarts Legacy falls in the second category, where checks have been intelligently stuffed into loading sequences and scene transitions. Capcom normally falls into the first category.