GPUBenchmark for Spider-Man Remastered

GPUBenchmark for Spider-Man Remastered ...

Spider-Man Remastered is a vintage console game that was remastered for PlayStation 5 and now it's available on PC, including the original game, updated textures, and performance improvements, among other things.

It's now available on Steam and the Epic Games Store as a standalone title. With that gamers get the best possible version of the game yet with unlocked frame rates, FSR 2.0 and DLSS upscaling, ray-traced reflections, and ultra-wide monitor support.

Developers recommend a GeForce GTX 1060 6GB or Radeon RX 580 8GB for good 1080p 60 fps gaming experience. The RTX 3070 or 6800 XT is required for excellent ray tracing, but we'll put those concepts to the test and much more when we dig out 43 current and previous generation GPUs for this testing.

Spider-Man Remastered is rated at 1080p, 1440p, and 4K with the medium and very high quality preset, as well as a ray tracing configuration with the high quality preset. This was equally as demanding as swinging from buildings, so it worked perfectly for testing.

The Ryzen 7 5800X3D has 32GB of DDR4-3200 CL14 memory. We tested the GeForce Game Ready 516.94 and Adrenalin Edition 22.8.1 drivers, both of which include performance improvements for this game.

Benchmarks

We'll start with medium quality data at 1080p and gradually build our way up. Intel isn't looking to optimize for newly released titles just yet, in fact they'd probably appreciate it if game designers stopped making games for the next five years.

The Arc A380 is at the bottom with a barely playable 33 frames per second on average, and remember we're using the medium quality preset here at 1080p. The GTX 1630 is equally awful with just 47 frames per second on average as it somehow made the RX 6400 with 56 frames look good. The 6500 XT cracked the 60 fps barrier but the 1% lows weren't great, so the true entry point is the GTX 1650 or the good old RX 5

The Radeon 5500 XT is nearly 20% faster than the 6500 XT, but we're unlikely to discuss it here. As usual the GTX 1650 Super looks like one of the best budget graphics cards, embarrassing not only the much newer 6500 XT, but also its predecessor, the vanilla 1650.

The GTX 1080 and the Vega 56 are an interesting collaboration, since both had a good speed of over 100 frames per second.

The RTX 3050 was able to beat the RX 6600, which is a great achievement for the GeForce given it is usually slower, despite costing more. It was also interesting seeing the 5700 XT defeat the GTX 1080 Ti, although it was 10% slower than the RTX 2060 Super, so this is a disappointing result for the RDNA GPU.

RDNA and RDNA2 aren't particularly powerful in terms of performance, with parts like the RX 6800, 6800 XT, 6900 XT, and 6950 XT posing a system bottleneck at just shy of 160 fps.

The RTX 3060 Ti was good for 171 frames per second, with many Turing and Ampere GPUs capable of delivering between 171 and 178 frames per second.

Parts of the A380 and GTX 1630 have dropped to around 30 fps on average, or unplayable territory in our opinion, after jumping up to the 1440p resolution. The RX 580 and 4GB 5500 XT are the bare minimum, which isn't that bad if we are using the medium quality preset.

The 5700 XT was able to match the 2060 Super and 3060, which is a better result for the popular RDNA GPU.

RDNA2 was unable to hold its ground in Ampere. For example, the 6800 XT was 10% slower than the 10GB RTX 3080, despite the GeForce GPU suffering with a CPU bottleneck of 172 fps. That also means the majority of mid-range to high-end Ampere GPUs were frame capped by the CPU.

If you want to play at 4K with medium quality settings and still get 60 frames per second, you will require a decent GPU. The Radeon RX 5700 XT achieved 64 frames per second, which matches the RTX 3060 perfectly.

The RTX 3070 Ti or 6800 XT will provide 100 frames per second. RDNA2 stacks up a bit better here, but the RTX 3090 Ti was still a whopping 27% faster than the 6950 XT.

We'll skip to the very high quality preset and start with the 1080p data. If you have an RX 6400 or RX 6500 XT, you'll probably not be able to see how Spider-Man Remastered appears when maxed out... 3 and 6 fps on average. The GTX 1630 and Intel Arc A380 were both significantly better.

With the GTX 1060 6GB, GTX 1650 Super, and RX 5500 XT, the game became quite playable, with 60 frames per second, which is somewhat confusing.

The very high quality preset allows Vega 56 to pull away from the GTX 1080 by a 9% margin. The 5700 XT is up there with the RTX 3060, 2070, and 2060 Super.

The RDNA2 GPUs aren't as powerful. The 6800 XT, 6900 XT, and 6950 XT all hit a brick wall around 140 fps, making the previous RTX 2080 Ti much faster, while they're only about on par with Ampere's RTX 3060 Ti.

The RTX 3090 Ti and RTX 3090 are 24% faster than AMD's finest offerings.

Both the RX 6400 and the 6500 XT were broken when we increased the resolution to 1440p, but the GTX 1630 and A380 weren't much better, and the game began to resemble playable performance until we reached the GTX 1650 Super, although the result was unsatisfactory.

The GTX 1070, 1660 Ti, or 1660 Super are the minimum requirements for playable performance. The GTX 1080 was noticeably smoother, with 57 frames per second on average, then Vega 56 with 60 frames per second on average. Then we have seven GPUs, from the RX 5700 to the RTX 2070, as well as several current-gen models such as the Radeon 6600 XT, 6650 XT, and GeForce RTX 3060

The RTX 2070 Super, 6700 XT, and RTX 2080 were taken to the next level, while the 6750 XT and RTX 3060 Ti were also competitive.

The RTX 3080 10GB delivered 28% more frames than the 6800 XT at the high-end. For some reason Radeon GPUs appear to be CPU bound at just shy of 130 fps, whereas Nvidia GPUs were able to push up to 170 fps.

The very high 4K results aren't exactly surprising. The Pascal flagship GTX 1080 Ti was good for ust 47 fps on average, and the RTX 2080 Ti was able to break the 60 fps barrier easily.

The most affordable Radeon GPU to achieve this was the Radeon RX 6800, which matched the RTX 3070 and 3070 Ti.

The GeForce RTX 3080 was 22% faster than the 6800 XT, and AMD's 6950 XT was able to match the original 10GB 3080. Again, where Radeon GPUs maxed out at around 90 fps, the flagship GeForce GPU pressed on to deliver over 100 fps, reaching 112 fps to make it 24% faster.

Benchmarks for Ray Tracing

With the high quality preset, we also looked at ray tracing performance. All RT effects were set to high (not very high) and the default "6 objects" was selected.

Performance at 1080p for high-end GeForce GPUs and 25% for high-end Radeon GPUs are reduced by 30% when compared to the "Very High" results we just examined.

For a moment, ignore the high-end, we see that entry-level GPUs such as the Arc A380, RX 6400, and 6500 XT claim to support ray tracing (and technically do), but they also don't have a wide use capability.

The RTX 3050 was capable of 53 frames per second, which is definitely playable, and the game looked superb. For a 60 fps experience, the Radeon 6600, 6600 XT, or 6650 XT will be required, and it's interesting to see all three GPUs delivering basically identical performance, suggesting that the ray tracing portion of the pipeline is hampering RDNA2.

Even Turing (RTX 20) can do ray tracing better than RDNA2, and this has allowed the RTX 2060 to beat the 6650 XT, while the Radeon GPU was 11% faster with the very high quality preset with RT effects disabled.

With ray tracing enabled, the 2060 Super and RTX 3060 Super pumped out 92 fps, while the 6700 XT pumped out an impressive 97 fps. Most high-end GeForce GPUs ran into a system bottleneck at 1080p as the RTX 3070 was only 2 fps behind the RTX 3090 and 3090 Ti.

For those who desire to see ray traced effects at 1440p, you'll be surprised to learn that you can do so with an RTX 2060 Super, RTX 2070, or RTX 3060, while AMD users will require the 6700 XT or 6750 XT, although both models were faster than the GeForce GPUs mentioned.

The RTX 3060 Ti and RX 6800 both have a good refresh rate, with the Radeon GPUs running at around 100 frames per second, while the RTX 3080 and faster GPUs all have a system limit of 120 frames per second.

For best performance you'll want an RTX 3080 Ti, RTX 3070, or RX 6800 for around 60 frames. The original RTX 3080 failed to match the 6950 XT, but at least here it was within a few frames.

With great power comes great responsibility.

After spending most of the last week benchmarking Spider-Man Remastered with over 1,000 passes, we've now got a pretty good grasp of what it takes to run this game. The good news is that gamers who want to achieve 60 frames per second, the game can be run on fairly modest hardware, especially if you're willing to reduce the quality settings a bit.

For example, a Radeon 6500 XT will do fine, regardless of whether or not it's a GTX 1630 or A380. The 6500 XT 1% lows were still a bit sketchy, so let's say the GTX 1650 is the minimum requirement here, followed by a Radeon RX 580, 5500 XT, GTX 1080, or RTX 3050.

Spider-Man Remastered is one of the greatest examples of ray tracing effects so far.

Spider-Man Remastered is one of the best examples of ray tracing effects so far. This is fantastic news, because although the performance drop remains significant, for those looking for 60 fps at 1080p, the requirements aren't extreme: the 2060 Super, RTX 3060, or 6700 XT will provide 60 fps, and much more in the case of the 6700 XT.

Unless you play at 4K with quality settings maxed out, you're almost always going to be limited by something other than your graphics card in this title, which we expect you'd do for this title as the RTX 3080 Ti was still good for over 80 frames per second, with 1% lows well over 60 frames per second.

However, it appears that CPU performance is the major issue here. For all our testing, the Ryzen 7 5800X3D was ideal, as we often saw usage peaking at around 70%, which is unusual for a game. However, the game ran completely smoothly, much smoother than many of the games we test, particularly those that are ported to PC.

For those of you who use a slower CPU with a high-end GPU, there's a good possibility you'll notice a performance limitation from your CPU at resolutions below 4K, and without doing the appropriate level of testing we can't comment on how smooth the game runs with lesser CPUs.

If you're lowering visual quality settings in the hope of increasing frame rates and it's not working, there's a good possibility your CPU is to blame, and if that is the case you might as well max out your GPU, or at least get closer to maxing out it out.

If time permits, we'll probably want to do a CPU benchmark for this game. We're interested to see how budget CPUs like the Core i3-12100 perform in this game. When it comes to RAM usage, it appears that you'll get away with 16GB fairly well, but 32GB is a nice luxury, especially for gamers who like to multitask.

  • Ad-free TechSpot experience while supporting our work
  • Our promise: All reader contributions will go toward funding more content
  • That means: More tech features, more benchmarks and analysis

You may also like: