May 31, 2011 22:31 GMT  ·  By

When I first began playing The Witcher 2, which we reviewed at length here on Softpedia, I was not playing on a very good machine and I was forced to bring the graphics quality slider back to Medium [ADMARk=1]when I started the game and even then during some character heavy sequences, like going straight through Flotsam, the game struggled to keep the frame rate up.

The Witcher 2 still looked very good and the art was still impressive, especially during cutscenes and I never felt that the game would be very much improved with an upgrade.

I have to make it clear that I have never been too interested in video game graphics and derive much more enjoyment from good mechanics and a well-built world.

My old system had a Core 2 Duo processor running at 2.20 Ghz on an Asus motherboard, backed up by 4 GB of RAM and using an Nvidia GeForce GTX 8800 video card.

Here's how the game looked in a clip captured when I was running it on the old hardware:

But I had a look at my colleague's PC and saw what The Witcher 2 could show players as long as top of the line hardware and Ultra settings were used, so I decided to go for an upgrade.

I am now running an Intel Core i5 2400 processor @ 3.10 Ghz with a GeForce GTX 560 / 1024 MB video card.

The difference is striking and you can see it in the video below:

I don't say this often but The Witcher 2 is certainly that type of video game experience that can push a gamer to upgrade his system, which is a bit weird considering that we are talking about a role-playing game from a Polish studio created with a new engine and not about Crysis 2 or about something like RAGE from id Software.