Following The Budget Gaming PC thread Page 114

  • THFourteen 26 Jul 2013 08:09:28 47,937 posts
    Seen 2 hours ago
    Registered 13 years ago
    you dont need actually need a PCIE 3.0 slot to run a PCIE 3.0 card do you?
  • Deleted user 26 July 2013 08:11:58
    You don't NEED no, but reading up Nvidea cards do loose a few percentage when on pcie 2. AMD cards on the other hand are not effected as much.
  • Rodpad 26 Jul 2013 08:24:12 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Why are you getting an i7? Hyperthreading gives no extra performance in games. You are wasting 100. An i5-3570k would have given you identical performance.

    Plus you are only losing maybe 2fps by using a 16x pcie 2.0 slot as mentioned above.

    All in all you've just spent 200 for around a 5fps boost at best over sticking with your existing Sandy Bridge setup.

  • Deleted user 26 July 2013 08:36:25
    I totally agree with you Roddles in todays games it is a bit overboard but hey atleast its not a Titan :)

    I dont tend to do things by halves and i figured the 3770k is the best CPU that my motherboard will ever take so i may as well get the best as Haswell and all future cpus would need a new motherboard. When the next gen rolls around i would not consider a new mobo anyway as i already have all i want on it USB 3, SATA 3 etc. I realise i wont get much performance change as iam going from 570 Sli as it is, but personally it irritates me if iam not getting the best i can from hardware so to have a 770 and it to not be used to its fullest potential would annoy me.

    As for 2GB vs 4GB well thats the one iam not sure on but DF says more VRAM the better so what the hell.
  • Phattso 26 Jul 2013 08:40:55 22,915 posts
    Seen 59 minutes ago
    Registered 14 years ago
    With the next gen consoles being 8-core, I don't think it's entirely insane to think that the eight logical cores of the i7 are going to be more future proof than an i5 for next-gen ports.

    For sure the Intel parts will be clocked at more than twice the frequency of the next gen console parts, and better cores to start with, but factoring in closed system optimisations on the consoles the extra cores may be warranted.

    I'm having the same tossup myself at the moment, and haven't settled on one or the other.

    What's narking me is that I'm probably going with an Haswell setup, but the gains over Ivybridge are negligible. Pisser.
  • Rodpad 26 Jul 2013 08:52:24 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Personally, I think by the time games take advantage of 8 cores, the current crop of CPU's won't cut the mustard anyway.

    As I mentioned in another post, a 4GB card is pointless since only using the super duper ultra mode on something like Metro 2033 at a 4k resolution will push you past 2GB of GPU memory usage. You'll be running that at about 10fps.

    Neither are future proofing and both will be out of date by the time games require it.
  • Phattso 26 Jul 2013 09:10:37 22,915 posts
    Seen 59 minutes ago
    Registered 14 years ago
    That's supposition though - neither you nor I know how it's going to pan out. So yeah, it is down to personal opinion as you say. Sharzam (and most like me also) are betting the other way.

    Just because games today don't make use of more than 2GB of VRAM doesn't mean that this will continue to be the case. Design patterns will emerge that game developers will embrace as a result of the next gen consoles I'm sure. Techniques and tactics that rely on that much VRAM are very likely to happen, and if you don't have that much RAM you're going to suffer a performance penalty and shonky port workarounds.

    I'm fairly sure that an i7 Haswell clocked at 4.2GHz is probably going to be fairly future proofed. I'd be stunned if we saw a sea change in PC game tech vs. the next gen consoles. You know the drill - for ports it'll be the same tech possibly with some extra bells and whistles. The odd stalwart PC exclusive might buck the trend, but that's about it.
  • Deleted user 26 July 2013 10:02:26
    When I got my first 570 people said 1gb was plenty and then games like BF3 came out and said 1gb was not enough. If I had bought the 2.5gb then I would of been safer in today's games, although I don't struggle to much thanks to Sli.

    Hence choosing 4gb over 2gb as it is only a bit more, so worth the help in next 2 years. I don't think anyone is right or wrong at the moment simply because we don't know at the moment. I simply wanted yo go back to single card due to heat and noise, the 770 seamed like a sensible option. All the other things spiraled from that to ensure I didn't downgrade performance.

    Anyway this is now getting a bit away from 'budget hardware' talk. I only posted here to mention if someone wanted a 570 or i5 I might be able to do a sensible price.

    Edited by Sharzam at 10:09:17 26-07-2013
  • Deleted user 26 July 2013 10:26:59
    There are worse ways to spend the extra 100 but I can't think of any right now.
  • Phattso 26 Jul 2013 10:30:33 22,915 posts
    Seen 59 minutes ago
    Registered 14 years ago
    I guess there are two ways: buy the cheapest Haswell i5 you can get and a moderate graphics card that takes care of today's high end games. Then in 18 months bin them both and step up to whatever is needed to meet the needs of more modern engines.

    Alternatively just stump up the cash for best of breed right now.

    If Sharzam were talking about some piece of shit graphics card then maybe I'd agree he should put another 100 there rather than into the CPU. But the 770 is pretty good, so he's covered there I reckons.
  • Rodpad 26 Jul 2013 10:40:21 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Indeed the 770 is the best high end card around. The difference in price between the 770 and 780 doesn't justify the tiny performance increase.
  • b0rk 26 Jul 2013 10:52:27 7,654 posts
    Seen 7 minutes ago
    Registered 12 years ago
    The performance increase or at least the technical specification between the 770 and 780 is quite significant and the disparity much more pronounced than the 670 to 680. Having said that the 770 is absolutely the best bang for buck card around at the moment with only its 2GB of VRAM being a potential problem if you're trying to future proof against the PS4 (the X1 is technically irrelevant).
  • crashVoodoo 26 Jul 2013 11:35:41 5,737 posts
    Seen 2 hours ago
    Registered 16 years ago
    core i5 3570k 3.40ghz reduced at aria
    134.99 ...
  • Deleted user 26 July 2013 11:46:44
    I don't see any value in future-proofing. Much better (and much cheaper) to wait for the future to arrive and upgrade when you have to.
  • Deleted user 26 July 2013 11:52:01
    crashVoodoo wrote:
    core i5 3570k 3.40ghz reduced at aria
    134.99 ...
    Basket Total 134.99
    Approx. Delivery 8.95
    VAT 28.79
    Total 172.73
  • b0rk 26 Jul 2013 11:52:26 7,654 posts
    Seen 7 minutes ago
    Registered 12 years ago
    Agreed but for anyone wanting to invest in a GPU now that will last a few years before becoming a problem playing the latest console port it's worth getting something that at least isn't inferior or even on par with what's inside the PS4.
  • Rodpad 26 Jul 2013 11:54:44 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Bremenacht wrote:
    I don't see any value in future-proofing. Much better (and much cheaper) to wait for the future to arrive and upgrade when you have to.
    So much this.

    Buying into those early DX10 and DX11 cards really worked out. Hyperthreading is a fad for gaming. We're only just now seeing quad core results.

    Current mid range will always trump the mid range from 2 years ago. You really can't future proof.

    Edited by Roddles at 11:55:23 26-07-2013
  • Rodpad 26 Jul 2013 11:59:53 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Once again, 4GB is pointless at 1080p and will be for the duration of the card's practical lifespan.
  • Rodpad 26 Jul 2013 12:07:22 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Got a source? All I can find is a quote which says the opposite (this is referring to total system memory which includes video memory within)

    The demo of Killzone: Shadow Fall shown during PlayStation 4's announcement only used 4GB of memory, not the full 8GB GDDR5 RAM available to developers in the final hardware. Guerrilla's decision to stick with 4GB suggest that the developer may not have been aware of Sony's decision to include 8GB in final retail hardware, something CEO Stewart Gilray told had been kept secret from third-party developers until the console's announcement
  • Rodpad 26 Jul 2013 12:09:54 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Do you mean this?

    That's allocated memory, not actually memory in use. The clue is in the precise 3GB amount.
  • Rodpad 26 Jul 2013 12:15:42 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    mazty wrote:
    @Roddles You can't future-proof if you don't have a bottomless wallet, but you can be smart about what you buy. For example, HD7950 is expected to be on par with the PS4, but the GTX760 OC'd (with good cooling) will cost the same and kick out much more power.
    Likewise, don't bother with Haswell, stick the Z77 over Z87, don't bother with RAM faster than 1600Mhz etc.
    I agree with all of that. I don't agree with spending more for 1600Mhz memory that has tight timings, or huge pools of memory on GPUs that won't get used for the exact same reason in your point.

    Edited by Roddles at 12:15:57 26-07-2013
  • Rodpad 26 Jul 2013 12:21:13 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    Being that most games on PC don't use more than 2GB of system memory, there's no reason for them not to reserve 3GB for video memory.

    Right now it is futile to spend the extra money on more video memory on your GPU. In 3 years time maybe not, at which point your GTX 770 with 4GB of memory is going to be a bit a of a slog anyway, and ergo, not future proofed.

    Pay for what's good now, sell in 3 years, buy what's good then.
  • Rodpad 26 Jul 2013 12:54:25 2,964 posts
    Seen 7 hours ago
    Registered 8 years ago
    The 8800GT came out in 2006.

    If the 8800GT had 2GB of video memory do you think people would still be using it today?

    The 760 is not going to have the grunt to play games in settings that require more than 2GB in three years time, let alone seven.

    Edited by Roddles at 12:55:18 26-07-2013
  • Deleted user 26 July 2013 12:57:35
    For 20, assuming that in all other ways the cards are equal (same bundled games etc), then maybe.

    However, his general point is correct. Why not spend less when you need to spend it, rather than spending more now, just to be 'ready' for when that extra memory is utilised?

    There's also that little thing that some devs do to grab headlines by claiming to do 'the most' of something in order to drive interest. Claiming to use 'the most' memory for graphics sounds just such a thing. Is the game any good? Who cares! It fills more of my GPU buffer than other games, and that's what matters!

    Reminds me of Sony bragging about BD capacity and filling it with extra crap to justify the bragging.
  • Deleted user 26 July 2013 13:01:20
    Diminishing returns have to kick in at some stage. "Just add an extra x in case of y happening in z years" isn't the best theory to subscribe to.
Log in or register to reply

Sometimes posts may contain links to online retail stores. If you click on one and make a purchase we may receive a small commission. For more information, go here.