The GPU Blues Looking to see if the 6570 is enough or need more.

Anthony N

Well-known member
Joined
Jun 13, 2012
Posts
619
Location
Michigan, USA
Ok so as I said I'm gonna be playing The Sims 3 quite a bit.

I'm really kicking myself if the 6570 card will do.

It has:
Memory Interface
128-bit

Memory Type
DDR3

Stream Processors
480

Long time ago I had a 4850 over AGP and it ran at about 48% to 62% at anytime during game play and everything could pretty much be on max.

I guess the big question here is will the 6570 be enough or will I need something more?

Also if there is much of a difference between 128-bit cards and 256-bit cards?

As well as DDR3 memory vs GDDR3 / GDDR5.
 
Unfortunately, you did not tell us how much memory that card has, or anything else about your system.

See The Sims 3 System Requirements.

Also if there is much of a difference between 128-bit cards and 256-bit cards?

As well as DDR3 memory vs GDDR3 / GDDR5
256-bit cards have the potential to provide better performance because the bus is twice is wide. For all intents and purposes, DDR3 = GDDR3. The G just means the memory modules are physically configured for mounting on graphics solutions (cards or motherboards) while the standard DDR3 modules (the actual chips, or ICs) are physically configured for mounting on memory sticks.

GDDR5 is only found on graphics cards and (on paper, anyway) offers 2x the bandwidth of GDDR3. I say on paper because it is most likely you will not see 2x the bandwidth as other factors come in play. However, GDDR5 is about 30% more efficient than GDDR3 and so they consume less power and generate less heat for the same, or slightly better (though maybe not noticeable - but on paper) performance.

If you have a choice, go GDDR5. However, I would NOT discount a good GDDR3 card if the rest of the specs and price are right.
 
Whoops, My bad.

And I don't pay attention to them, those are the minimum requirements to make the game "playable"

I'm looking beyond "playable". The reason I am picking on the GPU is because the GPU is what seems to suffer most.

My picked CPU is an Intel 2.9GHz dual-core. I might switch to an AMD and get two more cores. However, The Sims 3 does not behave like a muilt-threaded game unlike its younger brother The Sims 2.

The picked GPU is a 1GB card.
 
And I don't pay attention to them, those are the minimum requirements to make the game "playable"
I realize that. I'm just say with that information, you have a baseline minimum, for that game.

I'm looking beyond "playable".
Of course. But the game makers know most players cannot afford a $400 card (or 2!) (or 3!!!) so they design games to have acceptable "game play" on lessor systems. The main "action" is pretty much the same but the background resolutions, side activities and objects may be limited. That said, since games are so graphics intensive, it is much a rule, the more money the better the game play.

My picked CPU is an Intel 2.9GHz dual-core. I might switch to an AMD and get two more cores.
Not without picking a whole new motherboard. AMD and Intel CPUs are not interchangeable. But nevertheless, you don't need to switch to AMD to get more cores. That said, if you already have the motherboard, you may be able to upgrade to better Intel. But note if you get a different motherboard, you most likely will have to buy a new Windows license as most are OEM and tied to the "original equipment" it was purchased for or with - and that generally breaks down to the motherboard as the heart and soul of any computer.

Personally, if I were a gamer, I would not get a card with less than 2Gb of RAM on board - if my budget allowed. And as someone who cannot live without multiple monitors, 2Gb is nice too.

Just don't make the mistake of cutting the budget on your PSU. It is critical to get a quality 80 Plus certified PSU with ample power and from a reputable maker. Would you buy a new Porsche and fill up at the corner Tobacco and Wine Hut?
 
I know they are not mixable.

Geez Digerati, I mean I know you post general information so that anyone that reads the thread has an understanding.

-- On a second note, I'm sorry I seem snappy, I forgot you probably don't know how much IT knowledge I have.

And this is a new build so windows is of no concern to me. Just the hardware.

The only reason I thought about AMD was for cpu cost, I know if I switch CPU manufactures it opens a whole new ballgame.

I'd never cheap out on my PSU. That's where majority of the allotted cost goes to.

I generally use this order in terms of how much money I'll dish out in a general sense (giving no respect to an in-stone order or price bracket):
  1. PSU
  2. CPU
  3. Motherboard
  4. GPU
  5. RAM
  6. Storage
  7. Case
  8. Additional Cooling
  9. Add-on Cards and accessories

This is the Netbook replacement. So it doesn't have to win the Indy 500, but I want it to beable to take what I throw at it on a decent dollar. I knew $650 wasn't gonna work so I suckered in for dual monitor and looking for other improvements. Right now I'm attacking the GPU and CPU.

With my current income, I can work things out. But for the most part, I'm not worried. I know when it get's down to the 'nose against the grind stone' I'll start watching my toes and what I order. But for now this is blueprint. A plan, an idea where I think I should head.
 
Geez Digerati, I mean I know you post general information so that anyone that reads the thread has an understanding.-- On a second note, I'm sorry I seem snappy, I forgot you probably don't know how much IT knowledge I have.
No, I am sorry I am not the mind reader you want me to be. No I don't know your experiences but we also don't have much else to go on - such as your budget, the goals of the computer, or what you already bring to the table in terms of parts on hand.

And this is a new build so windows is of no concern to me.
And because it is a new build, ensuring you don't unintentionally do something illegal that might get you in trouble (like trying to use an OEM license that came with or was purchased for a different computer/motherboard on this new build), is a concern.

The only reason I thought about AMD was for cpu cost
While AMDs generally do have the advantage when it comes to cost, cost of the CPU alone is not a good enough reason to choose AMD. The CPU is but one component in your computer and once you factor in the motherboard, graphics, PSU, case, Windows license, RAM, drives, monitor, keyboard, mouse, UPS, the difference in costs between an AMD system and Intel system becomes minimal.

While the choice of PSU should be a top priority, it should also be your last purchase decision - done once you know the power requirements of the other components.
Right now I'm attacking the GPU and CPU.
That's fine but note if I have to juggle the budget with compromises, having lots of RAM is more important than getting a bigger CPU. It does not even need to be the fastest RAM, just lots of it. I recommend 8Gb for dual channel motherboards and 6 for triple channel as the "sweetspots" for RAM. Less and performance is takes a hit, more and improvement is minimal at best, for most users.

And of course, you need 64-bit Windows to handle that amount of RAM.

Also, proper selection of a good case should eliminate the need for additional cooling.
 
All new parts, nothing existing.

Right now because of the monitor + shipping I'm right around the $800 mark.

As I read, a cpu is not just a cpu, a motherboard is not just a motherboard, a gpu is not just a gpu.... ect ect ect

I'd like to stay in the $900 to $1000 price mark.

I've ordered the case, its on its way.

now it's down to the other components.

From past experience with The Sims 3 it played beautifully on an Intel P4 2.4GHz single-core, 3.5GB of RAM and a 4850 Radeon over AGP @ 4x

Well now we're several years later, and I'm new to PCIe other that the bandwidth trumps that of AGP by 10-fold.

So, now I'm getting to experience newer Chipsets and CPU's, now its time to up the drive and kick some booty.

So let's do this thing!





P.S. I have about 9yrs of self-taught IT experience, all from solving problems on my own or trial and error.
 
As I read, a cpu is not just a cpu, a motherboard is not just a motherboard, a gpu is not just a gpu.... ect ect ect
I don't know what means. Actually a CPU is pretty much just a CPU as is the GPU. They are totally stupid, unable to do anything unless instructed to do something. The motherboard, on the other hand, is much more than just a motherboard. It is an integrated circuit board that also contains the network interface, sound, and often the graphics solution too. It contains it's own mini operating system (the BIOS), I/O and storage (CMOS).

From past experience with The Sims 3 it played beautifully on an Intel P4 2.4GHz single-core, 3.5GB of RAM and a 4850 Radeon over AGP @ 4x
You are trying to compare new apples to pre-historic oranges. P4 single cores and AGP are ancient history - especially AGPx4 and neither should be used to judge how anything will run on modern hardware from either CPU or GPU makers.

P.S. I have about 9yrs of self-taught IT experience, all from solving problems on my own or trial and error.
And that's great, but scares me to death too. Not necessarily you in particular, but all too often I have seen self-taught users, who have no clue and did not take the time to learn about the basics of electronics or the safe handling of electronics doing hardware maintenance. They put themselves or others into situations that is downright dangerous. Too often folks think because they can swap-out RAM, or assemble a few parts in a case to build a computer that suddenly are now "technicians" and computer experts - when they never heard of Ohm's Law, ESD, or understand the fact that anything that plugs into the wall can kill.

"Normal" people would never dream of opening the back of their TVs, monitors, or home theater audio receivers and poke around inside with highly conductive meter probes, but you see the ill-informed doing this, or advising this practice with computers, or worse, power supplies. So for the record, I think it great you are doing your homework by coming to places like Sysnative to research and verify. But I caution to never assume you really know what you are doing without verifying first - if the safety of the equipment, or you, or your data are factors.

For the record, I got into computers professionally when I stood inside one in the mid 70s. It took up the whole floor of an ADC NORAD "blockhouse" and my job was to figure out how to interface my SLFCS (survivable low frequency communications system) radios to it. I say "my" - it was the Air Force's, but they sent me to school on it, put me in charge of it, and I had to train and certify the other radio techs before they could think about touching it. And also for the record, it took me over two years formal classroom, on-the-job, and certification training before I could call myself a technician.

And what has remained constant over the last 40+ years working secure IS/IT hardware support is things change and the more you learn, the more you realize there is MUCH more to learn. And there's always someone nearby who knows a heck of lot more than you do. :grin1:
 
As I read, a cpu is not just a cpu, a motherboard is not just a motherboard, a gpu is not just a gpu.... ect ect ect
I don't know what means. Actually a CPU is pretty much just a CPU as is the GPU. They are totally stupid, unable to do anything unless instructed to do something. The motherboard, on the other hand, is much more than just a motherboard. It is an integrated circuit board that also contains the network interface, sound, and often the graphics solution too. It contains it's own mini operating system (the BIOS), I/O and storage (CMOS).

You and I are on different pages with the CPU and GPU. I have a different outlook on those two. The motherboard, I've always looked up to because what it does.

From past experience with The Sims 3 it played beautifully on an Intel P4 2.4GHz single-core, 3.5GB of RAM and a 4850 Radeon over AGP @ 4x
You are trying to compare new apples to pre-historic oranges. P4 single cores and AGP are ancient history - especially AGPx4 and neither should be used to judge how anything will run on modern hardware from either CPU or GPU makers.

I know that. I'm trying to say is I'm trying to relive the past... just with faster, newer stuff.

P.S. I have about 9yrs of self-taught IT experience, all from solving problems on my own or trial and error.
And that's great, but scares me to death too. Not necessarily you in particular, but all too often I have seen self-taught users, who have no clue and did not take the time to learn about the basics of electronics or the safe handling of electronics doing hardware maintenance. They put themselves or others into situations that is downright dangerous. Too often folks think because they can swap-out RAM, or assemble a few parts in a case to build a computer that suddenly are now "technicians" and computer experts - when they never heard of Ohm's Law, ESD, or understand the fact that anything that plugs into the wall can kill.

"Normal" people would never dream of opening the back of their TVs, monitors, or home theater audio receivers and poke around inside with highly conductive meter probes, but you see the ill-informed doing this, or advising this practice with computers, or worse, power supplies. So for the record, I think it great you are doing your homework by coming to places like Sysnative to research and verify. But I caution to never assume you really know what you are doing without verifying first - if the safety of the equipment, or you, or your data are factors.

For the record, I got into computers professionally when I stood inside one in the mid 70s. It took up the whole floor of an ADC NORAD "blockhouse" and my job was to figure out how to interface my SLFCS (survivable low frequency communications system) radios to it. I say "my" - it was the Air Force's, but they sent me to school on it, put me in charge of it, and I had to train and certify the other radio techs before they could think about touching it. And also for the record, it took me over two years formal classroom, on-the-job, and certification training before I could call myself a technician.

And what has remained constant over the last 40+ years working secure IS/IT hardware support is things change and the more you learn, the more you realize there is MUCH more to learn. And there's always someone nearby who knows a heck of lot more than you do. :grin1:

Oh heck, I never take myself as a "know-it-all" that's why I'm here. To further expand my knowledge with other's knowledge.

I'm extremely careful. During the build process, cleaning, tear down, anytime I'm working with electronics.
 
You and I are on different pages with the CPU and GPU. I have a different outlook on those two.
Different outlook based on what? Got a link to something to substantiate that for I don't know what page you could be on. CPUs and GPUs are just ICs, "microchips" little more than a bunch of transistors mounted on a tiny circuit board. And all a transistor knows is two things, high (1) or low (0) and it only knows one of those at a time!

The processors really know how to do next to nothing at this point - in a total "wait state" - and will only know next to nothing until some program feeds them new data to crunch. And that does not happen until their respective chipsets and BIOS's are booted up, bus communications are established and data is sent to them to crunch. And that is all done through the main boards (the motherboard for the CPU and the graphics card for cards - or graphics region for integrated graphics).

If you know something different, then please provide some supporting evidence because if that is not how it works I think a few [million] text books and microcircuit course curriculums need to change ASAP!

Make no mistake, todays CPUs and GPUs are remarkable examples of advanced design and manufacturing technologies. But beyond that, what they do is not remarkable at all, or anything new. They do one thing - they add two 32-bit or 64-bit numbers together, really fast. Really really fast! And then add two more. And two more - several billion times a second. Different numbers, but the same task over and over again. Not really hard.

As a hardware guy, the biggest fact that makes CPUs and GPUs so remarkable to me is the shear quantity of transistors they pack on these dies - and consequently, how tiny each transistor must be. The first transistor I touched was in 1971 in radio maintenance tech school. It (as in 1 transistor) was 1/2 inch across. Today's 6-core i7 CPU has 2.27 billion transistors in the same space and NVIDIA recently introduced a GPU with more 7 billion (7,100,000,000) transistors on one chip!

I'm trying to say is I'm trying to relive the past... just with faster, newer stuff.
Well that is a mistake if you are judging current hardware (or their makers) on such legacy technologies as the AGP interface and single core CPUs, as it appeared by your comments you were. It is a mistake (technically speaking) to say you want AMD because your 10 year old P4 could not do this or that, just as it is a mistake to say NVIDIA is better because your old ATI gave you problems. Or to say you won't buy a new Ford today because they built the 73 Pinto. ;)

Nothing wrong with being loyal to a brand as long as it is for the right reasons. Both Intel and AMD make excellent, reliable CPUs and both NVIDIA and AMD make excellent, reliable graphics solutions.
 
*cracks fingers*
They do one thing - they add two 32-bit or 64-bit numbers together, really fast. Really really fast!
That is what I was what I find so amazing about them. Just the shear speed. We're at 4GHz stock clock, in a few months we'll be breaking the 5GHz then 6, then 7. Then so fast you swear you're living it live.

As a hardware guy, the biggest fact that makes CPUs and GPUs so remarkable to me is the shear quantity of transistors they pack on these dies - and consequently, how tiny each transistor must be. The first transistor I touched was in 1971 in radio maintenance tech school. It (as in 1 transistor) was 1/2 inch across. Today's 6-core i7 CPU has 2.27 billion transistors in the same space and NVIDIA recently introduced a GPU with more 7 billion (7,100,000,000) transistors on one chip!
That too. And we're just gonna keep getting smaller and smaller, faster and faster.

Well that is a mistake if you are judging current hardware (or their makers) on such legacy technologies as the AGP interface and single core CPUs, as it appeared by your comments you were. It is a mistake (technically speaking) to say you want AMD because your 10 year old P4 could not do this or that, just as it is a mistake to say NVIDIA is better because your old ATI gave you problems. Or to say you won't buy a new Ford today because they built the 73 Pinto. ;)

As I said before I'm open to trying other manufactures.

I've had the Athlon Thunderbird, Athlon XP Palomino, Celeron D, P3 Katmai & Coppermine, P4 and now I have Intel Atom and AMD Athlon II X2.

For GPUs I've been with ATI / AMD for add-on cards, however I've only worked with on-board Nvidia chips.

Technically, I'd like to understand more about the parts themselves. Such as the GPU, the CPU, the motherboard, ect ect.

And instruction order. Such as when one clicks windows explorer, then a folder, then a file. Or more in-depth a video game, where you have files, textures that have to be loaded into ram and then distro all over the freaking place.




Edit:

On a second note, my uncle (one of them) said to me long time ago there was no such thing as Windows (I grew up on Win95), You'd say the look on my face was priceless. He said to me that everything used to take place through DOS.

My face is still priceless every time my mom pulls out the typewriter for the business tax season.

It surprises me that our digital playground that manages everything is at stake on some very important processes.
 
That is what I was what I find so amazing about them. Just the shear speed.
Well, okay, but what allows the CPU run at those speeds? The motherboard via the bus clock.
 
Ok, now that I did not know. I thought the CPU was the one that set the speeds and the FSB was the link between the CPU and the motherboard.
 
Keep hanging around with Digerati, Anthony. He's been doing this for a lot of years. (You don't have to agree with him or what he suggests may not be what you have in mind but it will likely pay off in the long run if you listen to what he has to say.)
 
Ok, now that I did not know. I thought the CPU was the one that set the speeds and the FSB was the link between the CPU and the motherboard.
Wrong on the first part, right on the second.

The bus is the highway for communications between the major components on the motherboard (CPU, RAM, Graphics, etc). While the engine in your car is able to run at 100mph, your car's speed will be limited by the capabilities of the transmission, fuel delivery, and road conditions, but ultimately (for illustrative purposes), the speed your car runs at is set by the speed limit of the highway.

The CPU sets the "maximum" speed it can run at, but ultimately it can only run as fast as the bus can deliver new chunks of data so the bus speed establishes the communications "bandwidth" or speed of data transfer between the CPU and the board.

How do you change the CPU speed? By changing the bus speed and multiplier - both motherboard settings - not CPU settings. That is why when you want to overclock to boost speeds or underclock to reduce noise and heat, you change the motherboard timings, not the CPUs.
 

Has Sysnative Forums helped you? Please consider donating to help us support the site!

Back
Top