Jump to content

AMD Users Only


DarthKasadaRhur

Recommended Posts

[in terms of GPU it all depends on what you are after. Top tier right now 295x2 in some cases out performs the Titans for performance and way ahead in price so that would be your best bet for single card top end. Next would have to be the GTX 980 and 970's for price vs performance with those 970's on par with an r9 290x in performance but much better in choice and then giving you the option of SLI I think it is the 980's and still be at a better price than the 295x2 ( may have been the 970 that made it more affordable yet more powerful, I forget ).

Below that you would have to give the edge to AMD still with the r9 290 280x 280 etc. depending on your price point and what you can afford/want for a pure performance vs price equation. Pretty tight market now really and I would imagine those 290x's will drop again pretty soon too.

 

I agree. Looking at upgrading to an R9 very soon. Thank you for your input.

Link to comment
Share on other sites

Hey all!

 

Not sure if there is a support forum on this website strictly for users of AMD products, but if there isn't I want to start one.

 

Over the summer I built a new gaming rig that incorporates among other components an AMD FX-6130 processor and a Sapphire Radeon R7 240 Graphics Card with 2gigs of RAM.

 

So far it works sweet when I play SWTOR. And if anyone knows a way to make it work sweeter, please post your thoughts.

 

Cheers!!!:rak_03:

I hate to be the deliverer of bad news but those specs are pretty terrible. I'm talking lowest of the lows.

Anybody saying AMD processors are "better" than Intel is just trolling. In no measurable way are they "better".

 

I am however a complete fanboy of their GPU's. have always considered that, bang for buck wise, they cannot be beaten

Price/Performance ratio for heavily multi threaded workloads. That's how they are better. The FX 8000 has been a viable option since its release and still is today even though it's had a long lifespan for a CPU. That's impressive engineering considering AMDs yearly budget is more like Intel's monthly budget for R&D alone.

Link to comment
Share on other sites

How clueless are most other people in this thread when it comes to hardware?

 

It really bothers me when my fellow techies make comments like this. Its just rude. You could simply have said you disagreed with most of them without insulting the entire group with a blanket statement like this.

 

---end of reply to quote--

 

Anyone who builds computers and is capable of putting ego aside for a few minutes knows that there is no "best" piece of hardware for everyone. Every single user is unique and has unique needs from their rig, so the entire concept of "best" is highly subjective from the start because I don't need the same hardware as the next guy or the guy who came before me.

 

AMD needs to step up their R&D in my opinion. They've fallen way behind Intel on die size and power consumption. Almost to the point where the power consumption of AMD chips negates the savings an AMD chip offers over Intel. The cost of running an AMD chip on a 24/7 basis over a couple of years is almost equal to the difference in price between that same AMD chip and a comparable i3 or i5. There is still enough room to save some money by choosing AMD chips because the Intel chip has some cost associated with running it 24/7 over the same period of time, but every year we see Intel chips becoming more and more efficient while AMD chips continue to operate at the same specs as last year's chip, but with a few more clocks per second. If AMD doesn't make some serious changes to their architecture, they will find themselves pushed out of the market if current trends at both companies continue.

Link to comment
Share on other sites

I hate to be the deliverer of bad news but those specs are pretty terrible. I'm talking lowest of the lows.

 

 

Oh I beg to differ my friend. I typically get 75+ fps when playing this game. Often, that figure approaches 95 fps, and that mind is with no overclocking. SWTOR is an MMO which correct me if I am wrong means, that no one needs absolutely super powerful hardware to play.

Link to comment
Share on other sites

It really bothers me when my fellow techies make comments like this. Its just rude. You could simply have said you disagreed with most of them without insulting the entire group with a blanket statement like this.

 

---end of reply to quote--

 

Anyone who builds computers and is capable of putting ego aside for a few minutes knows that there is no "best" piece of hardware for everyone. Every single user is unique and has unique needs from their rig, so the entire concept of "best" is highly subjective from the start because I don't need the same hardware as the next guy or the guy who came before me.

 

AMD needs to step up their R&D in my opinion. They've fallen way behind Intel on die size and power consumption. Almost to the point where the power consumption of AMD chips negates the savings an AMD chip offers over Intel. The cost of running an AMD chip on a 24/7 basis over a couple of years is almost equal to the difference in price between that same AMD chip and a comparable i3 or i5. There is still enough room to save some money by choosing AMD chips because the Intel chip has some cost associated with running it 24/7 over the same period of time, but every year we see Intel chips becoming more and more efficient while AMD chips continue to operate at the same specs as last year's chip, but with a few more clocks per second. If AMD doesn't make some serious changes to their architecture, they will find themselves pushed out of the market if current trends at both companies continue.

 

This.

 

Power consumption is one reason I replaced my Sandy Bridge with Haswell. Not the only reason, but it helped.

 

Regarding what system works best, as you say, that depends on the user and what they actually need. Frankly, most people don't "need" more than a decent dual core CPU and built in graphics, since most people don't actually play games like SWTOR.

 

For those two do, a decent $150 video card is all most people really need, unless they have a high res monitor.

 

Is it nice to have more? Yes, it is... But it isn't a "need".

Link to comment
Share on other sites

Oh I beg to differ my friend. I typically get 75+ fps when playing this game. Often, that figure approaches 95 fps, and that mind is with no overclocking. SWTOR is an MMO which correct me if I am wrong means, that no one needs absolutely super powerful hardware to play.

 

What resolution and graphics settings are you using?

 

I imagine you have no reason to lie, so I'll just believe you, but I find that hard to imagine.

Link to comment
Share on other sites

It really bothers me when my fellow techies make comments like this. Its just rude. You could simply have said you disagreed with most of them without insulting the entire group with a blanket statement like this.

 

---end of reply to quote--

 

Anyone who builds computers and is capable of putting ego aside for a few minutes knows that there is no "best" piece of hardware for everyone. Every single user is unique and has unique needs from their rig, so the entire concept of "best" is highly subjective from the start because I don't need the same hardware as the next guy or the guy who came before me.

 

AMD needs to step up their R&D in my opinion. They've fallen way behind Intel on die size and power consumption. Almost to the point where the power consumption of AMD chips negates the savings an AMD chip offers over Intel. The cost of running an AMD chip on a 24/7 basis over a couple of years is almost equal to the difference in price between that same AMD chip and a comparable i3 or i5. There is still enough room to save some money by choosing AMD chips because the Intel chip has some cost associated with running it 24/7 over the same period of time, but every year we see Intel chips becoming more and more efficient while AMD chips continue to operate at the same specs as last year's chip, but with a few more clocks per second. If AMD doesn't make some serious changes to their architecture, they will find themselves pushed out of the market if current trends at both companies continue.

 

Thank you for your feedback. I will keep this in mind when I build my next rig. Most likely in a year or so.

Link to comment
Share on other sites

How clueless are most other people in this thread when it comes to hardware?

Well, for me, I have been building PCs for over 20 years, since back when a HDD controller was an add-in card.

I worked as an engineer for Intel for over a decade, including as a processor designer and a PC chipset architect.

The first 32-bit processor that I was architect of made first silicon back in the mid-1980's.

 

So don't be smugly diss'ing the cluefullness of "other people." You have no idea who they might be.

Link to comment
Share on other sites

Well, for me, I have been building PCs for over 20 years, since back when a HDD controller was an add-in card.

I worked as an engineer for Intel for over a decade, including as a processor designer and a PC chipset architect.

The first 32-bit processor that I was architect of made first silicon back in the mid-1980's.

 

So don't be smugly diss'ing the cluefullness of "other people." You have no idea who they might be.

 

My experience is likewise old... Wrote my first program in BASIC on the Apple II a very long time ago. :)

 

That being said, such knowledge is almost useless today when it comes down to picking parts to build gaming rigs in 2014. The basic Computer Science stuff still applies, but knowledge of HHD add in controllers and what way to low level format is rather pointless in 2014.

 

I don't miss the headaches of manual IRQ selection, having cards that only work in this or that slot, and the fun of serial and parallel ports.

 

Frankly, it is all much nicer today in so many ways...

 

I imagine that 30 years from now we could have the same conversation again. :)

Edited by Heat-Wave
Link to comment
Share on other sites

I admit to having fallen a bit behind since I left the industry (used to do purchasing and inventory, along with anything else that needed doing that day, for retail/repair PC business), but I always laugh when someone starts calling other people "idiots" and "ignorant" because they aren't up on projected performance of cards that don't release until next Q.
Link to comment
Share on other sites

My experience is likewise old... Wrote my first program in BASIC on the Apple II a very long time ago. :)

 

That being said, such knowledge is almost useless today when it comes down to picking parts to build gaming rigs in 2014. The basic Computer Science stuff still applies, but knowledge of HHD add in controllers and what way to low level format is rather pointless in 2014.

 

I don't miss the headaches of manual IRQ selection, having cards that only work in this or that slot, and the fun of serial and parallel ports.

 

Frankly, it is all much nicer today in so many ways...

 

I imagine that 30 years from now we could have the same conversation again. :)

 

Honestly, as a programmer, I miss serial ports being standard on desktops. It made integrating devices like scanners and bar code readers into in-house developed software tools a lot easier than the modern USB devices of today.

 

I'm an odd bird in the industry though. The bulk of my education was centered around .NET Programming in Visual Studios 2005 and 2008, but I've written more code in VB6 at my job than anything else. The .NET apps I've worked on were fairly easy to use with the USB devices, but I had to write my own class for VB6 to use them when the computers around the office stopped having serial ports.

Link to comment
Share on other sites

My experience is likewise old... That being said, such knowledge is almost useless today when it comes down to picking parts to build gaming rigs in 2014. :)

Perhaps, but not much has changed, really, in PCs since I left Intel a mere few years ago. The Core processors are still using an evolution of the Pentium III (which was derived from the Pentium Pro, which was derived from the i960MX, which was derived from the Biin dataflow processor) for their cores. Mass storage is still a bottleneck. GPUs are still being pushed to their limits by increasing monitor pixel counts and increasingly realistic games. Getting heat off the chip is still the main limitation on how much performance a CPU can deliver.

 

Still, I'm constantly amazed at what you can put together for a grand these days.

Edited by BuriDogshin
Link to comment
Share on other sites

Well, for me, I have been building PCs for over 20 years, since back when a HDD controller was an add-in card.

I worked as an engineer for Intel for over a decade, including as a processor designer and a PC chipset architect.

The first 32-bit processor that I was architect of made first silicon back in the mid-1980's.

 

So don't be smugly diss'ing the cluefullness of "other people." You have no idea who they might be.

 

I was a Applied Mathematics major as an undergraduate. Minored in Computer Science, but mostly focused on the software side. Programming in Java, C and C++ was enough for me. LOL!

Link to comment
Share on other sites

My experience is likewise old... Wrote my first program in BASIC on the Apple II a very long time ago. :)

 

That being said, such knowledge is almost useless today when it comes down to picking parts to build gaming rigs in 2014. The basic Computer Science stuff still applies, but knowledge of HHD add in controllers and what way to low level format is rather pointless in 2014.

 

I don't miss the headaches of manual IRQ selection, having cards that only work in this or that slot, and the fun of serial and parallel ports.

 

Frankly, it is all much nicer today in so many ways...

 

I imagine that 30 years from now we could have the same conversation again. :)

 

Let's have the same conversation in another 30 years. By then, my son will be on a board like this!

Link to comment
Share on other sites

Honestly, as a programmer, I miss serial ports being standard on desktops. It made integrating devices like scanners and bar code readers into in-house developed software tools a lot easier than the modern USB devices of today.

 

I'm an odd bird in the industry though. The bulk of my education was centered around .NET Programming in Visual Studios 2005 and 2008, but I've written more code in VB6 at my job than anything else. The .NET apps I've worked on were fairly easy to use with the USB devices, but I had to write my own class for VB6 to use them when the computers around the office stopped having serial ports.

 

Do you have any experience with Java?

Link to comment
Share on other sites

It really bothers me when my fellow techies make comments like this. Its just rude. You could simply have said you disagreed with most of them without insulting the entire group with a blanket statement like this.

 

When I read a thread and most people posting in it stating things as "fact" rather than opinion and are for the most part wrong to begin with then yes I will happily call them clueless because they are ... clueless?

 

So don't be smugly diss'ing the cluefullness of "other people." You have no idea who they might be.

 

Was reading a requirement of said PC building expertise? If not then I will direct you to the word "most" I used in my statement ( for it to be a blanket statement as pointed to above also I believe I would have replaced the word "most" for the word "the" ). Combine that with the bit you conveniently left out in your quotation of me agreeing with your point on SSD's to which I said "This is about the only factual statement in this thread" implying I am most likely clearly not talking about you as you would seem to believe since you've justified YOU'RE expertise with PC's and not other users and you end up at the net result of I was clearly not directing my use of the word "most" towards you. ;)

 

I don't miss the headaches of manual IRQ selection, having cards that only work in this or that slot, and the fun of serial and parallel ports.

 

I personally used to enjoy this level of customisation. To that degree it's more or less the main reason I do partake in over-clocking, not so much for the performance gains but more so the minor challenge of getting it all working along nicely and stable. Not that I go too far out of my way with after market cooling though building good liquid cooling loops looks like fun and can make for some pretty interesting looking PCs too.

Link to comment
Share on other sites

Do you have any experience with Java?

 

I have a lot of experience with the Swing class in Java, but I haven't used any of it in about 5 or 6 years and would probably have to refer back to the API for everything if I had to do any Java coding.

 

C# and Visual Basic are my bread & butter now. I have a love, hate relationship with Visual Basic though. I only use it because my employer requires it.

Link to comment
Share on other sites

I have a lot of experience with the Swing class in Java, but I haven't used any of it in about 5 or 6 years and would probably have to refer back to the API for everything if I had to do any Java coding.

 

C# and Visual Basic are my bread & butter now. I have a love, hate relationship with Visual Basic though. I only use it because my employer requires it.

 

I've heard that about Visual Basic. I do some coding in Java from time to time for personal projects but that's about it. Thought about becoming a software engineer at one point. Then I spoke with a friend who is a software engineer and he said the job is somewhat overrated.

Link to comment
Share on other sites

I've heard that about Visual Basic. I do some coding in Java from time to time for personal projects but that's about it. Thought about becoming a software engineer at one point. Then I spoke with a friend who is a software engineer and he said the job is somewhat overrated.

 

Yeah, most programmers have been shying away from VB since the turn of the century. Its not a bad language and there really isn't much difference between it and C#. If you can code in one, you can code in the other. The primary difference is VB is verbose, while C# is syntax-based. In layman's terms you end most logical structures in VB by typing END, while in C# you place curly braces ( {, } ) around the block of code. The API and environment are almost identical once you learn the different syntax for each language.

 

I would have loved to have stuck with Java and moved into a Java Developer career path, but I live in a city where the industry is lagging and everyone is still stuck on VB and VB.NET. Where I live the only company that employees Java developers is Kroger, and they want you to spend 2 years as an intern paying union dues out of your own pocket while not even drawing a paycheck just to get your foot in the door.

Link to comment
Share on other sites

Yeah, most programmers have been shying away from VB since the turn of the century. Its not a bad language and there really isn't much difference between it and C#. If you can code in one, you can code in the other. The primary difference is VB is verbose, while C# is syntax-based. In layman's terms you end most logical structures in VB by typing END, while in C# you place curly braces ( {, } ) around the block of code. The API and environment are almost identical once you learn the different syntax for each language.

 

I would have loved to have stuck with Java and moved into a Java Developer career path, but I live in a city where the industry is lagging and everyone is still stuck on VB and VB.NET. Where I live the only company that employees Java developers is Kroger, and they want you to spend 2 years as an intern paying union dues out of your own pocket while not even drawing a paycheck just to get your foot in the door.

 

That's terrible! Paying the union dues wouldn't bother me. Not getting a paycheck would result in an automatic adios!!!

Link to comment
Share on other sites

Let me know when you what kind of performance you get for SWTOR when you do (if you do) play the game using the AMD A8-7600 APU.

After copying the SWTOR folder to the HTPC and running it, it defaulted to low graphics settings at 800x600. After changing that to 1920x1080, and sitting in my SH, I was getting close to 50-60 fps (using ctrl-shift-f). But, when I went to the fleet, the fps dropped to 34-36. From that, I'm guessing that it wouldn't be very playable in most of PvE, but it should be fine for crafting, light questing, etc.

I still intend to do some more with it but, I didn't copy the Documents\SWTOR folder over, and that appears to be where my GUI setups are, - EDIT or maybe not - plus I still need to setup the controls, etc.

 

Btw, I'm quite happy with that performance. it's about what I'd expected.

Edited by JediQuaker
Link to comment
Share on other sites

×
×
  • Create New...