Jump to content

I take it that NVIDIA users are enjoying this game more than ATI / AMD users?


JepFareborn

Recommended Posts

Mars II isn't a real production card it's a 1000 unit showoff card. AMD could easily make a quad 6990 and blow it away, a quad 580 would drain all the power in the US.

 

AMD had the fastest single gpu card 18 months ago - the 5870. They've been fastest card overall since 2009 with the 5970 then the 6990.

 

The sad truth of it comes to light when you consider that AMD's dual-gpu cards consume less power than Nvidia's single-gpu cards, and are usually only half the (chip) size.

 

Nvidia is rubbish and has been for ages. Now the 7970 dominates, the 7950 will be out in 2 weeks followed soon by the monster 7990 and Nvidia won't have a new card for months. AMD will have 4/5 of the top fastest cards soon - 7990, 6990, 7970 and 7950 with Nvidia only having the 590 in the top 5. It's embarrassing.

 

Thanks for making me laugh. Nvidia takes a huge steaming pile of *flowers and happy thoughts* on the joke that ATI is.

 

580 > any single gpu AMD card til now (as always they put out their new cards earlier, it's nothing new, they'll just have a small advantage for a while)

 

and 590 > any dualgpu from AMD.

 

Try again.

Edited by Skeelol
Link to comment
Share on other sites

  • Replies 273
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

pretty sure nvidia users enjoy all games more than ati users, after all nvidia has always and will always be better. same with intel over amd. Just go intel and nvidia and ull never be dissapointed

 

thats such bs though. i have gone ATI and have had same amount of problems i did with nvidia. its all in the luck of the game IMO. with bf3 my 5870 works flawlessly and is giving performance i shouldn't even be able to get.

swtor performs al-right except in high density areas where i get hitching(random fps drops)

which is 100% the game engine.

another problem i get is degraded NPC quality when not in a cinematic. if you look closely you can see the difference.

Link to comment
Share on other sites

I gave up on AMD, I rather pay a little extra for the NVIDIA cards. I also suggest anyone with an AMD CPU to go over to i5 or i7 if you have the cash. My uber "Phenom II 960 6x core cpu" runs worse than an i5 cpu from Intel.

 

I actually have to overclock my cpu for this game to run smoother. Also, for anyone saying that the GPU is more important for SWTOR is horribly wrong -- This game requires more CPU power than GPU.

Edited by siegedeluxe
Link to comment
Share on other sites

HD 6970, only time I drop below 30 fps is on Imperial station. So long as I'm above 30 I don't care how many fps's I get, it's not like the eye can see more than that.

 

You know that's not really true, right? Your brain can tell the difference, unless you have some rare disorder. How important that difference is to a person, well, that's subjective.

Edited by Daemonjax
Link to comment
Share on other sites

Running nvidia and have been for years, along with amd. I can't deal with ati anymore, nvidia just does it better, and I can't stand intel with it's "hyperthread" garbage.

 

Anyway, now that my personal opinions which mean absolutely nothing to anyone but me are out of the way, I am on an amd/nvidia combo rig that I just built about 3-4 months ago. To say it is 2-3x optimal rig for this game would be an understatement, and the game still doesn't run or look very good.

 

They need to address these issues, and quit having people reinstall 500 times and redoing their systems and stuff. The problem is in their game, not your rig 99% of the time.

Link to comment
Share on other sites

Thanks for making me laugh. Nvidia takes a huge steaming pile of *flowers and happy thoughts* on the joke that ATI is.

 

580 > any single gpu AMD card til now (as always they put out their new cards earlier, it's nothing new, they'll just have a small advantage for a while)

 

and 590 > any dualgpu from AMD.

 

Try again.

 

Every single review and tech site online now would disagree with you.

 

As of a month ago you could run a 6990 tri-fired with a 6970 against pretty much any NVidia triple-sli set up possible (tri-fire because it's about the GPU's) and you would in some instances be getting an 80% increase in performance.

 

The 6990 is a lot better than the 590 according to every review and test available, runs on less power too, and the 6970 is a better card than the 580 according to most mags and online reviewers.... including when you crossfire them against each other.

Link to comment
Share on other sites

Well movies are only filmed in 24 frames per second and they don't look jerky or choppy.

 

 

 

Pause a movie while something is moving quickly in a scene.

 

Take a screenshot of a video game while something is moving quickly in a scene.

 

Compare the images, and tell me what you see that's different.

 

 

I'll save you the trouble. The frames from the movie will have motion blur. This tricks your brain into thinking the scene elements are moving faster than 24 frames per second. In essence, your brain fills in the gaps. You don't get that kind of motion blur in video games (yet), and so you need higher fps for scenes with motion to look more realistic.

Edited by Daemonjax
Link to comment
Share on other sites

Pause a movie while something is moving quickly in a scene.

 

Take a screenshot of a video game while something is moving quickly in a scene.

 

Compare the images, and tell me what you see is different?

 

 

I'll save you the trouble. The frames from the movie will have motion blur. This tricks your brain into thinking the scene elements are moving faster than 24 frames per second. In essence, your brain fills in the gaps. You don't get that kind of motion blur in video games (yet), and so you need higher fps for scenes with motion to look more realistic.

 

Ahhh that's interesting, thanks for the info!

Link to comment
Share on other sites

Thanks for making me laugh. Nvidia takes a huge steaming pile of *flowers and happy thoughts* on the joke that ATI is.

 

580 > any single gpu AMD card til now (as always they put out their new cards earlier, it's nothing new, they'll just have a small advantage for a while)

 

and 590 > any dualgpu from AMD.

 

Try again.

 

http://www.hardocp.com/article/2011/03/24/asus_geforce_gtx_590_video_card_review/9

 

http://hexus.net/tech/reviews/graphics/29724-asus-nvidia-geforce-gtx-590-graphics-card-reviewed-rated/?page=5

 

http://www.legitreviews.com/article/1576/11/

 

http://techgage.com/article/amd_radeon_hd_6990_dual-gpu_graphics_card_review/12

 

http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-590-Dual-GF110s-One-PCB/?page=5

Link to comment
Share on other sites

pretty sure nvidia users enjoy all games more than ati users, after all nvidia has always and will always be better. same with intel over amd. Just go intel and nvidia and ull never be dissapointed

 

Except that the balance of power is never stable. One side always tackles the other off it's throne and the masters must attempt to restore the balance. It has not been long since the AMDire had the upper hand, but then the Intel council came up with the C2D and took the upper hand. Same on the GFX department, Ati used to dominate before the 6600/6800 range.

Link to comment
Share on other sites

I've been reading a lot of threads concerning horrible frame rate issues vs. game engine blame on decent gaming rigs. I've got a BioStar ATI Radeon HD 4670 & a Sparkle NVIDIA Geforce 9600 GT. Obviously my Nvidia card gave me way smoother performance. But the ATI card is supposed to be superior. So I take it that any upcoming new driver release will not resolve anything? Only game engine tweak patch?

 

No I enjoy it a lot getting 58-118 FPS with ATI 5970.

Link to comment
Share on other sites

I've been reading a lot of threads concerning horrible frame rate issues vs. game engine blame on decent gaming rigs. I've got a BioStar ATI Radeon HD 4670 & a Sparkle NVIDIA Geforce 9600 GT. Obviously my Nvidia card gave me way smoother performance. But the ATI card is supposed to be superior. So I take it that any upcoming new driver release will not resolve anything? Only game engine tweak patch?

 

ATI has technically superior cards. Unfortunately - for all that they are technically superior - the results from those cards can be gotten a few hundred dollars cheaper from a nVidia card.

 

I've always found it to be hilarious. :D

Link to comment
Share on other sites

I stopped using NVIDIA a looooooooooooong time ago. In the past they've been caught a few times (some where around early 2000's) using trickery in there so called "magic" drivers to get extra fps.

 

They would have the card render so many frames out of a set badly so that it could render more frames and then they could come out and say "Oh look! Where faster than X card now!"

 

Just don't trust em any more. Not saying ATI is perfect either, but they haven't been busted for that.

Link to comment
Share on other sites

Haven't referred to them as ATI for a few yrs-just call it AMD.

 

I build my computers for gaming for the most part and look for performance more than anything else- I don't care for rlly flashy looking cases or excessive PSUs.

 

Since this is gaming focused, I acknowledge most games these days at 1920x1080 are GPU-limited and not CPU limited. That is why with a solid high-end gfx card-based rig, a cheap phenom or i5 750 will do nearly as well as a more expensive Sandy Bridge. That is why I always buy AMD cpus- because they're cheaper. You pay a premium for Intel CPUs if you're focusing on gaming or just general media/multitasking-brand recognition is all it is really. On top of that, Intel-based motherboards have been significantly more pricey than AMD boards. So as of late, I have been building systems based on AMD's CPUs. With the $ saved from getting a cheaper CPU, I just get a better gfx card.

 

As for graphics cards, it's really been a toss-up between AMD and Nvidia for years. I was never invested in either camp.

 

I acknowledge AMD cards tend to have a few more bugs upon release of games but they've improved by leaps such that most of these kinds of issues are resolved within a few weeks of release, if not earlier.

Link to comment
Share on other sites

I love how you all act like its such a huge difference between either 90 fps and say 50, bottom line is, the human eyes and brain can not process more then 40-45 frames per second anyhways, so all that power above and beyond those points is pretty much wasted energy.

 

Dude this is not true- I don't even get where people got this from. It depends on how the frames are rendered. The way movies and games are rendered are very different.

 

I can easily tell between 45 and 60fps in SWTOR. I could no longer tell the difference past ~80fps.

Link to comment
Share on other sites

×
×
  • Create New...