Jump to content

SuspectDown

Members
  • Posts

    286
  • Joined

Everything posted by SuspectDown

  1. I try to balance it by playing mostly when my wife isn't home, or on work days just limiting myself to the time I spend on the PC to 1-2 hours a day.
  2. Your 980 Ti won't even be fully utilized. SWTOR with a 970 is good up to 5K, maybe slightly more. GPU wise, your 980 Ti would be good for even higher than 5 K. I'd turn v-sync off for SWTOR. Yes, above 60 FPS it stops screen tearing (or above whatever your refresh rate is) at the expense of slight input lag, but more importantly in a game like this where the FPS will rapidly fluctuate it will not display the actual FPS that your system is capable of rendering under your refresh rate. For example, if you have vsync on and your refresh rate is 60Hz: Above 60 FPS: 60 FPS 30-59 FPS: 30 FPS 16-29 FPS: 15 FPS So if your FPS dips to even 59 FPS, you'll actually be running 30 FPS with vsync on. Higher resolution textures won't hurt performance probably because this game is still CPU bound.
  3. FTP should be used as something akin to a trial. If you don't pay toward the development and upkeep of this game, why should you have a voice?
  4. lol, yep, so true. Pretty much every place not named in-n-out.
  5. Actually, higher resolution character models won't see a huge hit, the game is CPU bound by animations, scripts, etc.
  6. It's impossible to compare those two screenshots and say without a shadow of a doubt that the Outlander is wearing a lower-res armor model. One is a closeup shot, the other is not. Squinting at the Outlander model, specifically the belt, it is not low res.
  7. No, I am not. Armor detail isn't much different, and as for the faces, one has more textures because he is older. I'm not missing the point, you're just making a moll hill out of an ant hill.
  8. The outlander is just a generic character, you can customize him/her to look how you want.
  9. The game doesn't use four cores, Windows is just distributing the load between multiple cores. Also, the reason you specially see an increase in FPS when giving the system access to more cores is because of AMD's FX architecture. Unlike Intel where every core has an FPU, the FX line uses a module layout where your CPU has 4 modules with each module having two integer cores and sharing one FPU. That is why in single-threaded applications, even when clocked significantly lower, Intel dominates AMD.
  10. +1. Many top GPU's with excellent non-reference cooling either don't primarily or at all vent their exhaust out of the case.
  11. You said P95 is the go to stress test and I said linpack causes more stress. Yes, the time dedicated to stress testing is another discussion. What does it prove exactly you say? My screenshots prove that linpack w/ avx is causes the CPU to use more power, produce more heat, and stresses the system harder (as evidenced by the VDROOP). The screenshots prove that. How can you say P95 is more stressful when it doesn't cause the same VDROOP? < Can you answer that? This whole thread has turned into off topic tangents because you keep attacking what I say, yet I provide legitimate proof and you just want to keep speaking rhetoric.
  12. You just keep repeating the same question, and I already touched on it. Yes, you can make multiple posts saying the same thing, but I already touched on that topic.
  13. Oh, and I'll be happy to replicate the tests with longer stress times, different hardware, etc. You say the word
  14. I figure you aren't going to shut up about linpack w/ AVX vs P95, so I did some small runs for you. Linpack w/ AVX vs. P95 w/ AVX Small FFT's vs. P95 w/ AVX Large FFT's. I ran P95 right after Linpack as well so there wasn't any bias and reset data between runs. Yes, Linpack was a few degrees hotter. Yes, Linpack used a couple more watts. Most importantly, YES, Linpack stressed the system the most as evident by the largest VDROOP... Linx: http://i.imgur.com/xiTamLW.png P95 Small FFT: http://i.imgur.com/JVyGApC.png P95 Large FFT: http://i.imgur.com/4DK7xMM.png You can argue I manipulated the temperature of my WC loop (I didn't, all runs were at my 24/7 silent settings), you can argue I manipulated the ambient temperature (I didn't, all tests ran within +-1 degree Fahrenheit), you can argue a few watts extra doesn't matter (whatever), but what you CANNOT argue is the VDROOP difference.
  15. I've already addressed that topic before, you must have skipped over those posts. As for Linx WITH avx vs P95, you obviously have your outdated opinion and nothing is going to change that. Yes, P95 was the go to stress test back in the day... Not anymore. Using Intel's Linpack (OCCT, Linx, etc) and making sure AVX extensions are enabled is the best way to ensure stability nowadays. P95 neither stresses nor heats up the CPU as good as that since Sandy Bridge came out. I've come to to this conclusion from empirical evidence from overclocking Intel CPU's that are AVX compatible. P95 stable doesn't mean it's Linpack w/ AVX stable.
  16. Linx is more effective than prime, this is fact. Linx was also the first to add AVX extensions, something P95 eventually followed through on. I've had overclocks do fine on P95 and lock up within 60 seconds using Linx w/ AVX. It seems every thread that comes up about PC performance you want to go round and round. I've been on EOCF and XS for over 10 years and have recorded many impressive overclocks as well with cooling like I said ranging from air to phase change. Look me up on those forums: Urbanfox. < 10+ years of internet history for my resume, what's yours?
  17. Join an overclocking forum like EOCF, XS, or OCN. You'll gain valuable knowledge that this forum can't teach you and get a nice overclock.
  18. Well, my experience ranges from air to phase change cooling (which I highly doubt more than 2-3 people on this forum if that have). I've hit significant overclocks with many platforms and am known on both EOCF and XS. Yes, you can get a stable OC in 15-30 minutes. The days of having to run Prime95 for 8+ hours are over. 30 minutes of Linx w/ AVX instructions is all you need and I have built several high end systems overclocked with that regimen to prove it.
  19. You're missing the point... With top GPU's today, SWTOR is NOT GPU limited at 4K, it's still CPU limited... I don't know how to explain that any different for you. This game will NOT fully utilize even a 970 at 2560x1440...
  20. It's not turbo boost then, it's the motherboard overclocking the CPU. Regardless, his argument is that buying a new CPU is easier than just overclocking the one you have. I fail to see how that is "easier", especially since the 4790K is only available on 1150 systems, leaving 1155 platform users nothing... except to overclock.
  21. Yes, some motherboards will auto-overclock the CPU... But that still isn't stock... You're talking semantics now.
  22. Yes, 4K uses more VRAM... but it's not VRAM intensive anyways... At 2560x1440, SWTOR doesn't even utilize one 970, it's still CPU bound with a 4.5 GHz Ivy Bridge CPU. So, up the resolution all he wants, it won't be significantly detrimental until his GPU is maxed out.
  23. What cards? There is no SWTOR SLI profile... In fact, not that SLI matters, since SWTOR is CPU bound, not GPU bound. The game doesn't even fully utilize one of my 970's until 5K.
  24. Buy level 53 gear on the GTN, then level 56 gear on the GTN, then use basic comms for Yavin/Ziost gear vendors at level 60.
  25. By bandwidth I assume you mean VRAM usage, of which SWTOR uses very little.
×
×
  • Create New...