Jump to content

SuspectDown

Members
  • Posts

    286
  • Joined

Posts posted by SuspectDown

  1. Yesterday I came to the odd realization that this is my life currently. Work, SWTOR, and marriage is shared somewhere in there.

     

    I literally thought to myself "this is my entire day everyday". How do you guys balance your days around the game?

     

    I try to balance it by playing mostly when my wife isn't home, or on work days just limiting myself to the time I spend on the PC to 1-2 hours a day.

  2. The game looks kinda dated in my opinion. It's much newer than WoW but does look worse.

    I love the game, but really would like it to test my 980Ti on 4k. It has to make me want to buy a second card :D

     

    Your 980 Ti won't even be fully utilized. SWTOR with a 970 is good up to 5K, maybe slightly more. GPU wise, your 980 Ti would be good for even higher than 5 K.

     

    Well even with updates.. the main issue is poor optimisation.

     

    I have a decent system i think, i7 5820k, 16gb ddr4, 1x gtx980 and well windows 10 nowadays but on w8.1 it ran exactly the same imho.

     

    In warzones the very steady 60fps will drop to 40fps. (Vsync on)

     

    Ridiculous tbh.

    I cant even imagine how people with lesser or older systems manage.

     

    I'd turn v-sync off for SWTOR. Yes, above 60 FPS it stops screen tearing (or above whatever your refresh rate is) at the expense of slight input lag, but more importantly in a game like this where the FPS will rapidly fluctuate it will not display the actual FPS that your system is capable of rendering under your refresh rate.

     

    For example, if you have vsync on and your refresh rate is 60Hz:

     

    Above 60 FPS: 60 FPS

    30-59 FPS: 30 FPS

    16-29 FPS: 15 FPS

     

    So if your FPS dips to even 59 FPS, you'll actually be running 30 FPS with vsync on.

     

    This game already performs like crap at times, that is all we need, more demanding graphics.

     

    Higher resolution textures won't hurt performance probably because this game is still CPU bound.

  3. Nope, the resolution of the textures is higher. They've come out and said everything post KOTFE will have updated graphics. And that armor detail is MUCH higher...mostly the resolution of the textures

     

    It's impossible to compare those two screenshots and say without a shadow of a doubt that the Outlander is wearing a lower-res armor model. One is a closeup shot, the other is not. Squinting at the Outlander model, specifically the belt, it is not low res.

  4. The game does use up to 4 cores, it just doesn't use them to the full for some reason.

    I did an experiment where I gave SWTOR only access (via task manager) to at first one core, then 2, 3, 4 and finally all 8. Using 1,2 or 3 cores made a big difference in FPS, the 4th core was still noticable and the difference between 4 and 8 cores was well in the margin of error (0-5 FPS).

    All the time SWTOR only went to about 40% CPU usage in split second peaks and was normally sitting around 20-25%, which looked like it was using only 2 cores.

     

    Testing build: AMD FX-5950 8x4.7GHz, 24GB (3x 8GB) DRR3-1600 Corsair XMS-3 RAM, MSI GTX 980Ti GAMING 6G.

    SWTOR and Windows 7 are stored on separate SSDs and the CPU was under 50ºC the whole time thanks to watercooling.

     

    I have no idea what is actually going on there...

     

    TL;DR: SWTOR supports 4 cores in a kind of weird way.

     

    The game doesn't use four cores, Windows is just distributing the load between multiple cores. Also, the reason you specially see an increase in FPS when giving the system access to more cores is because of AMD's FX architecture. Unlike Intel where every core has an FPU, the FX line uses a module layout where your CPU has 4 modules with each module having two integer cores and sharing one FPU. That is why in single-threaded applications, even when clocked significantly lower, Intel dominates AMD.

  5. It's worth noting that video cards that exhaust into the case aren't actually bad things. They're only bad when you're ignorant of that fact and the ramifications.

     

    +1. Many top GPU's with excellent non-reference cooling either don't primarily or at all vent their exhaust out of the case.

  6. What exactly does that prove?

     

    You said P95 is the go to stress test and I said linpack causes more stress. Yes, the time dedicated to stress testing is another discussion. What does it prove exactly you say? My screenshots prove that linpack w/ avx is causes the CPU to use more power, produce more heat, and stresses the system harder (as evidenced by the VDROOP). The screenshots prove that.

     

    How can you say P95 is more stressful when it doesn't cause the same VDROOP? < Can you answer that? This whole thread has turned into off topic tangents because you keep attacking what I say, yet I provide legitimate proof and you just want to keep speaking rhetoric.

  7. That's the second time you've ignored the part of ease of over clocking vs buying so we'll just assume you gave that point up then. ;)

     

    You just keep repeating the same question, and I already touched on it. Yes, you can make multiple posts saying the same thing, but I already touched on that topic.

  8. Interestingly since my last post I thought to try hunt any information that would actually prove your point. I really can't find anything, almost everything is too the contrary in that you shouldn't solely rely on a Linx stress test to prove stability perhaps you can provide some information evidence that turns your opinion into fact?

     

    Also Prime95 uses AVX ( for over a year ) or are you only picking and choosing when information is too old as such?

     

    Linpack W/AVX doesn't mean it's Prime95 stable either ( Oh I better add in w/ avx since you seem to love writing this ). ;)

     

    Oh, and I'll be happy to replicate the tests with longer stress times, different hardware, etc. You say the word ;)

  9. No, that is purely your opinion..

     

    I figure you aren't going to shut up about linpack w/ AVX vs P95, so I did some small runs for you. Linpack w/ AVX vs. P95 w/ AVX Small FFT's vs. P95 w/ AVX Large FFT's. I ran P95 right after Linpack as well so there wasn't any bias and reset data between runs.

     

    Yes, Linpack was a few degrees hotter. Yes, Linpack used a couple more watts. Most importantly, YES, Linpack stressed the system the most as evident by the largest VDROOP...

     

    Linx: http://i.imgur.com/xiTamLW.png

    P95 Small FFT: http://i.imgur.com/JVyGApC.png

    P95 Large FFT: http://i.imgur.com/4DK7xMM.png

     

    You can argue I manipulated the temperature of my WC loop (I didn't, all runs were at my 24/7 silent settings), you can argue I manipulated the ambient temperature (I didn't, all tests ran within +-1 degree Fahrenheit), you can argue a few watts extra doesn't matter (whatever), but what you CANNOT argue is the VDROOP difference.

  10. Maybe in your reply you can also address the original point you made of how overclocking is supposedly easier than buying? Still waiting for your reply on that one before you went off on this tangent.

     

    I've already addressed that topic before, you must have skipped over those posts.

     

    As for Linx WITH avx vs P95, you obviously have your outdated opinion and nothing is going to change that. Yes, P95 was the go to stress test back in the day... Not anymore. Using Intel's Linpack (OCCT, Linx, etc) and making sure AVX extensions are enabled is the best way to ensure stability nowadays. P95 neither stresses nor heats up the CPU as good as that since Sandy Bridge came out. I've come to to this conclusion from empirical evidence from overclocking Intel CPU's that are AVX compatible. P95 stable doesn't mean it's Linpack w/ AVX stable.

  11. From what I've known to have an overclock that is considered truly stable you really need to run the prime test . Whilst Linx is great for stress testing temperatures I wouldn't personally rely on it enough over 30 minutes to call my overclock completely stable.

     

    Many people would run both though personally if I'm going to run a prime test anyway I am fairly confident it will find anything wrong that linx runs would have anyway. The same cannot be send about running linx over prime.

     

    You could argue each to their own but if you are proclaiming the death of Prime testing or having it as a thing of the past you are just ignorant.

     

    Linx is more effective than prime, this is fact. Linx was also the first to add AVX extensions, something P95 eventually followed through on. I've had overclocks do fine on P95 and lock up within 60 seconds using Linx w/ AVX.

     

    It seems every thread that comes up about PC performance you want to go round and round. I've been on EOCF and XS for over 10 years and have recorded many impressive overclocks as well with cooling like I said ranging from air to phase change. Look me up on those forums: Urbanfox. < 10+ years of internet history for my resume, what's yours?

  12. Yeah the stock GHz on this new card is actually lower than my previous one by .2 GHz - I made a mistake and thought the stock GHz would be higher than 3.5 GHz, I didn't do all my homework obviously. But since it's already bought and installed, I might as well try to overclock it. I read before buying the CPU that it could be overclocked to as high as 4.6 GHz without causing system instability (maybe that's why I confused its stock speed).

     

    Join an overclocking forum like EOCF, XS, or OCN. You'll gain valuable knowledge that this forum can't teach you and get a nice overclock.

  13. If you think you can judge your overclock as "stable" in 15-30 minutes then you don't know as much about over clocking as you think you do and it's no wonder you think it's "easy".

     

    Well, my experience ranges from air to phase change cooling (which I highly doubt more than 2-3 people on this forum if that have). I've hit significant overclocks with many platforms and am known on both EOCF and XS.

     

    Yes, you can get a stable OC in 15-30 minutes. The days of having to run Prime95 for 8+ hours are over. 30 minutes of Linx w/ AVX instructions is all you need and I have built several high end systems overclocked with that regimen to prove it.

  14. A full HD 1080p image is 1080 rows high and 1920 columns wide. A 4K image approximately doubles both those numbers. that will require 4x the number of pixels. this will load the video card somewhat. it will require more resources even if it is just sending a bar code much less swtor. this will affect your performance. It will most likely be a great trade off. It does not matter what swtor loads. the video card has to drive the monitor. try setting your setup to 1080p and it will most likely fly like a beast. you will probably see drivers coming out in the near future that improves things even more.

     

    You're missing the point... With top GPU's today, SWTOR is NOT GPU limited at 4K, it's still CPU limited... I don't know how to explain that any different for you. This game will NOT fully utilize even a 970 at 2560x1440...

  15. It's still a valid point and an explanation as to why you and Menace seem to be disagree on the effects of Turbo Boost.

     

    It's not turbo boost then, it's the motherboard overclocking the CPU. Regardless, his argument is that buying a new CPU is easier than just overclocking the one you have. I fail to see how that is "easier", especially since the 4790K is only available on 1150 systems, leaving 1155 platform users nothing... except to overclock.

  16. That depends on your setup. Some motherboards will put all 4 cores at the Turbo Speed when XMP or Multicore Enhancement is enabled and several of these motherboards have these settings enabled by default.

     

    Yes, some motherboards will auto-overclock the CPU... But that still isn't stock... You're talking semantics now.

  17. look at the amount of data that is a 4k signal. all of which goes through your vid card. compare that with a much smaller data packet for lower resolutions.what do you think will happen? 4k has a much larger video signal budget than 1080p. it is going to slow things down. the beefy video cards will help but only so much. it doesnt matter what swtor uses, a 4k signal is much larger than 1080p

     

    Yes, 4K uses more VRAM... but it's not VRAM intensive anyways... At 2560x1440, SWTOR doesn't even utilize one 970, it's still CPU bound with a 4.5 GHz Ivy Bridge CPU. So, up the resolution all he wants, it won't be significantly detrimental until his GPU is maxed out.

  18. Then why do I notice a significant improvement in performance when I enable SLI?

     

    What cards? There is no SWTOR SLI profile... In fact, not that SLI matters, since SWTOR is CPU bound, not GPU bound. The game doesn't even fully utilize one of my 970's until 5K.

  19. Hi I am currently level 53 on my highest character and wearing comma gear (for mid to late 40's) should I get myself some gear at 55 if so what vendor / gear should I get or should I try and hold out til 60 ?

     

    Buy level 53 gear on the GTN, then level 56 gear on the GTN, then use basic comms for Yavin/Ziost gear vendors at level 60.

  20. look at the bandwith requirements for 4k. now go look at the 1080p requirements..go set your display to not 4k and watch what it does for your fps. there is a cost for 4k.

     

    By bandwidth I assume you mean VRAM usage, of which SWTOR uses very little.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.