Jump to content

SWTOR On GTX 680 "video inside"


BizzzIMO

Recommended Posts

The main problem with fps I experience is when I have more than 10 people on the screen, either in WZs or in raids, doing activities, such as moving around or using their abilities. In 8 man I can say it's relatively smooth (although it's 20 fps ... yeah, beggars can't be choosers), but in 16 man the frames are just stuck at frikkin' 10 fps or even less. Bleh ... in about a month I'm getting i5 2500k and the Z77 Gigabyte mobo, if the game doesn't drastically improve, I'll just have to give up, it's as simple as that, and also hope that my guild won't be converting to 16 man anytime soon *sigh I certainly don't have the money to buy overpriced and needless gtx 680 and i7 for gaming purposes, and certainly not for JUST THIS GAME.

 

Yeah, you definitely don't need to spend a killing. I spent $300 dollars on a new board (biostar a880gz), cpu (amd fx 8120) and 8gb ram and just used my old 8800gt from my old tower and i could run SWTOR nearly maxed out and still get decent fps (40-50). If you plan on picking up a i5 and a new board you should see huge improvements as long as you have at least 4gb ram and a decent grphx card

Link to comment
Share on other sites

  • Replies 165
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

well you did say i wasnt getting 240fps on a 240 hz tv. :cool:

I did say that, because that's not computer rendered 240fps.

Some quasy-tv-calculated-voodoo is not rendered 240fps.

 

If that were the case I could use GPU that does 10fps, and let TV 'render' rest to smooth it out.

Unfortunately, it doesn't work that way.

Link to comment
Share on other sites

I did say that, because that's not computer rendered 240fps.

Some quasy-tv-calculated-voodoo is not rendered 240fps.

 

If that were the case I could use GPU that does 10fps, and let TV 'render' rest to smooth it out.

Unfortunately, it doesn't work that way.

 

I think the lad who you are replying to thinks refresh rates generates frame rates from looking at his posts. if only I could get hold of one of these magic framerate generatiing monitors.

Link to comment
Share on other sites

I did say that, because that's not computer rendered 240fps.

Some quasy-tv-calculated-voodoo is not rendered 240fps.

 

If that were the case I could use GPU that does 10fps, and let TV 'render' rest to smooth it out.

Unfortunately, it doesn't work that way.

all i said was my tv can do 240 fps without screen tearing. it adds frames to make the picture more fluid.

 

my point was you can see a difference, and when the TV does the upscale for movies it looks fake, but when it does it in a videogame it looks smoother.

Link to comment
Share on other sites

all i said was my tv can do 240 fps without screen tearing. it adds frames to make the picture more fluid.

 

my point was you can see a difference, and when the TV does the upscale for movies it looks fake, but when it does it in a videogame it looks smoother.

 

refresh rate doesnt generate frames. It refreshes the screen at a high speed. which means if your system can run 240 fps. and your monitor can refresh at 240hz. You wont get screen tearing. But it does not ADD or GENERATE framerates. Only you GPU can do this.

Link to comment
Share on other sites

I think the lad who you are replying to thinks refresh rates generates frame rates from looking at his posts. if only I could get hold of one of these magic framerate generatiing monitors.

 

Motion interpolation is used in various display devices such as HDTVs and video players, aimed at alleviating the video artifacts introduced by framerate conversions in fixed-framerate displays such as LCD TVs. Films are recorded at a frame rate of 24 frames per second (frame/s) and television is typically filmed at 25, 50, 30 or 60 frames per second (the first two being PAL, the other two from NTSC). Normally, when a fixed framerate display such as an LCD screen is used to display a video source whose framerate is less than that of the screen, frames are often simply duplicated as necessary until the timing of the video is matched to that of the screen, which may introduce a visual artifact known as judder, perceived as "jumpiness" in the picture. Motion interpolation intends to remedy this by generating intermediate frames that make animation more fluid

 

While common, not all 120 Hz HDTVs include a motion interpolation feature. Also, anti judder technology is not the same as motion blur reducing technology, but is frequently lumped together with it.[1]

 

The commercial name given to motion interpolation technology varies across manufacturers, as does its implementation.

 

SOAP OPERA EFFECT

 

The "video" look is a byproduct of the perceived increase in framerate due to the interpolation and is commonly referred to as the "Soap Opera Effect" after the way those shows looked, having been shot on cheaper 30 fps video instead of regular broadcast equipment or film.[16] Some complain that the effect ruins the theatrical look of cinematic movies.[10] Others appreciate motion interpolation as it reduces motion blur produced by camera pans and shaky cameras and thus yields better clarity of such images. For this reason, almost all manufacturers have built in an option to turn the feature off. The soap opera effect can also be known as "Judder adjustment" or "Judder Removal"[17] This "video look" is created deliberately by the VidFIRE technique to restore archive television programs that only survive as film telerecordings.[18]

 

 

 

 

 

my tv is magic. leperachauns hide in my couch, a genie is in my remote.

 

i still get 240 fps. :cool:

Edited by Vulgarr
emphasis on the bolded
Link to comment
Share on other sites

refresh rate doesnt generate frames. It refreshes the screen at a high speed. which means if your system can run 240 fps. and your monitor can refresh at 240hz. You wont get screen tearing. But it does not ADD or GENERATE framerates. Only you GPU can do this.
it generates the same picture (frame) into the sequence until it matches the hz of the tv monitor.
Link to comment
Share on other sites

Actually the human eye can only perceive up to 80FPS or a frequency of 80Hz and that is for people with excellent eyesight. The reason most CRTs used to refresh at 50-60Hz was because that was deemed adequate for the human eye and most electrical supply was rated at that.

 

Furthermore all the human eye needs for a full motion image is 24FPS. Frame rates above 60FPS are lost to most human beings. So there is no tangible (considering the limits of the human eye) difference between 50-60FPS and 5000FPS-6000FPS except the technical prowess of the machine rendering the image.

 

Your eyes will not be able to discern the difference. If you can, you need submit yourself for tests, you might be the next step in human evolution =)

 

You are so wrong with everything you've posted that I'm not sure where to start correcting it. It's amazing how many people are posting misinformation while attempting to marginalize a legitimate problem. Obviously you fail to grasp thr difference between a computer display and a movie screen based on your 24 fps remark. I suggest you research thr subject before posting on it. It has been well-studied that the human visual system can and does recognize fps much higher than 60. Just because you don't notice a difference between 24 fps and 60 doesn't mean there isn't one. To many of us, it's glaringly apparent even without the ingame counter or fraps.

Link to comment
Share on other sites

You are so wrong with everything you've posted that I'm not sure where to start correcting it. It's amazing how many people are posting misinformation while attempting to marginalize a legitimate problem. Obviously you fail to grasp thr difference between a computer display and a movie screen based on your 24 fps remark. I suggest you research thr subject before posting on it. It has been well-studied that the human visual system can and does recognize fps much higher than 60. Just because you don't notice a difference between 24 fps and 60 doesn't mean there isn't one. To many of us, it's glaringly apparent even without the ingame counter or fraps.
these "experts" in this thread arent as "experty" as they thought they were.
Link to comment
Share on other sites

People do realize that 50-60 framerate is quite fine right? Actually its better than getting 100 frames per second. Not much reason for a videocard to work harder than it should.

 

Your monitor's refresh rate is probably at 60 frames per second and the human eye will not notice any stuttering motion at this rate most the time. Really anything less than 30 is when things start to go bad, at least when there is a lot a movement.

11 is louder than 10. Haven't you watched spinal tap? shesh... :rolleyes:

Link to comment
Share on other sites

refresh rate doesnt generate frames. It refreshes the screen at a high speed. which means if your system can run 240 fps. and your monitor can refresh at 240hz. You wont get screen tearing. But it does not ADD or GENERATE framerates. Only you GPU can do this.

 

More recently, refresh rates of 120Hz, 240Hz, and even 480Hz are advertised. To achieve these refresh rates, televisions do not have super-powered processors, rather they have engineered software that interpolates motion. Motion interpolation software takes information from frame A, and frame B, and averages the brightness and color coordinates in each frame to create a third frame between the two.

 

The television creates new pictures and inserts them between frames to add smoothness to your picture.

 

source: http://www.televisioninfo.com/News/Cleaning-up-the-Soap-Opera-Effect-Motion-Interpolation-and-why-480Hz-looks-terrible-on-your-new-TV.htm

 

i was given 3 wishes, my first wish i wished for an MMO that plays like crap no matter the hardware.

 

my second wish was to have fanboys lie and spread misinformation about the mmo and things they think they know everything about.

 

my 3rd wish was for my magical 50 inch 3d led 240 mhz tv that magically makes frames to run at 240fps.

 

that was from the leprachaun that lives in my couch. i still havent used up the genie that lives in my remote. saving him for guild wars 2.

Link to comment
Share on other sites

GTX 680 is a new card with a new architecture. These kinds of performance surprises often happen until after a couple of updated driver releases.

I would agree with this, if Precision X wasn't reading a 400 GPU clock speed. Goes to it's max potential in games like Battlefield 3, but doesn't ever get close to maxed out in TOR.

Link to comment
Share on other sites

×
×
  • Create New...