It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

monitor framerate lag : PC vs Console

page: 1
0

log in

join
share:

posted on Nov, 24 2013 @ 06:55 PM
link   
What is the frame lag for gaming consoles compared to PC games?

How much lag is noticeable to the player?



posted on Nov, 24 2013 @ 07:06 PM
link   
reply to post by John_Rodger_Cornman
 


The CPU always wins. There are already video cards out that are better than the video card inside the new consoles.
edit on 24-11-2013 by Evanzsayz because: (no reason given)



posted on Nov, 24 2013 @ 07:12 PM
link   
You would think that with consoles there would be no lag. The good thing about designing games for consoles is that you know the limits of the system you are making the game for. With Pc's there is a better chance of lag because the person making the game doesn't know the system that going to be running it so you are more likely to get lag.

Usually you will notice lag if it drops below 30fps.
edit on 24-11-2013 by buster2010 because: (no reason given)



posted on Nov, 24 2013 @ 07:26 PM
link   
FPS is a little more important than some realise.
It depends on how the developers set it all up, but here's an example of why high fps can be important.
A couple of years ago people with high fps found they could jump higher than those with low fps!!
People will say anything above 25 to 30 frames is fine as the human eye cant detect the difference once its higher than that (persistence of vision) but try telling that to serious gamers and they'll quickly tell you its bull, and they're right.

PC's not only have higher frame rate, they also have higher resolution. A good pc with a good graphics card will put a console to shame.



posted on Nov, 25 2013 @ 09:11 AM
link   
reply to post by John_Rodger_Cornman
 


*This Opinion is only on the lag issue between the two and nothing else*

I played many multiplatform games.... i gotta say console is less laggy due to games are modeled after them, so the compatibility is high.

I played a MMO back in the days on PS2 and PC(my PC was much more advanced than PS2(PS2 was at least 4 yrs old), i saw the reaction time in crowded area or claiming monsters were much faster on PS2 than PC.

PC has many other software and issue that might cause lag, unless its a PC with only games and no RAM eating software running.

To each his own
.



posted on Dec, 8 2013 @ 08:58 PM
link   
reply to post by luciddream
 


What I am talking about is the lag to the television monitor. Not the framerate difference between consoles and pc.


Example: PC/VGA card-->computer monitor

compared to...

Console/integrated VGA card --->television monitor



posted on Dec, 15 2013 @ 03:57 PM
link   

VoidHawk
FPS is a little more important than some realise.
It depends on how the developers set it all up, but here's an example of why high fps can be important.
A couple of years ago people with high fps found they could jump higher than those with low fps!!
People will say anything above 25 to 30 frames is fine as the human eye cant detect the difference once its higher than that (persistence of vision) but try telling that to serious gamers and they'll quickly tell you its bull, and they're right.

PC's not only have higher frame rate, they also have higher resolution. A good pc with a good graphics card will put a console to shame.


Never understood the people cant see anything above 25-30 fps thing... I can, always have anything bellow 50 and I can start to see the difference, anything bellow 40 and it starts to annoy me, the difference to my eyes and game play perspective from 50+ and 30-40 is staggeringly noticeable (its like watching one of those old very early black and white films where the projector was hand cranked, the speed and fluidity of the image just looks in consistant and things start to jump around.

Dont forget things like screen tearing, where the fps is higher than the refresh rating of the monitor and you get frames where the top half is the previous/current frame and the bottom half of the screen is the next frame. People seem to hate screen tearing, me I dont care at all and in some ways actually like seeing it, it means what im playing is the best fps i can get. Its hardly off putting since you only really notice it if the view your seeing is changing rapidly, such as turning around very quickly or looking up quickly... and in those situations you arent able to properly focus on whats on screen anyway, although thats probably why people notice it more in those situations. This is what Vsync is for, it sets the output of frames to the max fps at the time, removes tearing but can throttle your FPS unnaturally causing your performance to go down when it doesnt have to.

For me if the fps is a constant 60 everything is fine or at the very least equal to your monitor refresh rate, fps higher than that value is pointless, since you arent going to see the difference, since your monitor cant display it anyway. Which is why I laugh at people saying 'I get 100+ fps in X' sure as a number to say my GPU/CPU can push the output of the graphics engine to this value is fine, but to say your play experience is better because of it is completely false (the only reason it would be good is that you would have a say in the 100 fps with a 60mhz refresh monitor example a 40fps buffer zone for frame rate spikes so the fps would have to drop 50+ fps before you noticed anything but buttery smooth visuals.

The problem with consoles is for a very long time they've been tied to TV resolutions... which up until this decade has been 320x240... yes your old cathode ray TV had that small a resolution. With LCD TV's and HD and digital now the norm higher resolutions matching the bottom end of PC's are possible for consoles. The thing is the console gpus from my understanding are no where near the level of a PC's (the same as a laptops GPU is often inferior to a desktops), so for cost and manufacture the gpu for a console is lower strength than a PC's but they are trying to push console standard resolution... and it just doesnt work very well, and alot of games for consoles are still 720p (that is the vertical size of the image ie 720 pixels higher vertically) which works well for most current consoles, but on a modern LCD with a larger vertical resolution such as a HD TV with 1080p means the TV has to upscale the image... which is just ugly. As an aside something I found out recently is that many console first person shooters like COD or anything that has the need for quick precise targeting has actual fuzzy aiming in it, ie it hits what you'd otherwise not hit if you had played it on PC, they do this to get around the limitations of a console control compared to a PC's mouse and keyboard setup. Why would i want to play a game on a gaming system that 'adjusts' my gameplay to make up for its short comings.

One thing ive never experienced until now is micro stutter, I recently just got Skyrim (yeah im a few years behind everyone else
)... and boy is that game taxing my patience, its fun but damn it if it doesnt have the most demanding pre-setup to get looking good and working right of any game ive ever played on a PC. I get regular stutters though where the game plays normal (judging by the sound) but the frames pause for a quarter of a second in some situations (ie anything with a humanoid model being preloaded into the game or visible on screen for a while initially) creating a 'stutter' of the visuals that make playing especially in combat VERY hard (like watching a slide show with a quarter second interval, thing is its not like very low FPS which I can handle or avoid)... supposedly its because the FPS is higher than the refresh of the monitor, which is bollocks in my opinion since ive never had the issue with max+ FPS before in other games, but supposedly its a fairly new issue with alot of modern games... supposedly.... given I get it even in situations where my fps is well bellow the 30's I think the official explanation is wrong.

So anyway... Consoles are inferior to PC, and always have been, they've bridged the gap the last few years somewhat but they just cant catch up and by design they never will really (the recent resolution wars debate is a good example). Modern consoles are trying to be PC's and in doing so they have stopped being consoles (and the good things that used to entail) but are also trying to be PC's and a media center... and fail at all of that.

Consoles are/were for ease of use gaming... consoles in general are becoming more and more like PC games in that you have to preinstall the damn games, patch them etc etc... ultimately, why not just buy a PC with a decent GPU (heck the current gen of GPU are fairly cheap and run pretty much everything at the moment perfectly, and will do so well enough for at least the next 3 or years), its cheaper in the long run and can be used for other things.

So yeah sorry for all that... PC with a half decent GPU (ie anyone released in the last few years) is and will always will be better than a Console. Last console I owned (and still do) is a PS2, and ive played and or owned every console they've ever produced since the Atari... wouldn't touch any of the modern ones personally.


John_Rodger_Cornman
reply to post by luciddream
 


What I am talking about is the lag to the television monitor. Not the framerate difference between consoles and pc.


Example: PC/VGA card-->computer monitor

compared to...

Console/integrated VGA card --->television monitor





Ummmm.. your talking about thousandths if not hundreds of thousandths of a second, its an electrical signal running up a cable, you aint going to get any sort of lag from that part of the setup, unless your TV is a couple of hundred miles from where you are sitting... thats like saying whats the lag between turning on a light with a wall light switch or pushing the button on a torch. Makes no sense to ask...

edit on 15-12-2013 by BigfootNZ because: (no reason given)



posted on Dec, 16 2013 @ 11:10 PM
link   
reply to post by BigfootNZ
 

Hi BigFootNZ

I agree with everything you said except


For me if the fps is a constant 60 everything is fine or at the very least equal to your monitor refresh rate, fps higher than that value is pointless, since you arent going to see the difference, since your monitor cant display it anyway.

The reply I made to op was to make the point that higher fps (even higher than refresh) can make a difference. Its down to the way the developers structure their code, this was proven to me by my brother who's a serious gamer. He stood by a wall and rocket jumped. He then increased the fps and repeated the jump and he was able to jump higher! He increased the fps even more and he jumped even higher!

But yes, other than that I agree with the rest of your post, my self I turn off vsync and limit fps to 60.

How far back do you go? I started on quake (original) then q2 q3 Unreal tourney, the Half life's etc. Dont do much now but often watch my brother who's seriously into it.




top topics



 
0

log in

join