Max FPS

Posts
13
Likes
1
just wanted to say idk if you are already aware of this but
i do have GTX 970
i5-4690k
g.skill 8gb ram
the problem is when i set my FPS to (0) = uncapped
it starts to lag like shit, and oh yes..i mean the internet it's like a new lag script ya. :D
 

ent

Movie Battles II Team
Posts
848
Likes
390
That usually happens on server with high ping (>100) when your fps is too high. I unfortunately cannot say what exactly causes it right now, but here is the solution.
You don't need more than 100fps for this game actually since we usually send up to 100 user commands (moving, atacking etc) per second in each snapshot to server. (cl_maxpackets 100) And each snapshot is generated on each client frame.
Cap to 100 - I suggest this.
 
Posts
13
Likes
1
That usually happens on server with high ping (>100) when your fps is too high. I unfortunately cannot say what exactly causes it right now, but here is the solution.
You don't need more than 100fps for this game actually since we usually send up to 100 user commands (moving, atacking etc) per second in each snapshot to server. (cl_maxpackets 100) And each snapshot is generated on each client frame.
Cap to 100 - I suggest this.

ye, already know that but thanks :D
 

Cat Lady

Movie Battles II Team Retired
Posts
412
Likes
237
You don't need more than 100fps for this game actually since we usually send up to 100 user commands (moving, atacking etc) per second in each snapshot to server.

You actually don't need more FPS that you monitor refresh rate is, cause you don't see those frames, anyway, and you're just burning electricity/graphic card lifetime without any reason. Considering, that most people play on 60hz monitors, I'm always amused by the "FPS race".

The best thing you can do is to lock frames to your monitor refresh rate (yes, vsync) *without* allowing graphic card to pre-render any frames (=no input lag - easily achievable with nVidia under linux, I have no idea how it works on windoze, but I have *heard* rumours that this poor excuse for an operating system actually have problems with it and, many times enforces pre-generated frames despite being told not to).

This require your hardware setup to be able of matching/exceeding monitor refresh rate *all the time* (if you ever drop below refresh rate, you automatically drop into half of refresh rate, which inc ase of 60hz monitor means 30fps, and the sudden drop just suxx noticeably), but ensures that you get as fluid output as your monitor allows without even slightest bit of tearing.

Alternatively, you can opt for monitors using g-sync/freesync, which allow all of the above + pre-generated frames (still without input lag) - but, personally, I would wait it out until nVidia and AMD stop silly standards way, and industry adopt either one or the other. For now, it is just too much resembling the BlueRay/HD-DVD war from almost two decades back, which left people that invested in HD-DVD with useless hardware, afterwards.

/Cat Lady
 

ent

Movie Battles II Team
Posts
848
Likes
390
You actually don't need more FPS that you monitor refresh rate is, cause you don't see those frames, anyway, and you're just burning electricity/graphic card lifetime without any reason. Considering, that most people play on 60hz monitors, I'm always amused by the "FPS race".

The best thing you can do is to lock frames to your monitor refresh rate (yes, vsync) *without* allowing graphic card to pre-render any frames (=no input lag - easily achievable with nVidia under linux, I have no idea how it works on windoze, but I have *heard* rumours that this poor excuse for an operating system actually have problems with it and, many times enforces pre-generated frames despite being told not to).

This require your hardware setup to be able of matching/exceeding monitor refresh rate *all the time* (if you ever drop below refresh rate, you automatically drop into half of refresh rate, which inc ase of 60hz monitor means 30fps, and the sudden drop just suxx noticeably), but ensures that you get as fluid output as your monitor allows without even slightest bit of tearing.

Alternatively, you can opt for monitors using g-sync/freesync, which allow all of the above + pre-generated frames (still without input lag) - but, personally, I would wait it out until nVidia and AMD stop silly standards way, and industry adopt either one or the other. For now, it is just too much resembling the BlueRay/HD-DVD war from almost two decades back, which left people that invested in HD-DVD with useless hardware, afterwards.

/Cat Lady
I wanted to also note that the topic starter can cap the FPS to his monitor refresh rate (actually the best cap is 77fps but it's already offtop) but for the best game experience he supposes to have up to 100 frames for the smooth gameplay because as I explained above he generates much more user commands. This means the server recieves more data of the user and therefore he has more data in sync between him, the server and the other clients.
Limiting FPS in this game is more about for the input (keyboard, mouse) than for the eyes feedback.
 

Cat Lady

Movie Battles II Team Retired
Posts
412
Likes
237
Hm, isn't the input (keyboard, mouse movements) sent to the server without correlation to what FPS renderer produces, at least in Q3 engine derivatives? In my limited testing, I haven't seen worse sync even at 10 FPS cap, wasn't all the rate&friends settings on client affecting it, instead of FPS (of course, it doesn't apply to situations where whole computer slow downs cause it can't handle game, but it's different topic).

/Cat Lady
 

ent

Movie Battles II Team
Posts
848
Likes
390
It's the id tech 3 (game engine) design: more FPS - smoother you play.
 
Posts
13
Likes
1
well yeh, i can't play this with 1k+ fps im getting lol capped it to 125 or what was the max don't remember :D
 

Lessen

pew pew
Movie Battles II Team
Posts
1,251
Likes
995
I just wanna necro the fuck out of this thread to ask question relevant to it:

Can MB2 be played on a 144hz screen? at 144fps? Or does going over a certain breakpoint (such as 120 or 125) invariably cause some kind of wacky malfunction?

(I'm definitely gonna get a 144hz screen soon 'cause i heard the difference between 60hz and 144hz is legitimately dramatic once you've adjusted to 144hz, so I wanna experience the intense smoothness of 144. But I love mb2 so i wanna know if mb2 can handle intense smoovness.)
 

SK5

Moderator
Internal Beta Team
EU Official Server Admin
Posts
392
Likes
555
I just wanna necro the F**k out of this thread to ask question relevant to it:

Can MB2 be played on a 144hz screen? at 144fps? Or does going over a certain breakpoint (such as 120 or 125) invariably cause some kind of wacky malfunction?

(I'm definitely gonna get a 144hz screen soon 'cause i heard the difference between 60hz and 144hz is legitimately dramatic once you've adjusted to 144hz, so I wanna experience the intense smoothness of 144. But I love mb2 so i wanna know if mb2 can handle intense smoovness.)

Yeah it can handle it perfectly well, when you go over 300 fps it starts freaking out, so just cap it at 250 or something and you should be good to go
 

AaronAaron

Donator
Posts
421
Likes
799
(I'm definitely gonna get a 144hz screen soon 'cause i heard the difference between 60hz and 144hz is legitimately dramatic once you've adjusted to 144hz, so I wanna experience the intense smoothness of 144. But I love mb2 so i wanna know if mb2 can handle intense smoovness.)
You're gonna love it
 

eezstreet

Movie Battles II Team Retired
Posts
242
Likes
299
There are physics-related bugs regarding FPS but they're mostly irrelevant in MB2.
At 333fps you will get low per-frame gravity which can result in slightly higher jumps and higher air acceleration. 142 and 90 are commonly used for low jumps and/or bunnyhopping. But in MB2 you really won't care about this as much.
 

Tempest

Gameplay Design
Movie Battles II Team
Posts
731
Likes
1,104
I play at 200 FPS and the only real difference I notice is that saber trails get smaller/basically vanish upwards of 125.
 
Top