You are not logged in.

#1 2017-03-12 02:18:26

original username
Member
Registered: 2017-03-12
Posts: 14

How to improve Minecraft performance?

I previously used Lubuntu on my 2011 Lenovo G560 laptop, but Minecraft was extraordinarily laggy. After asking on Reddit, someone recommended that I try using Arch Linux, so I installed it and now Minecraft does run better, but the FPS is still low (~20 FPS on a normal Survival mode world). Not unplayable, but still kind of low. I'm 99% sure that this is an issue with the graphics chipset or its drivers, because the laptop fan sounds like it's going about medium speed and not under too bad of a strain. I'm using the latest Arch Linux with XFCE desktop, the CPU frequency monitor thingy in the menu bar (though it won't let me set it to performance mode), OpenJDK Java 8 JRE, and Minecraft 1.11.2 running on a dual-core Pentium P6200 CPU and 4GB of RAM, with the Intel integrated graphics chipset.

Last edited by original username (2017-03-12 10:44:07)

Offline

#2 2017-03-12 11:53:04

Mr.Elendig
#archlinux@freenode channel op
From: The intertubes
Registered: 2004-11-07
Posts: 4,092

Re: How to improve Minecraft performance?

Buy better hardware or use one of the non-java forks of minecraft.


Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest

Offline

#3 2017-03-12 15:49:28

seth
Member
Registered: 2012-09-03
Posts: 51,149

Re: How to improve Minecraft performance?

Looking at various minecraft performance reports, it seems to heavily rely on the texture fillrate (what, given the principle design, means it must be some *really* crappy GL code) which will be the bottleneck for you, since the IGP does not provide that.
Unless you can put a dedicated GPU into the notebook, there's not much to do (aside what Mr.Elendig said) ... :-(

Offline

#4 2017-03-12 15:58:10

Awebb
Member
Registered: 2010-05-06
Posts: 6,286

Re: How to improve Minecraft performance?

seth wrote:

Looking at various minecraft performance reports, it seems to heavily rely on the texture fillrate (what, given the principle design, means it must be some *really* crappy GL code) which will be the bottleneck for you, since the IGP does not provide that.
Unless you can put a dedicated GPU into the notebook, there's not much to do (aside what Mr.Elendig said) ... :-(

A short query for the laptop model reveals a built in Nvidia Geforce G310M. Turning that thing on will bring OP from pug speed to at least duck speed.

Mr.Elendig wrote:

Buy better hardware or use one of the non-java forks of minecraft.

... like the Windows 10 store exclusive, the Playstation, Xbox or android version, without proper mod support, a mutilated interface and in-game purchases?

Offline

#5 2017-03-12 16:42:33

seth
Member
Registered: 2012-09-03
Posts: 51,149

Re: How to improve Minecraft performance?

sure? cnet only lists hd graphics.
@original username, please post the output of "lspci | grep VGA"

The real question is why something that looks like software quake from 1996 has this kind of HW demands ...

Offline

#6 2017-03-12 19:56:53

Awebb
Member
Registered: 2010-05-06
Posts: 6,286

Re: How to improve Minecraft performance?

The Lenovo site said so. Could be the usual Lenovo clusterfuck.

seth wrote:

The real question is why something that looks like software quake from 1996 has this kind of HW demands ...

The real question would be: Why are we having this conversation?

Either: That's like asking how somebody running Openbox could possibly do something that brings all CPU cores to 100%.
Or: Your topic relevant information level and Minecraft have something in common: Both are poorly optimized.

The truth is not in between but in a circumposition around the two.

Offline

#7 2017-03-12 20:42:16

seth
Member
Registered: 2012-09-03
Posts: 51,149

Re: How to improve Minecraft performance?

The topic is "How to improve Minecraft performance?"

In order to do that you need to know why it's unexpectedly imperformant.
You might then be able to eg. shrink texture files to a reasonable size and alter a shader resp. enforce GL_NEAREST (for what we read is that the limitation is the GPU, resp. certain aspects of this - I would not bet on substituting java would help in this case, "minecraft performs crap" complaints are all over the internet - if there was a bug/branch in some JRE, this would have popped up. My assumption would be the pixel effect is attempted by hi-res textures to mitigate the linear interpolation "smear") - this is not a matter of "you're not John Carmack" missing optimization, something needs to be fundamentally wrong if what I saw has high GPU demands (no, I'm not going to pay $20 just to figure myself why it's imperformant)

PS: "make -j 16" maybe - not to far fetched for an openbox user, is it?

Offline

#8 2017-03-12 21:44:02

original username
Member
Registered: 2017-03-12
Posts: 14

Re: How to improve Minecraft performance?

@seth

00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 02)

Offline

#9 2017-03-12 22:17:12

ZSmith
Member
Registered: 2017-02-25
Posts: 16

Re: How to improve Minecraft performance?

If you're only looking to improve performance in Minecraft you should try the Optifine mod. I've seen this bring dramatic performance boosts for low-end systems. If however it turns out that you have a dedicate GPU like some have suggested, you should focus in getting that working. Check out the wiki for a list of options for configuring that.

Offline

#10 2017-03-12 23:38:57

Awebb
Member
Registered: 2010-05-06
Posts: 6,286

Re: How to improve Minecraft performance?

This isn't really going anywhere, seth. There are many things going wrong with Minecraft, but you must understand that a silly Quake comparison summons silly replies. It boils down to the unusual circumstance, that this game requires the CPU to do things not related to shiny representation, making it hard for ancient on-chip graphics to deal with the already complicated situation regarding garbage control in Java/LWJGL. The Android version, one of the C++ ports, runs fine even on a potato phone as old as OP's laptop. The reason why you think it looks ancient, are the 32x32 textures on all blocks. There are really pretty high resolution texture packs and the engine supports a lot of transparency for tree leaves and whatnot. On top of that, MC generates everything on the fly, there are no scripted and modeled areas, no manual level tweaks like in other 3D games.

The limitation isn't the GPU per-se, but the integrated GPU. In my experience - given a sane set of graphics settings - the majority of frame drops in Minecraft happen under heavy CPU stress from a single thread and sometimes RAM swapping, all while the GPU bores itself to sleep.

If you are interested, read the following. I have been managing several Minecraft servers in different sizes (between a local two player session and a big cooperative server for around a thousand players) over the years and I have observed, that it's not the actual GPU, but specifically integrated graphics, that cause the trouble in MC. I will try to explain, why graphics will only matter, if you try to add effects and why you still get low frame rates on a high end machine with all settings tuned down.

A general problem is the average Minecraft player, who cannot tell the difference between vanilla Minecraft and a heavily modded version. They will complain about Minecraft running bad, while it's their Refined Storage mod, that slows down the entire simulation, because they decided to build a lot of always required components into usually unloaded chunks. A chunk, FYI, is a unit in Minecraft, 16x16x256 blocks in dimension (with a block's side length being roughly a meter). A lot of the simulation is per-chunk and every loaded chunk. The chunks itself are divided into display chunks with 16x16x16 blocks dimensions, stored as OpenGL display lists, and every time a block changes, the respective display chunk has to be rebuilt. The default chunk loading radius (world chunk, 16x16x256 blocks, consisting of 16 display chunks on top of each other), is 10, the shape is a square, so we're talking about 21x21=441 world chunks (=7056 display chunks/OGL display lists) to be active. A tick in MC is 50ms long, so they had to come up with some way to not check 28901376 twenty times per second, so they have come up with a randomiser (geometric distribution, worth the look-up, IMO) choosing only a portion of those blocks per tick to receive an action at random, not counting the blocks that request a tick, because they do something interesting (moving fluid, block with a redstone signal, a mob spawner or even a complex piece of machinery from a mod). The game allows a total of 65536 block updates per tick, a wooping 1310720 updates per second.

Now, in a multiplayer environment, if the server can't keep up, you'll see yourself being set back in the client a few seconds (framerate untouched), while seing this in the log:

[Server thread/WARN]: Can't keep up! Did the system time change, or is the server overloaded? Running 4047ms behind, skipping 81 tick(s)

If a client in a multiplayer session sees an FPS drop, it's usually due to one of two reasons:

1. The player insists on turning on a combination of entity shadows, high particle settings (a killer for low-mid range GPU's, even in highly optimized games), smooth lighting, volumetric clouds and an insane chunk rendering distance. Those are all features not seen in classic Quake.
2. The server is absolutely fine with its performance, but the client isn't capable of processing the simulation fast enough. Since the server won't wait for the client, the client has to deal with it itself.

Lag in a single player session will happen for the same reasons. In a single player game, you will not be "set back", because the simulation will not skip whatever ticks it finds out it must have missed after a couple of seconds, but instead the game will recognize it on approximately every third frame (assuming a 60 fps target for vsyncm with one tick every 50 ms ~ 20 Hz) and grind to a halt until the simulation has something new to display. If this happens on a CPU, that has to share a significant portion of its resources with an integrated graphics chip, then we have an eternal circle of not being able to keep up and waiting.

However, whatever is wrong with MC, our bickering and dong comparison isn't really helping OP, so let's stick with what will:

- Find out whether the Laptop has this G310M as the Lenovo page claims. We're both now sufficiently curious, aren't we?
- Try OptiFine (https://optifine.net) and play with those settings.
- Reduce all settings to minimum and slowly increase the chunk loading distance until small lags happen, then slowly start adding effects. I recommend to leave everything on fast or off, ditch the clouds and - by all means - do not turn on entity shadows.
- Get your hands on a 16x16 texture pack (default is 32x32).
- Turn off VSYNC in MC as well as in your GFX drivers. Hard cap the FPS in MC to 30 or 60 to avoid tearing.

Offline

Board footer

Powered by FluxBB