Compression Ratio

bumblebee

Member +
Found this and thought it was a good read...enjoy

Is a low compression engine better for forced induction than high compression?

Depends on how much boost you're putting into the engine. The big issue here is managing the amount of internal pressure within the cylinders; by making sure it's not going to damage the engine whilst still making the most power possible. Too much pressure can cause catastrophic failure where you literally blow the head off the engine. Hence why Top Fuel drag cars have big straps to hold the superchargers down in case they get blown off. The higher the compression ratio, the more natural torque an engine produces. Adding forced induction increases the effective compression of an engine, because although you have the same compression ratio, air and fuel are entering the cylinder already at a higher pressure. This increase in pressure translates into a bigger bang at ignition, and a larger pressure from the expanding exhaust gases - resulting in more power. Dropping the compression ratio allows a higher amount of induction pressure to be used, meaning a greater volume of fuel and air can be squeezed into the cylinder. This results in a big increase in torque and power - as long as that volume is being delivered. When the turbocharger or supercharger is not delivering the full volume - when it's 'off boost' then the engine is relying on a lower amount (and pressure) of air coming in, which results in less power. This breathless lack of power is often mistakenly referred to as lag. A low compression engine with big induction pressure will perform very poorly 'off boost' (i.e. when the turbo/supercharger is not delivering), and will very rapidly build power as it comes 'on boost'. In extreme cases this can literally be like flicking a switch from no power to instant full power - and a car that will be quite a handful to drive hard. Depending on the induction device, this 'boost threshold' can be quite high in the engine rev range. A higher compression engine with low induction pressure will perform much better 'off boost' because it still has its own natural compression to generate power; it will generally not have a big jump in power, and as the induction device is generally smaller, its boost threshold will be much lower. A low compression, big boost engine will make an insane amount of top end power, but be very wheezy and powerless down low, whereas the same sized engine with higher compression and lower boost will be very torquey low down, but won't make as much top end power.
 

bumblebee

Member +
Part 2

What's better, low compression and more boost or high compression and less boost?

There are certainly reasons to try to raise compression ratio, namely when off-boost performance matters, like on a stree tcar, or when using a very small displacement motor. But when talking purely about on-boost power potential, compression just doesn't make any sense. People have tested the power effects of raising compression for decades, and the most optimistic results are about 3% more power with an additional point of compression (going from 9:1 to 10:1, for example). All combinations will be limited by detonation at some boost and timing threshold, regardless of the fuel used. The decrease in compression allows you to run more boost, which introduces more oxygen into the cylinder. Raising the boost from 14psi to 15psi (just a 1psi increase) adds an additional 3.4% of oxygen. So right there, you are already past the break-even mark of losing a point of compression. And obviously, lowering the compression a full point allows you to run much more than 1 additional psi of boost. In other words, you always pick up more power by adding boost and lowering compression, because power potential is based primarily on your ability to burn fuel, and that is directly proportional to the amount of oxygen that you have in the cylinder. Raising compression doesn't change the amount of oxygen/fuel in the cylinder; it just squeezes it a bit more. So the big question becomes, how much boost do we gain for X amount of compression? The best method we have found is to calculate the effective compression ratio (ECR) with boost. The problem is that most people use an incorrect formula that says that 14.7psi of boost on a 8.5:1 motor is a 17:1 ECR. So how in the world do people get away with this combination on pump gas? You can't even idle down the street on pump gas on a true 17:1 compression motor. Here's the real formula to use:

sqrt((boost+14.7)/14.7) * CR = ECR

sqrt = square root

boost = psi of boost

CR = static compression ratio of the motor

ECR = effective compression ratio

So our above example gives an ECR of 12.0:1. This makes perfect sense, because 12:1 is considered to be the max safe limit with aluminum heads on pump gas, and 15psi is about as much boost as you can safely run before you at least start losing a significant amount of timing to knock. Of course every motor is different, and no formula is going to be perfect for all combinations, but this one is vastly better than the standard formula (which leaves out the square root). So now we can target a certain ECR, say 12.0:1. We see that at 8.5:1 CR we can run 14.7psi of boost. But at 7.5:1 we can run 23psi of boost (and still maintain the 12.0:1 ECR). We only gave up 1 point of compression (3% max power) and yet we gained 28% more oxygen (28% more power potential). Suddenly it's quite obvious why top fuel is running 5:1 compression, that's where all the power is!! 8.5:1 turns out to be a real good all around number for on and off boost performance. Many "performance" NA motors are only 9.0:1 so we're not far off of that, and yet we're low enough to run 30+ psi without problems (provided that a proper fuel is used). Example: "I've got a 500+ CID motor and I'm looking to make 900hp. Can I use a GT42, I've heard they can make 900hp?" Nope! There's nothing wrong with the GT42, it will definitely make 900hp, just not in this scenario. Here's why: 900hp represents a fairly constant amount of air/fuel mixture, regardless of whether it's being made by a small motor at high boost (e.g. 183ci at 32psi) or a large motor at low boost (e.g. 502ci at 10psi). The first problem is that most compressors are only able to reach their maximum airflow when they are running at high boost levels. For example, a GT42 is able to flow about 94lbs/min of air at 32psi of boost, but it can only flow around 64lbs/min of air at 10psi. Often people are quick to assume that high boost means high heat and therefore decreased efficiency, but in reality, it takes higher boost levels to put most turbos into their "sweet spot". In this particular example, the turbo is capable of almost 50% more HP at high boost levels than it is at low boost levels. The other problem is related to backpressure. If the exhaust system (headers, turbine, downpipe, etc.) is the same between both motors, the backpressure will be roughly the same. Let's say the backpressure measures at 48psi between the motor and turbine. The big motor will run into a bottleneck because there is 48psi in the exhaust and only 10psi in the intake (a 4.8:1 ratio). This keeps the cylinder from scavenging/filling fully and therefore limits power. The small motor, on the other hand, has 32psi of boost (only a 1.5:1 ratio) to push against the backpressure. Therefore it is able to be much more efficient under these conditions. The bottom line is, as your motor size increases, your boost level will go down (in order to achieve the same power level). In such a case you will need to maximize the flow potential of your compressor and minimize the restriction of your exhaust system (including the turbine) in order to reach your power goals.
 
Last edited:
Top