Hello gents and ladies.
Lately something with regards to the torque->load and load->torque conversion has been bugging me, specifically, how for a given cylinder filling (load), torque will increase. I've noticed this behavior in two completely different ECUs, both in the 2.7t ME7.1 and my 3.0 ford running something called GreenOak
For example, the stock M-box 2.7 KFMIOP, with units being load %, RPM, and relative torque %. This is of course a turbocharged engine, so many of you will be familiar with the limitations of it's x-axis
:
And then the stock ford 3.0 file (AU7A-14C204-XJ), with units being load in %, RPM, and absolute torque in newton meters. This is a naturally aspirated engine, so it will theoretically only see maybe 0.95 load:
As you can see, for a given load, torque increases with RPM. Does anybody know why this behavior exists? My theory is that given the same amount of air and fuel (and presumably same amount of explosion), there are less thermal losses to the piston/block/cylinder head at higher RPM since there is less time to do so, but I have nothing to back this up.
If my theory stands true, then of course these tables would need to be modified to reflect the bore, stroke, materials, running temperature, and fuel composition to be completely accurate, although in practice the only thing that's crucial to the running of the engine is that your load+RPM->torque and torque+RPM->load tables are inverses of each other.
Would love to hear some thoughts as to why our OEM calibrators made these tables like this.