I'm looking over a document that describes various techniques to improve performance of Lua script code, and I'm shocked that such tricks would be required. (Although I'm quoting Lua, I've seen similar hacks in Javascript).
Why would this optimization be required:
For instance, the code
for i = 1, 1000000 do local x = math.sin(i) end
runs 30% slower than this one:
local sin = math.sin for i = 1, 1000000 do local x = sin(i) end
They're re-declaring sin
function locally.
Why would this be helpful? It's the job of the compiler to do that anyway. Why is the programmer having to do the compiler's job?
I've seen similar things in Javascript; and so obviously there must be a very good reason why the interpreting compiler isn't doing its job. What is it?
I see it repeatedly in the Lua environment I'm fiddling in; people redeclaring variables as local:
local strfind = strfind
local strlen = strlen
local gsub = gsub
local pairs = pairs
local ipairs = ipairs
local type = type
local tinsert = tinsert
local tremove = tremove
local unpack = unpack
local max = max
local min = min
local floor = floor
local ceil = ceil
local loadstring = loadstring
local tostring = tostring
local setmetatable = setmetatable
local getmetatable = getmetatable
local format = format
local sin = math.sin
What is going on here that people have to do the work of the compiler? Is the compiler confused by how to find format
? Why is this an issue that a programmer has to deal with? Why would this not have been taken care of in 1993?
I also seem to have hit a logical paradox:
- Optimization should not be done without profiling
- Lua has no ability to be profiled
- Lua should not be optimized