1
votes

When I was learning how to use premake, I remember reading a wiki page or perhaps a forum post somewhere (I wish I could find the original link) suggesting that project files generated by your premake scripts may ultimately be run on different machines than the one you're running premake on. So, I took this idea and designed premake scripts accordingly to replace the existing autotools/VS/Xcode project files in an open-source project I contribute to. This project uses a variety of third-party libraries, some mandatory and some optional.

What I started to discover, through both my own experience and through feedback from other developers, is that it's pretty tough to generate generic project files (gmake files, especially) that will work on other machines, especially when it comes to finding the location of system libraries to link to. It also seems like you're completely giving up your ability to auto-detect the state of things on the build machine and enable/disable optional build settings in your project accordingly, and in lieu of errors you could have displayed during configuration in a user-friendly format (missing dependencies, etc.), you have to rely on cryptic compiler errors to tell users that they're missing something.

My question is for those have experience using premake in a production environment: is it a reasonable goal to be able to transfer premake-generated project files to other machines and still have them work, or should you design your premake scripts around the assumption that users will run premake locally because build environments are so diverse?

1

1 Answers

1
votes

For simple or self-contained projects, certainly—the official Premake releases ship with pre-built project files, for example. But for more complex projects it generally makes more sense to just ship the Premake scripts (i.e. premake5.lua) and ask developers to download and run Premake locally to generate the final project files, for the reasons you specified.