33
votes

In standard php or source code based projects we easily keep all of the code in SVN and each developer can checkout their own copy and collaborate on the same code.

When developing a Drupal site however, much of the work is in "setup". Besides the theme and modules you don't really have any "source code". How do you run multiple instances of the same site so developers can all work at the same time yet share their work?

Example Scenario:

We launch an initial version of a Drupal site with content type "X" created. We also initially launch a view on the site that lists all the nodes of type "X" in chronological order. The client starts using the site, add content, menu items etc.

The next release is planned to add user search ability to that view. The setup for that is contained in the database though. We can copy down the production database to our development version to get the latest data while we work on changing the view. During that time however the client can still be updating the site, making our dev database out of sync. When we are ready to push the new view to production, is there an easier way to do it other than manually repeat the steps to set it up on the production install?

7
hmm can you clarify a bit? are you talking about setup like settings on certain modules basically?Owen
really good question, thanks.sepehr

7 Answers

12
votes

I think a good strategy here is to use the install profile API. With install profile API you can do most things that using the Drupal admin tools do. Most core forms simply set variables in the variables table. To be able to sensibly version your non content database contents i.e. configuration it is wise to use update functions.

On my site we have on module "ec" that does very little apart from have it's ec.install file contain update functions e.g. ec_update_6001()

Your main install function can take care of actually running the updates on any new installs you make to bring your modules up to date.

function ec_install() {
  $ret = array();
  $num = 0;
  while (1) {
   $version = 6000 + $num;
   $funcname = 'ec_update_' . $version;
   if (function_exists($funcname)) {
     $ret[] = $funcname();
     $num++;
   } else {
     break;
   }
  }
return $ret;
}

A sample update function or two from our actual file now follow

// Create editor role and set permissions for comment module
function ec_update_6000() {
  install_include(array('user'));
  $editor_rid = install_add_role('editor');
  install_add_permissions(DRUPAL_ANONYMOUS_RID, array('access comments'));
  install_add_permissions(DRUPAL_AUTHENTICATED_RID, array('access comments', 'post comments', 'post comments without approval'));
  install_add_permissions($editor_rid, array('administer comments', 'administer nodes'));
  return array();
}
// Enable the pirc theme.
function ec_update_6001() {
  install_include(array('system'));
  // TODO: line below is not working due to a bug in Install Profile API. See http://drupal.org/node/316789.
  install_enable_theme('pirc');
  return array();
}

// Add the content types for article and mtblog
function ec_update_6002() {
  install_include(array('node'));
  $props = array(
    'description' => 'Historical Movable Type blog entries',
  );
  install_create_content_type('mtblog', 'MT Blog entry', $props);
  $props = array(
    'description' => 'Article',
  );
install_create_content_type('article', 'Article', $props);
return array();
}

Effectively this mostly solves the versioning problem with databases and Drupal code. We use it extensively. It allows us to promote new code which changes database configuration without having to reimport the database or make live changes. This also means we can properly test releases without fear of hidden database changes.

Finally cck and views support this approach. See this code snippet

// Enable CCK modules, add CCK types for Articles in prep for first stage of migration,
// enable body for article, enable migration modules.
function ec_update_6023() {
  $ret = array();
  drupal_install_modules(array('content', 'content_copy', 'text', 'number', 'optionwidgets'));
  install_include(array('content', 'content_copy'));
  install_content_copy_import_from_file(drupal_get_path('module', 'ec') . '/' . 'article.type', 'article');
  $sql = "UPDATE {node_type} SET body_label='Body', has_body=1
  WHERE type = 'article'";
  $ret[] = update_sql($sql);
  return $ret;
} 
11
votes

I wrote an article on painless Drupal revision control with CVS and Subversion best practices a while ago.

Unfortunately there is still the issue of source controlling the database, as you've pointed out. There are a few suggested methods, which I mention in an additional post.

7
votes

Taking Drupal settings from the database into code had been moving forward in leaps and bounds. Two modules that really help in this realm are:

Features - Allows you to gather together entities such as content types, taxonomy, views, even feeds. We are using this very successfully and it's made it possible to share these changes between developers.

Strongarm - Allows for the storage and export of the variable using the above module. I've done some testing with this module but we are not using it, simple because we really didn't need the functionality.

These solve the biggest issues with keeping the site setup in the database. They are not perfect however. . . we've found modules that were not supported or supported incorrectly.

1
votes

You could save yourself some of the pain of configuring and working with SVN as described in Nick's article if you use the svn:externals property. This will keep your local version of Drupal up-to-date with the specified Drupal branch automatically, and you can use exactly the same mechanism for your modules. Additionally, because SVN will read the externals definitions from a file, you can put these under version control too!

I don't think CVS has an equivalent feature. However, it is quite easy to write a simple script that will automatically install a Drupal module, taking just a URL (I've done this for keeping my own Drupal site up to date).

As far as versioning the database is concerned, this is a much trickier problem to solve. I would suggest exporting a "stock" Drupal database to an SQL file and placing that under version control. Each developer would have their own local private database server to use. You could then provide a script that would revert a specified database to your stock version contained in the SQL file.

As an example of how this problem is solved in other ways, I'll describe the situation at work. I work on a web application; it doesn't use a database so doesn't suffer those problems. Our way of getting around the repeated setup of sites is to rebuild from source control and provide a program to achieve the automatic deployment of the sites. The program is used by our customers too as their way of creating sites.

1
votes

Some modules such as CCK and Views allow exporting and importing their setup data as text. You can save these textual representations under the source control system.

1
votes

Unfortunately, there just isn't a good/simple solution here. The problem is an unfortunate side-effect of the architecture of not just Drupal, but all framework-type CMS's where applications are defined as much through configuration (i.e. data stored in the db) as through source code. Neither of the two options for managing configuration data are great. The first is what you are doing: define a single db as canonical (i.e. the production db) and have developers work locally with a snapshot of the production db and "merge" new config info into the production db via manual configuration through the production site admin interface. In the case of well-defined subsystems - i.e. Views - you might be able to take advantage of import/export features developed to ease just this kind of configuration migration. The second option - i.e. the automation I think you are looking for - is difficult but could be worth it - or required - for large projects with the budget for complex project automation: dive deeply into the system/module db structure and develop custom scripting to merge new configuration data at the table/record level into the production db, say, as part of a nightly "build" of the latest db. Afraid there just isn't any in-between solution.

In terms of version control for historical tracking of the configuration data, a module like backup_migrate allows you to perform automated SQL dumps of the db. You can choose which tables are dumped by defining a backup "profile" and could create one that left large content, logging and caching tables (e.g. node, cache_content, watchdog) out of the dump so you were left with a more manageable chunk for versioning. Some simple scripting on the server or elsewhere could grab the latest dump and add it to your respository.

0
votes

Drupal has now support for exportables configuration that allow you to move most of a site configuration to code. Exportables are supported for configuration variables, views, content type, fields, input formats, etc. with the help of the features module.

You can also manage initial, non exportable configuration and changes of configuration throuh a central controller profile, or module. Use it to enable module, create user, etc.

See The Development -> Staging -> Production Workflow Problem in Drupal and the Code driven development: using Features effectively in Drupal 6 and 7 presentation.