2
votes

I've recently built a reasonably complex database application for a university handling almost 200 tables. Some tables can (e.g. Publications) hold 30 or more fields and store 10 one-to-one FK relations and up to 2 or 3 many-to-many FK relations (using crossreference-ref tables). I use integer IDs throughout and normalisation has been key every step of the way. AJAX is minimal and most pages are standard CRUD forms/processes.

I used Symfony 1.4, and Doctrine ORM 1.2, MySQL, PHP.

While the benefits to development time and ease of maintenance have been enormous (in using an MVC and ORM), we've been having problems with speed. That is, when we have more than a few users logged in and active at any one time the application slows considerbly (up to 20 seconds to save or edit a record).

We're currently engaged in discussions with our SysAdmin but they say that we should have more than enough power. With 6 or more users engaged in activity, we end up queuing 4 CPUs in a virtual server environment while memory usage is low (no bleeds).

Of course, we're considering multi-threading our mySQL application (if this will help), refining our code (though much of it is generated by the MVC) and refining our cache usage (this could be better, though the majority of the screen used is user-login specific and dynamic); we've installed APC, extra memory, de-fragmented our database, I've tried unsetting all recordsets (though I understand this is now automatic within the ORM), instigating manual garbage recycling...

But the question I'm asking is whether mySQL, PHP, and Symfony MVC was actually a poor choice for developing an application of this size in the first place? If so, what do people normally use/recommend for a web-based database interface application of this kind of size/complexity?

2
Sorry, perhaps I meant multi-channeling our mySQL database..user982220
So did you determine which process eats up the resources? It's pointless to optimize without seeing what tears down the memory and CPU. Also, what guarantee do you have that your queries are optimal? Number of tables is nothing, what matters is the size of the data, how it's indexed which determines size of the b-trees and the memory consumption (assuming you're using InnoDB with large enough buffer_pool_size). So answering your question is difficult and anyone can just guess what's wrong. It's too early to blame it on doctrine or php, check which process eats up resources first.N.B.
If you are otherwise happy with your system I'd look to optimize it before chucking it. There are many factors. First thought would be that you might not have the proper indexes on your databases for the queries you are running. Turn on your slow query log, examine the queries you find there, use the explain command to suss out whether the slow queries are performing unindexed searches. Add indexes as necessary.AllInOne
Dear N.B. and AllInOne, I have indexed all FK relations. There aren't any queries that are unindexed. Regarding slow queries, yes you're right but the majority of them are built by Symfony. I do apply table_methods on most list queries with root alias extensions and admittedly I do tend to use the algebraic syntax for SQL joins (is that REALLY bad??).user982220
My question, however, was really more general than asking for a pinpoint of what could be wrong. I've not built a system as large as this on my own before and I was wondering if there were better/more recommendable technologies out there to do this sort of thing? Perhaps not..user982220

2 Answers

3
votes

I don't have any experience with Symfony or Doctrine. However, there are certainly larger sites than yours that are built on these projects. DailyMotion for instance uses both, and it serves a heck of a lot more than a few simultaneous users.

Likewise, PHP and MySQL are used on sites as large as Wikipedia, so scalability should not be a problem. (BTW, MySQL should be multithreaded by default—one thread per connection.)

Of course, reasonable performance is relative. If you're trying to run a site serving 200 concurrent requests off a beige box sitting in a closet, then you may need to optimize your code a lot more than a Fortune 500 company with their own data centers.

The obvious thing to do is actually profile your application and see where the bottleneck is. Profiling MySQL queries is pretty simple, and there are also assorted PHP profiling tools like Xdebug. From there you can figure out whether you should switch frameworks, ORMs (or ditch ORM altogether), databases, or refactor some code, or just invest in more processing power.

1
votes

The best way to speed up complex database operation is not calling them :).

So analyize which parts of your application you can cache.
Symfony has a pretty well caching system (in symfony2 even better) that can be used pretty granulary.

On database side another way would be to use views or nested sets to store aggregated data.
Try to find parts where this is appropriate.