I’m currently refactoring some legacy third party PHP code, and as the old saying goes, the real metric is WTFs per minute.
So, just to entertain any readers, how about :
- Writing pagination links for a search form, but if there are more than 20 pages of results, add 20 onto whatever the maximum number of pages there are – so you get 20 invalid links at the end of the pagination list (clicking on them will show no results). I guess it looks like there are lots of results at least.
- if(isset($_GET[‘foo’]) == 0) … (wouldn’t if(!isset($_GET[‘foo’]) be easier to read?).
- Presumably not knowing what a while(…) { … } loop is, and always using something like: $row = mysql_fetch_assoc($x); do { … } while ($row = mysql_fetch_assoc($x)) ….
- Always including mysql_free_result($foo) after every query…. why bother?
- Always having an //END IF comment, even if the if(..) { } statement is only 3 lines long.
- The write_out_the_header() function which consists of a switch statement nearly 2900 lines long, which is just responsible for setting things like the <title> and some meta tags for every page in the site.
- When doing results pagination, for even numbered page links, write out the ‘jump’ URL differently (starting with a &, instead of a ?). Some numbers are more even/equal than others…. I guess.
- Executing a separate query each time within a loop rather than doing a simple join to start with….
And don’t get me started on the lack of error checking…..
Sounds like code written by someone straight out of university.
Unfortunately it wasn’t… :-/
The perils of technical debt.