As you may know PHP 5.2.0 will feature a very capable filtering extension that can be used to easily validate your input via a number of rules which you can find here. What I am interested in hearing is are there any other common types of data collected by PHP forms that would be worth while adding filters for into the extension. My own suggestions would be the phone (US/EU formats) and postal/zip code validators. So let's hear what you have to say ;-) Brief Disclaimer: Consider this an RFC of sorts, the suggestions if widely supported may not get integrated and any additions will need to have the implicit agreement of all the filter extension developers before being added.

Unless you've been living under a rock you probably know that Firefox 2.0 was released today. Although, it seems that someone one on the Mozilla's team is definitely living in a cavern since the official siteis still linking to FireFox 1.5. From a developer perspective Firefox 2.0 introduces a number of interesting features, which are explained in detail on the Firefox 2 for developers site . The thing that attracted my attention was the support for OpenSearch standard pioneered by A9 (Amazon), something IE7 also supports. The nature of this feature allows you to "push" your own site's search into the browser's search list for the searchbox, thereby providing a neat and consistent way to find content for the user. This is surprisingly simple to do as you can tell from the excerpts taken from FUDforum code (yes, the next release will have support for this feature) which you can find at the bottom. The other very handy addition to Firefox (something Safari had for quite some time) is the integr...

There are many instances where you may want to see what kind of PHP settings other people are using and what better source of this information then the phpinfo() page. The problem with finding a reliable pool of such pages is that basic search often contaisn many blog, forum, bugs.php.net and alike entries which area copy & paste outputs from users. This maybe fine in some instances, but what if you just want the real phpinfo() pages. The answer is surprisingly simple. To get the data you need to simply need to search for a element always present on the phpinfo() page such as the "Zend Scripting Language Engine" string and then for a user-agent containing the indexing bot of your favorite search engine. Among the data displayed by the phpinfo() page is a header containing the browser provided User-Agent field, which is always populated by respectable crawlers such as the ones uses by Google and Yahoo. The presence of this value guarantees that the page shown will be an actual page, rather then a copy in...

The 5.2.0 release is turning to be quite an adventure, we can't seem to get the bloody thing out. Hopefully RC6 will be the last release candidate, but given that I've said that about the last 3RCs, who knows... This said, the delays were not entirely unproductive and every time more bugs were fixed and language was generally made better, so it is not all bad. The release snapshots is available here: http://downloads.php.net/ilia/php-5.2.0RC6.tar.bz2 (md5: 5a146c08f85d8535c76fe6219281a06e) and win32 binaries will be made available shortly be Edin. As always I'd like to ask everyone to give this release a try to make sure no regressions were introduced and to make sure that your applications can still work with this release. If no major issues are uncovered, maybe, just maybe 5.2.0 in a week.

Search engines have for a long time been a good helper of people trying to find sensitive information or vulnerabilities on the web. When you have a few billion documents indexed, it is inevitable some things that should remain private inadvertainly end up in public directories and get indexed, then its just a matter of writing a sufficiently creative search query to find that data. There are even sites that aggregate "interesting" search queries designed to quickly locate sensetive data such as Google Hacking Database from "Johny" that has queries to find everything from old vulnerable software to credit card numbers, etc... There have also been attempts to identify things like SQL injection and XSS by locating sites collecting common form of input and then checking to see if said input is not validated. A good example of this can be found on Michael Sutton's blog, who used Google to generate statistics to identify the frequency of SQL injections. But this approach is does not really show you the...