Google continues to update its crawling and indexing capabilities. Most recently, Google has improved its indexing of Flash, and came out with a very robust infrastructure called Caffeine. With increased usage of JavaScript and AJAX, it has been noticed that more and more pages require POST requests. This may be either for the entire content, or just a part of the page. Now, because of this missing information, the users/searchers may be deprived of accessing the most comprehensive and relevant results.
As a general rule, it is always advised to use a GET request for fetching data on the page. This is presumed to be Google’s preferred method of crawling. As an experiment, Google has started to rewrite POST requests to GET. While in most cases it works absolutely fine, in others the content needed by the web server in GET & POST have marked differences. Developers justify using POST with the argument that the data holding capacity is much greater in POST.
Google POSTs are for crawling requests that the page requests automatically. In a way, it mimics a typical user request. Googlebot ensures the performing of POST requests, when it is believed to be safe and appropriate. This ongoing fine tuning of the process will undoubtedly be enhanced with time.