It has been about a month since the P2 update. I have about 6 - 8 search terms that I watch. In the time since the update, I have watched the serps on those terms jump all over the place on a little over a 36 hour cycle. On some of the more heavily congested terms, I have seen the cycle take 72 + hours. And then the serp from the previous period returns with maybe only one or two changes in position in the report. Over this same time, I have seen links that had disappeared completely from the SE report return, and that the website no longer exists and returns a 404. This obsevation suggests to me that the updates and realignments are using a bubble sort in conjunction with a merge sort. I have no idea why because the algorithmic times are so slow; O(n^2) for the bubble sort, and O(n log n) for the merge sort with a combined time of O(n^2) * O(n log n). For large data sets, there are better sorting algo's. There also seems to be other weighting variables that I have not been able to determine. Some seem to correspond with the domain authority, and others seem to more closely correspond the the page authority. There also seems to be an element of raw link numbers used in the serp placement that does not correspond with good/bad links for the links I can trace back links for. These observations suggest to me that Google is using a sorting algorithm and historical link data over at least two years in an attempt to determine which sites may have achieved their positions from spam links and other disapproved methods versus what Google considers appropriate/natural link building methods. This seems to be based on the velocity of how fast links are built over some time period. Has anyone else observed this? Does anyone have any thought or idea that may help support or disprove this observation?