It was officially announced that a recent change in the algorithm of Google's search is meant to determine original content sources and reduce search engine spam which is often the result of misused "Fair Use" regulations. In terms of fair use and distribution, bits of original content are all over the place that, in very rare cases, get reassembled in similar ways or are direct copies, copies that, due to some ranking advantage, got credit and traffic for the duplicated content. In past cases when a website has the right to use a portion of another's page(s) or use it as a basis for a new work and it happens to be a mega giant of pagerank, the more obscure of the two loses out, you know what happens. Many reputable sources report HUGE decrease in traffic.
What it boils down to for search engine optimization is Google's claim that the results will contain more authoritative results based on authorship rather than popularity. Popularity is another cause of concern mentioned above because, while you may not be successful in creating traffic to your page naturally, it doesn't allow someone to use what you did to try it themselves, and if they had, the original website's reputation could have been damaged in a more theoretical sense because of the past factors that determined relevant search results between identical or alike sources. Whether this is going to make the internet a better place is not quite as obvious as what the truth will be. Low quality pages with original information may not be presenting it in a way that serves the user, which is when one would expect to see things like page and traffic rank deciding what gets on the popular SERP with "fair use" material.
Who is going to ultimately profit from this? To quote Matt Cutts of Google, this change "primarily affects sites that copy others’ content and sites with low levels of original content". If all goes according to writers expectations, new people in very tight niches will appear and more players could get involved with the SEO game as copy and paste "pretenders" are left with time bombs of websites. New subjects and pawns for link building schemes are going to be created whether they know it or not. Without knowing how many webpages reside in the "sandbox" (excluded), the optimization minded could use a feature to work alongside the number of results found for a current search (About 41,500 results (0.18 seconds)) indicating a rough tally for the number of of total pages that were searched that resulted in the 41,500. If determining originality is a primary function of the new algorithm change, the total should drop as excluded sites are restored and spammy reprint outlets go away.