I would like to open with the words of the other genius I unabashedly worship, Frank Herbert:
"Once men turned their thinking over to machines in hope that this would set them free. But that only permitted other men with machines to enslave them."
These words are spoken through Reverend Mother Gaius Helen Mohiam in Herbert's science fiction masterpiece, Dune (1965). When I first read them, I was shocked by the idiosyncrasy of the thinking they implied - neither avoiding nor fetishizing machines, but treating them directly as what they are: Fulcrums. No matter how elaborate the process, the ultimate kernel that determines the function of a machine is the human will whose resources have been directed to creating it. The revelation that Goldman Sachs has been essentially burglarizing the stock market by utilizing powerful computers capable of responding to market conditions faster than normal traders immediately put me in mind of this quote.
Something that may be obvious to some, but probably has not occurred to the vast majority of us: Since the Goldman-Sachs trading code story became known, I would call it close to a certainty that every wealthy foreign government and large corporation on the planet began (if they were not already in the midst of) planning their own gargantuan server farm projects to make use of this general principle. It may have already been known in the abstract, but suddenly here was blatant proof that an organization could achieve insane profit margins simply by having incrementally faster computers than others.
Now, I'm simplifying here - there is also the matter of the code, which determined what specific market decisions were made, but in essence G-S's advantages came from the fact that their system made the same decisions as other professionals at a faster rate. If they were able to execute the same decision as others 1.5 seconds faster, that could result in thousands to millions of dollars in extra profits in a transaction. Aggregated over a large number of transactions, the extra profit from this seemingly marginal advantage can add up to more than the entire economy of a middle-class country in a single year. We see the concept of machines as fulcrums powerfully demonstrated.
So now, unless wolves no longer lick their teeth at the smell of venison, the computational arms race has begun. There will be attempts to regulate markets in such a way as to mitigate instabilities caused by this acceleration of existing day-trader patterns into the millisecond realm, but as governments and their constituent power blocs will likely be among the strongest players, we can reasonably call it unlikely there will be any serious attempt to stop traders from obtaining advantage from speed.
Eventually, a handful of concerns - probably a hodgepodge of legitimate and covert interests - would tend to accumulate the vast majority of wealth through continuous "skimming" of market fluctuations, and both the extent and arrogance of their interference in political affairs would tend to increase over time until they were effectively functioning as governments themselves. In the case of organizations that already wield government power, they would likely be functioning extralegally, since they would no longer depend on normal tax revenue and legislative appropriations processes to exist and function.
While this is, quite necessarily, broad speculation, we can now see a logical and direct path from an otherwise marginal technological advantage to literal world domination. This is where the rubber of Singularity fantasy meets the road of human reality: The increasing speed and (artificial) intelligence with which organizations who wield such resources can manipulate and control the global economy is no threat to those organizations themselves - programs that don't work as intended simply fall by the wayside, replaced by those that do. Software and computational evolution is not empowering machines, it is empowering owners of machines, and rapidly steepening the curve of economic (and thereby political) dominance each marginal increment of technological advantage affords.
Daily Kos itself is an example of this. It wasn't just our broad-based fundraising, but also our sheer speed that made the Obama campaign so agile and capable. We debunked in hours lies the McCain campaign spent a week strategizing, and disseminated counterattacks far and wide through an efficient distribution network while they were still operating under the assumption that their bullshit had succeeded. Our example presents the hope that, no matter how powerful the aforementioned organizations become through cynical applications of technology, a network of human brains will still contain within it the aggregate capacity to overwhelm it when operating in parallel.
Frank Herbert presented this concept obliquely in his Dune universe through retrospective references to historical events, chiefly the "Butlerian Jihad." Essentially, in the distant past of the universe in which the Dune novels take place - which would still be far in our future - people lived under the tyrannical control of machine entities that had evolved from, and (based on the Mohiam quote above) likely still served in some measure as tools of oligarchic human interests. These machines probably functioned by modeling human behavior, closely predicting responses of populations and thereby controlling them. (I say "probably" because I don't regard Brian Herbert's prequels as canon, because, quite frankly, they're abominable trash that doesn't even rise to the level of credible let alone worthy.)
Anyway, what apparently happened is that people evolved within these societies that were controlled by machines, and over time individuals came along who had capabilities that, in themselves were not sufficiently remarkable to be easily extirpated, but when fed into the overall system of the society created feedbacks that were beyond the ability of the machines to effectively model. A half-assed, dumbed down version of this kind of thinking is evident in "The Matrix Reloaded," as the character known as The Architect refers to his attempts to control "systemic anomalies," but I'm much less interested in cyberpunk, so you'll just have to pursue that yourself if that's your deal. Herbert's chief character who has such abilities is Serena Butler, whose name lends itself to the overthrow of the Machines in the Dune universe, the Butlerian Jihad.
All of which lends itself to a fascinating conclusion: Technology's ability to control humans has the potential to exert evolutionary pressure that ultimately defeats that control. But this is not actually as linear a progression as it seems, since there is no necessity that it occurs in only one cycle, or that the cycles are in fact 100% repetitive rather than being recursive. We might even imagine such a process being recursive and converging on some kind of Singularity scenario, although I doubt very much that would be in the lifetime of anyone alive today.
Rather, I think we personally are going to see the technology arms race become rather desperate for our side (the people), and we're not even really going to be sure what's going on as it's happening. Still, I would say that is at least a decade or two down the road before we really start to lose control of information. In the meantime, we might begin considering, as an intellectual exercise, how the world (and people like us in particular) might deal with immensely powerful, hugely rich organizations with the ability to disseminate both economic instructions and false information faster than anyone else is even physically capable of responding.