OMG, you ARE an Idiot! |
Software Development jobs in SF Bay area
Wednesday, March 26, 2014
Monday, March 24, 2014
The public cloud is the way of the future: Urs Hölzle, Google technical infrastructure
Urs Hölzle oversaw the creation of the world’s largest computer. It’s a machine that spans the globe — from The Dalles, Oregon to Hamina, Finland to Quilicura, Chile — and you use it every day. It’s called Google.
Google Data Center Engineering team build and operate the most efficient, global-scale Data Centers in the world. |
In 1999, Hölzle was a computer science professor of the University of California, Santa Barbara, when Larry Page and Sergey Brin asked him to help rethink the hardware and software underpinning the Google search engine. At the time, Google Search ran on about hundred computer servers in a single Northern California data center, and over the next 15 years, alongside some of the brightest minds in computer science, Hölzle would transform this tiny collection of machines into a global network of data centers that operate very much like a single system, allowing Google to operate a vast empire of web services, from Google Search to Gmail to Google Maps to Google Apps.
He’s the man most responsible for ensuring that Google’s services run as efficiently as they do.
With an air of extreme confidence — not to mention a diamond stud earring — the Swiss native heads Google’s technical infrastructure team, known “TI” inside the company. When he discusses other Google teams, Hölzle calls them his “customers.” The Search team is a customer, and so are the Gmail and Google Maps teams. He and his TI engineers provide the infrastructure — the global computer — that these teams use in delivering their web and mobile services to millions upon millions of people. He is, in short, the man most responsible for ensuring that these Google services run as efficiently as they do.
But this past January, Hölzle sent a thunderclap of a corporate memo across the company, laying out a new direction for both his team and the entire Google empire. In the months to come, he wrote, he and his team would be giving a little less attention to internal “customers” like Google Search and Gmail, so that they could concentrate on serving a new kind of customer outside the company. They were preparing a major expansion of the company’s cloud computing services — services that let outside businesses or software developers run their own software atop Google’s global infrastructure. “We will spend the majority of our development efforts on this New World,” wrote Hölzle. “Every developer will want to live in this world…and it’s our job to build it.”
Urs Hölzle details our adoption of the new, open-source infrastructure technology OpenFlow. |
Google has long offered cloud services that let outsiders build websites and other applications without buying, installing, or operating their own computer hardware. It unveiled a service called Google App Engine in 2008, and in 2012, it followed with a sister service, Google Compute Engine. But in this market — a market that represents the future of computing — Google has always trailed Jeff Bezos and Amazon, who pioneered the idea. And for years, Hölzle and company treated cloud computing as a sideline. But he now says Google is intent on turning this into an enormous business, a business whose revenues could even surpass what the company pulls in from online advertising.
That may seem like a stretch. Online advertising has made Google one of the richest companies on earth. But his point is that the potential market for cloud computing is far greater than what’s available from advertising, even in the rapidly growing world of smartphone and tablet ads. “It has become clear that the public cloud is the way of the future,” Hölzle says during an interview at Google headquarters, looking more like a session musician than a computer scientist, with his earring, his closely cropped beard, and uplift of salt-and-pepper hair. “One day, this could be bigger than ads. Certainly, in terms of market potential, it is.”
Google Mountain View (Global HQ) |
To be sure, Hölzle is looking years down the road. And he admits that both Google and the larger cloud computing market have a long way to go. But many others say that cloud computing is poised to expand in enormous ways, slowly eating away at the $600-billion information-technology market that spans all the hardware and software the world’s businesses use to run their operations. According to James Staten, an analyst with tech research outfit Forrester, cloud computing will account for about 15 percent of this IT market by 2020 — that’s $40 billion — and much like Google, Amazon believes cloud services could become its biggest business.
‘It has become clear that the public cloud is the way of the future. One day, this could be bigger than ads. Certainly, in terms of market potential, it is’ At the moment, Amazon still dominates the cloud computing world, after inventing the market with services like the Elastic Compute Cloud, a way of instantly running software, and the Simple Storage Service, a means of storing large amounts of data. And other rivals abound, including Microsoft, with its Windows Azure service, and Rackspace, with its Rackspace Cloud. But Hölzle believes that Google’s vast infrastructure gives the company an edge — in terms of speed, efficiency, price, and more. “We’ve done much of the hard lifting already for our internal needs,” his memo read. “We have a latent advantage in this business.”
This isn’t far from the truth. In serving up Google Search and Gmail and the rest of its own web products, Google operates a much larger online infrastructure than even Amazon or Microsoft — a 2010 study said that Google’s global network was larger than all but one of the companies that provide the backbone for the entire internet — and it’s widely acknowledged that this massive computing system is the world’s most technically advanced. Over the last fifteen years, as the Google empire expanded to unprecedented sizes, Hölzle and his team, including engineers such as Jeff Dean and Sanjay Ghemawat and Luiz Barroso, designed entirely new kinds of data center hardware and software in order to keep up with this growth, and the rest of the computing world is now striving to imitate these technologies.
The question is whether Google can translate this into a different kind of success. History has shown that the best technology doesn’t always win the day. Marketing can play a role, but so can timing — or just plain luck. “Google has the best infrastructure. They have the best engineers. They have the best software. I firmly believe that,” says Mike Miller, a founder of a cloud computing company called Cloudant, which is now owned by Google rival IBM. “But the trick is that’s not always enough. Think back to Betamax versus VHS.”
‘The World’s Biggest Cloud’
Google’s new push begins tomorrow morning. At an event in San Francisco, the company will unveil several changes to its portfolio of cloud services, while providing a look “behind the scenes of the world’s biggest cloud.” In a rare public appearance, Hölzle will give the keynote, laying out his vision for the future of the market.
Google Data Center Locations
Americas
Like other cloud giants, Google and Hölzle aim to make life easier for anyone who’s building a new website or new a mobile app, storing or processing large amounts of data, or just trying to see if some code will run. Rather than setting up their own infrastructure, businesses and developers can just open up a web browser and run their software on Google’s network. Many are already doing this. According to Google, its cloud services now run 4.75 million active applications, including names like SnapChat and Pulse. Google App Engine alone handles 28 billion online requests a day, or about 10 times more than Wikipedia, one of the largest sites on the web. But the process should be far easier than it is today, Hölzle says, and Google has already taken a big step in this direction.
‘What you really want is to mix and match,’ Hölzle says, indicating the company will combine the advantages of App Engine and Compute Engine.
Today, if you build a website on Google App Engine, the service will automatically expand the site across more and more machines as more and more people visit it. The problem is that you can’t just hoist any software onto the service. You have to build your site in a certain way, using specific languages, software libraries, and frameworks. The company’s other service, Google Cloud Engine, addresses that issue, giving you raw virtual machines, or VMs, where you can run anything you like. But the onus is on you to manage these VMs — to spin up more, for instance, as you need them. This is a tradeoff you see on most cloud services. But, Hölzle tells us, Google has now bridged the gap. “What you really want is to mix and match,” he says, indicating the company will unveil a service that combines the advantages of App Engine and Compute Engine.
He also says that cloud computing is too expensive. Indeed, many small businesses and developers complain there are situations where cloud services are far more expensive than buying and operating your own gear. It would seem that companies like Amazon charge an inexplicably high premium for their services — at least in certain cases. But Hölzle believes that because Google is operating at such an enormous scale, it can help solve this problem as well. Just as a massive retailer like WalMart can drive down the cost of toothpaste, Google can drive down the cost of compute cycles. “For cloud success, you not only have to be technically good,” he says, hinting that company will also significantly drop prices at tomorrow’s event. “You have to economically beat the alternative.”
That’s certainly true. And like Google’s efforts to dovetail the worlds of App Engine and Compute Engine, a price drop is certainly welcome. But it’s yet to seen how far such changes can take the company. Forrester analyst Staten believes that Google remains a long way from taking control of the market. “Google is still playing catchup,” he says. On the whole, he explains, Google’s services aren’t as mature as offerings from the likes of Amazon and Microsoft, meaning they don’t offer quite as many tools for rapidly building and running software. That said, he does think Hölzle and company will contend with the Amazons and the Microsofts and the IBMs in the long term, as they all fight for a slice of that $600-billion pie.
The Snowden Problem
The added rub is that these rivals are fighting more than just each other. Many of the world’s businesses are still reluctant to run their applications or store their data in the cloud, fearing it will compromise their privacy and their security. This is a particular worry now that ex-government contractor Edward Snowden has exposed the NSA’s efforts to eavesdrop on Google and other big-name web companies. What’s more, in many parts of the world, government regulations prohibit businesses from storing data outside of local borders — something that can’t always be accommodated in the cloud, where data can flow freely from place to place. But in typical Google fashion, Hölzle believes that the world will evolve to the point where people realize their data is actually be safer on Google and where government regulations bend for global cloud services.
Hölzle believes that the world will evolve to the point where people realize their data is actually safer on Google and where government regulations bend for global cloud services.
He acknowledges that regulatory issues are largely outside of Google’s control, but he says security is another matter. In the future, he explains, most security experts will realize that your data is actually more secure if you trust it to Google and other cloud services than if you don’t. “Cloud based systems are often a step ahead because they evolve more rapidly,” he says, before boasting of Google’s track record in particular. “If you look at the long history of NSA disclosures, you actually see in these documents that we are harder to crack than most companies.”
Others take a slightly different view of how this market will evolve. Lucas Carlson, a cloud computing veteran who now oversees services at internet service provider CenturyLink, believes there will always be businesses who insists on running their software in-house for security, regulatory, and other reasons. “I think there will always, forevermore, be a balance between the public and the private,” he says. But he also believes that public cloud services like Google Compute Engine and his own CenturyLink Cloud will continue to expand their reach, and according to Forrester’s James Staten, this is already happening at an unprecedented clip. Despite security and regulatory concerns, cloud services are doubling their share of the IT market each year.
As this market grows, the bigger question is what role Google will play. It certainly has the technology and the talent to close the gap on Amazon. And now it has the ambition as well. Urs Hölzle makes some bold statements about how far these ambitions can take the company. But they shouldn’t be discounted. Google is many things — a search company, an ad company, a map company, a phone company, a company that dabbles in everything from computerized eyewear to self-driving cars — but more than anything else, it’s a company that has mastered the art of global computing. No one would be surprised if this now becomes its biggest business.
Cade Metz
Cade Metz is the editor of Wired Enterprise
Please e-mail him: cade_metz at wired.com.
Follow @cademetz on Twitter.
Pioneer Stereo Receiver Model SX-850
Pioneer Stereo Receiver Model SX-850
Pioneer Stereo Receiver Model SX-850 |
This Pioneer SX-850 Vintage receiver sounds GREAT!! In very good condition with xlint phono stage and a tuner that picks up everything. Clean, high power tested at 87 rms per side at less than .05%,. These Pioneer receivers are really well built in Japan. The Pioneer AM/FM stereo receiver model SX-850 was introduced in 1976 and manufactured up till 1977. Rated at 85 watts per channel RMS and a price tag of $500.00 back then. Features, separate bass and treble controls with turnover switches from 200hz and 400hz bass, 5k and 25k treble, also tone on/off FM Mpx filter and muting on/off switches, low and high filters on/off, two tape in/outputs, duplicate switch, adaptor switch that enables you to add an equalizer etc. two phono inputs and one aux input, phono two selector is also the mic selector, mode stereo and mono, loudness on/off, mic input, muting switch and both signal and tuning meters. This receiver was built to last and perform for many years. I have repaired several of these. I will try to find the service manual, it's around here somewhere. These sound great. You will never see a receiver built like this again.
Pioneer Stereo Receiver Model SX-850 |
Specifications:
Power Output: 85 watts per channel RMS into 8 ohms.
Harmonic Distortion: Less than 0.05%
Intermodulation Distortion: Less than 0.03%
FM Sensitivity: 1.8uV
FM Capture Ratio: 1.00
FM Harmonic Distortion Mono: <0 .15="" p="">
FM Harmonic Distortion Stereo: <0 .3="" p="">
0>
Dimensions: 20.75 x 6.75 x 16.25
Weight: 42.00 lbs
0>Tuesday, March 18, 2014
NSA’s Facebook Malware Denial
Compare the NSA’s Facebook Malware Denial to its Own Secret Documents
A top-secret NSA presentation reveals how the agency used Facebook to hack into targeted computers for surveillance. On Wednesday, Glenn Greenwald and I revealed new details about the National Security Agency’s efforts to radically expand its ability to hack into computers and networks across the world. The story has received a lot of attention, and one detail in particular has sparked controversy: specifically, that the NSA secretly pretended to be a Facebook server in order to covertly infect targets with malware “implants” used for surveillance.
This revelation apparently infuriated Facebook founder Mark Zuckerberg so much that he got on the phone to President Barack Obama to complain about it. “I’ve been so confused and frustrated by the repeated reports of the behavior of the US government,” Zuckerberg wrote in a blog post Thursday. “When our engineers work tirelessly to improve security, we imagine we’re protecting you against criminals, not our own government.”
That wasn't all. Wired ran a piece saying that the NSA’s widespread use of its malware tools “acts as implicit permission to others, both nation-state and criminal.” Slate noted that the NSA’s hacking platform appears to be “becoming a bit more like the un-targeted dragnets everyone has been so upset about.” Meanwhile, Ars Technica wrote that the surveillance technology we exposed “poses a risk to the entire Internet.”
In response, the NSA has attempted to quell the backlash by putting out a public statement dismissing what it called “inaccurate” media reports. The agency denied that it was “impersonating U.S. social media or other websites” and said that it had not “infected millions of computers around the world with malware.” The statement follows a trend that has repeatedly been seen in the aftermath of major disclosures from documents turned over by NSA whistleblower Edward Snowden, in which the NSA or one of its implicated allies issues a carefully worded non-denial denial that on the face of it seems to refute an allegation but on closer inspection does not refute it at all.
Prior to publishing our story, we asked the NSA to explain its use of Facebook to deploy malware as part of a top-secret initiative codenamed QUANTUMHAND. The NSA declined to answer all of our questions or offer context for the documents. We went into meticulous detail in our report, which went through a rigorous fact-checking process because of the gravity of the revelations. What we reported, accurately, was that the Snowden files showed how the agency had in some cases “masqueraded as a fake Facebook server, using the social media site as a launching pad to infect a target’s computer and exfiltrate files from a hard drive.” The source for that detail was not plucked from thin air; it was rooted in multiple documents that refer to the technique in action, including the internal NSA animation that we published.
A particular short excerpt from one of the classified documents, however, has taken on new significance due to the NSA’s statement. The excerpt is worth drawing attention to here because of the clarity of the language it uses about the Facebook tactic and the light it shines on the NSA’s denial. Referencing the NSA’s Quantum malware initiative, the document, dated April 2011, explains how the NSA “pretends” to be Facebook servers to deploy its surveillance “implants” on target’s computers:
It is difficult to square the NSA secretly saying that it “pretends to be the Facebook server” while publicly claiming that it “does not use its technical capabilities to impersonate U.S. company websites.” Is the agency making a devious and unstated distinction in its denial between “websites” and “servers”? Was it deliberate that the agency used the present tense “does not” in its denial as opposed to the past tense “did not”? Has the Facebook QUANTUMHAND technique been shut down since our report? Either way, the language used in the NSA’s public statement seems highly misleading – which is why several tech writers have rightly treated it with skepticism.
The same is true of the NSA’s denial that it has not “infected millions of computers around the world with malware” as part of its hacking efforts. Our report never actually accused the NSA of having achieved that milestone. Again, we reported exactly what the NSA’s own documents say: that the NSA is working to “aggressively scale” its computer hacking missions and has built a system called TURBINE that it explicitly states will “allow the current implant network to scale to large size (millions of implants).” Only a decade ago, the number of implants deployed by the NSA was in the hundreds, according to the Snowden files. But the agency now reportedly manages a network of between 85,000 and 100,000 implants in computers systems worldwide – and, if TURBINE’s capabilities and the NSA’s own documents are anything to go by, it is intent on substantially increasing those numbers.
The rapid proliferation of these hacking techniques in the past decade, under cover of intense secrecy, is extraordinary and unprecedented. The NSA insists in its denial that its hacking efforts are not “indiscriminate.” Yet how the agency defines “indiscriminate” in this context remains unclear. The Intercept asked the NSA to clarify some of these issues for this post. Does the agency deny that it has used the QUANTUMHAND method to pretend to be a Facebook server in order to deploy malware implants? How does the NSA distinguish “indiscriminate” from “discriminate”? In what specific legal, policy, and operational context does the implants system function? The agency declined to answer all of these questions. Instead, spokeswoman Vanee’ Vines said that the NSA stood by its original statement, adding only that “unauthorized and selective publication” of the documents “may lead to incorrect assumptions.”
The NSA’s outgoing chief has claimed that the agency supports increased transparency in the wake of the Snowden leaks – but its response to the latest disclosures illustrates that it is failing to live up to that commitment. If the NSA truly wants to gain citizens’ trust, it should rethink its slippery public relations strategy. A good first step would be to stop issuing dubious denials that seem to sit so starkly at odds with what its officials were saying in secret when they thought nobody would ever learn about what they were doing.
By Ryan Gallagher15 Mar 2014
A top-secret NSA presentation reveals how the agency used Facebook to hack into targeted computers for surveillance. On Wednesday, Glenn Greenwald and I revealed new details about the National Security Agency’s efforts to radically expand its ability to hack into computers and networks across the world. The story has received a lot of attention, and one detail in particular has sparked controversy: specifically, that the NSA secretly pretended to be a Facebook server in order to covertly infect targets with malware “implants” used for surveillance.
This revelation apparently infuriated Facebook founder Mark Zuckerberg so much that he got on the phone to President Barack Obama to complain about it. “I’ve been so confused and frustrated by the repeated reports of the behavior of the US government,” Zuckerberg wrote in a blog post Thursday. “When our engineers work tirelessly to improve security, we imagine we’re protecting you against criminals, not our own government.”
That wasn't all. Wired ran a piece saying that the NSA’s widespread use of its malware tools “acts as implicit permission to others, both nation-state and criminal.” Slate noted that the NSA’s hacking platform appears to be “becoming a bit more like the un-targeted dragnets everyone has been so upset about.” Meanwhile, Ars Technica wrote that the surveillance technology we exposed “poses a risk to the entire Internet.”
In response, the NSA has attempted to quell the backlash by putting out a public statement dismissing what it called “inaccurate” media reports. The agency denied that it was “impersonating U.S. social media or other websites” and said that it had not “infected millions of computers around the world with malware.” The statement follows a trend that has repeatedly been seen in the aftermath of major disclosures from documents turned over by NSA whistleblower Edward Snowden, in which the NSA or one of its implicated allies issues a carefully worded non-denial denial that on the face of it seems to refute an allegation but on closer inspection does not refute it at all.
Prior to publishing our story, we asked the NSA to explain its use of Facebook to deploy malware as part of a top-secret initiative codenamed QUANTUMHAND. The NSA declined to answer all of our questions or offer context for the documents. We went into meticulous detail in our report, which went through a rigorous fact-checking process because of the gravity of the revelations. What we reported, accurately, was that the Snowden files showed how the agency had in some cases “masqueraded as a fake Facebook server, using the social media site as a launching pad to infect a target’s computer and exfiltrate files from a hard drive.” The source for that detail was not plucked from thin air; it was rooted in multiple documents that refer to the technique in action, including the internal NSA animation that we published.
A particular short excerpt from one of the classified documents, however, has taken on new significance due to the NSA’s statement. The excerpt is worth drawing attention to here because of the clarity of the language it uses about the Facebook tactic and the light it shines on the NSA’s denial. Referencing the NSA’s Quantum malware initiative, the document, dated April 2011, explains how the NSA “pretends” to be Facebook servers to deploy its surveillance “implants” on target’s computers:
It is difficult to square the NSA secretly saying that it “pretends to be the Facebook server” while publicly claiming that it “does not use its technical capabilities to impersonate U.S. company websites.” Is the agency making a devious and unstated distinction in its denial between “websites” and “servers”? Was it deliberate that the agency used the present tense “does not” in its denial as opposed to the past tense “did not”? Has the Facebook QUANTUMHAND technique been shut down since our report? Either way, the language used in the NSA’s public statement seems highly misleading – which is why several tech writers have rightly treated it with skepticism.
The same is true of the NSA’s denial that it has not “infected millions of computers around the world with malware” as part of its hacking efforts. Our report never actually accused the NSA of having achieved that milestone. Again, we reported exactly what the NSA’s own documents say: that the NSA is working to “aggressively scale” its computer hacking missions and has built a system called TURBINE that it explicitly states will “allow the current implant network to scale to large size (millions of implants).” Only a decade ago, the number of implants deployed by the NSA was in the hundreds, according to the Snowden files. But the agency now reportedly manages a network of between 85,000 and 100,000 implants in computers systems worldwide – and, if TURBINE’s capabilities and the NSA’s own documents are anything to go by, it is intent on substantially increasing those numbers.
The rapid proliferation of these hacking techniques in the past decade, under cover of intense secrecy, is extraordinary and unprecedented. The NSA insists in its denial that its hacking efforts are not “indiscriminate.” Yet how the agency defines “indiscriminate” in this context remains unclear. The Intercept asked the NSA to clarify some of these issues for this post. Does the agency deny that it has used the QUANTUMHAND method to pretend to be a Facebook server in order to deploy malware implants? How does the NSA distinguish “indiscriminate” from “discriminate”? In what specific legal, policy, and operational context does the implants system function? The agency declined to answer all of these questions. Instead, spokeswoman Vanee’ Vines said that the NSA stood by its original statement, adding only that “unauthorized and selective publication” of the documents “may lead to incorrect assumptions.”
The NSA’s outgoing chief has claimed that the agency supports increased transparency in the wake of the Snowden leaks – but its response to the latest disclosures illustrates that it is failing to live up to that commitment. If the NSA truly wants to gain citizens’ trust, it should rethink its slippery public relations strategy. A good first step would be to stop issuing dubious denials that seem to sit so starkly at odds with what its officials were saying in secret when they thought nobody would ever learn about what they were doing.
By Ryan Gallagher15 Mar 2014
Friday, March 14, 2014
NSA denies infecting millions of PC's
NSA denies infecting millions of PC's.
Summary: The
US National Security Agency (NSA) has denied claims that it conducts
indiscriminate hacking and says it doesn’t impersonate US social media
or websites.
"Recent media reports that allege NSA has infected millions of
computers around the world with malware, and that NSA is impersonating
US social media or other websites, are inaccurate," the NSA said in a
statement to media yesterday.
The statement followed reports based on classified NSA documents from whistleblower Edward Snowden that revealed the existence of Turbine, an NSA system that allowed the agency to perform automated control malware implants "by groups instead of individually".
The Turbine capabilities appeared around 2009, marking a departure from its old approach where manually deployed implants were reserved for targets that couldn’t be monitored through traditional wiretaps, according to the report on Tuesday by The Intercept.
According to the report, Turbine was built to compensate for the human limitations around hacking at scale. Turbine became part of its elite hacking squad, the Tailored Access operations unit, enabling it conduct "industrial-scale exploitation" and manage "millions of implants".
The Intercept's report did not allege NSA actually used the system to infect millions of people's computers and points to previous reports based on Snowden documents that put the number of implants deployed by the agency at between 85,000 to 100,000. And while Turbine may make it capable of attacking users by group rather than individually, the NSA has denied it operates indiscriminate cyber attacks. It also appears to have denied a claim that it had spoofed a Facebook server to phish its targets.
"NSA's authorities require that its foreign intelligence operations support valid national security requirements, protect the legitimate privacy interests of all persons, and be as tailored as feasible. NSA does not use its technical capabilities to impersonate U.S. company websites. Nor does NSA target any user of global Internet services without appropriate legal authority. Reports of indiscriminate computer exploitation operations are simply false," it said. "NSA uses its technical capabilities only to support lawful and appropriate foreign intelligence operations, all of which must be carried out in strict accordance with its authorities. Technical capability must be understood within the legal, policy, and operational context within which that capability must be employed." The nominee to head up the NSA US Navy vice admiral Michael S Rogers earlier this week outlined how the agency handles zero day flaws in software and devices, which are one of the key assets it uses to exploit computers. According to Rogers, the NSA's default position is to disclose software vulnerabilities to vendors of the affected product. But that position stands in contrast the $25m it spent on acquiring zero day flaws from third-party security firms, which could otherwise have sold or reported them to the vendor.
The statement followed reports based on classified NSA documents from whistleblower Edward Snowden that revealed the existence of Turbine, an NSA system that allowed the agency to perform automated control malware implants "by groups instead of individually".
The Turbine capabilities appeared around 2009, marking a departure from its old approach where manually deployed implants were reserved for targets that couldn’t be monitored through traditional wiretaps, according to the report on Tuesday by The Intercept.
According to the report, Turbine was built to compensate for the human limitations around hacking at scale. Turbine became part of its elite hacking squad, the Tailored Access operations unit, enabling it conduct "industrial-scale exploitation" and manage "millions of implants".
The Intercept's report did not allege NSA actually used the system to infect millions of people's computers and points to previous reports based on Snowden documents that put the number of implants deployed by the agency at between 85,000 to 100,000. And while Turbine may make it capable of attacking users by group rather than individually, the NSA has denied it operates indiscriminate cyber attacks. It also appears to have denied a claim that it had spoofed a Facebook server to phish its targets.
"NSA's authorities require that its foreign intelligence operations support valid national security requirements, protect the legitimate privacy interests of all persons, and be as tailored as feasible. NSA does not use its technical capabilities to impersonate U.S. company websites. Nor does NSA target any user of global Internet services without appropriate legal authority. Reports of indiscriminate computer exploitation operations are simply false," it said. "NSA uses its technical capabilities only to support lawful and appropriate foreign intelligence operations, all of which must be carried out in strict accordance with its authorities. Technical capability must be understood within the legal, policy, and operational context within which that capability must be employed." The nominee to head up the NSA US Navy vice admiral Michael S Rogers earlier this week outlined how the agency handles zero day flaws in software and devices, which are one of the key assets it uses to exploit computers. According to Rogers, the NSA's default position is to disclose software vulnerabilities to vendors of the affected product. But that position stands in contrast the $25m it spent on acquiring zero day flaws from third-party security firms, which could otherwise have sold or reported them to the vendor.
Read more on the NSA
Thursday, March 13, 2014
The Snowden Files: Story of the year?
The Edward Snowden files: revelations about the extent of the NSA surveillance has been a real eye opener for some. Here is a list of article from one of my top news web sites: slashdot.org Edward Snowden timeline from slashdot.org: |
---|
Subscribe to:
Posts (Atom)