On Thursday, some Windows Vista users began finding Service Pack 1 in Windows Update, even though the upgrade isn't supposed to be available broadly until the middle of March.
Microsoft acknowledged the error. "Yesterday, a build of SP1 was posted to Windows Update and it was inadvertently made available to a broad group. The build was intended only for our more technically advanced testers, and was meant to only be offered to those with a specific registry key set on their PC," Microsoft said in a statement. It also reiterated plans to make SP1 broadly available in mid-March.
Some customers on a Windows Vista forum reported that they successfully downloaded SP1 from Windows Update, but most others said that the download didn't work for them.
The accidental posting to Windows Update follows another recent issue with an update designed as a prerequisite for downloading SP1. Some users, after trying to install the update, got stuck in a reboot cycle. Earlier this week, Microsoft posted a fix for that problem.
Microsoft issued a second refresh of SP1 to beta users in late January, raising hopes that the final version would be out within a couple of weeks. The company had long said that SP1 would come out in the first quarter.
The final broad release of SP1 could boost Vista sales, particularly among enterprise users, because some companies have said that they are waiting for SP1 before upgrading to Vista.
Sunday, February 24, 2008
EA offers $2 billion for Grand Theft Auto publisher
Take-Two Interactive Software, publishers of the popular Grand Theft Auto series of games, has received and rejected a US$2 billion acquisition bid from Electronic Arts but left the door open to a possible acquisition later.
The EA bid, which wasn't made public until shortly before Take-Two announced its rejection Sunday, offered $26 cash per share for Take-Two. At the time the bid was made on Feb. 19, the price represented a 64 percent premium on Take-Two's Feb. 15 closing price of $15.83. It is currently a 49 percent premium on Take-Two's Friday closing price.
In its rejection the board of Take-Two said it judged the bid to be "inadequate in multiple respects."
"Electronic Arts' proposal provides insufficient value to our shareholders and comes at absolutely the wrong time given the crucial initiatives underway at the company," Take-Two Chairman Strauss Zelnick said in a statement.
Take-Two is scheduled to release the latest installment in the popular Grand Theft Auto series, "Grand Theft Auto IV," on April 29. The release of "GTA IV" was slated for October last year, but was delayed in order to give the development team more time for certain game elements. The series has sold more than 65 million copies to date, and the company said that it wants to hold-off on talks with EA until after that game hits the market. Therefore it proposed to start talks on April 30.
EA had originally told Take-Two the offer was subject to Take-Two agreeing to start talks by Feb. 22, but it noted Sunday that it would hold the offer open "for the present time" in the hope that discussions can begin.
In an open letter to investors CEO John Riccitiello wrote EA believes its offer is a good one for Take-Two shareholders. He said Take-Two's future is uncertain and that "there is a strong likelihood that the company will be sold in the not-too-distant future."
"So, that's it. We've made a proposal to buy Take-Two. Our preference is to make this a friendly transaction and I'm hopeful we can achieve that. We've sent this proposal in the genuine belief that combining EA and Take-Two would be good for the people who make games and good for the people who play them," he wrote.
The EA bid, which wasn't made public until shortly before Take-Two announced its rejection Sunday, offered $26 cash per share for Take-Two. At the time the bid was made on Feb. 19, the price represented a 64 percent premium on Take-Two's Feb. 15 closing price of $15.83. It is currently a 49 percent premium on Take-Two's Friday closing price.
In its rejection the board of Take-Two said it judged the bid to be "inadequate in multiple respects."
"Electronic Arts' proposal provides insufficient value to our shareholders and comes at absolutely the wrong time given the crucial initiatives underway at the company," Take-Two Chairman Strauss Zelnick said in a statement.
Take-Two is scheduled to release the latest installment in the popular Grand Theft Auto series, "Grand Theft Auto IV," on April 29. The release of "GTA IV" was slated for October last year, but was delayed in order to give the development team more time for certain game elements. The series has sold more than 65 million copies to date, and the company said that it wants to hold-off on talks with EA until after that game hits the market. Therefore it proposed to start talks on April 30.
EA had originally told Take-Two the offer was subject to Take-Two agreeing to start talks by Feb. 22, but it noted Sunday that it would hold the offer open "for the present time" in the hope that discussions can begin.
In an open letter to investors CEO John Riccitiello wrote EA believes its offer is a good one for Take-Two shareholders. He said Take-Two's future is uncertain and that "there is a strong likelihood that the company will be sold in the not-too-distant future."
"So, that's it. We've made a proposal to buy Take-Two. Our preference is to make this a friendly transaction and I'm hopeful we can achieve that. We've sent this proposal in the genuine belief that combining EA and Take-Two would be good for the people who make games and good for the people who play them," he wrote.
Goolag makes Google Hacking a snap
The hacking group Cult of the Dead Cow has released a tool that should make Google hacking a little easier for novices.
Called Goolag, the open-source software lets hackers use the Google search engine to scan Web sites for vulnerabilities.
This is something that hackers have been doing for years, but it can be tricky work -- involving custom scripts and tools that sift through the mountain of data available via Google.
The Cult of the Dead Cow is best known for creating the Back Orifice software 10 years ago, which could be used to remotely control a Windows machine.
Like Back Orifice, the software could be used by both legitimate security professionals and criminals. Goolag comes with an easy-to-use graphical interface. It is based on techniques developed by Computer Sciences Corp. researcher Johnny Long, a well-known computer hacker who has spent years documenting the way that Google's search engine can be used to uncover security vulnerabilities in the Web sites it indexes.
In a statement, The Cult of the Dead Cow said that the software is "one more tool for Web site owners to patch up their online properties."
"It's no big secret that the Web is the platform," the statement said. "And this platform pretty much sucks from a security perspective."
There are already free Web vulnerability search tools available -- such as the Wikto scanning software -- but the Cult of the Dead Cow's notoriety will probably help make Goolag popular, security experts said Friday.
"I don't think it's particularly new, but maybe it makes [Google hacking] more accessible," said Robert Hansen, CEO of Sectheory.com and author of the Ha.ckers.org Web security blog.
"It is interesting because it could theoretically represent a lower burden of entry for the novice Google hacker," he added.
Amichai Shulman, chief technology officer with security vendor Imperva, agreed that there are still far too many security vulnerabilities on Web sites. "Maybe the headlines that this release is getting will serve as a wake-up call for application owners," he said.
Called Goolag, the open-source software lets hackers use the Google search engine to scan Web sites for vulnerabilities.
This is something that hackers have been doing for years, but it can be tricky work -- involving custom scripts and tools that sift through the mountain of data available via Google.
The Cult of the Dead Cow is best known for creating the Back Orifice software 10 years ago, which could be used to remotely control a Windows machine.
Like Back Orifice, the software could be used by both legitimate security professionals and criminals. Goolag comes with an easy-to-use graphical interface. It is based on techniques developed by Computer Sciences Corp. researcher Johnny Long, a well-known computer hacker who has spent years documenting the way that Google's search engine can be used to uncover security vulnerabilities in the Web sites it indexes.
In a statement, The Cult of the Dead Cow said that the software is "one more tool for Web site owners to patch up their online properties."
"It's no big secret that the Web is the platform," the statement said. "And this platform pretty much sucks from a security perspective."
There are already free Web vulnerability search tools available -- such as the Wikto scanning software -- but the Cult of the Dead Cow's notoriety will probably help make Goolag popular, security experts said Friday.
"I don't think it's particularly new, but maybe it makes [Google hacking] more accessible," said Robert Hansen, CEO of Sectheory.com and author of the Ha.ckers.org Web security blog.
"It is interesting because it could theoretically represent a lower burden of entry for the novice Google hacker," he added.
Amichai Shulman, chief technology officer with security vendor Imperva, agreed that there are still far too many security vulnerabilities on Web sites. "Maybe the headlines that this release is getting will serve as a wake-up call for application owners," he said.
Microsoft letter hopeful, vague on Yahoo deal
In a letter to employees, Microsoft put an upbeat spin on its attempt to take over Yahoo.
While noting that no acquisition agreement is in place, Kevin Johnson, president of Microsoft's platforms and services division, wrote that he expects such a transaction to close in the second half of this year. "If and when Yahoo! agrees to proceed with the proposed transaction, we will go through the process to receive regulatory approval, and expect that this transaction will close in the 2nd half of calendar year 2008," he wrote.
Microsoft made its US$44.6 billion offer for Yahoo on Feb. 1. More than a week later, Yahoo rejected the bid as too low. Microsoft maintains that the offer is fair.
Johnson addressed some of the most pressing questions surrounding the potential acquisition in the letter, which Microsoft distributed to the media, but answered few of them definitively.
Acknowledging that there would likely be overlap in terms of staffing, he also noted that Microsoft has hired more than 20,000 people since 2005. "We have no shortage of business and technical opportunities, and we need great people to focus on them," he said. Microsoft would retain locations in both Silicon Valley and Redmond if the deal went through, he said.
He didn't shed any more light on the fate of either company's brands. "It is premature to say which aspects of the brands and technologies we would use in our combined offerings," he said.
Johnson also revealed little about how Microsoft would handle Yahoo's wide use of open-source software in its systems, an issue that some industry watchers have wondered about. Yahoo often uses open-source software in its back-end systems, while Microsoft prefers its own proprietary software. In the past, after acquisitions, Microsoft has sometimes migrated systems to its own software and in other cases maintained the existing software, Johnson said. "Yahoo! has made significant investments in both its skills and technologies, so we would work closely with Yahoo! engineers to make pragmatic platform and integration methodology decisions as appropriate, prioritizing above all how those decisions would impact customers," he said.
Johnson indicated that the process of integrating the companies would be critical to a combination's success. He pointed to recent Microsoft acquisitions, including aQuantive and Tellme, as examples of successful integrations.
Earlier this week, The New York Times reported that Microsoft planned to soon launch a proxy fight to replace Yahoo's board and force the takeover in a hostile bid. Neither company confirmed that report.
Johnson reiterated Microsoft's belief that a combination of the two companies would create a "more compelling alternative in search and online advertising," something that major media companies are looking for, he said.
While noting that no acquisition agreement is in place, Kevin Johnson, president of Microsoft's platforms and services division, wrote that he expects such a transaction to close in the second half of this year. "If and when Yahoo! agrees to proceed with the proposed transaction, we will go through the process to receive regulatory approval, and expect that this transaction will close in the 2nd half of calendar year 2008," he wrote.
Microsoft made its US$44.6 billion offer for Yahoo on Feb. 1. More than a week later, Yahoo rejected the bid as too low. Microsoft maintains that the offer is fair.
Johnson addressed some of the most pressing questions surrounding the potential acquisition in the letter, which Microsoft distributed to the media, but answered few of them definitively.
Acknowledging that there would likely be overlap in terms of staffing, he also noted that Microsoft has hired more than 20,000 people since 2005. "We have no shortage of business and technical opportunities, and we need great people to focus on them," he said. Microsoft would retain locations in both Silicon Valley and Redmond if the deal went through, he said.
He didn't shed any more light on the fate of either company's brands. "It is premature to say which aspects of the brands and technologies we would use in our combined offerings," he said.
Johnson also revealed little about how Microsoft would handle Yahoo's wide use of open-source software in its systems, an issue that some industry watchers have wondered about. Yahoo often uses open-source software in its back-end systems, while Microsoft prefers its own proprietary software. In the past, after acquisitions, Microsoft has sometimes migrated systems to its own software and in other cases maintained the existing software, Johnson said. "Yahoo! has made significant investments in both its skills and technologies, so we would work closely with Yahoo! engineers to make pragmatic platform and integration methodology decisions as appropriate, prioritizing above all how those decisions would impact customers," he said.
Johnson indicated that the process of integrating the companies would be critical to a combination's success. He pointed to recent Microsoft acquisitions, including aQuantive and Tellme, as examples of successful integrations.
Earlier this week, The New York Times reported that Microsoft planned to soon launch a proxy fight to replace Yahoo's board and force the takeover in a hostile bid. Neither company confirmed that report.
Johnson reiterated Microsoft's belief that a combination of the two companies would create a "more compelling alternative in search and online advertising," something that major media companies are looking for, he said.
Motorola finds new counter for shrinking pile of beans
Motorola President and CEO Greg Brown added another piece to the company's new management team on Friday with the announcement that Paul Liska will become executive vice president and chief financial officer.
Liska, who has been a partner in several private equity firms and played financial and general executive roles in transportation, publishing and retail companies, will take over Motorola's finances on March 1. Tom Meredith, who has been acting CFO since last year, will remain on Motorola's board and help Liska with the transition, the company said in a statement. It praised Meredith for cost-cutting efforts.
Motorola's last permanent CFO, David Devonshire, resigned last March. The company had run into rough waters after it failed to come up with a popular successor to the slim Razr clamshell phone. Former President and CEO Ed Zander handed those two jobs over to Brown in November, though he remains chairman until the next Motorola shareholder meeting in May.
Since Brown took Zander's place, Chief Technology Officer Padmasree Warrior has also left, and the company has said it might spin off its handset business.
Motorola has fallen behind both Nokia and Samsung in the hotly contested mobile-phone market, but its handset division still brought in US$4.8 billion of the company's US$9.6 billion revenue in the fourth quarter of last year. The company as a whole saw revenue fall from $11.8 billion a year earlier and earnings per share drop to $0.04 from $0.25.
Liska, who has been a partner in several private equity firms and played financial and general executive roles in transportation, publishing and retail companies, will take over Motorola's finances on March 1. Tom Meredith, who has been acting CFO since last year, will remain on Motorola's board and help Liska with the transition, the company said in a statement. It praised Meredith for cost-cutting efforts.
Motorola's last permanent CFO, David Devonshire, resigned last March. The company had run into rough waters after it failed to come up with a popular successor to the slim Razr clamshell phone. Former President and CEO Ed Zander handed those two jobs over to Brown in November, though he remains chairman until the next Motorola shareholder meeting in May.
Since Brown took Zander's place, Chief Technology Officer Padmasree Warrior has also left, and the company has said it might spin off its handset business.
Motorola has fallen behind both Nokia and Samsung in the hotly contested mobile-phone market, but its handset division still brought in US$4.8 billion of the company's US$9.6 billion revenue in the fourth quarter of last year. The company as a whole saw revenue fall from $11.8 billion a year earlier and earnings per share drop to $0.04 from $0.25.
Developers: OpenSocial OK, but needs tuning
Google's OpenSocial initiative to simplify the creation and adaptation of applications for social-networking sites pursues a valuable goal, but its technology platform needs further improvement.
That's the consensus from several developers who have been testing the OpenSocial APIs (application programming interfaces) and the OpenSocial implementations, or "containers," of participating Web sites.
However, the technical bumps they have encountered, while annoying and frustrating, haven't prompted them to give up on OpenSocial. Instead, the developers remain hopeful that the project, announced almost four months ago, will continue to mature.
Chris McCormick, a games industry contractor based in Australia, has encountered "a few rough edges" when working with OpenSocial, especially bugs in the partner sites' containers, but is "pretty satisfied" with the project.
"The API is intelligently designed and seems to cover all bases quite comprehensively. It should be possible to do some really fun stuff with it," McCormick said via e-mail.
Meanwhile, Aakash Bapna, an information sciences student in Bangalore, has also run into technical issues. "Bugs, bugs and lots of bugs. There are lots of issues with OpenSocial specs as they are launched. You can't tell when your smoothly working application can break," he said via e-mail.
For Bapna, a big hole is the unavailability of the server-side REST (Representational State Transfer) API, which will allow applications to tap servers, something that Thiago Santos, a Brazilian developer of an upcoming application called Partyeah, also misses.
Like McCormick, Santos also has encountered many bugs in partner site containers. Santos would also like Google to do a better job of communicating changes and updates to OpenSocial components. Still, he's confident OpenSocial will get over its growing pains eventually. "I have no doubt that [OpenSocial's promise] will be fulfilled," Santos said via e-mail.
That promise is to establish a standard application-development platform for social applications so developers don't have to remake an application for each social-networking site. While Facebook hasn't signed up for OpenSocial, other big social-networking sites have, like MySpace, Bebo and LinkedIn, as well as major enterprise software players like Oracle and Salesforce.com, which see emergence of social features within business applications.
With OpenSocial, developers will be able to build the core portions of social applications and then adapt them if necessary, with, they hope, minor tweaks and changes for specific sites.
"It's not 'write once, run everywhere.' It's more 'learn once and write everywhere.' You learn the OpenSocial model once. For most applications there will be a core of code that's common to all platforms," said Patrick Chanezon, developer advocate at Google.
Then it's likely that participating Web sites will make available to developers additional extensions in their OpenSocial containers, allowing developers to take advantage of specific features in their sites that aren't included in the standard, Chanezon said.
Developers don't seem worried that OpenSocial will splinter if partner sites add too many proprietary functions to their containers. "I think it should be reasonably easy to write apps that run on all social-networking sites that support OpenSocial without much modification," McCormick said. "The core of OpenSocial contains the most important parts of the social-networking experience ... Anything which does end up adding something drastically new and wonderful will more than likely become part of the standard anyway."
Regarding the technical bumps, Google was clear that the first version of the OpenSocial APIs, labeled 0.5, was far from final, and that it was putting it out in the market in order to get feedback from developers. Now, with version 0.7, Google says that developers can create production applications. Moreover, OpenSocial's technology will continue to improve. "If it turns out this round of OpenSocial provides good applications and we want to get to stellar applications, we'll enhance it," said David Glazer, an engineering director at Google.
The server-side REST API is also coming, but Google and its partners need to agree on the exact way it will be done, Chanezon said. "It will be super-useful for mobile applications," Chanezon said. Mobile phones whose browsers aren't powerful enough to run the OpenSocial Javascript APIs will take advantage of this REST API to get needed data from a server.
Google is also working on a security technology for OpenSocial applications called Caja, which the company calls an open-source Javascript "sanitizer" that aims to provide a security layer to prevent the spread of phishing scams, spam and malware via applications.
Also in the works is Shindig, an open-source reference implementation of OpenSocial overseen by The Apache Software Foundation, whose purpose is to let Web site operators implement an OpenSocial container in a matter of hours.
Meanwhile, Google's social-networking site Orkut will soon make available OpenSocial applications to its end-users, as will some of the other participating sites."That's what we're looking forward to: opening the doors and watching the party get started," Glazer said.
AOL's Userplane, a maker of Web-based communication applications, has been involved in the OpenSocial effort and is eager to see it continue to evolve, said Userplane CEO Michael Jones. "As application developers, we're excited about reducing the code we have to write, so I love the concept behind OpenSocial," Jones said.
"Although it has some uncertainties, I feel we're seeing an initiative that can have a great role in the future," Santos said.
That's the consensus from several developers who have been testing the OpenSocial APIs (application programming interfaces) and the OpenSocial implementations, or "containers," of participating Web sites.
However, the technical bumps they have encountered, while annoying and frustrating, haven't prompted them to give up on OpenSocial. Instead, the developers remain hopeful that the project, announced almost four months ago, will continue to mature.
Chris McCormick, a games industry contractor based in Australia, has encountered "a few rough edges" when working with OpenSocial, especially bugs in the partner sites' containers, but is "pretty satisfied" with the project.
"The API is intelligently designed and seems to cover all bases quite comprehensively. It should be possible to do some really fun stuff with it," McCormick said via e-mail.
Meanwhile, Aakash Bapna, an information sciences student in Bangalore, has also run into technical issues. "Bugs, bugs and lots of bugs. There are lots of issues with OpenSocial specs as they are launched. You can't tell when your smoothly working application can break," he said via e-mail.
For Bapna, a big hole is the unavailability of the server-side REST (Representational State Transfer) API, which will allow applications to tap servers, something that Thiago Santos, a Brazilian developer of an upcoming application called Partyeah, also misses.
Like McCormick, Santos also has encountered many bugs in partner site containers. Santos would also like Google to do a better job of communicating changes and updates to OpenSocial components. Still, he's confident OpenSocial will get over its growing pains eventually. "I have no doubt that [OpenSocial's promise] will be fulfilled," Santos said via e-mail.
That promise is to establish a standard application-development platform for social applications so developers don't have to remake an application for each social-networking site. While Facebook hasn't signed up for OpenSocial, other big social-networking sites have, like MySpace, Bebo and LinkedIn, as well as major enterprise software players like Oracle and Salesforce.com, which see emergence of social features within business applications.
With OpenSocial, developers will be able to build the core portions of social applications and then adapt them if necessary, with, they hope, minor tweaks and changes for specific sites.
"It's not 'write once, run everywhere.' It's more 'learn once and write everywhere.' You learn the OpenSocial model once. For most applications there will be a core of code that's common to all platforms," said Patrick Chanezon, developer advocate at Google.
Then it's likely that participating Web sites will make available to developers additional extensions in their OpenSocial containers, allowing developers to take advantage of specific features in their sites that aren't included in the standard, Chanezon said.
Developers don't seem worried that OpenSocial will splinter if partner sites add too many proprietary functions to their containers. "I think it should be reasonably easy to write apps that run on all social-networking sites that support OpenSocial without much modification," McCormick said. "The core of OpenSocial contains the most important parts of the social-networking experience ... Anything which does end up adding something drastically new and wonderful will more than likely become part of the standard anyway."
Regarding the technical bumps, Google was clear that the first version of the OpenSocial APIs, labeled 0.5, was far from final, and that it was putting it out in the market in order to get feedback from developers. Now, with version 0.7, Google says that developers can create production applications. Moreover, OpenSocial's technology will continue to improve. "If it turns out this round of OpenSocial provides good applications and we want to get to stellar applications, we'll enhance it," said David Glazer, an engineering director at Google.
The server-side REST API is also coming, but Google and its partners need to agree on the exact way it will be done, Chanezon said. "It will be super-useful for mobile applications," Chanezon said. Mobile phones whose browsers aren't powerful enough to run the OpenSocial Javascript APIs will take advantage of this REST API to get needed data from a server.
Google is also working on a security technology for OpenSocial applications called Caja, which the company calls an open-source Javascript "sanitizer" that aims to provide a security layer to prevent the spread of phishing scams, spam and malware via applications.
Also in the works is Shindig, an open-source reference implementation of OpenSocial overseen by The Apache Software Foundation, whose purpose is to let Web site operators implement an OpenSocial container in a matter of hours.
Meanwhile, Google's social-networking site Orkut will soon make available OpenSocial applications to its end-users, as will some of the other participating sites."That's what we're looking forward to: opening the doors and watching the party get started," Glazer said.
AOL's Userplane, a maker of Web-based communication applications, has been involved in the OpenSocial effort and is eager to see it continue to evolve, said Userplane CEO Michael Jones. "As application developers, we're excited about reducing the code we have to write, so I love the concept behind OpenSocial," Jones said.
"Although it has some uncertainties, I feel we're seeing an initiative that can have a great role in the future," Santos said.
Friday, February 22, 2008
17 arrested in Canadian hacking bust
Quebec provincial police conducted raids on Wednesday, breaking up a hacking ring that police say is responsible for an estimated CDN$45 million (US$44.3 million) in damage to computer systems.
The hackers installed remote-controlled "botnet" software on victims' computers in order to run phishing and spamming operations, said Capt. Frederick Gaudreau, of the Surete du Quebec, in a videotaped press conference posted to the police agency's Web site. "The hackers managed to install botnets on the victims' computers, which permitted them to control at a distance the victims' computers," he said. "These said computers were then used to attack Web sites in order to steal victims' data."
If convicted of computer hacking charges, the accused could face 10 years in prison, he said.
Although the hackers operated from about a dozen towns all over Quebec, their botnet network was international in scope, infecting 39,000 computers in Poland, 28,000 in Brazil, and 26,000 in Mexico -- the top three countries affected by the group. In all, they hacked into more than 100,000 computers in 100 countries.
The accused range in age from 17 years old to 26 years old, but police did not release the names of the accused. Three of them are minors, Gaudreau said.
This is the first time that Canadian authorities have dismantled such a network, he added. The investigation was done in collaboration with the Royal Canadian Mounted Police.
The hackers installed remote-controlled "botnet" software on victims' computers in order to run phishing and spamming operations, said Capt. Frederick Gaudreau, of the Surete du Quebec, in a videotaped press conference posted to the police agency's Web site. "The hackers managed to install botnets on the victims' computers, which permitted them to control at a distance the victims' computers," he said. "These said computers were then used to attack Web sites in order to steal victims' data."
If convicted of computer hacking charges, the accused could face 10 years in prison, he said.
Although the hackers operated from about a dozen towns all over Quebec, their botnet network was international in scope, infecting 39,000 computers in Poland, 28,000 in Brazil, and 26,000 in Mexico -- the top three countries affected by the group. In all, they hacked into more than 100,000 computers in 100 countries.
The accused range in age from 17 years old to 26 years old, but police did not release the names of the accused. Three of them are minors, Gaudreau said.
This is the first time that Canadian authorities have dismantled such a network, he added. The investigation was done in collaboration with the Royal Canadian Mounted Police.
Europe makes moves towards Internet censorship
A debate over the use of internet filtering is heating up in Europe, with privacy advocates and carriers going head to head with authorities.
In Finland programmer Matti Nikki is under investigation for publishing a secret list of domains that authorities had allegedly censored in an effort to stop the spread of child pornography. Nikki published his list to prove the system was being abused, and was himself censored as a result. The Finnish Chancellor of Justice has received a complaint about police handling of the matter.
The authorities distribute their list to the country's twenty largest Internet service providers, which then block access to the sites. The rest of Finland's 200 ISPs haven't implemented the technology, so protection is far from complete.
The problem with filtering is that it is a very blunt tool, according to Swedish Internet activist Oscar Swartz.
"I have seen the list Nikki published and it includes links to sites with regular pornography, so they shouldn't be censored," said Swartz.
The Finnish police force is aware of the problems with filtering.
"The technology we currently use works well with sites that only include child pornography. To filter sites with a mixture of content we need to use other technologies as well," said Lars Henriksson, chief superintendent at the National Bureau of Investigation.
Finland isn't the only country where the temperature is rising. Danish authorities recently decided to block file-sharing site Pirate Bay, after pressure from the International Federation of the Phonographic Industry (IFPI). ISP Tele2 decided to fight the court order. They are so far the only ISP that has been ordered to shut off access to The Pirate Bay, but IFPI has plans to expand the blocking.
Other organizations are starting to show an interest in the use of filtering, including mobile network operators. They are banding together to combat the distribution of child pornography.
"We are here to tackle a very disturbing and damaging phenomenon," said Craig Ehrlich, chairman of the GSM Association, a group of mobile network operators, launching the initiative at a conference in Barcelona last week.
The use of emotive issues to justify the introduction or extension of censorship worries some.
"It's easy to ignore the negative aspects of filtering and censorship when talking about something so universally disliked as child pornography," said Swartz.
But state censorship proposals don't stop there: the European Union's Justice and Security Commissioner Franco Frattini called last September for ISPs to block access to Web sites hosting information about bomb-making, and U.K. Home Secretary Jacqui Smith said in January that she wanted action taken against sites that encouraged terrorism, including social networking sites.
Such actions could have wider consequences: "If the E.U. starts to filter sites related to piracy, terrorism and child pornography, it will have some serious effects on the freedom to communicate," said Swartz.
In Finland programmer Matti Nikki is under investigation for publishing a secret list of domains that authorities had allegedly censored in an effort to stop the spread of child pornography. Nikki published his list to prove the system was being abused, and was himself censored as a result. The Finnish Chancellor of Justice has received a complaint about police handling of the matter.
The authorities distribute their list to the country's twenty largest Internet service providers, which then block access to the sites. The rest of Finland's 200 ISPs haven't implemented the technology, so protection is far from complete.
The problem with filtering is that it is a very blunt tool, according to Swedish Internet activist Oscar Swartz.
"I have seen the list Nikki published and it includes links to sites with regular pornography, so they shouldn't be censored," said Swartz.
The Finnish police force is aware of the problems with filtering.
"The technology we currently use works well with sites that only include child pornography. To filter sites with a mixture of content we need to use other technologies as well," said Lars Henriksson, chief superintendent at the National Bureau of Investigation.
Finland isn't the only country where the temperature is rising. Danish authorities recently decided to block file-sharing site Pirate Bay, after pressure from the International Federation of the Phonographic Industry (IFPI). ISP Tele2 decided to fight the court order. They are so far the only ISP that has been ordered to shut off access to The Pirate Bay, but IFPI has plans to expand the blocking.
Other organizations are starting to show an interest in the use of filtering, including mobile network operators. They are banding together to combat the distribution of child pornography.
"We are here to tackle a very disturbing and damaging phenomenon," said Craig Ehrlich, chairman of the GSM Association, a group of mobile network operators, launching the initiative at a conference in Barcelona last week.
The use of emotive issues to justify the introduction or extension of censorship worries some.
"It's easy to ignore the negative aspects of filtering and censorship when talking about something so universally disliked as child pornography," said Swartz.
But state censorship proposals don't stop there: the European Union's Justice and Security Commissioner Franco Frattini called last September for ISPs to block access to Web sites hosting information about bomb-making, and U.K. Home Secretary Jacqui Smith said in January that she wanted action taken against sites that encouraged terrorism, including social networking sites.
Such actions could have wider consequences: "If the E.U. starts to filter sites related to piracy, terrorism and child pornography, it will have some serious effects on the freedom to communicate," said Swartz.
White spaces group: Device testing on track
A wireless broadband device tested by the U.S. Federal Communications Commission for interference with television and wireless microphone signals has not failed, as a broadcasting group claimed last week, members of the White Spaces Coalition said Thursday.
The National Association of Broadcasters (NAB) on Feb. 11 said a so-called prototype device submitted by Microsoft lost power during tests being run by the FCC. The power failure comes after another whites spaces device malfunctioned in tests run by the FCC last year.
But Ed Thomas, a tech advisor to the White Spaces Coalition and a former chief of the FCC's Office of Engineering and Technology, said Thursday that while the devices power supply failed after many hours of continuous testing, it did not interfere with television signals due to the power failure.
Thomas, during a press briefing, said the NAB was engaged in "rhetoric" designed to complicate the FCC's device testing."Let this be based on science, not politics," Thomas said of the ongoing testing at the FCC. "Let the facts prevail."
The White Spaces Coalition, including Microsoft, Philips, Dell and Google, is asking the FCC to allow wireless devices to operate in the so-called white spaces of the television spectrum, space allocated for television signals but vacant. The coalition wants the white spaces opened up to give consumers more wireless broadband options, and the white spaces devices would be targeted at longer-range broadband than traditional Wi-Fi.
If the FCC approves the devices this year, commercial white spaces wireless devices could be available as soon as late 2009.
The FCC's in-house testing of four devices will continue for a couple more weeks, then the agency will conduct field tests for up to eight weeks. A second white spaces device has experienced no power failure problems, Thomas said.
But television broadcasters have opposed the coalition, saying it's likely that the that wireless devices will interfere with TV signals. The NAB has suggested the FCC should focus instead on a successful transition of TV stations to digital broadcasts, required by February 2009.
White spaces devices are "not ready for prime time," said Dennis Wharton, the NAB's executive vice president.
Wharton responded to Thomas' assertion that the Microsoft device did not interfere with TV signals.
"The devices they've tested haven't performed the way they were expected to perform," Wharton added. "That, in our view, constitutes a failure."
The National Association of Broadcasters (NAB) on Feb. 11 said a so-called prototype device submitted by Microsoft lost power during tests being run by the FCC. The power failure comes after another whites spaces device malfunctioned in tests run by the FCC last year.
But Ed Thomas, a tech advisor to the White Spaces Coalition and a former chief of the FCC's Office of Engineering and Technology, said Thursday that while the devices power supply failed after many hours of continuous testing, it did not interfere with television signals due to the power failure.
Thomas, during a press briefing, said the NAB was engaged in "rhetoric" designed to complicate the FCC's device testing."Let this be based on science, not politics," Thomas said of the ongoing testing at the FCC. "Let the facts prevail."
The White Spaces Coalition, including Microsoft, Philips, Dell and Google, is asking the FCC to allow wireless devices to operate in the so-called white spaces of the television spectrum, space allocated for television signals but vacant. The coalition wants the white spaces opened up to give consumers more wireless broadband options, and the white spaces devices would be targeted at longer-range broadband than traditional Wi-Fi.
If the FCC approves the devices this year, commercial white spaces wireless devices could be available as soon as late 2009.
The FCC's in-house testing of four devices will continue for a couple more weeks, then the agency will conduct field tests for up to eight weeks. A second white spaces device has experienced no power failure problems, Thomas said.
But television broadcasters have opposed the coalition, saying it's likely that the that wireless devices will interfere with TV signals. The NAB has suggested the FCC should focus instead on a successful transition of TV stations to digital broadcasts, required by February 2009.
White spaces devices are "not ready for prime time," said Dennis Wharton, the NAB's executive vice president.
Wharton responded to Thomas' assertion that the Microsoft device did not interfere with TV signals.
"The devices they've tested haven't performed the way they were expected to perform," Wharton added. "That, in our view, constitutes a failure."
Open APIs may help Microsoft repair reputation
If Microsoft executes effectively on its new interoperability promises, it could repair its tarnished reputation in the technology industry and help the company get out of its own way to compete more effectively with Google.
At first glance, Microsoft's news on Thursday that it would provide access to documentation for its major software products, including Windows Vista, Office 2007 and Exchange Server 2007, appeared to be a way to appease the European Commission in its ongoing antitrust case. It also seemed an acknowledgment that Microsoft can't ignore the open-source community's impact on its business and prominence in the industry any longer.
"[The news] validates and places a Microsoft acknowledgment that the open models that have emerged -- which Microsoft has denied almost as vociferously as tobacco companies have fought the idea that smoking causes cancer -- are a perfectly reasonable way to go," said Nick Selby, a senior analyst and research director at The 451 Group.
Still, many remain skeptical that providing easier access to APIs (application programming interfaces), and vowing to allow developers to build open-source implementations on those APIs without interfering, doesn't mean Microsoft is a friend to open source, or that the company will change how it does business. Already open-source companies like Red Hat are adopting a wait-and-see approach to the news -- and rightfully so, as Microsoft has cloaked its own business interests in interoperability announcements before. For example, last year, Microsoft struck a so-called interoperability pact with Linux vendor Novell, while at the same time saying the company would go after people who violated more than 200 patents Microsoft says it holds for technologies in Linux.
But Thursday's news could, if played correctly, repair the long-held notion in the industry that Microsoft is a proprietary bully that buries anyone who jumps in its sandbox. By making a companywide commitment to being more transparent about its technology and friendly to open-source developers and companies that build interoperable technology, Microsoft proves it realizes it can no longer embrace proprietary principles -- and expect the entire industry to go along with it.
"This is the new Microsoft," said Chris Swenson, an analyst at NPD Group. "They really are changing." However, he acknowledged that because of Microsoft's previous business practices and reputation, it's highly likely that "no one is going to give them credit for it."
Still, people should keep an open mind about Microsoft's extension of a new olive branch to open source, he said. If critics take a few steps back, they'll see that Microsoft's decision did not happen overnight.
Microsoft's new attitude is the result of many years of antitrust tussling, beratement at the hands of the open-standards community and product-interoperability challenges that have inspired the company to change its ways in order to stay relevant, analysts said. Under increased global pressure, the company has been slowly coming around to the idea of open source -- through key initiatives like the Open Specification Promise -- over the past few years.
Mike Gilpin, an analyst with Forrester, suggested that many of Microsoft's recent executive changes also represent a shift in mind-set to a more open policy, and noted the rise of executives such as Bill Hilf, general manager of platform strategy and a former IBM Linux specialist, as part of this attitude adjustment.
"I wouldn't be surprised if there wasn't a relationship between the two things," he said. "This does come from the top. I think in the way this is being communicated inside of Microsoft, it places a lot of requirements on developers and product managers to behave in a certain way -- and if they don't do that, they'll be in a lot of trouble with [Chairman] Bill [Gates] and [CEO] Steve [Ballmer]."
Gilpin acknowledged that he has always been skeptical of Microsoft's intentions toward being more open and transparent, but in the past two years, he said the company "has really changed its stripes around interoperability."
In a blog post on Thursday, Hilf himself noted that Microsoft's new commitment has evolved over time, though he called the changes to Microsoft's strategy "broad-reaching" and said they "go above and beyond any prior incremental changes in Microsoft's DNA."
These changes are not only happening because of market forces that have given rise to the success of open source, but also because Microsoft has suffered from its own proprietary legacy. Aside from its embroilment in lengthy and costly antitrust cases both in the U.S. and overseas, a lack of support for open standards and interfaces also have hurt the adoption of its technology. By being more open, the company could also be more successful in areas where it has struggled, like the Internet, analysts said.
For example, when Microsoft created a new version of its Internet Explorer browser, IE 7, to keep up with the latest Internet standards -- and to compete with Mozilla's Firefox browser -- many people who'd built sites to work with previous versions of IE found they no longer worked because they had been designed to support Microsoft's proprietary technologies. In trying to do the right thing and support more open and generally supported technologies, Microsoft found that its own proprietary software got in the way of its best intentions.
In fact, the changing business models on the Internet that have made Google so successful are another example of where Microsoft could have benefited if it had embraced open standards and more technological transparency sooner, Selby said. Google right away gave developers access to APIs to create a community around its Web-based products and services -- and used this fact to criticize Microsoft, he said.
Microsoft's decision to be more open takes a bit of the wind out of the sails of that argument, he added. "It's a simple way to do the right thing and also manage a poke in Google's eye," Selby said.
Providing more open access to technologies also could give Microsoft leverage if it is indeed successful in its bid to purchase Yahoo, which recently said it would open up more APIs to developers in its own pursuit of Google.
At first glance, Microsoft's news on Thursday that it would provide access to documentation for its major software products, including Windows Vista, Office 2007 and Exchange Server 2007, appeared to be a way to appease the European Commission in its ongoing antitrust case. It also seemed an acknowledgment that Microsoft can't ignore the open-source community's impact on its business and prominence in the industry any longer.
"[The news] validates and places a Microsoft acknowledgment that the open models that have emerged -- which Microsoft has denied almost as vociferously as tobacco companies have fought the idea that smoking causes cancer -- are a perfectly reasonable way to go," said Nick Selby, a senior analyst and research director at The 451 Group.
Still, many remain skeptical that providing easier access to APIs (application programming interfaces), and vowing to allow developers to build open-source implementations on those APIs without interfering, doesn't mean Microsoft is a friend to open source, or that the company will change how it does business. Already open-source companies like Red Hat are adopting a wait-and-see approach to the news -- and rightfully so, as Microsoft has cloaked its own business interests in interoperability announcements before. For example, last year, Microsoft struck a so-called interoperability pact with Linux vendor Novell, while at the same time saying the company would go after people who violated more than 200 patents Microsoft says it holds for technologies in Linux.
But Thursday's news could, if played correctly, repair the long-held notion in the industry that Microsoft is a proprietary bully that buries anyone who jumps in its sandbox. By making a companywide commitment to being more transparent about its technology and friendly to open-source developers and companies that build interoperable technology, Microsoft proves it realizes it can no longer embrace proprietary principles -- and expect the entire industry to go along with it.
"This is the new Microsoft," said Chris Swenson, an analyst at NPD Group. "They really are changing." However, he acknowledged that because of Microsoft's previous business practices and reputation, it's highly likely that "no one is going to give them credit for it."
Still, people should keep an open mind about Microsoft's extension of a new olive branch to open source, he said. If critics take a few steps back, they'll see that Microsoft's decision did not happen overnight.
Microsoft's new attitude is the result of many years of antitrust tussling, beratement at the hands of the open-standards community and product-interoperability challenges that have inspired the company to change its ways in order to stay relevant, analysts said. Under increased global pressure, the company has been slowly coming around to the idea of open source -- through key initiatives like the Open Specification Promise -- over the past few years.
Mike Gilpin, an analyst with Forrester, suggested that many of Microsoft's recent executive changes also represent a shift in mind-set to a more open policy, and noted the rise of executives such as Bill Hilf, general manager of platform strategy and a former IBM Linux specialist, as part of this attitude adjustment.
"I wouldn't be surprised if there wasn't a relationship between the two things," he said. "This does come from the top. I think in the way this is being communicated inside of Microsoft, it places a lot of requirements on developers and product managers to behave in a certain way -- and if they don't do that, they'll be in a lot of trouble with [Chairman] Bill [Gates] and [CEO] Steve [Ballmer]."
Gilpin acknowledged that he has always been skeptical of Microsoft's intentions toward being more open and transparent, but in the past two years, he said the company "has really changed its stripes around interoperability."
In a blog post on Thursday, Hilf himself noted that Microsoft's new commitment has evolved over time, though he called the changes to Microsoft's strategy "broad-reaching" and said they "go above and beyond any prior incremental changes in Microsoft's DNA."
These changes are not only happening because of market forces that have given rise to the success of open source, but also because Microsoft has suffered from its own proprietary legacy. Aside from its embroilment in lengthy and costly antitrust cases both in the U.S. and overseas, a lack of support for open standards and interfaces also have hurt the adoption of its technology. By being more open, the company could also be more successful in areas where it has struggled, like the Internet, analysts said.
For example, when Microsoft created a new version of its Internet Explorer browser, IE 7, to keep up with the latest Internet standards -- and to compete with Mozilla's Firefox browser -- many people who'd built sites to work with previous versions of IE found they no longer worked because they had been designed to support Microsoft's proprietary technologies. In trying to do the right thing and support more open and generally supported technologies, Microsoft found that its own proprietary software got in the way of its best intentions.
In fact, the changing business models on the Internet that have made Google so successful are another example of where Microsoft could have benefited if it had embraced open standards and more technological transparency sooner, Selby said. Google right away gave developers access to APIs to create a community around its Web-based products and services -- and used this fact to criticize Microsoft, he said.
Microsoft's decision to be more open takes a bit of the wind out of the sails of that argument, he added. "It's a simple way to do the right thing and also manage a poke in Google's eye," Selby said.
Providing more open access to technologies also could give Microsoft leverage if it is indeed successful in its bid to purchase Yahoo, which recently said it would open up more APIs to developers in its own pursuit of Google.
Hard drive encryption has Achilles heel
If you think that encrypting your laptop's hard drive will keep your data safe from prying eyes, you may want to think again, according to researchers at Princeton University.
They've discovered a way to steal the hard drive encryption key used by products such as Windows Vista's BitLocker or Apple's FileVault. With that key, hackers could get access to all of the data stored on an encrypted hard drive.
That's because of a physical property of the computer's memory chips. Data in these DRAM (dynamic RAM) processors disappears when the computer is turned off, but it turns out that this doesn't happen right away, according to Alex Halderman, a Princeton graduate student who worked on the paper.
In fact, it can take minutes before that data disappears, giving hackers a way to sniff out encryption keys.
For the attack to work, the computer would have to first be running or in standby mode. It wouldn't work against a computer that had been shut off for a few minutes because the data in DRAM would have disappeared by then.
The attacker simply turns the computer off for a second or two and then reboots the system from a portable hard disk, which includes software that can examine the contents of the memory chips. This gives an attacker a way around the operating system protection that keeps the encryption keys hidden in memory.
"This enables a whole new class of attacks against security products like disk encryption systems that have depended on the operating system to protect their private keys," Halderman said. "An attacker could steal someone's laptop where they were using disk encryption and reboot the machine ... and then capture what was in memory before the power was cut."
Some computers wipe the memory when they boot up, but even these systems can be vulnerable, Halderman said. Researchers found that if they cooled down the memory chips by spraying canned air on them, they could slow down the rate at which memory disappeared. Cooling chips down to about -58 degrees Fahrenheit (-50 degrees Celsius) gave researchers time to power down the computer and then install the memory in another PC that would boot without wiping out the data. "By cooling the chips we were able to recover data perfectly after 10 minutes or more," Halderman said.
Led by Princeton University, the team included researchers from the Electronic Frontier Foundation and Wind River Systems.
U.S. states have enacted a series of tough data disclosure laws over the past five years which force companies to notify residents whenever they lose sensitive information. Under these laws, a missing laptop can cost a company millions of dollars as well as public embarrassment as it is forced to track down and notify those whose data was lost.
However, many state laws, such as California's SB 1386 make an exception for encrypted PCs. So if a company or government agency loses an encrypted laptop containing sensitive data, they are not compelled to notify those affected.
The team's research may spur legislators to rethink that approach, Halderman said. "Maybe that law is placing too much faith in disk encryption technologies," he said. "It may be that we're not hearing about thefts of encrypted machines where that data could still be at risk."
Laws like SB 1386 treat encryption as if it's a "magic spell" and ignore the fact that there's such a thing as bad encryption, said encryption expert Bruce Schneier, who is chief technology officer with BT Counterpane.
The underlying problem is that if someone gains access to your machine, it is very difficult to protect the data on your hard drive, Schneier said. "That's an extremely hard problem for a lot of reasons, and this is one example of that."
Hardware-based encryption would probably reduce the risk, Halderman said, but he agreed that "it's a difficult problem."
Hard-drive makers Seagate and Hitachi both offer hardware-based disk encryption options with their hard drives, although these options come with a premium price tag.
They've discovered a way to steal the hard drive encryption key used by products such as Windows Vista's BitLocker or Apple's FileVault. With that key, hackers could get access to all of the data stored on an encrypted hard drive.
That's because of a physical property of the computer's memory chips. Data in these DRAM (dynamic RAM) processors disappears when the computer is turned off, but it turns out that this doesn't happen right away, according to Alex Halderman, a Princeton graduate student who worked on the paper.
In fact, it can take minutes before that data disappears, giving hackers a way to sniff out encryption keys.
For the attack to work, the computer would have to first be running or in standby mode. It wouldn't work against a computer that had been shut off for a few minutes because the data in DRAM would have disappeared by then.
The attacker simply turns the computer off for a second or two and then reboots the system from a portable hard disk, which includes software that can examine the contents of the memory chips. This gives an attacker a way around the operating system protection that keeps the encryption keys hidden in memory.
"This enables a whole new class of attacks against security products like disk encryption systems that have depended on the operating system to protect their private keys," Halderman said. "An attacker could steal someone's laptop where they were using disk encryption and reboot the machine ... and then capture what was in memory before the power was cut."
Some computers wipe the memory when they boot up, but even these systems can be vulnerable, Halderman said. Researchers found that if they cooled down the memory chips by spraying canned air on them, they could slow down the rate at which memory disappeared. Cooling chips down to about -58 degrees Fahrenheit (-50 degrees Celsius) gave researchers time to power down the computer and then install the memory in another PC that would boot without wiping out the data. "By cooling the chips we were able to recover data perfectly after 10 minutes or more," Halderman said.
Led by Princeton University, the team included researchers from the Electronic Frontier Foundation and Wind River Systems.
U.S. states have enacted a series of tough data disclosure laws over the past five years which force companies to notify residents whenever they lose sensitive information. Under these laws, a missing laptop can cost a company millions of dollars as well as public embarrassment as it is forced to track down and notify those whose data was lost.
However, many state laws, such as California's SB 1386 make an exception for encrypted PCs. So if a company or government agency loses an encrypted laptop containing sensitive data, they are not compelled to notify those affected.
The team's research may spur legislators to rethink that approach, Halderman said. "Maybe that law is placing too much faith in disk encryption technologies," he said. "It may be that we're not hearing about thefts of encrypted machines where that data could still be at risk."
Laws like SB 1386 treat encryption as if it's a "magic spell" and ignore the fact that there's such a thing as bad encryption, said encryption expert Bruce Schneier, who is chief technology officer with BT Counterpane.
The underlying problem is that if someone gains access to your machine, it is very difficult to protect the data on your hard drive, Schneier said. "That's an extremely hard problem for a lot of reasons, and this is one example of that."
Hardware-based encryption would probably reduce the risk, Halderman said, but he agreed that "it's a difficult problem."
Hard-drive makers Seagate and Hitachi both offer hardware-based disk encryption options with their hard drives, although these options come with a premium price tag.
EMC buys Pi to round out cloud computing unit
Storage giant EMC continues to push into the consumer territory: Its latest move is to acquire Pi, a company whose software and services will help users keep track of their personal data.
Seattle-based Pi develops software and online services to enable users to control how they find, access, share and protect everything from photos, videos, and music. The data can be stored online or locally.
The company name stands for personal information, not the number 3.14.
The rapidly growing amount of personal data is what prompted EMC to open its wallet, according to CEO Joe Tucci. It's a cash transaction, but EMC won't disclose the amount.
Pi hasn't actually launched any products or services yet: They are in beta testing, according to EMC.
EMC sees Pi not only as part of its consumer push, but also an element of its cloud computing strategy, the next big thing in storage, according to one analyst.
"Cloud computing is the next storage hype. It's all about moving storage, back up, and even clock cycles to the net," said Per Sedihn, chief technology officer at Swedish storage integrator Proact.
EMC expects to complete the deal during the first quarter, at which point Pi and its 100 employees will join EMC's newly minted Cloud Infrastructure and Services Division. It already includes Mozy, an online backup service, and Fortress, a platform for cloud-based services. Pi founder and CEO Paul Maritz (who used to be an executive at Microsoft), will join EMC's executive management team as president and general manager of the divsion.
EMC is far from the only company interested in the area. Amazon launched Simple Storage Service (S3) two years ago. It provides data storage through a web services based interface.
Proact's Sedihn also likes Nirvanix, a company that counts Intel among its investors. "They have a very nice user interface", said Sedihn, adding that Google is also waiting in the wings.
"I think cloud services will mainly be used by consumers and smaller companies. But I also expect larger companies to build their internal infrastructure with this model, said Sedihn.
Seattle-based Pi develops software and online services to enable users to control how they find, access, share and protect everything from photos, videos, and music. The data can be stored online or locally.
The company name stands for personal information, not the number 3.14.
The rapidly growing amount of personal data is what prompted EMC to open its wallet, according to CEO Joe Tucci. It's a cash transaction, but EMC won't disclose the amount.
Pi hasn't actually launched any products or services yet: They are in beta testing, according to EMC.
EMC sees Pi not only as part of its consumer push, but also an element of its cloud computing strategy, the next big thing in storage, according to one analyst.
"Cloud computing is the next storage hype. It's all about moving storage, back up, and even clock cycles to the net," said Per Sedihn, chief technology officer at Swedish storage integrator Proact.
EMC expects to complete the deal during the first quarter, at which point Pi and its 100 employees will join EMC's newly minted Cloud Infrastructure and Services Division. It already includes Mozy, an online backup service, and Fortress, a platform for cloud-based services. Pi founder and CEO Paul Maritz (who used to be an executive at Microsoft), will join EMC's executive management team as president and general manager of the divsion.
EMC is far from the only company interested in the area. Amazon launched Simple Storage Service (S3) two years ago. It provides data storage through a web services based interface.
Proact's Sedihn also likes Nirvanix, a company that counts Intel among its investors. "They have a very nice user interface", said Sedihn, adding that Google is also waiting in the wings.
"I think cloud services will mainly be used by consumers and smaller companies. But I also expect larger companies to build their internal infrastructure with this model, said Sedihn.
Thursday, February 21, 2008
RIM gains despite outages
Consumers and enterprise workers are flocking to the BlackBerry despite recent embarrassing glitches that have shut down service for hours on a few occasions.
Research In Motion on Thursday boosted its forecast for subscriber account additions in its fiscal fourth quarter ending March 1. Back in December, RIM predicted 1.82 million new accounts, but now it expects that number to be higher by 15 percent to 20 percent. That will mean a total of about 14 million subscriber accounts at the end of the quarter. Final results will be revealed April 2. The company's revenue and profit forecast hasn't changed.
The Waterloo, Ontario, company raised its forecast during a difficult month. Last week, BlackBerry users in North America lost the mobile e-mail and data service for about three hours in an incident RIM blamed on recent upgrades to an internal routing system. Then, some North American users reported the service down on Wednesday morning this week. RIM said scheduled maintenance slowed down delivery of some customers' e-mail. (Another outage in late January was caused by the AT&T Wireless network.)
BlackBerry has gotten black eyes before, namely during an April outage in North America that lasted overnight. In 2006, users lived for several months in fear of a service shutdown in the U.S. sought by NTP, which sued RIM alleging patent infringement. RIM eventually settled the suit in March 2006, agreeing to pay more than US$600 million.
The problems shone a spotlight on RIM's reliance on a proprietary architecture and the fact that all messages have to go through its network operations center (NOC). These factors could make RIM vulnerable to a single point of failure, some analysts said. But in reality, BlackBerry devices probably aren't any less dependable than mobile e-mail systems from other vendors, such as Microsoft, Palm and Nokia's Intellisync, they said.
BlackBerry service can be managed through a BlackBerry Enterprise Server within an organization, but RIM is now making a push for consumers with its BlackBerry Internet Service, which can be ordered from a carrier.
In the fourth quarter of last year, RIM had a leading 41 percent share of the U.S. smartphone market and more than doubled its worldwide share to 11.4 percent, according to research company Canalys. Users like the security of RIM's system, its support for IBM Lotus Notes in addition to Microsoft Exchange and the fact that RIM's NOC handles the connections to all mobile operators that carry BlackBerry devices, analysts said.
Research In Motion on Thursday boosted its forecast for subscriber account additions in its fiscal fourth quarter ending March 1. Back in December, RIM predicted 1.82 million new accounts, but now it expects that number to be higher by 15 percent to 20 percent. That will mean a total of about 14 million subscriber accounts at the end of the quarter. Final results will be revealed April 2. The company's revenue and profit forecast hasn't changed.
The Waterloo, Ontario, company raised its forecast during a difficult month. Last week, BlackBerry users in North America lost the mobile e-mail and data service for about three hours in an incident RIM blamed on recent upgrades to an internal routing system. Then, some North American users reported the service down on Wednesday morning this week. RIM said scheduled maintenance slowed down delivery of some customers' e-mail. (Another outage in late January was caused by the AT&T Wireless network.)
BlackBerry has gotten black eyes before, namely during an April outage in North America that lasted overnight. In 2006, users lived for several months in fear of a service shutdown in the U.S. sought by NTP, which sued RIM alleging patent infringement. RIM eventually settled the suit in March 2006, agreeing to pay more than US$600 million.
The problems shone a spotlight on RIM's reliance on a proprietary architecture and the fact that all messages have to go through its network operations center (NOC). These factors could make RIM vulnerable to a single point of failure, some analysts said. But in reality, BlackBerry devices probably aren't any less dependable than mobile e-mail systems from other vendors, such as Microsoft, Palm and Nokia's Intellisync, they said.
BlackBerry service can be managed through a BlackBerry Enterprise Server within an organization, but RIM is now making a push for consumers with its BlackBerry Internet Service, which can be ordered from a carrier.
In the fourth quarter of last year, RIM had a leading 41 percent share of the U.S. smartphone market and more than doubled its worldwide share to 11.4 percent, according to research company Canalys. Users like the security of RIM's system, its support for IBM Lotus Notes in addition to Microsoft Exchange and the fact that RIM's NOC handles the connections to all mobile operators that carry BlackBerry devices, analysts said.
Document format battle takes shape ahead of meeting
Microsoft faces a tough battle starting Monday at a meeting in Geneva that will influence how widely the company's latest document format will be used in the future.
Representatives of national standards bodies worldwide will attend the ballot resolution meeting (BRM) held by the International Organization for Standardization (ISO). They'll be focused on revising the specifications for Microsoft's Office Open XML (OOXML), which the company hopes will become an ISO standard.
Although OOXML has already been approved by an industry standards body, Ecma International, the ISO designation is key, since governments look to the ISO when choosing technical standards.
OOXML failed to become an ISO standard during a vote last September, but it has another chance if enough countries can agree on the revisions. Those countries will then have one month to vote on the new specification after the BRM.
But Microsoft faces stiff opposition from companies and industry groups behind OpenDocument Format (ODF), which was approved by the ISO in 2006 as a standard. Those opponents contend that having more than one document standard makes software purchasing decisions harder for organizations.
In fact, those opponents are staging their own conference in the same venue in Geneva as the ISO meeting.
OpenForum Europe, an organization supporting ODF and open standards, has invited prominent OOXML critics and advocates of open standards to speak. They include Vint Cerf, vice president and chief Internet evangelist at Google and Hakon Wium Lie, chief technology officer of Opera, the Oslo-based browser developer.
The timing or venue choice wasn't a coincidence, said Graham Taylor, chief executive of OpenForum Europe. The organization has also timed its sessions to not conflict so BRM delegates can attend.
The shrewd timing is clearly aimed at sinking OOXML, which critics say is an overly complex standard and favors Microsoft in intricate, technical ways, even though the specification is open.
"We think there are a much wider set of issues that need to be considered by the national bodies when they come to make their vote," Taylor said.
Microsoft believes there is room for more than one standard. "We do not fundamentally believe that you have a uniform single view of technology ... in order to have interoperability," said Jason Matusow, senior director of interoperability, on Wednesday during a company event with journalists in London.
Microsoft also cites several projects under way to create translators to move formats from OOXML to ODF, and vice versa. However, Microsoft argues that the features of OOXML, a version of which is now used in Office 2007, are richer than ODF.
The meeting of the two sides at one venue has led some to speculate about heightened tension around what's already been an acrimonious debate. But Taylor said Microsoft representatives will attend OpenForum Europe sessions, and that there won't be any "heckling."
Taylor said he has assured the BRM conveners there will be no trouble. Press and observers can attend OpenForum Europe sessions, but the BRM is open only to official delegates from the 87 countries participating.
After the BRM is over, countries will look at the revisions to OOXML and then cast a vote. To become an ISO standard, a specification must win the support of two-thirds of national standards bodies that participated in work on the proposal, known as P-members. It also must receive the support of three-quarters of all voting members.
During the September vote, OOXML failed, receiving only 53 percent of the voting P-members, below the 67 percent needed. Among voting members, OOXML received only 74 percent, 1 percent shy of the mark.
This time around, countries are allowed to change their votes, adding another element of uncertainty around OOXML's fate. If the format is not approved, it means Microsoft might be forced to rethink its strategy around document formats if it wants government IT contracts.
Either way, the sheer dominance of Microsoft's Office suite means some version of OOXML will be used for years to come. The company said its partners are already using it in their own applications, but ODF supporters counter no vendor has come close to fully implementing the 6,000-page specification.
One of Microsoft's partners is Fractal Edge, a U.K. company that makes software that builds visual representations of complex financial data, which it calls "fractal maps." But displaying the fractal maps in older Excel versions required sending an additional configuration file for the map to be compatible with Microsoft's with binary file format, said Gervase Clifton-Bligh, vice president of product strategy.
The company has written an add-in for Excel 2007 to display the maps. OOXML container files can easily hold additional elements such as graphics -- or map configuration files.
Whether OOXML is a standard won't make a huge difference in the company's business since 100 percent of their customers use Excel, Clifton-Bligh said. But if other companies store their data in Open XML -- even if they are using a different spreadsheet program -- it would be easier to move their data into Excel, he said.
"We won't make an add-in for every spreadsheet," Clifton-Bligh said.
The British Library isn't taking a stand on whether OOXML should become an ISO standard or not, said Richard Boulderstone, director of e-Strategy.
The library is facing the long-term problem of how to continue to make its digital collection available. Universal agreement and implementation of a standard is most helpful, Boulderstone said. Also important is how a standard is built into products.
"You can create any kind of standard but there's always going to be different implementations," he said, adding that those characteristics can affect how a document is archived and viewed in the future.
Representatives of national standards bodies worldwide will attend the ballot resolution meeting (BRM) held by the International Organization for Standardization (ISO). They'll be focused on revising the specifications for Microsoft's Office Open XML (OOXML), which the company hopes will become an ISO standard.
Although OOXML has already been approved by an industry standards body, Ecma International, the ISO designation is key, since governments look to the ISO when choosing technical standards.
OOXML failed to become an ISO standard during a vote last September, but it has another chance if enough countries can agree on the revisions. Those countries will then have one month to vote on the new specification after the BRM.
But Microsoft faces stiff opposition from companies and industry groups behind OpenDocument Format (ODF), which was approved by the ISO in 2006 as a standard. Those opponents contend that having more than one document standard makes software purchasing decisions harder for organizations.
In fact, those opponents are staging their own conference in the same venue in Geneva as the ISO meeting.
OpenForum Europe, an organization supporting ODF and open standards, has invited prominent OOXML critics and advocates of open standards to speak. They include Vint Cerf, vice president and chief Internet evangelist at Google and Hakon Wium Lie, chief technology officer of Opera, the Oslo-based browser developer.
The timing or venue choice wasn't a coincidence, said Graham Taylor, chief executive of OpenForum Europe. The organization has also timed its sessions to not conflict so BRM delegates can attend.
The shrewd timing is clearly aimed at sinking OOXML, which critics say is an overly complex standard and favors Microsoft in intricate, technical ways, even though the specification is open.
"We think there are a much wider set of issues that need to be considered by the national bodies when they come to make their vote," Taylor said.
Microsoft believes there is room for more than one standard. "We do not fundamentally believe that you have a uniform single view of technology ... in order to have interoperability," said Jason Matusow, senior director of interoperability, on Wednesday during a company event with journalists in London.
Microsoft also cites several projects under way to create translators to move formats from OOXML to ODF, and vice versa. However, Microsoft argues that the features of OOXML, a version of which is now used in Office 2007, are richer than ODF.
The meeting of the two sides at one venue has led some to speculate about heightened tension around what's already been an acrimonious debate. But Taylor said Microsoft representatives will attend OpenForum Europe sessions, and that there won't be any "heckling."
Taylor said he has assured the BRM conveners there will be no trouble. Press and observers can attend OpenForum Europe sessions, but the BRM is open only to official delegates from the 87 countries participating.
After the BRM is over, countries will look at the revisions to OOXML and then cast a vote. To become an ISO standard, a specification must win the support of two-thirds of national standards bodies that participated in work on the proposal, known as P-members. It also must receive the support of three-quarters of all voting members.
During the September vote, OOXML failed, receiving only 53 percent of the voting P-members, below the 67 percent needed. Among voting members, OOXML received only 74 percent, 1 percent shy of the mark.
This time around, countries are allowed to change their votes, adding another element of uncertainty around OOXML's fate. If the format is not approved, it means Microsoft might be forced to rethink its strategy around document formats if it wants government IT contracts.
Either way, the sheer dominance of Microsoft's Office suite means some version of OOXML will be used for years to come. The company said its partners are already using it in their own applications, but ODF supporters counter no vendor has come close to fully implementing the 6,000-page specification.
One of Microsoft's partners is Fractal Edge, a U.K. company that makes software that builds visual representations of complex financial data, which it calls "fractal maps." But displaying the fractal maps in older Excel versions required sending an additional configuration file for the map to be compatible with Microsoft's with binary file format, said Gervase Clifton-Bligh, vice president of product strategy.
The company has written an add-in for Excel 2007 to display the maps. OOXML container files can easily hold additional elements such as graphics -- or map configuration files.
Whether OOXML is a standard won't make a huge difference in the company's business since 100 percent of their customers use Excel, Clifton-Bligh said. But if other companies store their data in Open XML -- even if they are using a different spreadsheet program -- it would be easier to move their data into Excel, he said.
"We won't make an add-in for every spreadsheet," Clifton-Bligh said.
The British Library isn't taking a stand on whether OOXML should become an ISO standard or not, said Richard Boulderstone, director of e-Strategy.
The library is facing the long-term problem of how to continue to make its digital collection available. Universal agreement and implementation of a standard is most helpful, Boulderstone said. Also important is how a standard is built into products.
"You can create any kind of standard but there's always going to be different implementations," he said, adding that those characteristics can affect how a document is archived and viewed in the future.
EU drafts guidelines for RFID technologies
The European Commission has sketched out guidelines designed to help get RFID (radio frequency identification) technologies up and running in the European Union, but stopped short of proposing formal legislation in the area.
The Commission said Thursday that it has drawn up a draft text that aims to help the makers of RFID technology, as well as potential users, introduce the technology without harming privacy rights.
The Commission recommends that producers of RFID chips conduct a privacy assessment before marketing their wares, while industries that plan to use the chips should sign up to a code of conduct outlining how the chips should be used. Industries using RFID technology should agree on a symbol to attach to the goods that carry the chips to alert customers to their presence, the Commission proposed. It also suggests that the chips should deactivate automatically at the point of purchase.
RFID chips used with perishable items such as milk could alert consumers if products go bad, but such a service should be optional, said Commission spokesman Martin Selmayr.
"You should be able to decide whether to allow your milk carton to communicate with your fridge, for example," he said at a news conference.
The Commission has opened an eight-week consultation, which ends April 25, with interested parties, including industry, consumer and privacy groups. It hopes to adopt the recommendations in the summer.
Ensuring that the potentially invasive technology respects people's right to privacy is essential if it is to take off, Selmayr said.
"The new technology will only take off in a sound environment where data protection is safeguarded," he said.
RFID could revolutionize logistics operations by allowing companies to trace their goods from the factory to the shop shelf.
Three kinds of RFID chips are currently in use in Europe:
-- Passive RFID tags do not need a power supply of their own; the minute tension induced from a radio frequency signal emitted by the reader is sufficient to activate their circuit and to send out short digital information streams in response. Typically, this information includes a unique identification number that points to an entry in a data base.
-- Semi-passive RFID tags have built-in batteries and do not require energy induced from the reader to power the microchip. This allows them to function with much lower signal power levels and over greater distances than passive tags. They are, however, considerably more expensive.
-- Active RFID tags have an on-board power-supply, usually a battery, of their own. This allows for more complex circuits to be powered and for more functionality.
Six-hundred million RFID tags, almost all passive, were sold in the E.U. in 2006, according to Commission research. That is predicted to rise to around 300 billion by 2016, the Commission said.
More information on the RFID issue can be found at the Commission's Web site.
The Commission said Thursday that it has drawn up a draft text that aims to help the makers of RFID technology, as well as potential users, introduce the technology without harming privacy rights.
The Commission recommends that producers of RFID chips conduct a privacy assessment before marketing their wares, while industries that plan to use the chips should sign up to a code of conduct outlining how the chips should be used. Industries using RFID technology should agree on a symbol to attach to the goods that carry the chips to alert customers to their presence, the Commission proposed. It also suggests that the chips should deactivate automatically at the point of purchase.
RFID chips used with perishable items such as milk could alert consumers if products go bad, but such a service should be optional, said Commission spokesman Martin Selmayr.
"You should be able to decide whether to allow your milk carton to communicate with your fridge, for example," he said at a news conference.
The Commission has opened an eight-week consultation, which ends April 25, with interested parties, including industry, consumer and privacy groups. It hopes to adopt the recommendations in the summer.
Ensuring that the potentially invasive technology respects people's right to privacy is essential if it is to take off, Selmayr said.
"The new technology will only take off in a sound environment where data protection is safeguarded," he said.
RFID could revolutionize logistics operations by allowing companies to trace their goods from the factory to the shop shelf.
Three kinds of RFID chips are currently in use in Europe:
-- Passive RFID tags do not need a power supply of their own; the minute tension induced from a radio frequency signal emitted by the reader is sufficient to activate their circuit and to send out short digital information streams in response. Typically, this information includes a unique identification number that points to an entry in a data base.
-- Semi-passive RFID tags have built-in batteries and do not require energy induced from the reader to power the microchip. This allows them to function with much lower signal power levels and over greater distances than passive tags. They are, however, considerably more expensive.
-- Active RFID tags have an on-board power-supply, usually a battery, of their own. This allows for more complex circuits to be powered and for more functionality.
Six-hundred million RFID tags, almost all passive, were sold in the E.U. in 2006, according to Commission research. That is predicted to rise to around 300 billion by 2016, the Commission said.
More information on the RFID issue can be found at the Commission's Web site.
Subscribe to:
Posts (Atom)