Thursday, February 21, 2008

RIM gains despite outages

Consumers and enterprise workers are flocking to the BlackBerry despite recent embarrassing glitches that have shut down service for hours on a few occasions.
Research In Motion on Thursday boosted its forecast for subscriber account additions in its fiscal fourth quarter ending March 1. Back in December, RIM predicted 1.82 million new accounts, but now it expects that number to be higher by 15 percent to 20 percent. That will mean a total of about 14 million subscriber accounts at the end of the quarter. Final results will be revealed April 2. The company's revenue and profit forecast hasn't changed.

The Waterloo, Ontario, company raised its forecast during a difficult month. Last week, BlackBerry users in North America lost the mobile e-mail and data service for about three hours in an incident RIM blamed on recent upgrades to an internal routing system. Then, some North American users reported the service down on Wednesday morning this week. RIM said scheduled maintenance slowed down delivery of some customers' e-mail. (Another outage in late January was caused by the AT&T Wireless network.)

BlackBerry has gotten black eyes before, namely during an April outage in North America that lasted overnight. In 2006, users lived for several months in fear of a service shutdown in the U.S. sought by NTP, which sued RIM alleging patent infringement. RIM eventually settled the suit in March 2006, agreeing to pay more than US$600 million.

The problems shone a spotlight on RIM's reliance on a proprietary architecture and the fact that all messages have to go through its network operations center (NOC). These factors could make RIM vulnerable to a single point of failure, some analysts said. But in reality, BlackBerry devices probably aren't any less dependable than mobile e-mail systems from other vendors, such as Microsoft, Palm and Nokia's Intellisync, they said.

BlackBerry service can be managed through a BlackBerry Enterprise Server within an organization, but RIM is now making a push for consumers with its BlackBerry Internet Service, which can be ordered from a carrier.

In the fourth quarter of last year, RIM had a leading 41 percent share of the U.S. smartphone market and more than doubled its worldwide share to 11.4 percent, according to research company Canalys. Users like the security of RIM's system, its support for IBM Lotus Notes in addition to Microsoft Exchange and the fact that RIM's NOC handles the connections to all mobile operators that carry BlackBerry devices, analysts said.

Document format battle takes shape ahead of meeting

Microsoft faces a tough battle starting Monday at a meeting in Geneva that will influence how widely the company's latest document format will be used in the future.
Representatives of national standards bodies worldwide will attend the ballot resolution meeting (BRM) held by the International Organization for Standardization (ISO). They'll be focused on revising the specifications for Microsoft's Office Open XML (OOXML), which the company hopes will become an ISO standard.

Although OOXML has already been approved by an industry standards body, Ecma International, the ISO designation is key, since governments look to the ISO when choosing technical standards.

OOXML failed to become an ISO standard during a vote last September, but it has another chance if enough countries can agree on the revisions. Those countries will then have one month to vote on the new specification after the BRM.

But Microsoft faces stiff opposition from companies and industry groups behind OpenDocument Format (ODF), which was approved by the ISO in 2006 as a standard. Those opponents contend that having more than one document standard makes software purchasing decisions harder for organizations.

In fact, those opponents are staging their own conference in the same venue in Geneva as the ISO meeting.

OpenForum Europe, an organization supporting ODF and open standards, has invited prominent OOXML critics and advocates of open standards to speak. They include Vint Cerf, vice president and chief Internet evangelist at Google and Hakon Wium Lie, chief technology officer of Opera, the Oslo-based browser developer.

The timing or venue choice wasn't a coincidence, said Graham Taylor, chief executive of OpenForum Europe. The organization has also timed its sessions to not conflict so BRM delegates can attend.

The shrewd timing is clearly aimed at sinking OOXML, which critics say is an overly complex standard and favors Microsoft in intricate, technical ways, even though the specification is open.

"We think there are a much wider set of issues that need to be considered by the national bodies when they come to make their vote," Taylor said.

Microsoft believes there is room for more than one standard. "We do not fundamentally believe that you have a uniform single view of technology ... in order to have interoperability," said Jason Matusow, senior director of interoperability, on Wednesday during a company event with journalists in London.

Microsoft also cites several projects under way to create translators to move formats from OOXML to ODF, and vice versa. However, Microsoft argues that the features of OOXML, a version of which is now used in Office 2007, are richer than ODF.

The meeting of the two sides at one venue has led some to speculate about heightened tension around what's already been an acrimonious debate. But Taylor said Microsoft representatives will attend OpenForum Europe sessions, and that there won't be any "heckling."

Taylor said he has assured the BRM conveners there will be no trouble. Press and observers can attend OpenForum Europe sessions, but the BRM is open only to official delegates from the 87 countries participating.

After the BRM is over, countries will look at the revisions to OOXML and then cast a vote. To become an ISO standard, a specification must win the support of two-thirds of national standards bodies that participated in work on the proposal, known as P-members. It also must receive the support of three-quarters of all voting members.

During the September vote, OOXML failed, receiving only 53 percent of the voting P-members, below the 67 percent needed. Among voting members, OOXML received only 74 percent, 1 percent shy of the mark.

This time around, countries are allowed to change their votes, adding another element of uncertainty around OOXML's fate. If the format is not approved, it means Microsoft might be forced to rethink its strategy around document formats if it wants government IT contracts.

Either way, the sheer dominance of Microsoft's Office suite means some version of OOXML will be used for years to come. The company said its partners are already using it in their own applications, but ODF supporters counter no vendor has come close to fully implementing the 6,000-page specification.

One of Microsoft's partners is Fractal Edge, a U.K. company that makes software that builds visual representations of complex financial data, which it calls "fractal maps." But displaying the fractal maps in older Excel versions required sending an additional configuration file for the map to be compatible with Microsoft's with binary file format, said Gervase Clifton-Bligh, vice president of product strategy.

The company has written an add-in for Excel 2007 to display the maps. OOXML container files can easily hold additional elements such as graphics -- or map configuration files.

Whether OOXML is a standard won't make a huge difference in the company's business since 100 percent of their customers use Excel, Clifton-Bligh said. But if other companies store their data in Open XML -- even if they are using a different spreadsheet program -- it would be easier to move their data into Excel, he said.

"We won't make an add-in for every spreadsheet," Clifton-Bligh said.

The British Library isn't taking a stand on whether OOXML should become an ISO standard or not, said Richard Boulderstone, director of e-Strategy.

The library is facing the long-term problem of how to continue to make its digital collection available. Universal agreement and implementation of a standard is most helpful, Boulderstone said. Also important is how a standard is built into products.

"You can create any kind of standard but there's always going to be different implementations," he said, adding that those characteristics can affect how a document is archived and viewed in the future.

EU drafts guidelines for RFID technologies

The European Commission has sketched out guidelines designed to help get RFID (radio frequency identification) technologies up and running in the European Union, but stopped short of proposing formal legislation in the area.
The Commission said Thursday that it has drawn up a draft text that aims to help the makers of RFID technology, as well as potential users, introduce the technology without harming privacy rights.

The Commission recommends that producers of RFID chips conduct a privacy assessment before marketing their wares, while industries that plan to use the chips should sign up to a code of conduct outlining how the chips should be used. Industries using RFID technology should agree on a symbol to attach to the goods that carry the chips to alert customers to their presence, the Commission proposed. It also suggests that the chips should deactivate automatically at the point of purchase.

RFID chips used with perishable items such as milk could alert consumers if products go bad, but such a service should be optional, said Commission spokesman Martin Selmayr.

"You should be able to decide whether to allow your milk carton to communicate with your fridge, for example," he said at a news conference.

The Commission has opened an eight-week consultation, which ends April 25, with interested parties, including industry, consumer and privacy groups. It hopes to adopt the recommendations in the summer.

Ensuring that the potentially invasive technology respects people's right to privacy is essential if it is to take off, Selmayr said.

"The new technology will only take off in a sound environment where data protection is safeguarded," he said.

RFID could revolutionize logistics operations by allowing companies to trace their goods from the factory to the shop shelf.

Three kinds of RFID chips are currently in use in Europe:

-- Passive RFID tags do not need a power supply of their own; the minute tension induced from a radio frequency signal emitted by the reader is sufficient to activate their circuit and to send out short digital information streams in response. Typically, this information includes a unique identification number that points to an entry in a data base.

-- Semi-passive RFID tags have built-in batteries and do not require energy induced from the reader to power the microchip. This allows them to function with much lower signal power levels and over greater distances than passive tags. They are, however, considerably more expensive.

-- Active RFID tags have an on-board power-supply, usually a battery, of their own. This allows for more complex circuits to be powered and for more functionality.

Six-hundred million RFID tags, almost all passive, were sold in the E.U. in 2006, according to Commission research. That is predicted to rise to around 300 billion by 2016, the Commission said.

More information on the RFID issue can be found at the Commission's Web site.

Microsoft gives open source a big hug

In a major turnaround for Microsoft, the company Thursday promised "greater transparency" in its development and business practices, outlining a new strategy to provide more access to APIs and previously proprietary protocols for some of its major software products, including Windows and Office.
The move, inspired by the ongoing antitrust case against Microsoft in the European Union, shows the company finally acknowledging the significant impact open source and open standards have had on the industry and the company's own business. It also should mean the end of Microsoft's patent threats against Linux and interoperability concerns surrounding Office 2007 file formats.

During a news conference with top executives Thursday, Microsoft said it is implementing four new interoperability principles and actions across its business products to ensure open connections, promote data portability, enhance support for industry standards, and foster more open engagement with customers and the industry, including open-source communities.

These steps are "important" and represent "significant change in how we share information about our products and technologies," Microsoft CEO Steve Ballmer said in a statement. "For the past 33 years, we have shared a lot of information with hundreds of thousands of partners around the world and helped build the industry, but today's announcement represents a significant expansion toward even greater transparency."

Under increased global pressure, Microsoft has limped toward a more open development policy for some time with strategies like the Open Specification Promise, which it published in September 2006 as a pledge that it would not take any patent-enforcement action against those who use certain technology APIs (application programming interfaces). The company launched an open-source Web site last year, a move that was notable for one of the first official uses of the term "open source" by the company. Microsoft previously would release APIs and code to developers and other companies through something it called the Shared Source Initiative rather than specifically calling its policy open source.

However, at the same time as it appeared to be more open, Microsoft continued to make bold claims and threats against technologies like Linux that it said violated many patents the company holds. While the open-source community mostly scoffed at Microsoft's claims, some companies -- including Novell -- signed specific deals with the vendor to protect customers from indemnification and promote interoperability with Microsoft software.

Microsoft also continued to promote proprietary file formats it designed as the default for Office 2007 -- Office Open XML (OOXML) -- in favor of another file format, ODF (Open Document Format for XML), which already has been approved as a global technology standard by the International Organization for Standardization (ISO).

Microsoft submitted the OOXML specification to another standards body, Ecma International, in November 2005 in an effort to have it fast-tracked through the ISO. However, approval by the ISO has been stalled and the process riddled with complaints that Microsoft is not acting in the transparent way typical of an international standards process.

The announcements on Thursday don't affect the company's continued efforts to standardize OOXML, Ballmer said during the press conference.

Thursday's news includes broad, royalty-free publishing of APIs and the establishment of an Open Source Interoperability Initiative to provide ongoing resources and documentation to the community, and marks more commitment than the notoriously proprietary software maker has ever shown toward embracing open standards and open source.

Microsoft plans to publish on its Web site documentation for APIs and communications protocols that are used by what it calls its "high-volume products." Microsoft includes Windows Vista (including the .NET Framework), Windows Server 2008, SQL Server 2008, Office 2007, Exchange Server 2007 and Office SharePoint Server 2007 -- as well as their future versions -- under this umbrella. Microsoft will not require developers to license or pay royalties for this information, the company said.

To get this ball rolling, Microsoft Thursday will publish on its Microsoft Developer Network Web site more than 30,000 pages of documentation for Windows client and server protocols that were previously available only under a trade-secret license through the Microsoft Work Group Server Protocol Program and the Microsoft Communication Protocol Program. Microsoft will publish protocol documentation for the other high-volume products in upcoming months, the company said.

Microsoft also is providing a covenant not to sue open-source developers for development or non-commercial distribution of implementations of these protocols -- a huge move for any Linux or open-source developers that may have feared litigation from Microsoft. The company said Thursday that developers will be able to use the documentation for free to develop products. However, companies that want to commercially distribute implementations of the protocols still must obtain a patent license from Microsoft, it said.

On the OOXML front, Microsoft promised Thursday to design new APIs for its Word, Excel and PowerPoint applications so developers can plug in additional document formats and enable users to set these formats as their default for saving documents. While there are add-on technologies that can translate between OOXML -- the default file format in Office 2007 -- and other file formats, Microsoft has not included the ability to set other file formats as default in the product suite.

Microsoft said Thursday it will use a new Open Source Interoperability Initiative to provide resources, facilities and events to the community, including labs, technical content and opportunities for ongoing cooperative development. Microsoft also is seeking an ongoing dialogue with customers, developers and open-source communities through an online Interoperability Forum. And Microsoft will launch a Document Interoperability Initiative to address the issue of data exchange between widely deployed formats, the company said

The announcement reflects a change in the market in the past couple of decades, said Ray Ozzie, Microsoft's chief software architect, during a question and answer session at the press conference. "When Microsoft entered the game in the mid-80s, people focused on using the PC. They tended to use a small number of programs," he said. Today, people use many more applications and they expect data from one program to be available in other products, he said. The changes Microsoft is making adapt to that change in the market, he said.

Still, Ballmer cautioned that end-users shouldn't expect to see much change for some time. "Any opening up doesn't happen overnight," he said during the Q&A session. "I think it will be more like years than days" before end-users notice the effects of Thursday's announcements, he said.

Microsoft finds it hard to predict what kinds of new products might become available to users because of this change. "One thing the Net has shown is that sometimes, constraints around standards can be quite liberating to developers," said Ozzie. "Many times, new services pop out of nowhere once a standard is there and once interoperability principles are established, because we can't think of the different potential uses of customer data and how to interface with products."

Ballmer said he doesn't expect the licensing changes to affect Microsoft's bottom line. "The amount of trade secrets licensing fees we forgo will be minimal," he said. The licensing changes are risky, he acknowledged, but the potential benefit for third parties to add value around Microsoft offerings balances the risk, he said.

While Thursday's announcements are related to Microsoft's legal problems in Europe, Ballmer argued that the changes were more driven by the market. "The announcement today is driven by what we're hearing from industry participants," he said.

Microsoft's Interoperability Executive Customer (IEC) Council will oversee the new principles and initiatives to help keep the company honest. The IEC is an advisory board established in 2006 and comprised mainly of chief information and technology officers from more than 40 companies and government institutions worldwide.

More information about the news can be found on Microsoft interoperability Web site.

Cisco tries to turn cities green

Industry needs to team up with cities to battle climate change, Cisco Systems Chairman and CEO John Chambers told local government leaders on Wednesday.
Saying his views had changed from just five or six years ago, the head of the world's largest network builder cozied up to officials from municipalities around the world at the Connected Urban Development Global Conference in San Francisco.

"It is hugely important to have supportive government," Chambers said.

The conference, co-hosted by Cisco and the city and county of San Francisco, focused on what cities can do to reduce greenhouse gas emissions and encourage their residents to do the same. As an example of what might help, the city unveiled a "green" bus equipped with Wi-Fi and with screens that can tell riders where they are, when they'll reach their destination and how much they're reducing their greenhouse gases by taking the bus. That will encourage them to ride more often, the city said. Officials from Seoul also discussed traffic-reduction initiatives at the conference, and Amsterdam representatives talked about an efficiency standard for data centers.

Cities should play the key role in tackling climate change because they consume 75 percent of the world's energy and produce 80 percent of its emissions, Chambers said.

Rather than coming up with solutions one by one, pioneering cities should work with each other and private industry to create a "replicable blueprint" for making urban centers friendlier to the environment, Chambers said. Cisco's Connected Urban Development initiative will start with a few cities, including the three represented at the conference, and deliver knowledge and best practices to many more cities over time, he said. He called for cities to tap into social-networking technology -- which Cisco has been rapidly adding to its portfolio -- to bring together parties that traditionally haven't worked together.

Though it wasn't on display at the conference, Cisco's Telepresence high-definition virtual meeting technology played a key role in Chambers' speech. Cisco has used Telepresence units for 75,000 meetings since its debut just over a year ago, and in the process has slashed travel, helping to cut Cisco's annual greenhouse gas emissions per employee by 10 percent, Chambers said. One air trip produces the same emissions as 98 Telepresence sessions, he said. Meanwhile, Chambers said, he was able to slash the company's budget by US$150 million thanks to the new technology.

"Corporate social responsibility is just plain good for business," Chambers said.

Cisco isn't just pushing green technology to save money and the Earth. San Francisco's green bus, created by Cisco, is equipped with a Cisco router to link onboard Wi-Fi with outdoor 3G (third-generation) mobile data. Cisco also envisions the bus using its IPICS (Internet Protocol Interoperability and Collaboration System) technology, which unifies many public-sector radio technologies through an IP network.

And networking has a big role to play in environmental efforts, another Cisco executive said. For example, IP (Internet Procol) networks can transmit power consumption data and information from remote sensors around a building, said Laura Ipsen, senior vice president of global policy and government affairs at Cisco.

"If it's connected, it can be more green," Ipsen said.

Google to manage health records for Cleveland Clinic

Google will test a new online medical record service with a hospital group in Cleveland, Ohio, allowing patients to control who gets to see their health information. The two organizations hope the trial will lead to the creation of a national system for sharing electronic medical records.
The Cleveland Clinic already operates its own electronic personal health record system, eCleveland Clinic MyChart, holding the records of 100,000 patients. It will invite between 1,500 and 10,000 of them to participate in the trial with Google, it said Wednesday.

Participants in the trial will be able to exchange data about their prescriptions, conditions and allergies between their Cleveland Clinic record and a "secure Google profile" in a live clinical delivery setting, the hospital group said.

With the data accessible in this way, patients will be able to share it with different doctors, service providers and pharmacies, according to the hospital group.

Google will release further details of its role in the trial later Thursday, said company spokeswoman Amy Fisher.

She was not immediately able to say whether the Google profile that will be used in the Cleveland trial is the same as the Google Account used to access Gmail, iGoogle and other personalized services the company offers. Google usually uses the term "Google Profile" to refer to a kind of online visiting card showing information about a Google Account holder.

For the Cleveland Clinic, one of the attractions of working with Google is that it can offer the record sharing service at no cost to the user or to the clinic, it said.

Google funds its personalized Web services by displaying targeted advertising based on what it knows about its users, raising questions about whether sensitive personal information could end up in the hands of marketers.

Privacy is bound to be a concern for potential users of the system for other reasons: health information is sensitive because of the effects that certain conditions can have on job prospects or insurance rates. For those reasons the storage, transmission and use of such information is tightly regulated by the Health Insurance Portability and Accountability Act (HIPAA) in the U.S..

In its privacy policy, Google already promises not to share sensitive personal information with third parties without prior consent, defining sensitive personal information as "information we know to be related to confidential medical information, racial or ethnic origins, political or religious beliefs or sexuality and tied to personal information."

Roy Fielding quits OpenSolaris project

A high-profile figure in Sun Microsystems' OpenSolaris community has quit, accusing Sun of retaining too much control over the open-source counterpart to its Solaris operating system.
Roy Fielding, co-founder of the Apache HTTP Server Project and a key contributor to the hypertext transfer protocol (HTTP), announced his resignation last week in a message on the community's discussion forum.

"Sun didn't just make vague statements to me about OpenSolaris; they made promises about it being an open development project. That's the only way they could get someone like me to provide free labor for their benefit," he wrote in the message dated Feb. 14.

That hasn't happened, Fielding argued.

"Sun agreed that 'OpenSolaris' would be governed by the community and yet has refused, in every step along the way, to cede any real control over the software produced or the way it is produced, and continues to make private decisions every day that are later promoted as decisions for this thing we call OpenSolaris," he wrote.

OpenSolaris consists of an open-source code base, tools and a community of developers, not an end-user-ready distribution. Various distributions employing OpenSolaris have been released.

Sun should adopt the governing style of MySQL, the open-source database company Sun is acquiring, Fielding wrote. "That company doesn't pretend to let their community participate in decisions, and yet they still manage to satisfy most of their users. ... There's nothing particularly wrong with that choice -- it is a perfectly valid open source model for corporations that don't need active community participation."

Fielding did not respond to a request for additional comment.

Terri Molini, an OpenSolaris advocacy contributor at Sun, responded to Fielding in a prepared statement Wednesday.

"As a consultant, Roy was extremely helpful during the inception of the OpenSolaris community as well as an original member of the [OpenSolaris Governing Board]. His involvement with the community and his contributions in the creation of the governance model were invaluable. Sun wishes him well in all of his endeavors," she wrote.

"Open source technologies have many stakeholders and Sun is working with as many communities as possible to create an open source distribution of OpenSolaris," she continued. "We recognize that we will not be able to please everyone as we move through this process and in some cases, we'll have to agree to disagree on some points."

One close observer of OpenSolaris had a mixed reaction to Fielding's decision.

"There's a controversy here, there's a flare-up, it looks bad, but the commentary itself is not unprecedented in the open-source community," said Stephen O'Grady, an analyst with Redmonk.

Given Fielding's status as a technologist, his departure will have some effect but won't cripple the project, O'Grady said. "Certainly it's a [public relations] hit. ... But this isn't Linus Torvalds leaving Linux."

Despite Yahoo drama, Microsoft forges ahead with search

Microsoft is pushing ahead with plans to expand its enterprise and Internet search offerings even as its quest to purchase Yahoo remains uncertain and could get uglier.
On Wednesday, Fast Search and Transfer (FAST), an enterprise search company that Microsoft said it would acquire in January, added personalization to its existing search product and divulged more details about how Microsoft will use the technology.

Microsoft is also prepping a new release of its Live Search online offering, code-named Rome, which was revealed in an SEC document filed as part of the Yahoo bid but which Microsoft declined to discuss in more detail.

The company's entire search strategy will likely be disrupted if it's successful in its pursuit of Yahoo. Microsoft seems intent on closing the deal whether Yahoo's board of directors likes it or not; Microsoft is reportedly going to mount a proxy fight for Yahoo in which it would attempt to remove Yahoo's board members, who rejected Microsoft's US$44.6 billion buyout offer.

"There are a lot of moving pieces here," said Chris Swenson, an analyst with NPD Group, about Microsoft's recent moves in the search market.

Despite this, Microsoft's plan for FAST, the acquisition of which should be finalized in April or May, seems to be taking shape.

On Wednesday, FAST's Zia Zaman, executive vice president of global marketing for the Oslo, Norway-based company, unveiled a new category of software that FAST is developing called interaction management, and a new product under that umbrella called the FAST Personalization Framework. The framework adds the type of personalization technology found on consumer Web sites, such as Amazon.com, that suggests products and services based on previous choices.

Zaman said this kind of personalization, which FAST acquired through its purchase of San Francisco-based AgentArts, could be a powerful tool for enterprise search. A pharmaceutical researcher developing a new drug, for example, could find valuable information if a search tool delivers not only directly relevant results, but other results related to that information, he said.

"There is a lot of cool information within the four walls" that researchers may not even know they're looking for until it appears before them, he said. And FAST's search technology also searches outside the firewall to find the most relevant information, which also can be personalized through the new framework, Zaman added.

Once the Microsoft acquisition closes, FAST will act as a wholly owned subsidiary of Microsoft and maintain much of its current operations, he said. The company will report to the SharePoint team, which is part of the business unit that also includes Microsoft Office. SharePoint is the portal component of Microsoft's collaboration strategy for Office; it also links in to Microsoft's back-end CRM and ERP applications and is part of the company's larger business-intelligence strategy.

SharePoint already lets workers search across company projects hosted through the product. FAST's technology is expected to enhance and expand the search capabilities in SharePoint, Zaman said.

For online users, Microsoft offered a new version of Live Search last September and recently updated its search algorithm to improve its ability to crawl Web sites after the company identified an indexing problem. According to the SEC filing, Microsoft is already at work on another major update code-named Rome, and it will move "full steam ahead" no matter what happens with Yahoo, according to comments Steve Ballmer made during an employee meeting about the Yahoo bid.

"We've got to drive ahead with our Rome release of search, we've got to get in the market, destination search, Windows Live Wave 3, etc.," Ballmer said, according to the filing.

Despite the SEC filing, Microsoft won't acknowledge Rome's existence. Through an e-mail from its public relations firm, the company said it won't comment on "speculation about internal code names."

NPD's Swenson said that if the Yahoo deal doesn't go through, Microsoft could choose to use the FAST acquisition to bolster not only its enterprise search offerings but also its Internet search. If the deal does go though, it will become more complicated as to which search technology will be applied where, he said.

Despite the downside of having to combine so many search technologies, buying Yahoo would offer Microsoft a host of new opportunities to expand both its Internet and enterprise search efforts, Swenson added.

Zaman said Microsoft and FAST are still ironing out exactly where -- outside of SharePoint and a standalone enterprise search offering -- FAST's technology will be put to use. Enhancing Live Search is a possibility, he said, but there are no specific plans for that yet.

Welcome to the age of localized malware

The program is nasty. It deletes pictures and movies from your hard drive and then it teases you: "Even though Mr. Kaneko was found guilty, you are still using Winny. I really hate such people," taunts an animated woman on your screen.
Welcome to the age of localized malware.

Over the past two years virus writers have increasingly targeted their malicious programs to users in different regions of the globe, creating programs that are specially designed to infect users in countries like Japan, Brazil, China or Germany.

Take the taunting Trojan, which goes after users of the Winny file-sharing program. (Winny creator Isamu Kaneko was convicted of abetting copyright violations in late 2006) Winny is file-sharing software that is incredibly popular in Japan, but virtually unknown outside of the region. Still, it's been the target of several malware programs, according to Dave Marcus, security research and communications manager for McAfee Avert Labs. "Japan has some really unique factors that we just don't see anywhere else," he said. "There are a couple of malware writers in Japan who don't like people who illegally share content."

Previously, attackers would write programs that would affect the largest possible number of users, but that's no longer necessarily the case, Marcus said. "What we've noticed over the last couple of years is that a growing amount of malware is localized."

McAfee believes that there are a few reasons behind this shift. For one thing, writers no longer want the worldwide attention and law enforcement action that was garnered by outbreaks such as Sasser and Netsky.

And with users becoming more wary, hackers have to be crafty with their attacks -- creating more targeted malware that victims are unlikely to have seen before. Another factor is that criminals are increasingly targeting their attacks to regions that have weak cybercrime enforcement, McAfee believes.

Regional attacks also cater to regional tastes. Online banking is widely used in Brazil, so much of the malware there tries to steal banking usernames and passwords. In China, online gaming is so popular that Chinese World of Warcraft password stealers are now the second-largest class of malware tracked by McAfee, Marcus said.

These regional attacks are part of an explosion of viruses and Trojan programs that is making life more difficult for people companies like McAfee that track and intercept the malware. In 2006, the company identified 53,537 unique pieces of malware according to data set to be published Thursday in Sage, McAfee's semi-annual magazine devoted to security issues.

Last year that number jumped 246 percent to 131,862, and it could double again this year. By the end of 2008, McAfee expects to be identifying about 750 pieces of malware per day.

Brain-controlled gaming system falls into 'demo hell'

It wasn't supposed to happen like this.

Emotiv Systems threw a press conference in San Francisco this week to show the latest enhancements to its futuristic gaming system, which lets players control objects on the screen using only their thoughts.
When it works it can be impressive. The system is based on a "neuroheadset" fitted with about a dozen sensors that read tiny electrical impulses that are emitted by the brain when a person thinks. It learns to recognize the impulses and interpret thoughts like "up," "down" and "rotate" and translate them onto the screen.

On Tuesday evening the lights dimmed in a packed auditorium at the Sony Metreon theater and for a few moments everything went fine. An employee donned the headset and made facial expressions -- smiles, winks -- that were reflected on the face of an animated robot on a large screen. The employee rotated a three-dimensional cube on the screen and moved it forward.

Then he tried to make it disappear. Then he tried again. Feet shuffled uncomfortably in the auditorium. "Can you make it disappear?" CEO Nam Do asked hopefully. Someone in the audience coughed. "Shall we move on?" Do asked. "I think we'll move on to the next thing." Then the cube disappeared. The auditorium erupted into applause, either from excitement or relief.

Then came the demonstration of an actual game. Zachary Drake, Emotiv's game developer, built the crowd's expectations. "This," he said scornfully, brandishing a wireless game controller, "this is a wonderful thing, and it does some things really well ... but lifting an object with your mind just leaves this thing behind."

That's when things went really wrong. The neuroheasdset didn't work, so Drake had to use the wireless controller he had scorned a few moments ago to navigate through the game. (The controller is also part of Emotiv's system, it turns out, and supplements use of the headset.) The controller didn't work very well either, however.

"Can you please switch off any wireless transmitters you may be using because right now we can't even get the wireless controller to work," Do asked the audience. But it was too late.

"Welcome to demo hell, folks," Drake said.

It was an unfortunate night for Emotiv, which has demonstrated the system successfully at the Consumer Electronics Show in January and at other venues. It worked fine during set-up on Tuesday afternoon, Do said.

He said later that the demonstration had been disrupted by the wireless audio-visual equipment used by lighting and sound crews at the event, which operates on the same 2.4GHz frequency as its game system. The AV equipment uses a high-power, frequency-hopping, spread-spectrum technology not found in consumer devices or home wireless set-ups, a spokeswoman added.

Emotiv's system doesn't actually "read" a person's thoughts. Instead it looks at patterns of electrical impulses generated by the various parts of the brain and figures out what they correspond to based on patterns it has learned. It can also gauge a person's mood, according to Emotiv, and adjust the difficulty of a game when a player gets frustrated.

The system will go on sale in time for this year's holiday shopping season for US$299, Do said. The company said it will work with all PC games and console platforms.

The underlying technology is known as non-invasive electroencephalography (EEG) and has been around for many years. An Austrian company called g.tec showed a system at Cebit last year designed mostly for scientific and medical use. It features an ungainly rubber cap with wires protruding from it, and could be used to type words or move a cursor on a screen.

Emotiv's contribution is what it calls "the world's first consumer neuroheadset," which uses its own wireless sensor technology. It has also released a software development kit that it hopes will be used use to build applications. The headset could be used with instant messaging software, for example, to express emotions without needing emoticons, Do said.

"You should come try it at our booth at the Game Developers Conference," he told the audience Tuesday. "Then you'll see it really works and we're not lying."

Flash memory prices may plummet, analysts say

Prices of NAND flash memory could plummet this year because of weak demand and an oversupply in the market, analysts said on Wednesday.

On this topic
Flash price drop spurring innovation
CES: Samsung shows off 128G-byte solid state drive
Report: Seagate to offer solid-state drives
Get practical tips, IT news, how-tos, and the best in tech humor.

If concerns about the U.S. economy deepen, consumers may reduce spending on phones, MP3 players and the other devices that use NAND flash, weakening demand for the chips and depressing prices, said Nam Hyung Kim, director and chief memory analyst for iSuppli. He predicted that prices could fall by as much as 55 percent this year.

Up to 90 percent of NAND flash is sold as storage for MP3 players and cell phones, or as cards such as the MicroSD that are slotted into digital cameras and other devices.

The reduced price for flash could lead to cheaper products for consumers. Apple already dropped the price of its 1G-byte iPod Shuffle this week, to US$49 from $79, partly because of the falling prices of flash memory, said Shaw Wu, an analyst with American Technology Research.

"Flash pricing definitely has an impact in terms of giving Apple the ammunition to be able to lower price points," Wu said. At the same time, flash pricing was down a lot in the fourth quarter of 2007, so Apple could have dropped its prices sooner, though it would have earned it slimmer margins on its iPods, Wu said.

The concerns about the economy and consumer spending prompted iSuppli to slash its revenue forecast Wednesday for the flash industry this year. It now expects global NAND revenue to climb between 7 percent and 9 percent, down from its earlier projection of 27 percent, Kim said. Global NAND revenue in 2007 was US$13.9 billion, up 12.5 percent from $12.36 billion in 2006, he said.

This year's revenue growth will also be affected by a reduction in NAND flash orders by device makers, Kim said. Apple, one of the largest buyers of NAND flash, ordered $1.2 billion worth of flash memory last year, but has slashed its order forecast for 2008, according to Kim.

Pricing in the NAND business is cyclical and the declines this year wwould not be unique. This year's 55 percent decline will actually be slightly less than the decline last year, when prices dropped about 60 percent, Kim said. Last year, however, Apple's iPhone helped to shore up the market when it was released mid-year. Apple isn't expected to have a new killer product this year that will bolster the market in the same way.

The killer application that will drive NAND flash sales during 2008 will be mobile phones, said Joseph Unsworth, principal analyst at Gartner. Of the 1.2 billion mobile phones expected to be sold this year, about 650 million will have flash card slots, Unsworth said. The sweet spot in NAND flash remains 1G-byte and 2G-byte capacities as many users can't find the need for more, Unsworth said.

The expected drop in demand isn't leading NAND manufacturers to scale back production, Unsworth said. For fear of losing customers and market share, companies are adding production capacity and flooding the market with NAND, he said.

Excess inventory and increased capital spending on new factories may ultimately bite into the suppliers' earnings, Kim said. Companies including Samsung, Toshiba, Intel and Micron are investing in new fabs this year, and if the U.S. economy recovers and consumer spending increases, the companies are willing to bite the bullet this year to see larger revenue growth in 2009 and 2010. NAND flash is a mature market with lots of upside in the long term, Kim said.

Samsung took 42.1 percent of the NAND flash market last year with US$5.86 billion in revenue, a 4.4 percent year-over-year increase, according to data from iSuppli. Toshiba was second place with 27.2 percent and $3.88 billion in revenue, an increase of 20.3 percent year-over-year. Hynix was third with 8.8 percent growth and $2.38 billion in revenue. Micron and Intel, in fourth and fifth place respectively, showed very strong growth of 139.2 percent and 269.6 percent respectively.

Lessig considers running for Congress

Lawrence Lessig, the cyberlaw author and advocate for free software and online civil liberties, is considering a run for the U.S. Congress, he announced on his blog Wednesday.
Lessig, author of books such as "Free Culture" and "Code 2.0," would run for the open House of Representatives seat in California created by the death of Representative Tom Lantos, a Democrat, earlier this month. A "draft Lessig" movement has popped up online since Lantos died.

Lessig said he plans to make the decision about whether to run by about March 1. "This is a very difficult decision," he wrote on his blog. "Thank you to everyone who has tried to help -- both through very strong words of encouragement and very, very strong words to dissuade."

Lessig, a self-described progressive, would run as part of his Change Congress campaign. The Stanford University law professor announced in January that he would shift his focus to political corruption and away from free software and free culture.

He called on lawmakers to stop accepting money from political action committees and lobbyists, and to stop adding so-called earmarks for special projects in appropriation legislation. Politicians need to change "how Washington works" and to end a culture of corruption that's based on political contributions, he said in a video at Lessig08.org.

"You know about this corruption in Washington, a corruption that doesn't come from evil people, a corruption that comes from good people working in a bad system," he said in the video. Progressives should work to change the way money influences decisions in Washington, he said, "not because this is, in some sense, the most important problem, but because it is the first problem that has to be solved if we're going to address these more fundamental problems later."

During a Lessig speech at Stanford in January, one audience member challenged him to "do something" about the problems in Washington, he said. Lessig is considering a run for Congress "with lots of fear and uncertainty," he said.

Already in the race for Lantos' seat is Jackie Speier, a former Democratic state senator in California. A primary election in the heavily Democratic Silicon Valley district is scheduled for April 8.

Some visitors to Lessig's blog expressed support for his candidacy, but one said Speier would be a strong candidate as well. "I think that your anti-corruption movement, and your effort to reform Congress, would be more likely to succeed from the outside, and could be damaged by a partisan campaign in which you oppose a good candidate," someone wrote.

Others repeated calls for him to run. "My only hope for the future of the Internet and our digital life in general is that we start electing candidates from this new generation, who think differently about issues like digital freedom and copyright," one person wrote. "Lessig certainly is one of those candidates."

Lessig served as a special master in the U.S. government's antitrust case against Microsoft. He's the founder of the Creative Commons, which attempts to give copyright holders additional options for licensing their work beyond all rights reserved.

Lessig has served on the boards of the Free Software Foundation, the Electronic Frontier Foundation, the Public Library of Science, and Public Knowledge.

BlackBerry network down again on Wednesday

BlackBerry users in North America were complaining of service problems again on Wednesday morning.

On this topic
Vodafone's BlackBerries get disaster shield
GPS helps you find your friends
Most analog cellular to fade away on Monday
Public wireless LAN for mobile operators
Wireless# Certification Official Study Guide
Get practical tips, IT news, how-tos, and the best in tech humor.

Users of the BlackBerry outage newsgroup began reporting problems at around 6 a.m. on the East Coast of the U.S. related to scheduled maintenance on Research In Motion's network. The issue appeared to get progressively worse, initially affecting about half of users in the Americas but eventually affecting all users, according to users of the newsgroup.

One user posted a note that he said came from his AT&T representative. The note called it a national BlackBerry outage that could affect e-mail delivery for up to half of new BlackBerry users.

At noon on the East Coast, one user reported that AT&T said the problem was fixed but that e-mail messages might still be delayed until the backup of messages was sent out.

Some users of the newsgroup said that they were able to receive messages all morning, although they were often delayed.

RIM did not immediately comment on the issue.

Messages sent to and from BlackBerry devices pass through network operations centers that are operated by RIM. When RIM's network goes down, so does service to end-users.

The company had an outage just last week that lasted for about three hours and left users unable to send or receive messages or access the Internet. That outage followed one that occurred in January, but that one was due to a problem with AT&T's network, so it affected BlackBerry users as well as iPhone and other AT&T mobile data customers.

RIM's most notorious network problem happened last year, with an outage that lasted overnight for many users in North America.

ODF standard editor calls for cooperation with OOXML

The teams developing the OpenDocument Format (ODF) and Office Open XML (OOXML) standards should work together, evolving the two in parallel, the editor of the ODF standard said Tuesday in an open letter to the standards-setting community.
The Microsoft-sponsored OOXML document format is just days away from a critical meeting that will influence whether the International Organization for Standardization (ISO) will adopt it as a standard, as its rival ODF was adopted in May 2006. Relations between supporters of the two formats are, for the most part, combative rather than cordial.

Patrick Durusau, ISO project editor for ODF, or ISO/IEC 26300 as it is known there, thinks supporters of the two formats would be more productive if they allowed the formats to co-evolve, he wrote in his open letter. Durusau thoughtfully avoided the ODF and OOXML formats for his letter, choosing instead PDF, itself adopted as an ISO standard in December.

"If we had a co-evolutionary environment, one where the proponents of OpenXML and OpenDocument,
their respective organizations, national bodies and others [sic] interested groups could meet to discuss the
future of those proposals, the future revisions of both would likely be quite different," Durusau wrote.

"Peaceful co-evolution will mean better standards at lower costs in a more timely fashion," he said.

ODF, based on a file format used by Sun Microsystems' StarOffice application and the open-source productivity suite OpenOffice.org, won the support of OASIS, the Organization for the Advancement of Structured Information Standards, which fast-tracked it through the ISO standardization process.

Microsoft offered its OOXML file format to ECMA International, which adopted it as one of its standards before proposing it to ISO for the same fast-track process.

An ISO committee rejected OOXML at a first vote last year. Since then, ECMA has been responding to the objections expressed in that vote, and the ISO committee will consider the revisions it proposed at a so-called ballot resolution meeting next week. There are 1,100 comments to process during the five days of the meeting. National standards bodies will then have one month to decide whether to change their vote in light of the changes made.

ISO accepts recommendations for fast-tracking from industry standards bodies. Specifications that take this route can become standards much more quickly and with less debate than by the regular route.

The two groups need a neutral forum where they can meet and learn from each other, Durusau suggested.

He expects his calls for cooperation to raise eyebrows, given that he is editor of the ODF standard at OASIS and project editor for it at ISO.

"Why am I advocating such cooperation? The answer is fairly simple. The difficulties we face today are the result of not talking to each other in the past and we can't change the past. But we can decide to act differently today," he said.

"Creating such an environment is going to take time and effort," Durusau said, offering a series of concrete steps the two groups can take, including talking to people working on the "other" standard, working toward a common meeting place and sponsoring conferences on multiple standards.

He suggested more companies should follow Novell's example: It participates in both the TC 45 committee working on Office Open XML and the ODF Technical Committee.

Microsoft yanks Vista SP1 update causing endless reboots

Responding to reports of endlessly rebooting PCs that flooded support newsgroups last week, Microsoft Corp. said on Tuesday it had pulled an update designed to prep Windows Vista for Service Pack 1.
Although the update -- actually a pair of prerequisite files that modify Vista's install components -- has been temporarily pulled from Windows Update, Microsoft has not yet produced a fix for users whose machines either won't boot or reboot constantly.

"Immediately after receiving reports of this error, we made the decision to temporarily suspend automatic distribution of the update to avoid further customer impact while we investigate possible causes," said Nick White, a Vista program manager, in a post to the company's blog Tuesday afternoon.

White downplayed the problem. "So far, we've been able to determine that this problem only affects a small number of customers in unique circumstances. We are working to identify possible solutions and will make the update available again shortly after we address the issue."

According to White, Update 937287 was the cause of the problem. In a support document, Microsoft describes that update as one for Vista's installation software, "the component that handles the installation and the removal of software updates, language packs, optional Windows features and service packs." Along with a companion update pushed to users starting Feb. 12 and another that was offered to machines running Vista Ultimate and Vista Business in January, the guilty update is required before Vista can be upgraded to Service Pack 1 (SP1).

Shortly after the two prerequisites hit Windows Update last week, users began reporting problems on Microsoft's support newsgroups. Most said that the update hung as the message "Configuring Updates Step 3 of 3 -- 0% Complete" appeared on the screen. When users rebooted hoping to clear the error, their PCs went into an endless cycle of reboots. A smaller number of users said that their computers refused to boot normally.

Some users have been able to regain control by booting from a Vista install DVD and selecting the "Restore from a previous restore point" option.
What's it doing in there?
It's uncertain whether Microsoft knows exactly why Update 937287 is hammering PCs. Even after White posted the company statement to the Vista blog, Darrell Gorter, a Microsoft employee, was asking users to send him system logs. "I still need more log files for the investigations that we are doing," Gorter said in a message on the support newsgroup. Late last week, Gorter made a similar request on the same message board.

Also unclear is the actual extent of the problem. Although White called the number "small," the traffic on the Vista SP1 newsgroup is heavy. One thread had been viewed more than 35,500 times by late Tuesday.

But the problem is not new. Computerworld has found messages describing the endless reboot problem dated Dec. 13, one day after it first offered a Vista SP1 release candidate to the general public. That build of SP1 also required the prerequisite updates, including 937287.

Microsoft was not available for comment Tuesday night to answer questions about whether, and if so how, the snafu will impact its plans to start offering SP1 to most users next month. Currently, only beta testers, Volume Licensing customers, and subscribers to TechNet Plus and Microsoft Developer Network have been able to download legal versions of the service pack.

That will change in mid-March when SP1 is set to land on Windows Update as an optional update, and again in mid-April when Microsoft said it would start installing SP1 automatically on most PCs running Vista.

Microsoft reveals details of new small-business OS

Microsoft on Wednesday revealed details of the next version of its Windows OS for small businesses and formally introduced a new product line aimed at small and mid-size businesses.
Microsoft Windows Small Business Server (SBS) 2008, formally code-named "Cougar," is one of two software bundles in Microsoft's new Windows Essential Server Solutions line; it also includes Windows Essential Business Server 2008, formerly code-named "Centro" and aimed at mid-sized companies. Both products are based on the same code as Windows Server 2008, the next version of Microsoft's enterprise server OS.

The products in the Essential line bundle a server OS with other software products that Microsoft deems necessary to running a business -- such as Microsoft's messaging software, Exchange Server and security products -- to provide what Microsoft describes as an all-in-one, easy-to-install software stack for companies that may only have a small IT support staff.

SBS 2008 is aimed at companies with up to 50 PCs and includes one-year trial subscriptions to Microsoft Forefront Security for Exchange Server Small Business Edition and Windows Live OneCare for Server.

The software also provides integration with Microsoft's Web-based service, Microsoft Office Live Small Business, to help companies set up and manage Web sites and Web-based collaboration workspaces for employees. Support for Windows Mobile devices, so employees can access business information and e-mail remotely, also is bundled in.

According to Microsoft, it designed SBS 2008 for simplified deployment, set-up and administration from one management console that administrators can access remotely. The software also comes in a premium edition for companies that need more heavy lifting from their business software.

SBS 2008 will be demonstrated on hardware from Dell at Microsoft's Feb. 27 event in Los Angeles, in which Microsoft will highlight a triptych of releases -- Windows Server 2008, Visual Studio 2008 and SQL Server 2008. Both SBS 2008 and Windows Essential Business Server 2008 are scheduled to be available in the second half of 2008.

Windows Essential Business Server 2008, which Microsoft has previously discussed, also is intended to make it simpler for businesses with limited IT management resources to install and control critical software tools. The product is aimed at businesses with 25 to 250 PCs and is currently in beta.

Like SBS 2008, Windows Essential Business Server 2008 also has a single management console for administrators. However, unlike SBS 2008, third parties can integrate their products into the console so they can be managed from it as well. In fact, Microsoft has already said that Symantec, Citrix, CA, Trend Micro, FullArmor, McAfee and Quest are among the companies that will integrate products with the software.

"Based on our conversations with customers and partners, we felt the mid-market IT is a much different customer than a small-business owner, so we wanted to respect that in the way we designed the management UI for each product," said Steven VanRoekel, senior director in the server and tools division at Microsoft.

More information about SBS 2008 and Windows Essential Business Server 2008 can be found on Microsoft's Web site.

Security issues scuttle Bain/Huawei bid for 3Com

A deal for Bain Capital Partners and China's Huawei Technologies to buy 3Com is on hold because the companies were unable to come to agreement with the U.S. Committee on Foreign Investment in the United States (CFIUS) about security concerns.
The three companies have withdrawn their joint filing with CFIUS, although they remain committed to continued discussions, they announced Wednesday.

The proposed US$2.2 billion deal, announced in September, raised security concerns because of networking giant Huawei's close ties with the Chinese government. Under the proposed deal, Bain would have gotten an 83.5 percent stake in 3Com and Chinese networking giant Huawei Technologies would have gotten the remaining piece. CFIUS, part of the U.S. Department of Treasury, is investigating whether the investment by Huawei poses a risk to U.S. national security after Bain voluntarily submitted the deal for review in October.

"We are very disappointed that we were unable to reach a mitigation agreement with CFIUS for this transaction," Edgar Masri, president and CEO of 3Com, said in a statement. "While we work closely with Bain Capital Partners and Huawei to construct alternatives that would address CFIUS' concerns, we will continue to execute our strategy to build a global networking leader."

Among the critics of the deal was U.S. Representative Thaddeus McCotter, a Michigan Republican. Huawei's stake in 3Com, which markets intrusion detection systems, would "gravely compromise" U.S. national security, he said in a House floor speech in October. The U.S. Department of Defense uses 3Com intrusion detection products, and Chinese hackers have targeted the agency, McCotter said.

The companies had argued that Bain Capital, based in Boston, would have a controlling interest in 3Com. "Bain Capital will be able to make all operational decisions for the company, to set budgets, to spend money, to make investments, and to hire and fire personnel," 3Com said in an October filing with the U.S. Securities and Exchange Commission. "Huawei will not have any control over the operation of the business."

The companies could still put together a new proposal and take it back to CFIUS, said Christopher Wall, a partner in the international trade practice at the Pillsbury Winthrop Shaw Pittman law firm who is based in Washington, D.C. With cases like the 3Com one, there are often concerns about technology transfer to other countries, but Bain, Huawei and 3Com could come back with a proposal to limit tech transfer, said Wall. Neither Wall nor his firm is involved in this case.

The companies could also work out a deal with less Huawei involvement, he added.

Wall wasn't surprised about the security concerns surrounding the deal, given that the U.S. Department of Defense uses 3Com networking products, and U.S. officials have accused Chinese hackers of attacking U.S. government sites. "People are obviously going to be on high alert when it comes to China and computer security issues," he said. "I can easily see why people would be very concerned about any Chinese role in a transaction involving that kind of technology."

Top technology companies form gaming alliance

Some of the top technology companies, including Intel, Microsoft, Dell and Advanced Micro Devices joined forces Tuesday to form the PC Gaming Alliance, which will try to promote the PC as a gaming platform.
The alliance will bring hardware makers, software companies and game publishers under one roof to "accelerate innovation, improve the gaming experience for consumers and serve as a collective source of market information and expertise on PC gaming," the alliance said in a statement.

The companies will work together on challenges facing the PC gaming industry, including piracy and the establishment of hardware requirements for PC games, the alliance said. PCGA also hopes to accelerate growth of the PC gaming industry and standardize the development of gaming PCs and software by developing and promoting guidelines.

The alliance comes at a time when PC video game sales are falling. PC games sales in the U.S. were US$910.7 million in 2007, down from $970 million in 2006, according to research from NPD Techworld. PC game sales in 2007 dwarfed in comparison to the sale of software for video game consoles like Sony's PlayStation and Nintendo's Wii, which were $6.6 billion.

Unit shipments of PC game software totaled 36.4 million in 2007, compared to video game software unit shipments of 153.9 million, according to NPD.

The U.S. gaming industry already has the Entertainment Software Association, which represents vendors that publish games for both computers and consoles. About 90 percent of the $7.4 billion revenue of PC and console gaming software in 2006 belonged to ESA members, giving the association a dominant presence.

Other PCGA members include Acer, Epic, Nvidia and Razer USA.

The announcement comes during the Game Developers Conference, which is being held in San Francisco. During the show PCGA member Intel launched a new gaming platform formerly code-named "Skulltrail." The Intel Dual Socket Extreme Desktop Platform includes two quad-core microprocessors, totaling eight-processing engines, and supports graphics cards from ATI or Nvidia.

Microsoft: HD DVD demise won't hurt Xbox 360

Toshiba's announcement to end production of HD DVD players and recorders will not affect the Xbox 360, even though Microsoft offers an optional stand-alone HD DVD drive for the game console, Microsoft said in a statement Tuesday.
"We do not believe Toshiba's announcement about HD DVD will have any material impact on the Xbox 360 platform or our position in the marketplace," Microsoft said. The gaming function of the console is its main attraction, the company added.

The company did not say whether or not it will stop manufacturing the HD DVD drive for the Xbox 360.

Earlier Tuesday, Toshiba formally announced an end to its support of HD DVD, a high-definition optical disc format meant to replace DVDs. Toshiba was the developer and main backer of HD DVD, and its capitulation handed victory to rival format Blu-ray Disc, which has been championed by Sony.

Toshiba said its decision came after careful analysis of the long-term impact of continuing the format war, and said a swift decision was called for to help the high-definition market develop.

For game consoles, high-definition discs are growing in importance. Sony put Blu-ray Disc drives in the PlayStation 3 as a way to spread popularity of the high-definition discs. PlayStation 3 owners can play HD movies on the drives as well as advanced games. Blu-ray discs can hold far more data than conventional DVDs, and give players high definition graphics on games.

At a recent game show in Taipei, Sony showed off a display of over 100 game titles created using Blu-ray Disc. A nearby Xbox 360 booth displayed the HD DVD logo and showed movies and just a few games made for the format.

Recent changes in the market prompted Toshiba's decision to phase out HD DVD. Early this year, Warner Bros. said it would stop issuing movies on HD DVD in the coming months and rely exclusively on Blu-ray Disc. The Hollywood studio was one of three major studios remaining in the HD DVD camp, and its defection created widespread belief that the battle between HD DVD and Blu-ray Disc was now over.

More recently, major U.S. retail chain Wal-Mart announced it would phase out the sale of HD DVD products, moving to exclusivity with Blu-ray Disc. Electronics retailer Best Buy also said it would back Blu-ray Disc, but it did not say it would stop offering HD DVD.

DoS attack prevents access to WordPress.com blogs

The WordPress.com blog-hosting service suffered a denial-of-service (DoS) attack that began Saturday and was still preventing users from logging in or posting to their blogs on Tuesday.
Matt Mullenweg, spokesman for Automattic, confirmed that the service experienced a DoS attack with spikes of up to 6 gigabits of incoming traffic, which was making some blogs inaccessible for about five to 15 minutes on Tuesday. Though service had mostly been restored, Automattic, which maintains WordPress.com, was still working on returning service to normal levels on Tuesday afternoon, he said.

"Obviously that [is not good] and is pretty unusual for our service," he said in an e-mail. "All our people who can are working on the issue."

However, an employee at a New York-based company that has blogs hosted by WordPress.com suggested that some users were experiencing outages for longer than 15 minutes. The source, who asked not to be identified, said on Tuesday afternoon that users there were unable to log in to their blogs and post comments for "most of the day." However, the blogs were still able to be viewed publicly.

"It's starting to come back to life now, slowly," said the source on Tuesday afternoon.

WordPress.com users were notified via e-mail about the DoS attack. In the e-mail, the service provider said that the attack was affecting user log-in and causing some forums to be offline.

Mullenweg said that the main Wordpress.com page was down longer than some blogs because "we sacrificed it in order to keep blogs and our users up." However, the site's home page and Web site were up and running on Tuesday.

He also provided a link to a graph that shows the traffic spikes to WordPress.com on the graph, where the service's traffic is displayed in a brown line.

A DoS attack is an attempt to make a Web site or service unavailable to intended users by flooding the service or site with incoming data requests, such as e-mails. Motives for DoS attacks vary, but perpetrators mostly target companies with high-profile, highly trafficked Web sites.

Joris Evers, a spokesman for security research and software company McAfee, said DoS attacks are still fairly common, although they have tapered off in recent years because technology has been developed that can head off such attacks before they affect service.

Though he had not heard specifically of the WordPress attack, Evers said that it's possible the attack was mounted by someone "who was upset about something that was written on a WordPress blog, and they decided to take action against that."

HP reports strong results on PC, enterprise sales

Hewlett-Packard reported solid financial results for its fiscal first quarter, driven by growth in PCs and enterprise hardware. The results prompted HP to raise its forecast for the year ahead.
Revenue for the quarter, which ended Jan. 31, was $28.5 billion, up 13 percent from a year earlier, HP announced Tuesday. Pro forma net income was $2.3 billion, or $0.86 per share, up from $1.8 billion, or $0.65 per share, a year earlier.

The figures beat the expectations of financial analysts, who had forecast revenue of $27.6 billion and pro forma earnings per share of $0.81, according to Thomson Financial. The pro forma figure excludes one-time items that slightly inflated the results. Using generally accepted accounting principles, HP's profit was $2.1 billion, or $0.80 per share.

HP's Personal Systems Group, which produces its laptop and desktop PCs, grew its revenue 24 percent from the same period a year earlier, to $10.8 billion, with unit shipments up 27 percent. Notebook sales climbed fastest, up 37 percent, while desktop sales climbed 15 percent.

The division had already been doing well. HP extended its lead over Dell in PC sales last year, according to figures from Gartner. HP ended the year with 18.2 percent of the market, compared with 14.3 percent for Dell. The PC market overall grew 13.4 percent.

HP may find it hard to sustain that growth rate, in part because it has to make comparisons with increasingly successful quarters in the year before, CEO Mark Hurd said on a conference call. Still, Hurd said, "when you look at 24 percent growth, I think that's pretty darned strong."

HP's imaging and printing group performed slightly less well, with revenue climbing 4 percent to $7.3 billion. Printer unit sales declined by 1 percent from a year earlier, thanks to weakness in the consumer market. Revenue from supplies, which includes HP's profitable ink business, climbed 6 percent, however.

Revenue from the servers and storage group climbed 9 percent to $4.8 billion. Sales of blades and industry-standard servers were strong, while HP's PA-RISC and Alpha chip businesses continued to shrink. Services revenue climbed 11 percent year-over-year to $4.4 billion, while software sales climbed 11 percent to $666 million, HP said.

Hurd said he was pleased with the results overall. He attributed them to successful cost-cutting efforts, the addition of 2,000 new HP sales staff in the past year, and a diverse product portfolio.

HP generates an increasing amount of its business overseas, he said. Its biggest market continued to be Europe, the Middle East and Africa, where revenue grew 15 percent to $12.3 billion. Asia-Pacific revenue climbed 22 percent to $4.9 billion. Growth in the Americas was a sluggish 8 percent, generating $11.2 billion.

"We generated 69 percent of our revenue outside the U.S., with emerging markets driving significant growth," Hurd said.

HP is about halfway through a project to consolidate its own applications and IT systems, said Chief Financial Officer Cathie Lesjak. The goal is to cut $1 billion per year in IT expenses starting in fiscal 2009. HP is also trying to sell the same process to customers.

"It starts with business process changes, then we do application modernization, then that allows us to consolidate infrastructure and close data centers," he said.

So far the company has consolidated 6,000 applications worldwide down to about 3,000, he said. HP used to have 75 customer service applications, for example -- one for each of its major countries -- and it now has just one. That allows the company to make a change to the application once and push it out worldwide, which makes HP "more nimble," he said.

One slight cloud on the horizon may be component prices. "We're thinking about the fact that memory will be a bit tougher than in the past couple of quarters," Lesjak said. "LCD prices as well have started to tick up a bit."

HP also has a significant shortage of sales people, Hurd said. "We have 144,000 resellers and partners, but at the end of the day ... we are dramatically undercovered, and we're not off by 10 or 20 percent, we're off by more than that," he said.

The company now expects second-quarter revenue of $27.7 billion to $27.9 billion, and pro forma earnings per share of $0.83 or $0.84. That's above what analysts polled by Thomson had been estimating: pro forma profit of $0.82 and revenue of $27.4 billion.

Investors applauded the results. HP's shares were trading 5 percent higher in after-hours trading, at $46.12 at the time of this report. They ended the regular day's trading at $43.95, level with Friday's close. (Monday was a national holiday in the U.S.)

Yahoo protects employees in case of Microsoft takeover

Yahoo has introduced two new severance plans that will protect its employees if Microsoft's unsolicited takeover bid is successful, it said in filings with the U.S. Securities and Exchange Commission (SEC) on Tuesday.
The new plans filed make all full-time employees eligible for severance pay equal to base salary for four months to 24 months, depending on the employee's job level. Health and dental coverage is also included.

The maximum protection of 24 months' salary will be offered to CEO Jerry Yang, Chief Financial Officer Blake Jorgensen, and certain other executives still employed by the company and named in the SEC proxy filing for Yahoo's 2007 annual general meeting. That list includes former CFO Susan Decker, now president of the company,
and Executive Vice President, General Counsel and Secretary Michael Callahan. Others listed in the proxy filing have already left the company, including former Chairman Terry Semel, former Chief Operating Officer Daniel Rosensweig and former Chief Technology Officer Farzad Nazem.

The benefits take effect if an employee's contract is terminated without cause by Microsoft -- or another acquirer -- or if the employee leaves with good reason within two years of a change of ownership.

The severance plans are designed to help retain employees, help maintain a stable work environment and provide certain economic benefits to the employees in the event their employment is terminated, according to Yahoo.

Yahoo's filing came the same day that news reports suggested Microsoft is preparing a campaign to win shareholder support for changes to Yahoo's board.

Microsoft looking for ways to converge Windows Mobile, Zune

Microsoft appears to be looking for new ways to tie Windows Mobile phones and Zune media players together, although a Zune phone remains unlikely.
Over the weekend, Microsoft developer "Mel" asked an open question on the Windows Mobile blog: "What are some ways the Zune player and a Windows Mobile device can work better together?"

Since then, over 50 commenters have suggested ways that Microsoft might converge the two devices. The most common idea is to essentially replace the Windows Media Player on Windows Mobile devices with Zune software.

"I proposed that WMP should be fazed out in favor of a combined WMP/Zune player which will synch with both Windows Media AND Zune, instead of having to have two separate apps with two different libraries for each device," wrote one commenter using the name Colin Walker.

Peter Henning, another commenter, also suggested making just one media player that works on both devices. "Currently you are just making our lives much more difficult with this parallel development and incompatibilities," he wrote.

A single media player would solve some of the problems that other users complained about in synching music between a Windows Mobile phone and a Zune. Another commenter going by the name Charlie Quidnunc noted that he has to create new playlists once he transfers music from Zune to his phone because Windows Media Player can't read Zune playlists.

Another complained that he can't transfer music that he downloaded under his Zune subscription plan to his Windows Mobile device because of DRM (digital rights management) restrictions.

Offering Zune software on Windows Mobile phones could be one simple way for Microsoft to converge the two, said Michael Gartenberg, a research director with Jupiter Research. "There are any number of ways that Microsoft could go about Zune integration. We might see a Zune application for Windows Mobile devices."

But what we most likely won't see is a Zune phone, despite many Zune phone rumors. "On one hand the Zune is a closed proprietary system not built around a partner ecosystem," Gartenberg noted. "On the other hand the phone business is built on a partner ecosystem." Microsoft develops the Windows Mobile software but hardware makers build the phones. By contrast, Microsoft develops the hardware and software for the Zune.

If Microsoft started making a Zune phone, it would compete with its phone hardware partners. "It's the same reason we don't see a Microsoft branded PC," he said.

On the Windows Mobile blog, "Mel" emphasized that wasn't looking for more suggestions of a Zune phone. "I'm not referring to an imaginary 'Zune phone,' and I'm certainly not hinting or speculating about a converged device," he wrote.

Building a better music playing experience into Windows Mobile will be important for Microsoft, which is increasingly trying to make Windows Mobile phones appeal to consumers and not just business users. "For the most part, Windows Mobile has ignored consumers," Gartenberg said.

Microsoft recently announced plans to buy Danger, the developer of mobile phone software that runs the youth-oriented Sidekick device from T-Mobile. Microsoft has also made some executive changes in the Windows Mobile group designed to better focus on consumers.