Tuesday, December 4, 2007

IT departments biggest source of data leaks, says research

IT personnel are to blame in 30% of all data breach incidents, according to research from security services firm Orthus.

The research monitored the ways users accessed, processed, stored and transmitted information, including financial records and intellectual property.

Orthus monitored more than 100,000 hours of user activity over the last year through its data leakage audit service. The software, installed on endpoints, servers and terminal servers, records how sensitive information is removed from the corporate infrastructure, providing time and date stamped visual evidence of these data leaks.

Orthus found that IT departments were responsible for 30% of incidents. The customer service department is also a common offender, responsible for 22% of the incidents identified.

"The research proves the rule: the higher level of access privileges -- the greater the propensity for abuse," said Richard Hollis, managing director of Orthus. "Companies need to address the insider as the primary threat to their business. Until this is done, no real security can be achieved."

CDT urges changes to wiretapping legislation

The Center for Democracy and Technology (CDT) has urged the U.S. Congress to make changes to a bill that would extend a controversial wiretapping program.

The CDT, a group that focuses on online civil liberties, called for the U.S. Senate to pass a substitute to the FISA Amendments Act, likely to be debated on the Senate floor later this week.

The legislation, as approved by the Senate Intelligence Committee, would reauthorize warrantless wiretapping of some U.S. residents' telephone and electronic communications in the name of protecting the U.S. against terrorists. One of the most controversial provisions would give telecom carriers immunity from civil lawsuit judgements for assisting the government wiretapping efforts, but CDT officials said Tuesday that there are other important debates raised by the legislation, including the role of the U.S. FISA (Foreign Intelligence Surveillance Act) court in overseeing the wiretapping program.

The Senate Intelligence Committee version of the bill, which was put together with help from President George Bush's administration, offers "no meaningful protection" to U.S. residents and limits the involvement of the FISA court in approving wiretapping, CDT said. Several civil liberties groups have called the wiretapping program illegal because it spies on U.S. residents communicating with oversees suspects without court approval.

"The biggest issue is, what's the role of the court in protecting the privacy of communications?" said Greg Nojeim, CDT's senior counsel, said during a Tuesday press conference. "Two years from now, or four years from now, or six years from now, when the bill sunsets, we won't be looking at immunity as the big issue. What will be the big issue is, how did this surveillance affect Americans? Were innocent people's communications routinely intercepted?"

Bush, in the past week, has repeatedly pressured Congress to pass a bill extending the wiretap authorizations. A FISA bill passed in August expires Feb. 1, "while the threat from our terrorist enemies does not," he said Saturday during a national radio address.

"Congress must take action now to keep the intelligence gaps closed -- and make certain our national security professionals do not lose a critical tool for keeping America safe," Bush said. "As part of these efforts, Congress also needs to provide meaningful liability protection to those companies now facing multibillion dollar lawsuits only because they are believed to have assisted in the efforts to defend our Nation following the 9/11 attacks."

Since the warrantless wiretapping program at the U.S. National Security Agency came to light in late 2005, civil liberties groups have filed several lawsuits against telecom carriers that allegedly assisted the NSA. About 40 cases are consolidated before the U.S. District Court for the Northern District of California and the lawsuits are moving forward there.

"Our view is the litigation should go forward as is," Nojeim said. "That makes the most sense to us, because there ought to be some accountability."

Congress has debated several versions of immunity. The CDT would prefer that Congress cap awards, but allow the lawsuits to go forward. However, the Senate Intelligence bill would require that courts dismiss civil lawsuits against carriers that assisted the NSA.

The CDT would prefer a substitute amendment from the Senate Judiciary Committee that's likely to come before the Senate during debate on the bill. That bill would give the FISA court more oversight of the wiretap orders, would prohibit the bulk collection of international communications and would sunset the bill in four years instead of six, as in the Senate Intelligence version.

Even better is a House of Representatives bill, the Restore Act, which would allow ongoing FISA court supervision of the wiretapping program, and would require prior court approval of wiretaps in most cases, CDT said. The House narrowly passed the Restore Act Nov. 15.

SAP puts a Web 2.0 spin on CRM upgrade

SAP on Tuesday announced an update to its customer relationship management software with a Web 2.0-style interface that could help to increase usage rates among workers.

Companies often report that usage levels for their business software is lower than they would like, with salespeople managing accounts in Microsoft Outlook instead of their more expensive CRM software, for example. SAP hopes to address that with CRM 2007, an update to its CRM (customer relationship management) product that will be widely available early next year.

People accustomed to using easy-to-use Web applications in their personal life are starting to expect that same ease of use in their business software, said Stefan Haenisch, SAP's vice president of CRM product management.

"We're trying to bridge the gap between a cool, user-driven Web application, and an enterprise software application," he said.

SAP competes primarily with Oracle in the market for broad CRM suites, which include tools for managing sales, marketing and customer service. Other rivals include Salesforce.com, Chordiant Software and Infor.

Oracle probably has the broadest set of CRM capabilities, thanks to its acquisitions of Siebel and PeopleSoft, said Vuk Trifkovic, an analyst with Datamonitor in the U.K. "But I don't think that reflects badly on SAP, they have good tools with a lot of features, and they're a natural for anyone in the SAP ecosystem," he said.

CRM 2007 has a portal-like interface that workers can customize with information from within the CRM system, such as reports, or from external sources, such as publically available newsfeeds and maps. They can change the color and "theme" of the interface by clicking through different designs, or skins.

The idea is to make the software more appealing to work with, but also to provide information that might increase productivity. A salesperson might add a feed showing news about companies he plans to visit that week, Haenisch said.

The software also looks different inside. The content is laid out in task windows that users can drag and drop to rearrange. The interface is built on SAP's NetWeaver and uses AJAX (asynchronous JavaScript and XML), a popular interface technology on the Web.

There are also new CRM tools, including a pipeline management tool that can run "what if" scenarios on upcoming deals. A salesperson can view quarterly sales in a bar chart, and then move deals from one quarter to the next, or push expected targets up or down, to see the effect on the quarterly totals, Haenisch said.

CRM 2007 will also include telephony software that uses Internet Protocol, a technology SAP acquired when it bought Wicom Communications last May. The software lets companies set up a virtual call center that could include workers in remote locations, without having to invest in specialized telephony hardware, Haenisch said.

There is also an updated trade promotions management tool, which can help marketing departments manage hundreds of concurrent programs with retail stores.

The base pricing for the software hasn't changed, Haenisch said, but customers may have to pay for the new features, like the telephony software, depending on the type of SAP license they have.

SAP's last big CRM update was CRM 2005, two years ago. Some of the new features in CRM 2007 were offered in interim releases this year, but SAP expects most customers to adopt them with CRM 2007. The software will be rolled out gradually starting this month, with widespread availability scheduled for the second quarter next year, Haenisch said.

The global CRM market is growing quickly, according to Datamonitor, pushed along by organizations that recognize the benefits of creating a positive experience for their customers. The analyst company expects worldwide CRM sales to hit US$6.6 billion in 2012, up from $3.6 billion in 2006, with a compound growth rate of 10.5 percent per year.

Oracle clarifies VMware support plans sort of

Oracle Corp. is attempting to clarify its support plan for non-Oracle virtual servers to dispel confusion caused by conflicting statements from executives during its OpenWorld user conference in San Francisco last month.

After the Oracle VM virtual server was unveiled during the conference, CEO Larry Ellison said that the company would "essentially" continue providing support for Oracle software running on rival VMware Inc.'s virtual machines.

Ellison appeared to contradict earlier comments by Ed Screven, chief corporate development architect at Oracle, who said that the vendor would not offer support for such systems.

In an e-mailed response to Computerworld, Oracle contended that there is "no change" in its support policy for customers running Oracle applications on VMware. The statement asserted that such users have never been guaranteed full support.

"Oracle has not certified any Oracle software on VMware virtualized environments," the company said. Oracle said it will fix problems in non-Oracle virtualized environments only if they are unrelated to the virtualization platform.

VMware contended last week that its customers running Oracle software needn't worry about the database vendor's support policy. "Oracle has been responsive [to] and supportive of customers who are running Oracle products in VMware environments," said Parag Patel, vice president for alliances at VMware, in an e-mail to Computerworld last week.

"We haven't seen many referrals from Oracle (even though Oracle's official policy mentions sending referrals to VMware), which seems to indicate that Oracle is engaging with our mutual customers," Patel wrote.

Gordon Haff, an analyst at Illuminata Inc., noted that Oracle tends to work with its customers despite such support policies.

"Like Microsoft, Oracle doesn't especially like to play in other children's sandboxes, but in practice, it does what it has to for important customers -- even if it does so reluctantly," he said. "This isn't exactly nice behavior. But it's hard to argue that it's hurt them to any significant degree."

DMTF standardizes virtual server management

The Distributed Management Task Force last week released a set of standard profiles the industry group says will help IT professionals manage virtual servers.

The DMTF System Virtualization, Partitioning and Clustering (SVPC) work group developed five profiles that are all based on the DMTF's Common Information Model (CIM) standard, an open standard for interoperable exchange of management information. According to the DMTF, basing standards on CIM helps IT managers more easily incorporate new standards and specifications into existing management tools.

"With the ever-increasing adoption of virtualization, DMTF aims to simplify and provide ease-of-use for the virtual environment by creating an industry standard for system virtualization management," says Winston Bumpus, DMTF president. (Follow the latest wares in our constantly updating Server Management Buyer's Guide.)

The profiles, available at the DMTF's Web site, enable capabilities to: discover and inventory virtual computer systems; manage the life cycle of virtual systems; create, modify and delete virtual resources; and monitor virtual systems for health and performance.

For instance, the System Virtualization Profile provides a service for manipulating virtual computer systems and their resources, while the Virtual System Profile defines basic control operations for activating, deactivating, pausing or suspending a virtual system, according to the DMTF. Other profiles include the Generic Device Resource Virtualization Profile, the Resource Allocation Profile and the Allocation Capabilities profile.

Virtualization software makers and management vendors participated in the development of the profiles and are working today to incorporate the standard into their products.

"End users and software vendors have been clear that they need to be able to leverage standards and avoid proprietary formats and licensing that lock them to a single vendor or platform. VMware's participation in and contribution to the DMTF SVPC work group reinforces our commitment to open, industry standards," said Stephen Herrod, vice president of technology development at VMware, in a statement.

IBM said it is putting the technology into its virtual management products. "The DMTF virtualization model brings a critical level of standardization to the data center, simplifying and extending the management of physical and virtual resources in heterogeneous environments. IBM is actively implementing these draft standards in IBM's Systems Director Virtualization Manager," said Rebecca Austen, director, IBM System Software, in a press release.

Next week, the work group will host a members-only "plugfest" in Santa Clara to test interoperability of products based on the SVPC specifications. A white paper detailing the CIM system virtualization model is available here.

F-Secure: Malware samples doubled in one year

Finnish security vendor F-Secure has collected twice as many malicious software samples this year than it has over the last 20 years, a trend that highlights the growing danger of malicious software on the Internet.

Through the end of 2006 and 20 years prior, F-Secure counted a total of 250,000 samples, said Mikko Hypponen, F-Secure's chief research officer. This year alone, 250,000 samples have been counted, he said.

Statistics on malware from antivirus companies can vary since the data is often derived from what their customers experience while using their software, and it depends on how widely that software is used.

But other security vendors have also noted the flood of new malware on the Internet over the last few years. Symantec said earlier this year that it detected 212,101 new malicious code threats between January and June, an increased of 185 percent over the same period a year prior.

The astounding increase shows that hackers "are generating large number of different [malware] variants on purpose to make the lives of antivirus vendors more difficult," Hypponen said.

A variant is a piece of malware that has a unique look but belongs to a known family of malware, sharing common code and functions. Hackers use techniques such as obfuscation, which jumbles up code and makes it hard to determine what the program is, and encryption, to trick security programs.

"Genuine innovation appears to be on the decline and is currently being replaced with volume and mass-produced kit malware," according to F-Secure's report, which covers the second half of 2007.

Higher numbers of malware samples put more pressure on vendors to ensure they have fine-tuned products. To handle the surge, F-Secure has hired more security analysts as well as continued to develop automated tools to evaluate malicious software, Hypponen said.

Any new malware must first undergo an analysis. Then most security software vendors companies create a signature, or an indicator, that allows its software to detect the malware.

Automation makes the task of analyzing malware somewhat easier, but "in the end, a human makes the decision where we add detection [signatures]," Hypponen said.

Survey: virtualization in two-thirds of enterprises by '09

More than a third of enterprise IT shops have implemented x86 server virtualization, and nearly two-thirds expect to do so by 2009, Forrester Research finds in a new survey.

IT departments already using virtualization have virtualized 24% of servers, and that number is expected to grow to 45% by 2009.

Vendors need to get busy upgrading virtualization products, because many enterprises have been using the technology for two years or more and are ready to expand usage, Forrester reports.

"BMC Software, IBM Tivoli, HP Software, and Microsoft must repackage their offerings to create immediate tactical value by adding or buying tools for virtualization environment tasks, such as converting between physical and virtual servers and rapidly updating virtual server configurations," Forrester states.
The Forrester report -- "x86 virtualization adopters hit the tipping point" -- was released Friday and is based on a survey of 275 enterprise server decision-makers.

Previous Forrester research actually showed higher adoption of server virtualization, with 50% of IT shops using the technology in production and pilots in 2006.
Estimates tend to be "all over the map," and IT executives are sometimes too optimistic about predictions of future use, says report lead author Frank Gillett. But the survey results "show the power and popularity of the idea ... and demonstrates there is significant intent to increase usage."

The latest report finds that 37% of IT departments have virtualized servers already, and another 13% plan to do so by July 2008. An additional 15% think they will virtualize x86 servers by 2009.

As enterprises gain a couple years experience with virtualization, they will move from tactical, experimental approaches to strategic IT infrastructure initiatives that might involve upgrading servers, storage, networks and systems management.

But virtualization isn't close to being universally adopted throughout enterprises, Gillett says. IT executives typically aren't using the technology for critical applications, or platforms like grid computing and supercomputing, he says.
"Virtualization is working its way [up] from things where people are less uptight about performance," he says.

Virtualization is primarily about sharing machines and portability, but these may not be compelling reasons to virtualize critical workloads, according to Gillett. Machine sharing isn't that necessary if a machine is already busy, and portability might not be compelling when there are few other servers a workload can be moved to.

Nokia lays plan for more Internet services

Nokia unveiled an ambitious plan on Tuesday to move beyond cell phones and deeper into the world of Internet services, where it will compete more directly with Google, Apple and Microsoft.
<A TARGET="_new" HREF="http://ad.doubleclick.net/click%3Bh=v8/361f/3/0/%2a/p%3B130643729%3B0-0%3B0%3B18158584%3B4252-336/280%3B22315834/22333723/1%3B%3B%7Esscs%3D%3fhttp://www.computerworld.com/action/member.do?command=registerNewsletters&source=housead"><IMG SRC="http://m1.2mdn.net/743328/newslttr_sky_v2.gif" BORDER=0></A>
On this topic
iPhone to use mobile data instead of GPS?
Verizon Wireless will use LTE for 4G
Verizon Wireless' open network earns praise
Public wireless LAN for mobile operators
Wireless# Certification Official Study Guide
Get practical tips, IT news, how-tos, and the best in tech humor.

The plan centers on its Web site at Ovi.com, which Nokia will market as a "personal dashboard" where users can share photos with friends, buy music and access third-party services like Yahoo's Flickr photo site.

The idea is to offer a single location where people can manage the content, services and contacts they accumulate when surfing the Internet on their phones and PCs, said Anssi Vanjoki, general manager of Nokia's multimedia group, at the company's Nokia World conference in Amsterdam.

Ovi.com will offer a single sign-on for the services, so people don't have to remember numerous log-ins and passwords on the Web, Vanjoki said. Nokia is also developing Ovi desktop software for organizing content offline.

Nokia began talking about Ovi in August, and one part of the service, an updated version of Nokia's mobile gaming platform, N-Gage Arena, is going live this month, Vanjoki said. The service worked in the past only with Nokia's N-Gage mobile game consoles, but the company said it will soon work with other devices too.

The games service is only the start. Nokia has said that an online music store will follow, and on Tuesday it provided more details of other services it will offer. They include mapping services, a video store and a photo service that allows users to upload photos from a phone and link them to maps, much as Google allows with its Picasa service today.

"Ovi will enable people to access social networks, communities and content. It's the foundation from which we'll expand Nokia in new directions," said Olli-Pekka Kallasvuo, Nokia's president and CEO.

Nokia holds more than a third of the world's mobile phone market, and it hopes that Internet-enabled devices like its N95 will become the primary way people access the Web in future. At a time when the average price of cell phones is falling, online services could help it build new business.

It faces several challenges, including turning Ovi into a brand that can compete with established online companies like Google and Facebook. Kallasvuo acknowledged the challenges while answering questions after his speech, which was webcast.

"In addition to being a device company we have to become more like an Internet company as well, and combine the two worlds," he said. "That's a great challenge, but at the same time a great opportunity."

Nokia will also need more Internet-enabled phones in the market. It estimates that 3 billion people worldwide have a mobile phone, but only 300 million have advanced multimedia handsets, and only about 200 million of those are from Nokia. The devices also need to be easier to use, Vanjoki said. "A lot of improvement needs to take place," he said.

Ovi.com is being tested internally and will be rolled out for public beta next year, when the desktop software will also be released, Vanjoki said. The company demonstrated the software, which has snazzy interface elements, like a tool for organizing videos, photos and other files that makes them appear to be floating in three-dimensional space.

The service is likely to include an online storage component to make it easier to share files online. "We haven't yet announced the media-sharing service, but that will be part of the Ovi.com sales offering," said Nokia spokesman Kari Tuutti.

Access to Ovi.com and the desktop software will be free, Tuutti said. The software will be delivered on a CD with Nokia phones and offered for download over the Web.

Ovi is the Finnish word for "door," and the name is intended to imply that Nokia opens doors to the Web.

Dell pressures suppliers to cut emissions

Dell has become the first IT company to sign up to the Carbon Disclosure Project (CDP) plan to report on supply chain carbon emissions.

The Carbon Disclosure Project is a not-for-profit organization founded to obtain full carbon footprint disclosure by Times 500 companies on behalf of investors and with a desire to reduce greenhouse gas (GHG) emissions.

It produces annual reports providing a fair and accurate way to compare suppliers and their carbon footprints. The CDP is currently inviting institutional investors to become signatories to the sixth Carbon Disclosure Project for 2008. It is the collective pressure from these investors, representing US$41 trillion of investment funds, which encourages suppliers to reveal carbon footprints to the CDP.

The CDP has recognized that disclosing companies have supply chain contributors which also cause GHG emissions. Although the CDP has its standard reporting format there is no standard format for supply chain company emissions.

The CDP's Supply Chain Leadership Collaboration (SCLC) project has produced a worldwide standard for supply chain businesses to report their emissions. By signing up for SCLC Dell has served notice on its suppliers that it wants them to report their emissions to Dell in this format.

It also means that the supply chain companies have joined the CDP by proxy and their emissions may even become public.

Furthermore it provides Dell with the means to compare and contrast its suppliers on GHG emissions and direct business to the low emitters. Thus a virtuous circle is produced in which Dell suppliers will compete to lower their emissions and so help to lower Dell's own emissions.

The entry of Dell to the SCLC may well prompt other IT suppliers in the CDP's ranks, such as IBM, HP and Sun, to follow suit; such is the CDP's hope.

Paul Dickinson, its CEO, said: "Dell is the first IT company to join the collaboration and we hope others will follow their lead. The supply chain is often responsible for a large part of a company's emissions, so in working with Dell to help measure these emissions, CDP hopes to help them achieve their own carbon reduction goals."

Adobe upgrades Flash to high definition

Adobe on Tuesday slashed the price of its Flash Media Server, making the system compatible with additional movie codecs and potentially opening the floodgates to making more video content available online.

Adobe announced that its latest Adobe Flash Media Server 3 family of products will ship in January, and made the latest version of the client software, Adobe Flash Player 9 Update 3, available with immediate effect.

Adobe Flash Player 9 Update 3 software, previously code named Moviestar now includes H.264 standard video support, the same standard deployed in Blu-Ray and HD-DVD high definition video players, and HE-AAC audio capabilities.

Since H.264 and HE-AAC are open industry standards and already integrated into existing authoring and publishing workflows, content producers can use their existing H.264 material for playback in Adobe Flash Player.

The latest update to Flash Player also features hardware accelerated, multi-core enhanced, full-screen video playback for high-resolution viewing across major operating systems and browsers.

Adobe Flash Media Server 3 offers streaming media and real-time communication capabilities to a variety of computer platforms. New codecs include support for the industry standard H.264 (potentially meaning Flash will be made available for the iPhone and Apple TV, which also both support H.264) and High Efficiency AAC (HE-AAC) audio support.

In keeping with the desire for a secure manner in which to distribute media assets, Adobe Flash Media Server 3 also offers content owners increased protection for streaming high quality video and allows for the delivery of interactive media applications that work consistently across multiple browsers and operating systems.

The new Flash Media Server range comprises: Adobe Flash Media Streaming Server 3 for live and on- demand video streaming and Adobe Flash Media Interactive Server 3 for customized scalable video streaming services, plus multi-way social media applications.

"Adobe's award-winning Flash technology is driving the shift from traditional media consumption to engaging interactive experiences," said John Loiacono, senior vice president of Creative Solutions at Adobe. "By offering the Flash Media Server 3 product line coupled with new pricing options, Adobe is reducing the barrier to entry for content owners who want to deliver streaming video or real-time communications online."

Both the live streaming and video streaming on-demand products deliver new features for media publishers, including: nearly double the amount of streams per server; support for industry standard codecs; upgrades to Adobe's patented protocol for delivering protected content; and enhanced live video support for news, concerts, sporting events and social media services. In addition, Adobe Flash Media Server 3 supports both pre-recorded and live streaming to Adobe Flash Lite 3 which ensures the same video experience on mobile devices.

"With a complete end-to-end workflow, Adobe's video solutions have transformed our creation to delivery model," said Erik Huggers, BBC future media and technology group controller. "With advancements in performance and protection, Flash Media Server 3 provides the BBC with flexible new ways to deliver streaming media on as many platforms as possible. We're delighted to collaborate with Adobe and deliver more secure instant-on programming to our audience, via BBC iPlayer."

"Adobe Flash technology has been a major force behind Internet TV as we know it, bringing viewers instant-on, engaging experiences," said Kevin Lynch, senior vice president and chief software architect for Adobe. "The inclusion of industry standard H.264 support ... brings new HD capabilities to millions of Flash developers and a new generation of viewers who are turning to the web as the place to find their favorite shows and video content."

Adobe Flash Media Server 3 is expected to be available in January 2008 and will ship with special pre-built services making it easier to stream Flash Player compatible video. Adobe Flash Media Interactive Server 3 will be offered at $4,500. For single-server deployments, Adobe Flash Media Streaming Server 3 will be available at $995.

Adobe Flash Player 9 Update 3 is available now for free download.

Court lets T-Mobile sell locked iPhones in Germany

T-Mobile Germany need not sell an unlocked version of Apple's iPhone, a court in Hamburg ruled Tuesday.

The decision leaves the German operator free to sell the phone for €399 ($585) including tax, tied to its network and with a two-year service contract, just as it proposed at the phone's German launch on Nov. 9.

Rival Vodafone filed suit against T-Mobile on Nov. 19, alleging that the sale of locked phones tied to a two-year contract breached German consumer protection laws.

More importantly for Vodafone, T-Mobile's exclusive deal with Apple, combined with the locking of the phones, meant that Vodafone stood to lose customers attracted by the device.

In response to Vodafone's suit, the court temporarily ordered T-Mobile to sell an unlocked version of the phone, with no restrictive contract, while it decided the case. T-Mobile sold the unlocked iPhones without a service contract for €999 while waiting for the court's final ruling.

Now the court has thrown out Vodafone's complaint, and T-Mobile is once again selling only locked phones, T-Mobile said Tuesday.

T-Mobile offers three service contracts for the iPhone, with monthly costs of €49 for 100 voice minutes and 40 text messages, €69 for 200 minutes and 150 messages, or €89 for 1000 minutes and 300 messages. There is no charge for cellular data traffic, access to T-Mobile's Wi-Fi hotspot network or use of the Visual Voicemail service. Unlike their U.S. counterparts, European mobile phone users do not pay for incoming calls. All the contracts run for a minimum of 24 months, after which customers can ask T-Mobile to unlock their iPhone for free, the company said.

Vodafone said it will analyze the ruling before deciding what action to take.

Customers wanting an unlocked iPhone can still cross the border to France, where France Télécom subsidiary Orange sells phones without a contract for €649, plus a €100 unlocking charge. (The charge is waived if the customer waits for six months from the purchase date.) The phones sold by Orange can be configured to present menu options in French, German, English or Italian.

Motorola CTO leaves company

Motorola's chief technology officer has left the company, just days after Motorola announced it will replace CEO Ed Zander.

The struggling mobile-phone maker confirmed on Monday that Padmasree Warrior, who was executive vice president and chief technology officer at Motorola, has left the company. Many references to her on the Motorola Web site have already been removed.

Motorola has CTOs for each of its businesses, including mobile devices, enterprise mobility solutions, and home and networks mobility, and they will continue to be in charge of commercialization of product development, Jennifer Erickson, a Motorola spokeswoman, said in an e-mail. Rich Nottenburg, Motorola's chief strategy officer, will become responsible for Motorola's overall technology leadership, she said.

Erickson did not explain why Warrior departed, but said the move was in line with a plan outlined several months ago. "This is the final step in redefining the CTO responsibilities and is entirely consistent with the direction we outlined several months ago," she said.

Other components of that plan included a realignment of Motorola's software group, which was aimed at ensuring that the company's engineering and technology specialization directly supports its businesses, she said.

Although it's hard to know if Warrior's departure is linked to Zander's, it's a bit surprising, said Chris Silva, an analyst with Forrester Research. He would have expected, and still expects, some changes in product marketing leadership and possibly within internal business functions at Motorola, rather than among operational leaders like Warrior, he said.

These types of changes he expects would support a shift at Motorola away from the consumer handset business and toward enterprise networks and the mobilization of the enterprise, he said.

One version of Warrior's biography on Motorola's Web site says she was called "sharp as a Razr" by the Chicago Sun Times. She was responsible for Motorola's US$4.1 billion research and development investment and 26,000 engineers.

The change follows the announcement on Friday that Greg Brown, formerly president and chief operating officer at Motorola, would take over for Zander as CEO at the end of the year.

Motorola, despite its widely recognized brand, has struggled recently with declining revenue, profit and market share. Last week, Gartner reported that Motorola's share of the mobile phone market dropped to 13 percent, down from 21 percent last year. Gartner also said Motorola lost its position as number two among phone makers to Samsung. The company met with major success with its Razr phone recently but has failed to match the success of the phone.

Still, the company has valuable assets, particularly in the enterprise market, such as technology from its acquisitions of Symbol and Good Technologies. Motorola should be able to leverage those to turn around its fortunes, Silva said.

700MHz filing deadline: What's next?

Companies wishing to bid in the upcoming 700MHz auctions at the U.S. Federal Communications Commission were largely silent about their plans Monday, the deadline for submitting bid applications.

Google on Friday announced it plans to bid on the spectrum, often called "beach front" property because it can carry wireless broadband signals three to four times farther than some other spectrum bands. An AT&T spokesman on Monday confirmed the company's earlier statements saying it intends to bid.

A Verizon Wireless spokeswoman declined to comment on the company's bidding plans. Verizon in September had filed a lawsuit against the FCC for its so-called open-access requirements on about a third of the 62MHz of spectrum to be auctioned starting in late January. But last week the company announced it would open up its existing network to outside wireless devices and applications starting in 2008. So Verizon's objections to the FCC's similar open-access rules seem to have subsided.

Sprint Nextel does not plan to participate, a spokesman said. "Sprint has all the spectrum it needs to meet its strategic business needs," spokesman Scott Sloat said.

Startup Frontline Wireless, made up of wireless industry and government veterans, has also indicated it plans to bid in the auctions. There could be dozens of other bidders, including regional wireless carriers and broadband providers.

What happens now?

The FCC plans to make the names of the auction applicants public by Dec. 28. For one of the first times, the FCC is conducting an anonymous bidding process, so it will not disclose what sections of spectrum applicants intend to bid on.

The auctions begin on Jan. 24, but they could last several weeks. Auctions go on as long as bidders keep bidding; the FCC's last major auctions, its advanced wireless services auctions in 2006, lasted about five weeks. If reserve prices aren't met on parts of spectrum, the FCC will re-auction those bands.

The auction is conducted electronically with numerous rounds per day, with time frames for rounds growing shorter as bidding activity heats up.

Why is this auction important?

The 700MHz auctions represent the last large chunk of spectrum available for the FCC to auction in the foreseeable future. In addition, the spectrum, now used to carry over-the-air television signals, can be used to carry long-range wireless broadband traffic. Many people, including FCC Chairman Kevin Martin have said the auction represents a golden opportunity to create a nationwide broadband network in competition with the providers of cable modem and DSL (Digital Subscriber Line) and fiber-based services.

Some consumer groups have called the auctions the "last, best hope" for a third pipe that competes with cable operators such as Comcast and DSL and fiber-based providers such as AT&T and Verizon Communications.

While many observers see the spectrum as optimal for wireless broadband, some carriers may use it for traditional wireless voice traffic as well. Some plans for the spectrum will likely include networks that merge traditional wireless voice with high-speed data services. Google seemed to be headed in that direction when it launched an open-development handset coalition in early November.

In addition, about 20MHz of spectrum will go toward a nationwide voice and data network for public safety agencies, including police and fire departments. The U.S. Congress set aside about half of that spectrum for a public safety umbrella group, and the other half will be auctioned, with the winning bidder required to build a nationwide network that serves commercial and public-safety needs.

Several lawmakers and public-safety officials pushed hard for the spectrum after communication problems during the Sept. 11, 2001, terrorist attacks on the U.S. and later disasters. Public-safety agencies, using a wide variety of devices on different bands of spectrum, weren't able to communicate with each other.

The FCC didn't require that bidders build certain types of networks, except that a voice and data network is envisioned for the public-safety network. And customers taking advantage of the open-access rules on about a third of the spectrum are likely to connect a variety of devices to the network. Beyond that, the FCC has required geographic or population-based build-out requirements on much of the spectrum.

What's being auctioned?

For sale is 62MHz of spectrum in the 700MHz band. In late 2005, after a decade of debate, Congress passed a law requiring U.S. TV stations to move to all-digital broadcasts and abandon analog spectrum between channels 52 and 69. The deadline for TV stations to end broadcasts in the 700MHz band is February 2009.

The spectrum is broken up into five blocks. The C block, a 22MHz of spectrum that has the open-access rules, is broken up into 12 regional licenses across the U.S. A bidder can win one or more of those regional licenses.

The A block is 12MHz, broken up into 176 smaller regions called economic areas, as is the 6MHz E block. The 12MHz B block is broken up into 734 local areas called cellular market areas. Again, bidders can win multiple regional or local licenses.

Finally, 10MHz of spectrum in the D block, paired with about 10MHz set aside for public safety, is a nationwide license.

Congress has budgeted the auctions to raise at least $10 billion, but many observers expect them to cost much more. The FCC set the reserve price for the C block of spectrum at $4.6 billion.

Microsoft to beef up anti-piracy checks in Vista SP1

Microsoft will change the user experience of its automatic anti-piracy checks in Windows Vista and also make it harder for hackers to bypass the system in the first service pack for the OS due out early next year.

Once Windows Vista Service Pack 1 (SP1) is installed on a PC, that computer will no longer go into limited functionality mode if a user or administrator fails to activate Vista on that system in 30 days or if the system fails Microsoft's Windows Genuine Advantage (WGA) validation, which checks to see if a version of Vista is pirated or counterfeit. In Vista, WGA is called the Software Protection Program feature.

In limited functionality mode, a computer will shut down after 60 minutes and then allow only browser use. Now, instead of going into that mode, a version of Vista that has not been activated in 30 days will start up with a black screen and a dialogue box that gives users the choice of activating Vista now or later, said Alex Kochis, a group product manager at Microsoft.

If users choose to activate now, the screen prompts will lead them through the proper activation system. If users choose to activate later, all the usual functions of Windows will start up, but with a black screen in the background instead of whatever customized background screen a user had set for the system.

Then, after 60 minutes of use, a balloon dialogue box will appear on the screen reminding the user to activate Vista. It also will reset the background to black even if a user had replaced the black screen with a customized view.

The experience will be similar for machines that fail the WGA validation, except that users will be reminded that their copy of Vista is not valid and that they need to purchase a valid copy of the OS.

Kochis said it was feedback from business and enterprise customers that inspired Microsoft to make the changes to the user experience. Many of these customers have been waiting until SP1 to upgrade to Vista, which means Microsoft has gotten their feedback on the Software Protection Program only recently. SP1 is expected to be available in the first calendar quarter of 2008.

Business and enterprise customers were concerned about the idea that desktop computers in their organizations would cease to function in the usual way if a machine were not activated or validated properly, Kochis said.

"In some cases, it was a simple reaction to this concept, as in 'We don't like this,'" he said. The complexity of getting a large number of users up and running again on Vista was also a concern.

In addition to these user-experience changes, in SP1 Microsoft also will include code to combat two of the most common hacker workarounds to the WGA system -- OEM Bios and Grace Timer exploits -- and their variants, Kochis said.

Many customers were unhappy with the way Vista's Software Protection System and the compulsory WGA checks for XP worked, as there were initially bugs in the systems that would deem valid versions of the OS invalid. Hackers came up with ways to bypass the system not only for nefarious purposes, but also for users who were frustrated by system errors.

The OEM Bios exploit bypasses the check by mimicking what Windows looks like during a normal installation by an OEM, thus fooling the anti-piracy check by appearing to be a genuine copy of Windows. The Grace Timer exploit allows a hacker to modify the 30-day activation system so an indefinite number of days or years can be set as the time the user has to validate Vista.

Microsoft also is building a feature into SP1 that can find new hacks in counterfeit systems and send out updates to Windows to stop new exploits before they can be used, Kochis said.

The pirating of Windows has been a perennial problem for Microsoft, particularly in developing countries where Windows is too expensive for many people to purchase. Microsoft and agencies that track piracy, such as the Business Software Alliance, claim piracy costs software vendors billions of dollars a year in revenue.

Microsoft began coming down hard on software piracy two years ago when it introduced WGA, which initially required users to validate their copies of Windows if they wanted to use Microsoft's update services. That program expanded into the automatic validation built directly into Vista. Many have criticized Microsoft's anti-piracy tactics not only for failing to work properly, but also for being generally intrusive, since they communicate directly with a user's PC and send information back to Microsoft.

However, Kochis said on Monday that Microsoft's anti-piracy checks and other efforts to combat piracy -- including lawsuits brought against alleged counterfeiters -- are working. He said the rate of piracy for Vista to date is half the rate it was for XP during the same stage of its release cycle.

Real-time Linux launched by Red Hat

Red Hat has launched a real-time version of Linux, aimed at applications needing predictable responses, from process control to financial market traders.

Red Hat Enterprise MRG (Messaging, Realtime, Grid) includes features such as high-speed inter-application messaging based on the Advanced Message Queuing Protocol (AMQP), whose performance RH reckoned it had improved 100-fold. At the U.K. launch, RH VP Scott Crenshaw could not confirm how this was achieved, representatives said that details will be available on the company's blog.

MRG also allows you to "steal" unused desktop CPU cycles, to manage distributed workloads, to schedule tasks across both local and remote grids, and use cloud capacity from Amazon EC2. The distributed computing capabilities emanate from RH's collaboration with the University of Wisconsin and its high-throughput computing project, Condor. The code for this portion of the system is open source under an OSI-approved licence.

While MRG can be run on Java, Solaris and .Net platforms, the company said that best performance will be obtained when running on Red Hat Enterprise Linux (RHEL).

According to Crenshaw, the first customer for the system was the U.S. Navy. "They approached us a few years ago for a system that could run whole ships, weapons control, the lot," he said.

"As a working group member of AMQP, Cisco has been collaborating with Red Hat for over 18 months in low-latency optimization of AMQP and MRG Messaging open middleware across Infiniband compute fabrics," said Cisco marketing director Bill Erdman.

According to RH, the messaging and grid capabilities can be deployed in multiple environments and are optimized for use with RH's real-time Linux.

Crenshaw said that pricing had yet to be announced.