Tuesday, February 26, 2008

Dell's new gaming desktop allows more hardware tweaks

Dell on Monday announced a quad-core gaming desktop that allows gamers to not only overclock the CPU, but also monitor and tweak the performance of components like fans and power supplies.
With Dell's XPS 630, an entry-level gaming desktop, gamers can ramp up system performance through the Enthusiast System Architecture (ESA), a communication platform that manipulates and monitors the performance of components like graphics cards and fans.

ESA is a hardware and software platform developed by a group of PC makers and component manufacturers that allows components to communicate and exchange performance information, said Bryan Del Rizzo, a spokesman at Nvidia, which supplies a graphics card for the Dell desktop. That information is then provided to users, who can control the components through the software, Del Rizzo said.

For example, users can set the performance of components like the fan and power supply to work harder when the graphics processing unit is rendering 3D graphics. If system components fail, the interface maintains a log for troubleshooting.

The platform is independent of the CPU and relates to the interdependency of system components, Del Rizzo said.

Dell is the first PC maker to support the ESA protocol, which works well with Dell's LightFX technology, Del Rizzo said. Many enthusiast-class gaming PCs have neon lighting on their chassis, and LightFX technology allows developers to sync the on-screen environment with light levels on PCs. Lights on a PC could flash to increase tension in a shooter game, for example.

Nvidia, Dell, HP, Alienware and Falcon Northwest are among the companies that support the ESA platform, which is built for the Windows Vista OS.

Dell's adoption lends credibility to the platform, said Jon Peddie, president of Jon Peddie Research. The technology is still brand-new and could attract attention, though it may be a while before PC makers and component manufacturers are on the same page, Peddie said.

"[ESA] is an interesting idea, it's going to be interesting to a small community of gaming enthusiasts," Peddie said. Even though the gaming community is small, it's influential in adopting new technologies, Peddie said.

At the starting price of US$1,249, the XPS 630 is powered by Intel's Core 2 Quad Q6600 processor running at 2.4GHz, 2G bytes of DDR2 memory, Nvidia's GeForce 8800 GT graphics card with 512M bytes of video memory, a 320G-byte hard drive and a DVD-RW drive. It runs the Windows Vista Home Premium OS.

The system also supports ATI's Crossfire technology, which allows multiple cards to work together to scale multimedia performance, according to AMD.

YouTube outage underscores big Internet problem

Sunday's inadvertent disruption of Google's YouTube video service underscores a flaw in the Internet's design that could some day lead to a serious security problem, according to networking experts.
The issue lies in the way Internet Service Providers (ISPs) share Border Gateway Protocol (BGP) routing information. BGP is the standard protocol used by routers to find computers on the Internet, but there is a lot of BGP routing data available. To simplify things, ISPs share this kind of information among each other.

And that can cause problems when one ISP shares bad data with the rest of the Internet.

That's what happened with YouTube this weekend, according to sources familiar with the situation. BGP data intended to block access to YouTube within Pakistan was accidentally broadcast to other service providers, causing a widespread YouTube outage.

The chain of events that led to YouTube's partial black-out was kicked off Friday when the Pakistan Telecommunication Authority (PTA) ordered the country's ISPs to block access to YouTube because of an alleged anti-Islamic video that was hosted on the site.

According to published reports, the clip was from a film made by Geert Wilders, [cq] a Dutch politician who has been critical of Islam. Wilders is hoping to air a 15 minute anti-Islam film, called Fitna on Dutch television in March.

ISPs in Pakistan were able to block YouTube by creating BGP data that redirected routers looking for YouTube.com's servers to nonexistent network destinations. But that data was accidentally shared with Hong Kong's PCCW, who in turn shared it with other ISPs throughout the Internet.

In San Francisco, David Ulevitch first noticed the problem Saturday morning. "I was trying to watch cats falling off roofs... and I couldn't get to YouTube," he said. Ulevitch, who runs an Internet infrastructure company called OpenDNS, was soon able to connect with engineers at Google, who also experienced similar problems, he said. "They were like, 'Holy crap, we can't get to YouTube either.'"

Because Pakistan's BGP traffic was offering very precise routes to what it claimed were YouTube's Internet servers, routers took it to be more accurate than YouTube's own information about itself.

Larger service providers typically validate BGP data from their customers to make sure that the routing information is accurate, but in this case, PCCW apparently did not do that, according to Ulevitch. When the Pakistani ISP sent the bad data, PCCW ended up sharing it with other ISPs around the globe.

This kind of accidental denial of service attack has happened before. In early 2006, for example, New York's Con Edison caused data intended for a number of networks to be misrouted following a similar mistake.

There wasn't anything that Google could have done to prevent the problem, said Danny McPherson, [cq] chief research officer with Arbor Networks. "They can't keep someone on the Internet from announcing their address space," he said. "It's a huge vulnerability."

By intentionally propagating bad BGP data, an attacker could knock a Web site off the Internet or even redirect visitor's traffic to a malicious server, security experts said.

Although there hasn't been a high profile example of criminals misusing the BGP protocol to knock a Web site offline intentionally, it has been misused by spammers to cover their tracks.

If criminals were able to send BGP information to a larger service provider that didn't properly check its BGP data, they could cause serious problems, McPherson said. "The reality is that if you wanted to cause global instability, you simply compromise one of those people who have access to a BGP-speaking router," he said.

Making BGP data more reliable isn't so easy either. Although secure versions of BGP have been developed, it would take a major effort to adopt them and until there's widespread concern over the current system, it is likely to continue.

Two parties were to blame for the YouTube fiasco, said a networking engineer familiar with the YouTube situation, who asked not to be identified. First, the Pakistani ISP should never have forwarded the bad BGP routing data to PCCW. Second, PCCW should have checked to make sure that the ISP was talking about its own domains before accepting the information.."One of the dirty secrets about the Internet is a lot of it is still a handshake deal," he said.

Lessig decides not to run for Congress

Cyberlaw author and professor Lawrence Lessig has decided not to run for the U.S. Congress after briefly flirting with the idea, he announced on his blog Monday.
Lessig said last week he was considering a run for the House of Representatives seat in Silicon Valley vacated upon the death of Representative Tom Lantos, a Democrat, earlier this month. Lessig, an advocate for free software and online civil liberties, had considered a campaign after a "draft Lessig" movement launched online.

But Lessig wrote Monday that a run for Congress would not help his Change Congress initiative, which he launched this month. After consulting with a pollster, Lessig decided there was also "no possible way" to achieve the name recognition he needed to run against Jackie Speier, a popular former Democratic state senator in California, in the primary election scheduled for April 8, he said in a video on his blog.

"Certainly, we would lose this race, and not just lose in a tight contest, but lose in a big way," Lessig said.

Losing big, he said, would not inspire others to join the Change Congress movement, focusing on getting lawmakers to stop taking money from political action committees and lobbyists and to stop adding so-called earmarks for special projects in appropriation legislation.

Lessig expressed "regret that this movement, this challenge to change Congress, doesn't have, here, an early and easy victory."

Lessig thanked people who already sent money, said they'd volunteer or weighed in on whether he should run, "especially the many friends who in the harshest way told me it would be a mistake."

Several posters on Lessig's blog applauded him for deciding not to run. "Ultimately, I think it is a good call," wrote one. "You ... have always had your eye on the bigger picture, on the long-term struggle and I am glad we have you fighting on our side. Looking forward to participating in the Change Congress movement."

Lessig, a Stanford University professor, is author of books such as "Free Culture" and "Code 2.0." He has served on the boards of the Free Software Foundation, the Electronic Frontier Foundation, the Public Library of Science, and Public Knowledge.

FCC hearing weighs net neutrality

Advocates on both sides of the net neutrality debate descended on Harvard Law School Monday for a U.S. Federal Communications Commission hearing that multiple speakers suggested was crucial to the Internet's future.
Members of the FCC, along with industry representatives, legal scholars and pro-neutrality advocates spoke at the hearing, which drew an overflow crowd.

"The Internet is as much mine and yours as it is AT&T's and Comcast's," said U.S. Representative Edward Markey, a Democrat from Massachusetts.

Markey has filed a bill along with U.S. Representative Charles Pickering, a Republican from Mississippi, in support of net neutrality, the idea that network providers shouldn't discriminate against Web sites or various types of traffic. The FCC is investigating complaints that Comcast has interfered with P-to-P (peer-to-peer) traffic associated with file-sharing sites.

"Network operators are making choices right now that will determine how Americans communicate, now and in the future," said FCC Commissioner Michael J. Copps. "I am not saying that any or all of these practices are unlawful. I am saying that choices like these, when you add them all together, are going to determine what kind of Internet we have in the future."

Other FCC members echoed Copps.

"Respect for the free flow of information was bred into our country from its founding," said Commissioner Jonathan S. Adelstein. "We must preserve the open and neutral character of the Internet, which has been its hallmark from the very beginning. It is clear consumers don't want the Internet to be a another version of old media dominated by a number of giants."

Gilles BianRosa, CEO of Vuze, a video service that uses P-to-P technology, said that while his company competes with Comcast in the delivery of content, the latter company holds an unfair advantage. "What we have here is a horse race, and Comcast owns the racetrack. I agree the market should decide which services win ... but there is no market without basic ground rules and transparency. ... We believe corporate assurances of good faith are not enough."

Marvin Ammori, chief counsel for Free Press, an advocacy group backing the net neutrality effort, also described Monday's discussion in sweeping terms.

"This hearing is not about technical details of managing networks, it's about the future of online TV and the Internet," Ammori said. "By targeting P-to-P, Comcast is disrupting investment and innovation in its online competition."

But David Cohen, executive vice president of Comcast, was as vigorous in defending his company's practices as its critics were in lambasting it.

"Comcast does not block any Web site, application or protocol, including P-to-P. Period," he said.

The company only "manages" protocols such as P-to-P during limited periods of heavy traffic; does so in limited geographic areas; only manages uploads, not downloads; and merely delays, not totally blocks requests for uploads, he said.

"It's true that to maximize our customer's Internet experience, we do manage our network. But don't let the rhetoric scare you. There's nothing wrong with it," he said. "Every network must be managed. Our customers want us to manage network congestion so they can do what they want, when they want, at reasonable speeds."

Tom Tauke, executive vice president for public affairs-policy and communications at Verizon, noted that his company's investment in fiber-optic networks has resulted in exponential growth in the size of data pipes to residential homes, but network management is still necessary. "As capacity grows so do the applications and services. This is a good thing, but you still have to have reasonable network practices," he said.

Later Monday, the hearing continued with a second panel, this one stocked with an array of technologists from the academic and commercial sectors.

"Some kind of network management is critical. ... the question is how to do that in an open manner," said Daniel Weitzner, director of the Massachusetts Institute of Technology's Decentralized Information Group.

At the same time, the Web's days as a primarily client-server environment are over, he suggested: "The profile of the way people use the Internet today is peer-to-peer, and we have to deal with it. But I think it poses a challenge way beyond whether we all get our BitTorrent or not. What's really at stake is everyone's ability to speak with everyone else."

David Clark, a senior research scientist at MIT, predicted the debate could ultimately be settled by moving away from current pricing plans -- which see ISPs (Internet service providers) charge varying rates for a broadband line's speed, but not for the amount of content downloaded -- to a cost schedule based more on data volume.

Eric Klinker, CTO of BitTorrent, said it is wrong for the industry to view his company as a force "endlessly consuming bandwidth."

In fact, he argued, BitTorrent has actually solved a problem: "How do we effectively move large files on the Internet?" He listed off a series of public and private-sector organizations, ranging from film studios to NASA, which employ P-to-P file sharing to deliver large files.

Efforts to thwart P-to-P traffic "would stamp out in its infancy the most promising technology we have to deliver a world of near-infinite content," he said, adding that the U.S. falls far behind other nations in terms of its Internet infrastructure: "Geopolitically, we might think of ourselves as a superpower, but when measured against network power we're a Third World country at best."

Novell bids $205 million for virtualization management firm

Novell Monday announced it had signed a definitive agreement to acquire for US$205 million PlateSpin, a maker of tools to help companies adopt, extend and manage server virtualization in the data center.
The deal, expected to close by April 30, will equip Novell with the technology to help customers better manage and optimize heterogeneous physical and virtual servers in next-generation data centers, the company said in a press conference call. PlateSpin, founded in 2003, generated more than $20 million in revenue in 2007 with its technology that disconnects software from hardware and allows servers to be streamed over a network from any source to any destination. Such capabilities, Novell executives say, will enable Novell to help its customers better manage workloads and optimize data-center resources in any environment. "It is a cornerstone of Novell corporate strategy to help our customers work in a mixed environment," said Ron Hovsepian, president and CEO of Novell.

The PlateSpin deal is Novell's second in two weeks. Last week, Novell announced the acquisition of open source collaboration vendor SiteScape. Novell officials are undecided on whether they will make any of PlateSpin's technology open source.

The pending acquisition will allow PlateSpin to expand its reach into larger companies and couple its technology with Novell's Zenworks management tools to "become a powerhouse," says Stephen Pollack, founder and CEO of PlateSpin, which is headquartered in Toronto.

This acquisition ideally would help Novell accomplish a goal toward which many management and virtualization vendors are working: heterogeneous management of multivendor virtualization platforms. VMware last fall acquired Dunes Technologies to bring more management capabilities in-house, and Microsoft is expected to couple its virtualization wares with management capabilities.

"It's going to be a big year for virtualization, and there is an awful lot of money floating around the big players in the management market," such as BMC Software, CA, EMC, HP and IBM, says Rich Ptak, principal and founder of market research firm Ptak, Noel and Associates. "The demand for platform-agnostic, heterogeneous virtual management is going to be there, and Microsoft is going to put a lot of pressure on the virtualization market because it is beefing up its management capabilities. Successful vendors have to make virtualization accessible and make management a critical function that is easy to implement."

Following the close of the deal, some 200 PlateSpin employees are expected to join the Systems and Resource Management business unit at Novell. PlateSpin products, such as PlateSpin Forge, PowerConvert and PowerRecon, will continue for now with existing branding, as will Novell's Zenworks line. Going forward, however, Novell will consider product integration and branding changes.

The deal was approved by the boards of both companies and must undergo the customary approvals. That is expected to happen during Novell's second 2008 fiscal quarter, which ends April 30. Novell declined to comment further on the more specific financials of the deal because the company has an earnings call scheduled for Thursday, Feb. 28.

Microsoft to measure 'engagement' with online ads

Microsoft has introduced a reporting tool for its online advertising platform that it claims measures the impact of online ad campaigns rather than merely counting clicks.
The tool, called Engagement ROI, has been integrated into Microsoft's Atlas Media Console, which is software used for booking and managing online advertising campaigns. A senior Microsoft executive will officially announce the tool on Monday at the Interactive Advertising Bureau's annual meeting in Phoenix.

The Media Console was developed by Atlas, a company that was owned by aQuantive, which Microsoft bought for US$6 billion in May 2007. It marked Microsoft's biggest acquisition in the online advertising space, following Google's announcement in April 2007 of its intention to acquire a rival online advertising firm, DoubleClick.

Google has a huge lead over Microsoft in online advertising. To counter Google's dominance, Microsoft made an unsolicited offer earlier this month to buy Yahoo, but the acquisition has been so far resisted by Yahoo's board. Microsoft wants access to Yahoo's engineers as well as the company's online advertising technology.

Engagement ROI is in a beta release. Microsoft said the tool will undergo testing with national advertising agencies such as McKinney, Mindshare Interaction, World Vision and Neo@Ogilvy.

Engagement ROI looks at several aspects of an online ad, such as how recently it has been displayed, its size and its format, and then determines how successful the message is in influencing a purchase. Microsoft calls the concept "engagement mapping."

Microsoft said the most commonly used metric in the industry, counting clicks, is a poor way to measure the effectiveness of an ad campaign. Generally, the success of an ad is determined on how frequently it is viewed or clicked.

Microsoft expects to begin receiving results on how successful the tool is by the end of June.

New Windows Server will lead march to 64-bit OS

The launch of a new family of Windows server products this week will kick-start a broad shift among customers to 64-bit versions of Microsoft's server software, analysts and customers said.
Microsoft CEO Steve Ballmer is due to launch two major product upgrades at an event Wednesday in Los Angeles -- the Windows Server 2008 OS, which is due for release next week, and its SQL Server 2008 database, expected in the third quarter after delays. He's also expected to discuss Visual Studio 2008, which shipped in November.

Like their predecessors, the new products will be offered in both 32- and 64-bit editions. But several factors this time will prompt more customers to choose the 64-bit versions, including the broad availability of 64-bit x86 server hardware and the trend toward consolidating and virtualizing server workloads to reduce power consumption and improve efficiency.

The shift will happen gradually, since most customers are not expected to deploy the products widely until next year. But it will mark a significant maturation for Microsoft's server products, which long were seen as an also-ran in the datacenter beside 64-bit Unix OSes from companies such as Sun and HP. It should also mean better performance for Microsoft customers.

"This will absolutely tip the scales in terms of more 64-bit deployments moving forward," John Enck, a vice president and research analyst with Gartner, said of the new products. The move will be driven by a desire among customers to get the most out of their 64-bit server hardware, he said, which means using a 64-bit OS.

The difference lies in the amount of physical memory the software can address. A 32-bit OS can address only 4G bytes of main memory without having to use technology tricks that diminish performance gains. A 64-bit OS can address far more memory -- up to 2T bytes in the case of Windows Server 2008, according to Microsoft.

That will boost the performance of some applications because they will be able to pull data quickly from main memory, instead of having to retrieve it from disk, which is slower. The gains should be evident for databases and for Microsoft's Exchange Server, although line-of-business applications will see less benefit, Enck said.

Customers may also be driven to 64 bits by concerns about the future. Microsoft has said this will be the last big upgrade to Windows Server offered in both 32- and 64-bit editions, and some expect the same to be true for SQL Server. Exchange Server 2007, released in November, already is available only in 64 bits. Customers would be wise to start preparing now for a move that soon will be forced upon them anyway, analysts said.

Also propelling the move is the trend toward server consolidation. One option for that is virtualization, which allows multiple OSes and application loads to run on a single physical machine, and server virtualization requires the capacity of a powerful, 64-bit server.

"Anybody doing a deployment today would be foolish not to at least consider when and where a 64-bit OS would be a good fit," said Al Gillen, a research vice president with IDC. "It's really about future-proofing your IT environment, giving yourself the ability to support the workloads that you'll have on these servers before they are retired in five years' time."

IDC has called the lack of adoption of 64-bit Windows Server "one of the biggest missed opportunities among today's customer base." It notes that the 64-bit products will be priced the same as their 32-bit counterparts and argues that the transition is relatively easy for customers.

The 64-bit processors from Intel and AMD have maintained the x86 architecture from the 32-bit world. That means 32-bit applications can still run on 64-bit servers, and that "the majority of existing 32-bit applications will run aboard 64-bit Windows Server without modification and, most frequently, with improved performance," IDC said.

Customers will need to update low-level system tools such as security products, antivirus tools and some system-management products, which interact directly with the Windows Server kernel, IDC said.

At the end of 2007, the research company estimates, only about 10 percent of Windows Server customers were using the 64-bit edition of Windows Server 2003. It expects that figure to approach 50 percent by the end of 2010, driven by Windows Server 2008. For new licenses sold in 2010, close to 75 percent will be for a 64-bit version of Windows Server, IDC said.

Microsoft is keen to promote the transition to 64 bits. It will give its customers better performance and help Microsoft catch up with the Unix world, where powerful, but more expensive, servers from Sun, IBM and HP have long been based on 64-bit OSes. Microsoft believes the products launching this week will mark "a big turning point" toward the use of 64-bit Windows software, said Ward Ralston, a Microsoft senior technical product manager.

Microsoft also is pressuring ISVs (independent software vendors) to get their software 64-bit ready, after a lack of preparedness held back the transition after Windows Server 2003 was released. ISVs aren't required to have a native 64-bit edition of their software to receive a Certified for Windows Server 2008 logo, but they will need to assure their software can run on the 64-bit OS.

Some expect Microsoft's virtualization technology, Hyper-V, to be a factor. The hypervisor will be offered free with the 64-bit edition of Windows Server 2008. Andrew Brust, head of new technology for the IT consulting company Twentysix New York, a Microsoft partner, said that Hyper-V "once it ships, is going to be huge. The virtualization space is ripe for some new competition."

Brian Randell, a senior consultant with another Microsoft partner, MCW Technologies in Los Angeles, said Hyper-V will be a major impetus for the move to 64 bits. "It demands that you have that kind of processor environment available," he said.

However, others pointed to Hyper-V's immaturity. It was originally planned to ship with Windows Server 2008 but has been delayed for up to six months. Even then it will be Microsoft's first attempt at virtualization, noted Michael Cherry, an analyst at Directions on Microsoft.

"I think there's too much emphasis on virtualization with this release," he said. Hyper-V may eventually play a significant role for Microsoft, but the company first needs to develop the required tools for managing a complex virtualized environment. After the hypervisor is released, he said, Microsoft will also need to update its Virtual Machine Manager product.

Still, Cherry is upbeat about the new products, particularly Windows Server 2008. The redesign of the OS to allow customers to install only the functions they need for particular tasks, or roles, will provide security and maintenance advantages, he said. He also pointed to the new Internet Information Server, which gives more options for running and controlling applications remotely, and a significant update to Terminal Services, which will make it easier to run line-of-business applications on a server and make them appear to the end-user as if they were running locally.

Twentysix New York's Brust said the new server products are "rock solid on 64-bit, and so too are the currently shipping versions of SQL Server, SharePoint and other server applications."

"When SQL Server 2008 ships, it will be the third version of the product to offer 64-bit support," Brust said. "Let’s face it, it’s time to move off 32-bit."

Critical VMware bug lets attackers zap 'real' Windows

A critical vulnerability in VMware Inc.'s virtualization software for Windows lets attackers escape the "guest" operating system and modify or add files to the underlying "host" operating system, the company has acknowledged.
As of Sunday, there was no patch available for the flaw, which affects VMware's Windows client virtualization programs, including Workstation, Player and ACE. The company's virtual machine software for Windows servers and for Mac- and Linux-based hosts are not at risk.

The bug was reported by Core Security Technologies, makers of the penetration-testing framework CORE IMPACT, said VMware in a security alert issued last Friday. "Exploitation of this vulnerability allows attackers to break out of an isolated guest system to compromise the underlying host system that controls it," claimed Core Security.

According to VMware, the bug is in the shared-folder feature of its Windows client-based virtualization software. Shared folders let users access certain files -- typically documents and other application-generated files -- from the host operating system and any virtual machine on that physical system.

"On Windows hosts, if you have configured a VMware host-to-guest shared folder, it is possible for a program running in the guest to gain access to the host's complete file system and create or modify executable files in sensitive locations," confirmed VMware.

VMware has not posted a fix, but it instead told users to disable shared folders.

The Palo Alto, Calif.-based company also made it clear that the vulnerability isn't present in its server line of virtual machine software; VMware Server and ESX Server do not use shared folders. Newer versions of VMware's Windows client virtualization tools also disable shared folders by default, the company added. Users must manually turn on the feature to be vulnerable.

A similar bug was reported by VeriSign Inc.'s iDefense Labs to VMware in March 2007. VMware patched it about a month later.

Friday's alert, however, was the second security-related notice posted by VMware in two days. On Thursday, VMware patched its ESX Server line to quash five bugs that could be used to slip past security restrictions, launch denial-of-service attacks or compromise virtualized systems.

The increased reliance on virtual machines, particularly on enterprise servers, has come with its own set of security problems, researchers and IT administrators have noted previously. Sunday, an analyst at the SANS Institute's Internet Storm Center (ISC) extended that warning to desktop virtualization users, particularly security professionals.

"We make an extensive use of virtualization technologies for multiple purposes: malware analysis, incident response, forensics, security testing, training, etc., and we typically use the client versions of the products," said Raul Siles in a post to the ISC blog. "It is time to disable the shared-folder capabilities."

Hackers ramp up Facebook, MySpace attacks

Hackers are actively exploiting an Internet Explorer plug-in that's widely used by Facebook Inc. and MySpace.com members with a multi-attack kit, a security company warned Friday.
The exploit directed at Aurigma Inc.'s Image Uploader, an ActiveX control used by Facebook, MySpace and other social networking sites to allow members to upload photos to their profiles, is just one of five in a new hacker tool kit being used by several Chinese attack sites, said Symantec Corp.

Attacks begin when users receive spam or an instant message with an embedded link, said Darren Kemp, the Symantec analyst who authored the advisory. The link takes users to a bogus MySpace log-in page, which tries to steal members' credentials as it also silently probes the their computers for vulnerabilities in Uploader, Apple Inc.'s QuickTime, Windows and Yahoo Music Jukebox.

Although the Windows and QuickTime bugs were patched eight and 13 months ago, respectively, the Uploader and Yahoo vulnerabilities were made public and fixed only within the past few weeks. Kemp noted the hackers' fast reaction times. "[This demonstrates] how quickly attackers are leveraging new vulnerabilities," said Kemp. "It is unlikely that attackers will stop trying to leverage this vulnerability any time soon."

The Aurigma bug was disclosed at the end of January by researcher Elazar Broad. Shortly after that, a spokeswoman for Facebook and MySpace claimed that the social networking sites were alerting members of the danger. New bugs cropped up a week later, however, forcing Aurigma to again patch the ActiveX control. Not until Feb. 13 did the company claim "Image Uploader is safe again!"

Yahoo Inc. plugged a pair of holes in Music Player on Feb. 6, two days after Broad published attack code for both.

Symantec has been tracking attacks against the Aurigma vulnerabilities most of the month. More than three weeks, ago, for example, another of its analysts reported seeing evidence of a new multi-exploit hacker tool kit -- presumably the same one analyzed by Kemp -- that included an Image Uploader attack.

Exploits against ActiveX controls are nothing unusual; scores of bugs in the Microsoft-made technology were uncovered and exploited in 2007, according to Symantec. It counted 210 ActiveX vulnerabilities in the first half of last year alone, a prime factor in making IE a popular attack target.

In fact, after the Uploader and Yahoo Music Jukebox vulnerabilities were disclosed, the U.S. Computer Emergency Readiness Team (US-CERT), which is part of the U.S. Department of Homeland Security, recommended that IE users disable ActiveX.

Kemp, however, saw the social networking angle as just as important. "Given the growing popularity of social networking sites like MySpace and Facebook, attacks leveraging vulnerabilities in their client-side components are not surprising," he wrote in the warning.

Symantec urged users to update the Image Uploader ActiveX control to Version 4.5.57.1.

YouTube blames Pakistani ISP for global site outage

Many users around the world could not access the YouTube site for about two hours on Sunday. The company blamed the outage on erroneous routing information introduced by a Pakistani Internet service provider. Pakistani authorities ordered ISPs there to block the site on Friday.
Traffic to YouTube was misrouted for around two hours, rendering the site inaccessible for many users around the world, YouTube said on Monday.

"We have determined that the source of these events was a network in Pakistan," the company said, adding that it is still investigating the problem to prevent it from happening again.

The Pakistan Telecommunication Authority (PTA) ordered the country's ISPs to block users access to YouTube on Friday because of an inflammatory anti-Islamic video on the site, Wahaj us Siraj, convener of the Association of Pakistan Internet Service Providers said in a telephone interview on Monday.

If the video is provocative, then it is better it is removed, rather than provoke unrest in Pakistan, said Siraj who added that he did not know the contents of the video.

Access to YouTube is still blocked in Pakistan while the ISPs work with the PTA to narrow its order to block a single URL (Uniform Resource Locator) pointing to the video, Siraj said. He expects the PTA to make an order to that effect later on Monday.

Steven Schwankert in Beijing contributed to this report.

Researchers dream up mobile chameleon device

Imagine tapping out text messages on a device the size of an index card and as flat as a piece of paper, then folding it in thirds to hold it to your ear and make a phone call. Refold it in a slightly different shape and wrap it around your wrist, where it becomes a watch and also communicates with an ear bud that lets you talk hands free.
Nokia researchers, along with researchers at the University of Cambridge in England, have created an animated video describing such a vision for mobile devices, which could come in the future through nanotechnology developments.

The animation shows practical applications for several specific types of work that the scientists are developing based on their nanotechnology research, said Tapani Ryhanen, the head of multimedia devices research at Nokia Research Center. The concept video was created at the prodding of New York's Museum of Modern Art, which is opening an exhibit Sunday called "Design and the Elastic Mind," he said.

In another segment of the video, the user flaps the paper-thin device in front of an apple. Tiny particles fly off the apple, landing on the device, which quickly analyzes them. It then flashes a warning signal, recommending that the user wash the apple before eating it.

That's one of the most interesting potential uses that Ryhanen sees. "Personally, I'm mostly interested about the bigger issue of how we can make our mobile devices more intelligent and so they can sense something from the environment," he said. One day, a device like the one in the video could sense harmful elements in the air. With potentially millions of such devices communicating globally, they might be able to warn people about a disease that could spread into a pandemic, identifying dangerous areas around the world, he said.

The device in the animation is covered in minuscule "grass" that can absorb solar energy to power it. It's also "syperhydrophobic," making it incredibly dirt repellent. The animated woman in the video, sitting at an outdoor café, accidentally drops a bit of honey on the device and the drop slides off without leaving a bit behind.

Just before she walks away, she places the device on top of her brightly colored purse and snaps a photo. When she folds the device around her wrist, she sets a new wallpaper and the entire surface of the device displays the same pattern as her purse.

Currently, the researchers have developed "bits and pieces" of the technologies envisioned in the concept "but we are not yet at the level that we could integrate those things together into a device that we're showing in this animation," Ryhanen said. Some features of the device could start appearing in commercial products as soon as seven years from now, Nokia said.

Around 18 Nokia researchers and 25 University of Cambridge researchers have been working together for about a year at the university's West Cambridge site.

The concept animation video is expected to be available for viewing on Nokia's site on Monday. Nothing about the concept, called Morph, will be on exhibit at the museum, but it will feature in the exhibition catalog and on MoMa's Web site, Nokia said.

SAP ships 'enhancements' for ERP

SAP is expected on Monday to ship a third "enhancement package" to its ERP (enterprise resource planning) application, with new features focusing both on core functionality, such as financials and procurement, and functionality aimed at verticals like the retail and manufacturing industries.
The release also has more than 50 "enterprise services bundles." These are sets of existing SAP ERP service interfaces, packaged in various ways to address specific business processes such as order-to-cash, the company said.

The vendor doesn't charge existing customers for enhancement packages, which stem from a strategy shift it announced in 2006. Instead of issuing major ERP platform releases every 12 to 18 months, it is parceling out incremental updates to the current core platform, SAP ERP 6.0. According to the company, 4,000 implementations of SAP ERP 6.0 have gone live since January 2007.

Given the "historically painful" process of implementing an ERP, it is wise for SAP to move in this direction, said Marc Songini, an analyst with Nucleus Research, by e-mail on Friday.

SAP's rival, Oracle, is also basing its Fusion strategy around pain-free upgrades, making it a competitive play as well, he said.

The move is also a good way for SAP to preserve its installed base, he added. "If you have a choice between the agony of re-implementing SAP or turning to a new vendor such as Oracle or Lawson, you might be tempted to jump ship. But if you're already on SAP, know it warts and all, and want to keep that investment, and SAP is making it easy to get new features without a rip and replace, you won't be as tempted."

Microsoft kills off HD DVD drive for Xbox 360

Microsoft will stop making external HD DVD drives for its Xbox 360 game console, but won't say whether it will offer a Blu-ray Disc drive instead.
The company will continue to provide warranty and product support for existing HD DVD players, it said.

The Xbox 360 has a standard DVD drive built in: support for high-definition content came only with an add-on. Sony's Playstation 3 console, however, has a Blu-ray Disc drive built in, which helped grow support for the rival high-definition format.

Microsoft's announcement comes barely a week after HD DVD's main backer, Toshiba, said it will stop making the drives in the face of declining support for its high-definition format from retailers and studios. HD DVD's other supporters included Microsoft, Intel, HP and Universal Studios. Blu-ray also had the support of Panasonic and Samsung.

Warner Bros., which initially supported HD DVD, said early this year it would switch to Blu-ray Disc, a decision widely seen as a mortal blow to the format. Retailer Wal-Mart also recently said it would no longer sell HD DVDs.

A Microsoft spokesperson said Monday morning that the company is taking the long-term view that support for specific high-definition drives is less important as people increasingly look to download movies and content from the Internet.

Microsoft's Xbox Live Marketplace lets people download content to their Xbox or PC from major studios such as Paramount Studios and Warner Bros., with recent titles such as "Ocean's Thirteen."

That movie, which costs £19.99 (US$39.26) to download from the site, lets a user keep one copy on their PC and one copy on their mobile device. The movie is encoded in Microsoft's Windows Media Player format.