Changing LINKS
informatics news: January 2009

Thursday, January 29, 2009

Oracle subpoenaed nearly 100 TomorrowNow customers, filing shows

Oracle has issued 102 subpoenas to 99 former customers of TomorrowNow, the now-shuttered SAP subsidiary at the heart of Oracle's lawsuit against SAP, according to a court document filed this week.

Forty-nine of the 99 customers have subsequently supplied some 77,012 documents, which SAP is reviewing in order to "appropriately designate any confidential or highly confidential information that they may contain," according to the filing by SAP in U.S. District Court, Northern District of California.

It was not clear Thursday what type of information the documents contain. None of the customers in question were named in the filing.

An Oracle spokeswoman declined comment Thursday. An SAP spokeswoman did not immediately return a call for comment.

Oracle filed a lawsuit against SAP in March 2007, charging that workers at TomorrowNow, a provider of third-party support for Oracle's PeopleSoft, JD Edwards and Siebel applications, illegally downloaded material from Oracle's support systems and used it to court Oracle customers.

SAP has said that TomorrowNow workers were authorized to download materials from Oracle's site on behalf of TomorrowNow customers, but acknowledged some "inappropriate downloads" had occurred. SAP has also said that Oracle's software remained in TomorrowNow's systems and has denied Oracle's allegations of a wider pattern of wrongdoing.

SAP referred to the TomorrowNow customer subpoenas in one of a series of recent filings related to discovery in the sprawling suit.

"It's a key point to any resolution whether Oracle's bottom line was damaged by former customers turning to TomorrowNow for their [application] support needs," 451 Group analyst China Martens said via e-mail. "Oracle seems to be struggling to prove that point, hence the resorting to customers. We question the wisdom of that move, which seems overly intrusive to end users."

The case's next settlement conference is scheduled for Feb. 23, and a trial date has been set for February 2010.

IDG News Service
computerworld

By Chris Kanaracus

Windows 7: More Flavors Than Ever?

Windows Me may not have had much going for it, but it has one claim to fame: it was the last major release of Windows to come in a single edition, or SKU.

In the ensuing decade, every major release of desktop Windows has come in a wide -- too wide, say many -- variety of flavors.

By one count, Windows XP and Vista came in eight separate editions, if you include two Windows Media Player-free versions mandated by the European Union for anti-monopoly reasons.

Even Windows 2000, often romanticized for its small footprint , came in four versions.

This increase in Windows edition has bewildered many consumers, and led even ardent Windows fans to make dark jokes.

"I wonder whether Windows 7 will have 700 SKUs or if [Microsoft] will streamline that," Andrew Brust, a technology consultant and Microsoft MVP, has said on his Twitter page.

Paul Thurrott, a well-known Windows blogger , said, "It is laughable. It's such a brazen play on their part to juice people for as much money as they can get."

This MBA textbook-style attempt to maximize revenue by divvying up features by customer segment is actually hurting Microsoft, said Rob Enderle , an independent analyst.

He said Microsoft's decision to strip Active Directory features from consumer versions of Vista meant that workers running Macs at home or on personal laptops have an easier time hooking up to their corporate network than many Vista users.

That is helping Apple gain the foothold in the enterprise it has long been denied , Enderle said.

"In effect, this screwy SKU thing has given Apple an advantage in enterprises that Microsoft has taken away from itself and probably will be one of the primary things slowing Windows 7 adoption" should it come in multiple editions, he said.
Will Windows 7 Continue the 'SKU Inflation'?

How many editions will Windows 7 come in? A recent beta release of Windows 7 lists five versions during the installation process:

Starter Edition, a stripped-down version for customers in developing countries running underpowered hardware that has been around since XP;

Home Basic, the controversial low-end consumer flavor introduced with Vista that Microsoft apparently debated whether or not to release;

Home Premium, also introduced with Vista;

Ultimate, introduced with Vista, the loaded-with-goodies version aimed at hard-core hobbyists;

Business, introduced with Vista as the replacement to Professional for corporate use.

A Microsoft spokeswoman confirmed the five version names in the Windows 7 beta, but said they were only "preliminary."

"We will continue to take customer feedback from the beta test period into account as we refine the SKU set for Windows 7 and will share more information when we are further along the development path," the spokeswoman said in an e-mail.

Meanwhile, CNET UK reported that Microsoft plans to make a single version of Windows 7 just for netbooks.

There is evidence, via a Microsoft job posting, that Microsoft plans to release a Small Business version of Windows 7 , as it once planned but abandoned for Vista, as well as an Enterprise edition , which already exists with Vista. There would also be two additional 'N' versions of Windows 7 for customers in the EU, which has signalled recently it may even demand Microsoft bundle rival browsers with Windows . In all, Windows 7 could therefore have as many as 10 editions.

Windows blogger Thurrott disagrees, arguing strongly that Microsoft will cut down on the proliferation in editions that hit an apex with XP and Vista. He notes that the public Windows 7 beta includes the locale-specific themes that, in XP and Vista, were only available in the Starter edition.

The public beta, which is of Windows 7 Ultimate, appears able to run on low-end hardware like netbooks, obviating the need to create a separate SKU for it, Thurrott said.

He said that he has also heard reports that Microsoft plans to cut the "useless" Home Basic, that the Business edition will eventually be renamed Professional and include Media Center features, and that an Enterprise edition would be eliminated and its features, such as desktop virtualization, offered as add-ons to interested corporate customers.

Thurrott believes Microsoft's best strategy is to release Windows 7 in just three versions (not including the EU-mandated ones): Home, Professional and Ultimate.

"Gosh, I really do hope so. If there were just three versions, no one would make fun of it," he said. "Five or seven versions, that's just crazy town."

Thurrott also thinks Microsoft should cut the price on all of its versions, as well as let customers install Windows on multiple PCs or virtual machines, as Apple does with Mac OS X.

He said he was hopeful for a reduction in editions because Steven Sinofsky, the Microsoft VP in charge of Windows 7's development, is "a simplicity maven."

Enderle, who hammered Microsoft's version strategy with Vista , especially its decision to release Vista Home Basic, has a more quixotic hope.

"I think there should be one version of Windows which allows the OEMs [PC makers] more flexibility with regard to creating unique user experiences without breaking compatibility, and restores the ability of users to drive OS upgrades in the companies where they work," Enderle said in an e-mail.

"I'm not aware of another instance where a user-focused technology is specifically altered so a user can't bring it into their workplace," he said.

Eric Lai, Computerworld

Google unveils tools that can show if your ISP is giving you what you paid for

Want to know if you're actually getting what you're paying your Internet service provider for?

If you are, join the club. The problem is that it it has been far from easy to get a handle on how your service provider deals with various kinds of traffic. That may become an easier job now that Google Inc. is launching what it calls Measurement Lab (M-Lab), an open system that researchers and consumers can use to access its new Internet performance measurement tools.

"Researchers are already developing tools that allow users to, among other things, measure the speed of their connections, run diagnostics, and attempt to discern if their ISP is blocking or throttling particular applications," said Vint Cerf, Google's chief Internet evangelist, and Stephen Stuart, Google's principal engineer, in a blog post. "These tools generate and send some data back and forth between the user's computer and a server elsewhere on the Internet. Unfortunately, researchers lack widely-distributed servers with ample connectivity. This poses a barrier to the accuracy and scalability of these tools."

To tackle the problem, Google announced late on Wednesday that it will host the tools on 37 servers in the U.S. and Europe. The tools are designed to help users try to figure out what might be impairing their broadband speed, as well as find out if BitTorrent is being blocked or throttled by their Internet service providers.

"Seems like the intention behind this is to give consumers a way to keep tabs on their provider and make sure that they're getting what they're paying for in terms of speed," said Dan Olds, an analyst at Gabriel Consulting Group Inc. "Also, with these tools, consumers will supposedly be able to tell if particular high-bandwidth applications, like BitTorrent, are being constrained by their ISP. So if an ISP is limiting video downloads, for example, consumers can use the Google tool, figure it out and start a huge outcry, putting pressure on the ISP to stop."

Just last month, an analyst with ties to the telecommunications industry released a report calling Google a bandwidth hog. Scott Cleland, president of Precursor LLC, a research firm bankrolled by telecommunications heavyweights such as AT&T Inc. and Verizon Communications Inc., reported that Google uses 21 times more bandwidth than it pays for.

Google was quick to fire back. Richard Whitt, Google's Washington telecommunications and media counsel, noted in a blog post that Cleland is "not exactly a neutral party." Whitt also claimed that the analyst had made methodological and factual errors.

Olds noted that the new measurement platform is another salvo in the war between content providers such as Google and network providers.
"This Google tool is a way for consumers to keep their providers honest, but it also serves to further Google's interest in keeping bandwidth limits at bay," he added. "This gives consumers a way to pressure ISPs who try to limit individual bandwidth -- like secretly limiting your account because you download 10 movies a day and giving me my full bandwidth because I don't. There's only so much bandwidth to go around. They can either build more capacity, which they hate to do, or try to put limits on the extreme high users."

Stuart and Cerf noted in their blog post that all the data collected via M-Lab will be made publicly available for other researchers to build on. Google is working on the project with the New America Foundation's Open Technology Institute, the PlanetLab Consortium and academic researchers, according to the company.

"No matter your views on Net neutrality and ISP network management practices, everyone can agree that Internet users deserve to be well-informed about what they're getting when they sign up for broadband, and good data is the bedrock of sound policy," wrote Cerf and Stuart. "Transparency has always been crucial to the success of the Internet, and by advancing network research in this area, M-Lab aims to help sustain a healthy, innovative Internet."

By Sharon Gaudin
computerworld

Microsoft delivers Vista SP2 RC to testers, reports say

Microsoft Corp. has delivered a preliminary release candidate for Windows Vista Service Pack 2 (SP2) to testers and is again on track to offer another public preview next month, according to several reports on the Web.

Just last week, a Malaysian Web site, TechARP, claimed that Vista SP2 had been pushed back a month. Yesterday, however, TechARP, which has accurately predicted Windows delivery dates in the past, revised its estimate, saying that Microsoft had "brought forward their release schedule" and would be issuing an "escrow" build no later than Friday.

Yesterday, reports surfaced that testers had been told by Microsoft that the escrow build of Vista SP2's release candidate was available for downloading. ZDNet blogger Mary-Jo Foley, for example, cited a section of the e-mail notification, which told testers that the company was not interested in feature feedback, but only reports on "SP2 regressions and confirmation of fixes we've made."

An "escrow" build is a version on which development has stopped but that is handed to developers and testers, who are asked to shake out the code one final time to make sure there are no show-stopping bugs.

TechARP's revised timetable claims that Microsoft will deliver a full-fledged release candidate to the public during the week of Feb. 16-20, not in March as the site said last week. That will be followed by a release-to-manufacturing (RTM) build sometime in the first half of the second calendar quarter -- in other words, before mid-May.

Previously, TechARP had said Vista SP2 would reach RTM -- a milestone at which the service pack is officially finished, and sent to computer makers and duplicators for retail copies -- as late as June.

Vista SP2 will be released for download from the Web at an undetermined date after Microsoft slaps the RTM label on the service pack. In the past, Microsoft has waited to post service packs anywhere from just two weeks after RTM to more than six weeks after.

But with the recent appearance of the first public beta of Windows 7, the follow-up to Vista, already in users' hands, some have dismissed Vista SP2 as irrelevant.

"Who cares now with Windows 7?" asked a user identified as Luis Mazza on a message thread discussing Vista SP2 at the Windows enthusiast Web site, Neowin.net.

"I could care less as I just got rid of Vista and I'm now only running 7 beta," added "smooth3006" on the same thread.

One analyst, however, disagreed.

"Service packs always matter," said Michael Cherry, an analyst at Directions on Microsoft, a research firm. "Because service packs make it more efficient to update PCs, they increase the chances that people do deploy fixes and patches."

Microsoft has previously declined to comment on TechARPs Vista SP2 schedule, and has instead reiterated its general timetable for delivering Windows Vista SP2 sometime in the second quarter of 2009.

By Gregg Keizer
Computerworld

Is Your ISP Throttling Your Internet Connection?

Think your Internet Service Provider (ISP) is messing with your connection performance? Now you can find out, with Google's new online tools that will diagnose your network connection. Here's a quick walkthrough on how to make the best of them.

Google's broadband test tools are located at Measurementlab.net. On that page, you'll see an first icon that says "Users: Test Your Internet Connection". Click that, and then you'll be taken to a page where there are three tests available, and two more listed as coming soon. However, out of the three available tests, only one of them is fully automated and easy to use.

Glasnost , second on the list, will check whether your ISP is slowing down (like Comcast) or blocking Peer2Peer (P2P) downloads from software such as BitTorrent. P2P apps are commonly used for downloading illegal software and media content like movies and music, but also are used for legal purposes as well, such as distributing large software packages to many users at once.

To use the measurement tool, you will be redirected to the Glasnost site. You'll need the latest version of Java installed, and you should stop any large downloads that you may have running before you begin the test. If you're on a Mac, a popup message will prompt you to trust the site's Java applet.

When you're ready to start, you can choose whether you want to run a full test (approximately 7 minutes long) or a simple test (4 minutes long). When I tried to test my connection, Glasnost's measurement servers were overloaded and an alternative server was offered, but that was overloaded as well. After a short while I was able to run the test.

In the tests of my connection (my provider is Vodafone At Home, in the UK) all results indicated that BitTorrent traffic is not blocked or throttled. But I'm looking forward to hearing from you in the comments how your ISP performed in Glasnost's diagnostics. Meanwhile, make sure you keep an eye on the other tests that will be available soon from Measurementlab.net.

pcworld
Daniel Ionescu


Wednesday, January 28, 2009

Groups Push for Health IT Privacy Safeguards

U.S. lawmakers need to make sure privacy safeguards are in place before pushing electronic health records on the public, senators and witnesses at a hearing said.

Health IT improvements are needed to improve the quality and efficiency of health care in the U.S., but patients might be wary of electronic health records without strong privacy safeguards built in, Senator Patrick Leahy, a Vermont Democrat, said during a Senate Judiciary Committee hearing Tuesday.

"If you don't have adequate safeguards to protect privacy, many Americans aren't going to seek medical treatment," Leahy said. "Health-care providers who think there's a privacy risk ... are going to see that as inconsistent with their professional obligations, and they won't want to participate."

A US$825 billion economic stimulus package, called the American Recovery and Reinvestment Act, includes $20 billion targeted toward health IT efforts. The bill, which could come before the full House this week, establishes an Office of the National Coordinator for Health Information Technology, with the duty of driving health IT standards.

The bill would also create a national health IT research center and a series of health IT extension centers to help health-care providers and patients adopt electronic health records (EHRs). It would fund grant programs to help health-care providers adopt health IT and EHRs.

The bill would also provide EHR incentive payments to health-care professionals, with a $15,000 payment if EHRs are adopted in the first year and declining payments after that. Hospitals would also receive incentive payments.

The bill includes several privacy provisions. It extends privacy requirements to business associates of health-care providers, and it requires the U.S. Department of Health and Human Services to put out annual guidance on the most effective privacy safeguards. The bill also requires health-care providers to notify customers of any security breaches.

Microsoft, which has its own EHR product, sees the necessity of building in security, said Michael Stokes, principal program managers in the company's Health Solutions Group. "Health data is often considered more sensitive than other personally identifiable information," he said. "If health data is stolen or lost, it is not simply a matter of recovering financial assets. It can impact an individual's employment, ability to receive health care and social standing."

The stimulus package is a great opportunity to push health IT and health reform, said Deven McGraw, director of the Health Privacy Project at the Center for Democracy and Technology. "It will help us create the information superhighway for health, that will improve health-care quality and engage more consumers in their care," she said to the Judiciary Committee.

While U.S. residents support a push toward greater use of IT in health care, they remain concerned about potential privacy problems, McGraw added. "Building trust in these systems is absolutely critical to realizing the benefits of this technology," she added.

Lawmakers need to balance privacy with the benefits that health IT can provide, said David Merritt, project director at the Center for Health Transformation and the Gingrich Group. "Privacy cannot be compromised, but neither can we compromise progress in pulling our health-care system out of the technological Stone Age," Merritt said. "We need to find the right balance between privacy at all costs and progress at any cost."

But privacy and progress don't have to conflict, McGraw said. Privacy is "not an obstacle; in fact, the opposite is true," she said. "Enhanced privacy and security developed in health IT will bolster consumer trust."

Whatever privacy and security safeguards are put in place, they will have to change as the industry develops, added James Hester, director of the Vermont Health Care Reform Commission. "The balance point is not static; it will evolve," he said. "We fully expect that the implementation of the initial privacy policies in the growing set of pilot health IT initiatives will teach us important lessons in the next couple of years."

Grant Gross, IDG News Service
pcworld

Windows 7 to be 'thoroughly' tested by antitrust regulators

Changes in the OS generate new documentation to check compliance with '02 agreement

Technical advisers to the antitrust regulators who monitor Microsoft Corp.'s compliance with a 2002 antitrust settlement will test Windows 7 "more thoroughly" than earlier versions of the operating system were tested, according to a recently-released status report filed with the federal judge watching over the company.

The three-member panel of computer experts that works for state antitrust officials has had a copy of Windows 7 since at least last March, but in December 2008, Microsoft delivered additional documentation to the technical committee.

In the report, submitted last Wednesday to U.S. District Judge Colleen Kollar-Kotelly, antitrust officials with the Department of Justice, 17 states and the District of Columbia said that Microsoft had given notice that "changes to the protocols in Windows 7" required 30 new and 87 revised technical documents.

Microsoft has been under a microscope since it struck a deal in 2002 that required the company to document communication protocols so that other developers, competitors included, could craft software that works smoothly with Windows clients and servers. The decree also set up the technical committee and forced Microsoft and state and federal antitrust officials to deliver regular reports to Kollar-Kotelly.

The newest report spelled out changes the committee, dubbed "TC" by the court, will make to test Windows 7, the successor to Vista.

"In light of the number of new documents that need to be reviewed, the TC is going to shift its focus to direct review of the documents by the TC's engineers as the most efficient method of identifying issues with the documentation," read the status report. "The revised strategy will enable the TC to review the new Windows 7 and system documents more thoroughly than it would otherwise, which is particularly desirable given the significance of these new documents to the project as a whole."

Originally, the consent decree Microsoft signed was to expire in November 2007. Several states objected, however, and after months of legal back-and-forth, Kollar-Kotelly in January 2008 extended her oversight by another two years, to Nov. 12, 2009.

Microsoft is also facing renewed scrutiny from the European Union, which two weeks ago filed preliminary charges against the company, accusing it of violating antitrust laws since 1996 by bundling the Internet Explorer browser with Windows.

Computerworld
By Gregg Keizer

Tokyo Electric Plans Big Solar Plant

Tokyo Electric Power is increasing its investment in solar energy with plans to build a 10-megawatt solar plant near Tokyo.

The facility will be constructed in Yamanashi prefecture, which is just to the west of Tokyo, and partial operations will start in the financial year from April 2011. Details of the price of the project and supplier of the solar panels were not announced.

Several Japanese companies are working to become leaders in the solar-power-generation field, and contracts from domestic utilities like TEPCO are likely to provide a big push toward their goal of securing a sizable portion of the global market.

Last year Sharp said it would install a 9-megawatt generating facility on the roof of a new factory it is building in west Japan. The facility will be operated with Kansai Electric Power and will expand to eventually reach 18 megawatts -- enough to provide about 5 percent of the power used by the factory, which will produce solar panels.

Other Japanese electronics companies in the solar business include Sanyo Electric and Toshiba.

Martyn Williams, IDG News Service

pcworld

AT&T Revenue Up, but Net Income Down for Quarter

AT&T reported net income of US$2.4 billion for the fourth quarter of 2008, down from $3.1 billion in the previous year, with declining voice revenue offsetting gains in iPhone adoptions and broadband profit increases.

AT&T posted operating revenue of $31.1 billion for the fourth quarter, up from $30.3 billion in the fourth quarter of 2007. The company narrowly missed analyst expectations, however, with adjusted earnings per share of $0.64, while analysts expected $0.65, according to Thomson Reuters.

"Despite the economic environment, we grew revenues in 2008, and I expect 2009 will be another year of overall revenue growth and solid progress for our company," Randall Stephenson, AT&T chairman and CEO, said in a statement.

For the year, AT&T reported revenue of $124 billion, up 4.3 percent, and net income of $12.9 billion, up 7.7 percent over 2007.

AT&T's traditional voice services posted revenue of $8.8 billion in the fourth quarter, down from $9.8 billion in the fourth quarter of 2007. Operating expenses also increased by about $1.3 billion.

Those numbers were partially offset by AT&T's mobile phone division. The company reported a net gain of 2.1 million wireless customers in the fourth quarter, with 1.9 million new Apple iPhone customers. About 40 percent of iPhone customers are new to AT&T, the company said.

AT&T had 77 million mobile phone customers at the end of the year.

AT&T Wireless posted revenue of $11.5 billion, up from $10.2 billion in the fourth quarter of 2007, not including sales of handset and accessory sales. AT&T's wireless data revenue grew 51.2 percent to $3.1 billion. AT&T Wireless customers sent nearly 80 billion text messages during the fourth quarter, more than double the number from the fourth quarter of 2007.

The company also reported growth in its fiber-based U-verse television and broadband numbers. AT&T's U-verse TV service had a net gain of 264,000 customers in the fourth quarter, up from 232,000 in the third quarter.

The company's wireline data revenue grew by 14.2 percent from the fourth quarter of 2007.

AT&T's total broadband connections, which include wireline subscribers and wireless customers with 3G LaptopConnect cards, increased by 357,000 in the fourth quarter to reach 16.3 million in service, up 1.5 million or 10.3 percent over the past year.

Grant Gross, IDG News Service

pcworld

Gmail Goes Offline

If you live in Gmail, but don’t always have a broadband connection available, today should be a happy day for you. Google is rolling out a new system for letting Gmail users access their accounts offline. Google will cache your messages on your system using Google Gears. You’ll be able to open your browser to Gmail.com, see your inbox, read and label messages and even write replies without a Net connection. Your messages will send once your system reconnects to the Web.

The system is beta (of course) and accessible through Gmail Labs. But it won’t be immediately available to everyone – Google is parsing out access as it experiments with the new feature. I don’t have access to the new feature yet, so I’ve still got lots of questions. But Google’s post makes it sound like the experience will be almost indistinguishable from using Gmail normally.

“Gmail uses Gears to download a local cache of your mail. As long as you're connected to the network, that cache is synchronized with Gmail's servers. When you lose your connection, Gmail automatically switches to offline mode, and uses the data stored on your computer's hard drive instead of the information sent across the network. You can read messages, star and label them, and do all of the things you're used to doing while reading your webmail online. Any messages you send while offline will be placed in your outbox and automatically sent the next time Gmail detects a connection,” Gmail Engineer Andy Palay wrote.

There will also be a “flaky connection mode” that’s supposed to give you the best of both worlds. It’ll assume that you’re disconnected and use the local cache to store your data, but whenever your connection is working, it’ll sync with Google’s servers in the background.

This all sounds pretty good, but here are my questions:

How much will Gmail cache? Just my inbox, my entire 6.2GB mail file or something in between? I work pretty hard to keep my inbox clear, so I hope that it’ll cache more than just my inbox.

How extensively can you search? The biggest reason I use Gmail is that I can find a message I got two years ago in just a few seconds with the right search terms. But depending on just how much gets cached, your search capability could be severely limited.

Will you work the same way in Gmail whether you’re offline or online? That’s certainly the way Google makes it sound. If so, that’ll be a big step forward from other attempts to bring webmail offline. You’ve long been able to access your Gmail account through a client like Mozilla Thunderbird. But that doesn’t give you all the Gmail functionality like labels. Yahoo Mail has offered offline access since last summer using Zimbra Desktop. But that also involves using a client on your desktop. For offline access to Windows Live Hotmail, Microsoft suggests using their Mail client software.

Is Gears up to the challenge? Google launched this system for creating offline access to Web apps nearly two years ago. For a long time, the only apps that used it were Google Reader and to do list Remember the Milk -- an indication that developing for and implementing Gears wasn’t quite as simple as Google would have you believe. In fact, bringing Gmail offline was an obvious use of Gears that has taken 21 months to come to fruition.

(Google also says it's readying an offline version of Google Calendar, also presumably using Gears, though the company didn't specifically say that. Offline Calendar will initially be available only for users of Google Apps Standard Edition and there's no firm release date.)

Is Gears ready now? We’ll see soon. I’ll check back in once I get a chance to play with Gmail offline. And let me know what questions you have. I’ll do my best to answer them. In the meantime, you can watch this relatively lame video from your friends at Google.

pcworld
Edward N. Albro

Coming soon: Full-disk encryption for all computer drives

Drive makers settle on a single encryption standard

The world's six largest computer drive makers today published the final specifications(download PDF) for a single, full-disk encryption standard that can be used across all hard disk drives, solid state drives (SSD) and encryption key management applications. Once enabled, any disk that uses the specification will be locked without a password -- and the password will be needed even before a computer boots.

The three The Trusted Computing Group (TCG) specifications cover storage devices in consumer laptops and desktop computers as well as enterprise-class drives used in servers and disk storage arrays.

"This represents interoperability commitments from every disk drive maker on the planet," said Robert Thibadeau, chief technologist at Seagate Technology and chairman of the TCG. "We're protecting data at rest. When a USB drive is unplugged, or when a laptop is powered down, or when an administrator pulls a drive from a server, it can't be brought back up and read without first giving a cryptographically-strong password. If you don't have that, it's a brick. You can't even sell it on eBay."

By using a single, full-disk encryption specification, all drive manufacturers can bake security into their products' firmware, lowering the cost of production and increasing the efficiency of the security technology.

For enterprises rolling out security across PCs, laptops and servers, standardized hardware encryption translates into minimum security configuration at installation, along with higher performance with low overhead. The specifications enable support for strong access control and, once set at the management level, the encryption cannot be turned off by end-users.

Whenever an operating system or application writes data to a self-encrypting drive, there is no bottleneck created by software, which would have to interrupt the I/O stream and convert the data "so there's no slowdown," Thibadeau said.

"Also, the encryption machinery uses no power. When it reads data from the drive, it displays it to the user in the clear. It's completely transparent to the user," he said.

The TCG includes Fujitsu, Hitachi GST, Seagate Technology, Samsung, Toshiba, Western Digital, Wave Systems, LSI Corp., ULink Technology and IBM.

"In five years time, you can imagine any drive coming off the production line will be encrypted, and there will be virtually no cost for it," said Jon Oltsik, an analyst at Enterprise Strategy Group.

Here are the three specifications:

* The Opal specification, which outlines minimum requirements for storage devices used in PCs and laptops.

* The Enterprise Security Subsystem Class Specification, which is aimed at drives in data centers and high-volume applications, where typically there is a minimum security configuration at installation.

* The Storage Interface Interactions Specification, which specifies how the TCG's existing Storage Core Specification and the other specifications interact with other standards for storage interfaces and connections. For example, the specification supports a number of transports, including ATA parallel and serial, SCSI SAS, Fibre Channel and ATAPI.

Several of the drive manufacturers, including Seagate, Fujitsu and Hitachi, already support the standard on some of their drives. Hitachi, for instance, is shipping its internal Travelstar 5K500.B laptop drives with full-disk encryption.

Several encryption management software vendors, including Wave Systems, WinMagic Inc. and CryptoMill Technologies, have also announced product certification for the standard.

Brian Berger, a marketing manager with Wave Systems and chair of the TCG marketing work group, said the specifications call for the use of Advanced Encryption Standard (AES). Vendors are free to choose either AES 128-bit or AES 256-bit keys depending on the level of security they want. Neither have been broken.

"Things like key manageability and patch management become things of the past," he said. "You don't have to worry about what version of encryption software is running or what [encryption appliance] your system's plugged into. When encrypted drives are under management, users can't turn off encryption, so there's no chance of users losing machines with valuable data on them after having turned off encryption."

The effort to create the encryption specifications, which began six years ago, focused on full-disk encryption, which protects data on a computer by encrypting all of the information on the computer's hard drive regardless of what partition it's on. In order to gain access to the information, users would first have to supply a password, which, in turn, would be used to unlock a key used to decrypt the data.

"You can use these [enabled] drives to childproof your laptop because it operates outside of Windows. Windows hasn't even booted yet. Your kid can't crack it unless [he] has the password. You can leave the laptop at home and rest assured a 14-year-old can't get on it," Thibadeau said.

IT departments will also be able to repurpose drives using the encryption standard by cryptographically erasing them with a few keystrokes. Cryptographic erasure changes the cryptographic key, thus making data permanently inaccessible.

"The specific way in which encryption is done inside the drive doesn't matter for interoperability," said Jorge Campello, senior manager of architecture and electronics at Hitachi. "What matters is how they drives are configured and how access control is configured. So any drive, in conforming to these standards, will have the same interface commands."

By Lucas Mearian
computerworld

Monday, January 26, 2009

AMD working on extra-low-power chip for its Shanghai line

The new Opteron processor could be aimed at cloud computing centers

Advance Micro Devices Inc. (AMD) is adding a new, low-power Opteron processor to its Shanghai line roadmap and it plans to release the chip in the second quarter of the year.

AMD said the need for lower-power chips is being driven, in part, by the rise of computing cloud centers, the massive data centers being built by the likes of Microsoft and Google to deliver cloud services.

AMD isn't saying, just yet, how much it can cut power usage in the upcoming 45nm, quad-core chip line below the power used by its current low-power chip, a 55-watt processor released today. AMD revealed its plans for the fourth chip as part of the release of two new processors.

In November, AMD released the first in its line of 45nm, quad-core Opterons, a 75-watt version with speeds of up to 2.7GHz that makes up the bulk of sales in the Shanghai line.

Today, it followed that processor with two more: A 105-watt, 2.8GHz chip designed for high performance and large database users, and the 55-watt version, which offers speeds of up to 2.3GHz. That latter processor may appeal to cloud-centric facilities and large hosting data centers.

These chips succeed similar offerings in the 65nm Barcelona line of processors.

The decision to add the fourth chip to the line follows AMD's announcement last fall that Microsoft would use its Opteron chip for its cloud computing initiative, the company's Windows Azure Compute Service.

AMD isn't alone in looking to diversify its chip line to address escalating heating and cooling problems in data centers. Rival Intel, for instance, last year released a 50-watt Xeon chip.

In total, AMD is releasing seven chips with varying capabilities and prices; two in the 105-watt line priced at $1,165 and $2,649, and seven in the 55-watt line with speeds ranging from 2.1GHz to 2.3GHz. Prices for the 55-watt chips range from $316 to $1,514.

AMD said the new chips are available immediately on servers from Hewlett-Packard Co. and Rackable Systems Inc. Servers using these chips will be available from other vendors this quarter, including Dell Inc., IBM and Sun Microsystems Inc.

At one time, a low-power chip with lower clock speeds would have gotten mostly niche adoption. But this niche is growing. Cloud computing is really a high-growth segment, said Steve Demski, a senior product manager at AMD. As a result, the chip maker is expecting demand for low-power chips to increase with the rise of cloud computing as a platform.

Lower power, combined with a virtualization-optimized processor, fits cloud-computing environments looking to save as much on energy consumption as possible, said Charles King, an analyst at Pund-IT Inc. in Hayward, Calif.

By Patrick Thibodeau
Computerworld

Qimonda's Woes Will Zap Gamers, PC and Servers Users

The bankruptcy filing by Germany's Qimonda AG last week marks a first for a major technology company amid the current global economic downturn, and it likely won't be the last.

Qimonda's problems are already rippling across the globe and could take down others, as well as cause a sharp rise in DRAM prices for users. Qimonda's problems could also zap computer gamers and businesses in the short term because the company supplies DRAM or specialty chips to around a quarter of each market.

Qimonda had hoped to use bankruptcy as a way to reorganize its business, but the prospects of such a comeback are quickly dimming. For one thing, a chip supplier Qimonda counted on, Inotera Memories, has said it won't provide DRAM to Qimonda anymore.

"This means that Qimonda will lose close to 35 percent of its capacity overnight, which will hamper the company's ability to support existing clients and to continue normal operations," said Andrew Norwood, DRAM industry analyst at Gartner, in a report released Monday.

"As for Qimonda, an insolvency administrator has been appointed, and the administrator will be looking to rescue as much money from the company as possible -- either though restructuring or, more likely, a fire sale of assets," he added.

The first place users will feel Qimonda's bankruptcy filing is in the price of DRAM and its possible reduced use in PCs, laptops and other hardware. The low price of DRAM over the past year has prompted many PC vendors to add more DRAM than normal to desktops and laptops, but a sharp rise in chip prices could halt that trend. PC sales are already waning and the last thing vendors want to do is raise prices. Reducing the amount of DRAM to the least needed per PC will keep system prices stable.

The price of 1Gb DDR2 (double data rate, second generation) that runs at 667MHz could rise to between US$1.20 to $1.50 in the near term as the DRAM market absorbs the Qimonda shock, according to DRAMeXchange Technology, a clearinghouse for the chips.

The price of the chips on Friday averaged US$0.85 on DRAMeXchange, indicating a rise of as much as 76 percent should they reach $1.50 each.

Some analysts say a price rise could be short lived because even if Qimonda shut down immediately the chip glut would still exist and several companies -- or even the German insolvency administrator handling Qimonda's case -- could dump inventories. But the bankruptcy filing and near term changes in business, such as Inotera Memories ceasing to supply Qimonda with 35 percent of its output, will disrupt the supply chain.

Qimonda's woes could far more significantly affect the graphics and computer server markets, according to Nam Hyung Kim, memory industry researcher at iSuppli.

Qimonda accounted for 26 percent of global shipments of graphics DRAM and as much as 20 percent of the DRAM for computer servers, he said. Companies that use Qimonda's chips will have to prepare for a possible shutdown by finding new suppliers for the chips.

Qimonda filed for bankruptcy protection in German last Friday after a €325 million (US$422.5 million) financing package from the German state of Saxony, a Portuguese financial institution and Qimonda's parent company, Infineon Technologies, could not be completed in time.

The company's troubles were partly caused by a DRAM glut that started over a year and a half ago, while the global economic downturn pushed the company over the edge.

Overly-optimistic projections of PC demand, partly based on hopes Microsoft Windows Vista would be a hit, and easy lending terms from financiers prompted companies to build massive new DRAM factories. In the end, Vista didn't stimulate PC demand as much as hoped and the new factories have become a problem to makers because of the oversupply they have created.

As DRAM prices sank, industry revenue dried up.

DRAM marked a second year of revenue declines last year due to the chip glut. Industry revenue fell 19.8 percent in 2008 to $25.2 billion, down from $31.5 billion in 2007, according to preliminary estimates by iSuppli. The market researcher predicts the sector will suffer a 4.3 percent decline this year.

Qimonda isn't alone in its financial troubles.

ProMOS Technologies of Taiwan faces a US$330 million bond repayment in mid-February and the company doesn't have cash on hand to cover the debt. ProMOS has petitioned the Taiwan government for a loan and sought new partnerships but so far has not found any takers.

It's hard to tell what might happen next, but a spike in DRAM demand is highly unlikely now that the holiday season and Lunar New Year celebrated in China and much of the rest of Asia is now past.

"If Qimonda exits the DRAM business, their competitors are likely to prosper while their partners will suffer," says Jim Handy, analyst at Objective Analysis. "Should the company remain intact, they are still more likely to be taken over than not. In any event, we anticipate that, by the end of 2009, there will exist at least one less DRAM maker."

Dan Nystedt, IDG News Service

Top Internet Security Suites: Paying for Protection

Using security software is more important than ever. Our tests of the latest all-in-one security suites show that good protection can shut down the nastiest viruses, spyware, and adware.

In the early days of computer viruses, you could get by with careful surfing--and without antivirus protection. Now, crooks love nothing more than to discover a nasty zero-day security flaw for which there's no defense, and then to infiltrate otherwise benign and popular Web sites with hidden, malicious programming made to attack that security flaw. While relatively uncommon, such tactics can catch even the most careful surfer. Like it or not, you need security tools.

To help you select the best security for your computer, PC World put nine comprehensive suites--from Avira, BitDefender, F-Secure, Kaspersky, McAfee, Panda, Symantec, Trend Micro, and Webroot--through the wringer. We poked and prodded, surfed and scanned until one contender came out on top.

Our all-around winner this year was Norton Internet Security 2009. Once again Symantec's suite did a fine job combining strong performance with smooth design, starting with a top-tier overall malware detection rate just shy of 99 percent. It sports an attractive and well-laid-out interface, and useful new features in this year's version include "pulse" automatic updates, which send new malware-detection signatures out to your PC every 5 to 15 minutes.

Symantec also incorporated cloud computing into the suite this year, with on-the-spot online checks to supplement scans that used to occur entirely on your machine. The Norton Insight feature, which compares a new program on your PC against a reputation database of programs that other Norton users have, is meant primarily to improve the suite's performance by preventing it from scanning known safe applications.

Harnessing the immediacy of the Internet is a trend this year. The F-Secure, McAfee, and Panda packages all now use similar online checks to attempt to detect new malware more quickly, without having to wait for a scheduled signature update; the approach has the potential to boost overall detection rates.

With new features, strong performance and pleasing design, Norton Internet Security deserves its top spot--but that doesn't mean it's for everyone. Norton can't perform backups, for instance, while four other suites in this group offer the feature. Also, though Norton's detection rate is very good, it isn't the best: Avira's Premium Security Suite took top honors again this year in identifying both known and unknown malware. What's more, you'll have to pay for the best. At $70 for three PCs, Norton was the second-most-costly suite we tested; only Kaspersky Internet Security 2009 cost more, at $80 for three users.

If you want the best all-around security, buy the Norton suite. But for particular needs, you might have better choices--the less-expensive BitDefender (our number two pick), say, or maybe Avira, the top virus detector (with backups)--so be sure to read all our reviews before you commit.
How We Tested the Suites

To evaluate the suites, PC World once again partnered with AV-Test.org. This German organization pitted each suite against its "zoo" of 654,914 backdoor programs, bots, worms, Trojan horses, and password stealers, as well as against 46,246 adware samples. Each suite was allowed to connect to the Internet to use online checks, where available.

The group's rootkit tests looked at each package's ability to detect and clean up both active and inactive rootkits--stealth malware designed to hide criminal software on your PC. AV-Test also assessed scan speed and each suite's ability to clean up a malware infection, and the group's heuristic and behavioral detection tests determined how well a suite could identify new and unknown malware for which it didn't yet have a signature. The heuristic tests used two- and four-week-old signature files with each suite to simulate encounters with unknown malware, while the in-depth behavior tests examined how well each suite could identify malware based solely on how it acted on a PC.

After AV-Test evaluated and scrutinized each suite's innards, we tested its interface and design. We determined whether it smoothly handled alert pop-ups or phishing-site blocks, or whether its actions left us scratching our heads. We combed through all the settings to see whether they were at appropriate defaults, and also whether advanced users could easily change them.

Where security applications are concerned, however, performance and effectiveness outweigh design, so the bulk of our scoring depended on how well a suite detected and disinfected malware, along with how fast it scanned. We also considered price, support, and features in the final rankings.
Internet Security Suites: Read Our Reviews

1. Symantec Norton Internet Security 2009
2. BitDefender Internet Security 2009
3. Panda Internet Security 2009
4. McAfee Internet Security Suite 2009
5. Avira Premium Security Suite 8.2
6. Kaspersky Internet Security 2009
7. F-Secure Internet Security 2009
8. Webroot Internet Security Essentials
9. Trend Micro Internet Security Pro 2009

* Top Internet Security Suites (chart)

For additional security news coverage, reviews of security programs, and tips on making your computing experience safer,

Erik Larkin, PC World

Saturday, January 24, 2009

Protecting Against the Rampant Conficker Worm

Businesses worldwide are under attack from a highly infectious computer worm that has infected almost 9 million PCs, according to antivirus company F-Secure.

That number has more than tripled over the last four days alone, says F-Secure, leaping from 2.4 million to 8.9 million infected PCs. Once a machine is infected, the worm can download and install additional malware from attacker-controlled Web sites, according to the company. Since that could mean anything from a password stealer to remote control software, a Conflicker-infected PC is essentially under the complete control of the attackers.

According to the Internet Storm Center, which tracks virus infections and Internet attacks, Conficker can spread in three ways.

First, it attacks a vulnerability in the Microsoft Server service. Computers without the October patch can be remotely attacked and taken over.

Second, Conficker can attempt to guess or 'brute force' Administrator passwords used by local networks and spread through network shares.

And third, the worm infects removable devices and network shares with an autorun file that executes as soon as a USB drive or other infected device is connected to a victim PC.

Conficker and other worms are typically of most concern to businesses that don't regularly update the desktops and servers in their networks. Once one computer in a network is infected, it often has ready access to other vulnerable computers in that network and can spread rapidly.

Home computers, on the other hand, are usually protected by a firewall and are less at risk. However, a home network can suffer as well. For example, a laptop might pick up the worm from a company network and launch attacks at home.

The most critical and obvious protection is to make sure the Microsoft patch is applied. Network administrators can also use a blocklist provided by F-Secure to try and stop the worm's attempts to connect to Web sites.

And finally, you can disable Autorun so that a PC won't suffer automatic attack from an infected USB drive or other removable media when it's connected. The Internet Storm Center links to one method for doing so at http://nick.brown.free.fr/blog/2007/10/memory-stick-worms.html, but the instructions involve changing the Windows registry and should only be attempted by adminstrators or tech experts. Comments under those instructions also list other potential methods for disabling autorun.

Erik Larkin, PC World

Microsoft extends Windows 7 beta download deadline

Microsoft Corp. yesterday extended the deadline for downloading the public beta of Windows 7 by more than two weeks, citing continued interest in the preview.

The move suggests that fewer than 2.5 million copies of the beta have been downloaded since Microsoft launched the Windows 7 preview Jan. 10.

Although Microsoft had originally capped the downloads of Windows 7 beta at 2.5 million, after a rocky launch -- the company's servers were overloaded as frustrated users tried to download the preview -- the company lifted the limit and said it would offer the beta through today, Jan. 24.

Late Friday, Microsoft changed its mind again.

"Because enthusiasm continues to be so high for the Windows 7 Beta and we don't want anyone to miss out, we will keep the Beta downloads open through February 10," said company spokesman Brandon LeBlanc in a post to the Windows 7 blog.

LeBlanc didn't say whether the 2.5-million cap had been reached, noting only that, "We are at a point where we have more than enough beta testers ??? so we are beginning to plan the end of general availability of Windows 7 Beta."

According to comments made earlier this month by a Microsoft IT evangelist, the decision to keep the beta download open is a clue that download demand has not yet reached the 2.5-million mark. Two weeks ago, Kevin Remdes, a company evangelist, said that if the cap was not reached by today, downloads would continue "until the limit is reached."

On Saturday, Microsoft declined to say whether the 2.5-million cap had been reached or surpassed, or to specify how many copies it has provided users to this point. "I can say that interest has greatly exceeded our goals for the feedback we hoped to collect from testers," a company spokeswoman said in an e-mail reply to questions.

Windows 7 beta availability will be shut down in stages, LeBlanc said. While the beta will be pulled from Microsoft's servers at the end of the day Feb. 10, users who have already begun the download by then will have two more days, through Feb. 12, to complete the process.

Users can pause the Windows 7 beta download and resume it later; an interrupted download -- perhaps due to a severed Internet connection -- can also be resumed at the point it was halted.

Activation keys will be available indefinitely for users who finished downloading the disk image file before Feb. 12. Even users unable or unwilling to activate the beta, however, can install and run Windows 7 for up to 120 days without a key by using the same "slmgr -rearm" command that gained notoriety after Windows Vista's debut.

Subscribers to the TechNet and Microsoft Developers Network (MSDN) will be able to download the beta after the February deadlines imposed on the general public, LeBlanc added.

Users can download Windows 7 from the Microsoft site after selecting the 32- or 64-bit version, and the desired language.

By Gregg Keizer
Computerworld

Thursday, January 22, 2009

Slow Vista sales hit Microsoft revenues as netbooks gain ground

Sales of Windows desktop software dropped 8% last quarter compared to a year ago, while Microsoft Corp.'s server division revenues were up 15%, illustrating the rejection of Windows Vista and the acceptance of Windows Server 2008, an analyst said Thursday morning.

"Very disappointing results from the Windows Client unit," said Neil MacDonald, an analyst with Gartner Inc. "But it was very predictable, especially after Intel's earnings report." Last week, Intel announced that its fourth-quarter profit plummeted 90%.

Revenues for the Windows client group totalled $3.98 billion in Microsoft's second fiscal quarter, which ended Dec. 31, 2008, down 8% compared to the same period in 2007 because what the company called the "PC market weakness and a continued shift to lower priced netbooks."

MacDonald echoed those reasons, but added more detail. "When people look at opportunities for cost savings, they first think about not replacing computers," he said. "They try to get another year of life out of their notebooks and desktops. And that goes right to Microsoft's bottom line [in the Windows client unit]. If PCs aren't selling, Microsoft's not making money."

But Windows Vista -- 2007's problem- and perception-plagued operating system -- also played a major part in the revenue drop, said MacDonald. "Compounding the problem is the fact that Vista has been a very disappointing release. That makes a bad situation worse."

MacDonald, along with fellow Gartner analyst Michael Silver, made waves last year by arguing that Windows was "collapsing" because of the operating system's increasingly bloated code base and inability to roll out upgrades in a timely fashion.

In a sideways fashion, Microsoft confirmed that Vista's sale were down significantly in the quarter quarter. "There was double-digit declines in premium SKUs" of both the business and consumer lines of Windows, said Frank Brod, Microsoft's chief accounting officer, referring to the higher-priced editions, including Vista Home Premium, Vista Ultimate and Vista Business.

The bright spot, according to both Microsoft and MacDonald, was the growth in sales of "netbooks," the new category of small, lower-priced laptops. But even that came with a price. "The uptake in netbooks also played a part, because although there was growth in that segment, those machines are running Windows XP," said MacDonald. "Vista doesn't fit."

Microsoft makes less on Windows XP per copy that it sells to computer makers than it does for Windows Vista.

While the client-side revenues took a dive, sales for the Server and Tools division were $3.74 billion last quarter, a 15% increase over the same period in 2007. "Double-digit growth, that's fantastic results given this economic climate," said MacDonald.

The server group, which released Windows Server 2008 almost a year ago, "has been on a roll, even in this tough economy," said MacDonald, who spelled out several reasons.

"The move to virtualization is a clear cost savings for companies," he said, noting that Microsoft gives away its Hyper-V virtualization software and sells its management tool "at a very aggressive price."

MacDonald pegged Hyper-V and the also-important widespread accolades for Windows Server 2008 as key drivers in Microsoft's impressive gains.

The near future doesn't look much brighter for Windows client, MacDonald said. "There's a glimmer of hope in the second half of the year," he argued. "If Windows 7 comes out in advance of the holiday season, consumers may once again get excited about buying a new computer.

"Consumers and businesses can't continue to run their old machines forever," he said.

Microsoft released a public beta of Windows 7 on Jan. 10, a fact that company executives mentioned several times Thursday morning during the conference call with Wall Street analysts.

The company also announced that it would cut 5,000 jobs, with first 1,400 layoffs slated for today.


By Gregg Keizer
Computerworld

Seagate releases new firmware for broken hard drives

Seagate Technology LLC has now released new firmware for all models of hard drives affected by a software flaw, the company said today.

Seagate has published detailed instructions for how administrators can identify the model of hard drive in service and whether it needs a firmware upgrade. Models affected are the Barracuda 7200.11, ES.2 SATA and DiamondMax 22.

The problem caused some drives to become completely inoperable, while other users found they could not access data on the drives. The new firmware will not fix drives that have become inoperable, the company said.

Seagate is offering customers whose drives are broken data recovery services from its i365 subsidiary. Data on drives that aren't inoperable is still on the drives and can be recovered, the company said.

Seagate released new firmware last Friday for the Barracuda 7200.11 drives, but that upgrade was also faulty. Seagate withdrew it on Monday, said company spokesman Ian D. O'Leary.

It was originally thought that drives in the SV35 series, which are designed for surveillance applications, were also affected by the problems, but that now appears not to be the case, O'Leary said.

Seagate said it believes that the vast majority of customers using the drives will not have problems. However, the company has not released figures on how many of the drives have been sold and what percentage may be affected.

"We regret any inconvenience that the firmware issues have caused our customers," Seagate said in a statement released today.


By Jeremy Kirk
IDG News Service

XP Users, Plan Your Windows Upgrade -- Right Now

An XP-to-Vista migration may not be in the cards for everybody, but at this point, the jump to Windows 7 is inevitable for XP users.

A recent report from IT research firm TAC (The Advisory Council) highlights the most efficient ways to roll out Windows XP to Windows 7 upgrades. One key message: if you are not upgrading in 2009 you should at least be planning to do so.

The report, entitled "Cutting Through the Nonsense About Windows Vista, Windows 7", also touches on a subject that Microsoft probably wants kept quiet: The migration to Windows 7 will not be much different than to Vista, the report's author, Peter Schay, concludes; the two OS's share the same software compatibility issues, interface features and hardware requirements (though Windows 7 is reportedly much less resource-intensive).

It's no secret that Microsoft is trying to disassociate Windows 7 from Vista, despite the OSes sharing the same code base.

Windows 7 has good timing and marketing on its side, Schay says . The Windows 7 beta has been getting positive reviews and the release of the OS, likely to happen some time in the second half of this year, will coincide with companies' hardware refresh cycles that have been stretched to four or five years because of the economic downturn.

The report from TAC goes on to offer advice for XP users facing an upgrade, including lessons from Vista's failure, thoughts on Windows 7's potential and a look at the realities of switching to Linux or Macintosh.
Thin Line Between Vista and Windows 7

Windows Vista failed to get the initial adoption rates that Microsoft hoped it would, especially at enterprises. Vista adoption has improved since Microsoft released Vista service pack 1 (SP1) in February, but even two years after its release Vista still only has 21 percent market share, according to Web metrics company Net Applications.

The TAC report cites the usual suspects for Vista's troubles: software application incompatibility, beefed up security at the expense of compatibility, intrusive UAC pop ups and changes to the user interface.

Windows 7 may be a new brand name, but all of the reasons XP users put off Vista apply more or less equally to Windows 7 too, Schay writes.

"If your application software is incompatible with Vista, then it will be incompatible with Windows 7. While there will be incremental performance improvements in Windows 7, the hardware requirements are the same as for Vista," he writes.

XP users will have the benefit of learning from other people's past Vista headaches and should start upgrading to Vista and/or Windows 7 compatible application software now even if they won't actually be migrating for a few more years, the TAC report advises.
Don't Think a Linux or Mac Migration Would Be Easier

Some pundits and bloggers believe that Vista's bad reputation and slow acceptance will lead to a wave of converts to Linux or Mac OS. But the TAC report deems this notion "naive at best, and disingenuous at worst."

Again the reason is lack of application compatibility, cites the report. "Neither Linux nor Macintosh brings anything to this party," writes Schay, adding that a Linux or Mac OS migration would be a much greater effort than a Vista/Windows 7 upgrade and should only be done if there are motivations at play other than hardware and software compatibility.

Schay concedes that Linux has Wine, the free Windows compatibility tool, and Macs can run Windows as a virtual machine, but he writes that these options "add complexity for both the user and IT, and still leave the issue of eventual end-of-support for XP on the table."

Whether Upgrading to Windows 7 or Vista, Start Now

The TAC report emphasizes that eventually Microsoft will pull the plug on Windows XP, just as it did with Windows 98. So what should businesses do about upgrades from XP in 2009? Plan, plan, plan.

The report advises that if you have a Vista migration underway you should continue it, because Vista and Windows 7 will co-exist well for a smooth migration later.

If you have not started a Vista migration plan, the report says, Windows 7 will be available soon enough that Vista can be skipped.

But author Schay offers a caveat: Don't expect Windows 7 to be an easier migration because it has a different name.

"Skip Vista with the understanding that it isn't going to save you from the work and cost of upgrading your software applications to be Windows 7 compatible," he writes.

Sectera Edge: A BlackBerry Secure Enough For Obama?

President Barack Obama may be getting a souped-up, super secure version of the BlackBerry called the Sectera Edge, recent speculation suggests. The NSA-certified device could allow Obama to fulfill his wish of staying connected, some suspect, while also addressing the numerous security concerns that come with the territory.

Sectera Edge: Smartphone by General Dynamics

The Sectera Edge is a smartphone made by a defense contractor called General Dynamics. The device, according to General Dynamics, was actually developed for the National Security Agency (NSA) and is considered a "secure mobile environment portable electronic device."

So what's all that mean? Basically, the thing uses an alphabet soup of secure protocols to make sure all communications are encrypted and safe from would-be spies. The Sectera Edge uses SCIP, or Secure Communications Interoperability Protocol, along with HAIPE IS -- High Assurance Internet Protocol Encryptor Interoperability Specification (try saying that three times fast) -- to create connections with classified government networks. Its feature list even includes the specific option to "exchange secure e-mail with government personnel."
The Sectera Edge offers one-touch switching between classified and regular "Joe Dialtone" mode, too -- so if you ever feel like slumming it with the unsecured masses, you're just one click away.

Security Questions

So, given all of that, could the Sectera Edge be secure enough for a U.S. president? It's hard to say. One issue raised by my colleague Matt Hamblen at Computerworld is how the encryption would affect messages sent to non-government employees. (They would have to be given appropriate decryption technology to be able to read them.) A security expert also questions whether enemies could use the device to ascertain the president's location, or even tap into his phone's microphone and transmit his conversations.

The government, not surprisingly, is staying mum on the matter thus far -- and one might imagine it will continue to do so.

Sectera Edge in the Real World

Whether or not it ends up being presidential, maybe an NSA-certified smartphone could be the secret to one-upping your iPhone-toting friends. The Sectera Edge does offer all the standard stuff -- calendar, tasks, desktop synchronization. Oh, and it even meets military standards for drop and shock protection. That could come in handy.

The Sectera operates on a Microsoft Windows platform, if you're cool with that sort of thing. And both AT&T and T-Mobile offer secure voice and data service for the device.

One little problem, though: The Sectera Edge is just a teensy bit more expensive than our old iFriend. Its pricetag? $3,350.

Here's hoping for a Wal-Mart distribution deal sometime soon.

Frankly Speaking: For Microsoft, the pain is just beginning

Microsoft cuts 5,000 jobs. That's the big news of the week. Not just because the layoffs will cut one in 20 of Microsoft's 91,000 employees. Not only because it signals just how hard Microsoft has been hurt by the failure of Vista and by shifts in the way big customers license and use software. Not even because of the grim sign it represents for the rest of the IT industry.

No, it's big because it means Microsoft has begun to hit bottom.

And it's about time. For the past couple of decades, we've been referring to Microsoft as the new IBM. But Microsoft has never learned the lessons of the original IBM — not even the ones that Microsoft forced Big Blue to learn.

Consider: Back in early 1993, IBM had never had a round of layoffs, not even the kind of nips and tucks that Microsoft has used to trim about 1,000 employees over the past few years. When you worked at IBM, unless you fouled up badly, you had a job for life.

That all changed 16 years ago. IBM's mainframe-centric business model had failed, largely because of computing shifts to PCs and servers running Microsoft software. IBM's culture was sluggish, and it buried innovation. Result: The company's profits were headed off a cliff.

It took a new chairman and CEO, Lou Gerstner, to take the company apart and reassemble it around IT services. Along the way, there were layoffs -- a lot of layoffs. Many, many good people were hurt. So was IBM's reputation in towns like Poughkeepsie, where so many mainframe plant workers ended up on the street.

Today, IBM is healthy again. And the IT industry is healthier for not being dominated by Big Blue's mainframes.

It's easy to understand why Microsoft hasn't learned that lesson from IBM. This is Microsoft, after all. It has a lock on its markets. It has customers over a barrel. It's the 800-pound gorilla of the IT world, and it has been for as long as anyone at the company can remember.

Of course, all those things were true of IBM, too.

But Microsoft's situation is nowhere near as bad as IBM's was in 1993, is it? After all, Microsoft's profits haven't crashed — they've just dropped 11%, and that's in the middle of a recession.

Microsoft's revenue is still rising — though a closer look shows that it's inching up at less than the rate of inflation.

And Microsoft's stock? On Thursday, as Microsoft was announcing the layoffs, one cable-TV reporter commented that MSFT has "gone nowhere for years." Actually, the stock has lost nearly half its value over the past year.

So now, for the first time, Microsoft — like IBM 16 years ago — is resorting to a major layoff.

It won't be enough, any more than a layoff was enough for IBM.

Microsoft has been coasting for years on Windows and Office. Those have been the cash cows that enabled the company to fumble its way through years of halfhearted "innovation" and watered-down imitation. Microsoft has lost ground (or never gained a footing) in search versus Google, music players versus Apple, Web browsers versus Firefox.

Worse still, Microsoft has forgotten how to improve even those cash-cow products. Office 2007 is a mess for usability. Vista is a disaster in almost every way.

And now, Microsoft has begun to hit bottom financially, too. It's not all the way down yet. There's a lot more pain to come — both in Redmond and across the IT business.

But that has to happen before Microsoft can change its leadership, its culture, its business and, ultimately, its value to customers.

Now that will be big news.

By Frank Hayes
Computerworld

Wednesday, January 21, 2009

Ubuntu Mobile looks at Qt development environment

By Rodney Gedda

The Ubuntu Mobile operating system is undergoing its most radical change with a port to the ARM processor for Internet devices and netbooks, and may use Nokia's LGPL Qt development environment as an alternative to GNOME.

During a presentation at this year's linux.conf.au conference in Hobart, Canonical's David Mandala said Ubuntu Mobile has changed a lot over the past year in that it now includes netbook devices in addition to MIDs and the ARM port.

"I worked on ARM devices for many years so a full Linux distribution on ARM is exciting," Mandala said, adding one of the biggest challenges is reminding developers to write applications for 800 by 600 screen resolutions found in smaller devices.

"The standard [resolution] for GNOME [apps] is 800 by 600, but not all apps are. We do a fair amount of work customising screen sizes. Our apps are optimised to fit 4.5 to 10-inch LCDs -- with and without touchscreens."

For this reason Ubuntu Mobile uses the GNOME Mobile (Hildon framework) instead of a full GNOME desktop, but since Nokia open sourced Qt under the LGPL it may consider this as an alternative.

"We will be looking at a better framework than Hildon for screen input," Mandala said. "Hilton is about to change and watch this space. Intel and Nokia are creating a huge amount of change so hang tight for a couple of months."

"The KDE stuff and Qt is getting LGPL which will change the whole space. So watch this space as it is changing dramatically. We will chose the best tool."

Mandala said some of the KDE apps fit on the smaller screens well.

"I can't say anything about KDE at this point but who would have thought Qt would go LGPL," he said.

Ubuntu Mobile for netbooks will also get its own distribution in line with the release of Jaunty Jackalope in April 2009.

"Jaunty will have a full image for netbook devices," Mandala said. " Jaunty netbook edition will have a cut-down set of applications compared with the Ubuntu desktop, but apt-get works and you can install what you like."

The Jaunty netbook release may be GNOME-based or an alternative desktop.

"It's a completely supported Ubuntu distribution and will get every six month updates. We've ported the main repository, and compiled Universe," he said. "About 500 packages are still not built out of some 10,000."

"The interfaces are changing and they are playing with the interface to see what works for consumers," Mandala said. "It differs from Ubuntu desktop as it is not for experts."

The ARM v7 system for netbooks is under development and there are issues as most boards don't have 2D and 3D Linux drivers yet.

"We are also targeting things a little differently and Ubuntu Mobile requires some tweaking," Mandala said. "It is going on proprietary hardware and has proprietary extensions. It is a different marketplace, but we are still out here for the community."

"I'm working very hard with system vendors to get these devices to market at reasonable prices."

As for an Ubuntu Mobile GSM phone, "the big drawback is there is no telephony stack out there for us to use and there are certification issues".

computerworld

Set Up Windows 7 With Help From Hassle-Free PC

Set Up a Dual-Boot Configuration for Windows 7

If you're yearning to try the new Windows 7 beta everyone's talking about but don't have a spare machine lying around, fear not: You can install it in a small corner of your primary PC, without interfering with any of your day-to-day operations.

The secret: creating a new drive partition where the beta can take up residence. Lifehacker has a step-by-step guide that shows you how to do this, so I won't regurgitate the steps here.

I will say that I tried this over the weekend and it worked like a charm. Ironically, it's easiest for Windows Vista users, as that OS has built-in drive-partitioning tools. (And you thought it brought nothing new to the table.) But you can do it in Windows XP as well, provided you leverage a freeware partitioning program.

After you're done with the install, you'll be able to boot to your original Windows partition or the new one containing Windows 7. Truly, this is an ideal way to take the beta for a spin. And when you're done tinkering (or the license expires next August, whichever comes first), you can easily remove the partition to reclaim the drive space.
Plan Your Migration to a New PC

I just ordered a new PC. That's right, I'm doing my part to help the economy (or so I told the missus). When it arrives, I'll get to enjoy the kind of fun that usually requires a visit to the dentist's chair: migrating all my programs, data, and settings to the new machine.

I've had a bit of experience with this over the years, and feel like I've just about got it down to a science. The secret to a successful migration? Planning, planning, planning.

For starters, I've got a week or so until the UPS driver delivers my new toy. During that time, I'm going to compile a list of all the programs I use regularly, if not daily: Word, Outlook, Firefox, IrfanView, iTunes, and so on.

My plan is to reinstall each program on the new machine, as opposed to using pricey migration software to try to move the apps. In my opinion, that's just asking for hassles. For each program on my list, I make a note: "CD" or "download." If it's something I can't download to the new machine, I'll start searching now for the CDs I'll need.

Also for each program on the list, I make a note of what kind of data goes with it. That way, I can determine the best way to migrate. Firefox, for instance, has bookmarks, passwords, and extensions. The bookmarks and passwords are a snap: After installing Firefox on the new machine, I'll then install the Foxmarks extension and sign into my account. It syncs my bookmarks and passwords to the browser and presto, I'm done.

As for Word, I keep all my documents in a Data folder; it's a simple matter to copy that over to the new machine. (I'll connect both PCs to the network for fast and easy file transfers.) Same goes for my photos, videos, MP3s, and the like.

Speaking of MP3s, iTunes is going to be tricky. So is Outlook. I'll cover my migration strategies for those apps in future posts.

In the meantime, I'm planning, planning, planning. In addition to listing apps and data, I'm making notes about drivers I'll need for printers and other accessories. I'm making sure I've got registration codes for programs I've purchased online. And I'm clearing extra home-office space so I can keep the old machine "live" for a few weeks after I've transitioned to the new one, just in case I discover I left something important behind.

Rick Broida
pcworld

Obama inauguration drives record Web usage

he swearing in of U.S. President Barack Obama and the other presidential inauguration activities generated massive Web traffic Tuesday, leading to site slowdowns but not to a general meltdown of the Internet.

With intense and widespread interest in the ceremonies and festivities, especially President Obama's oath of office and inaugural address, millions of people had been expected to tune in online, especially those without TV access while at work.

Big media, news and U.S. government sites streamed events live and prepared special sections for the inauguration, yet some were still caught off-guard and experienced performance problems, mostly between midmorning and 12:30 p.m. U.S. Eastern Time.

Among those experiencing significant slowdowns were the sites of ABC, CBS, Fox Business, the L.A. Times, NBC, National Public Radio, USA Today and The Wall Street Journal, according to Keynote Systems, an Internet measurement and testing company. Government sites that buckled under the traffic included those of the White House, the U.S. Senate and the National Park Service, according to Keynote. Gomez, another Web performance-tracking company, also noticed a performance problem at the National Public Radio Web site.

"We predicted today would be one of the most, if not the most, significant online streaming event[s]," said Shawn White, Keynote's director of external operations.

"This was an unprecedented online event. I don't think we've ever seen as many viewers go online to watch an event," he added. "It's difficult to prepare for something that's unprecedented."

"On a positive note, I had heard predictions that the Internet would crumble, which didn't happen," White said.

A group of 40 large Web sites that Keynote routinely tracks also saw, on average, a collective slowdown during the swearing-in ceremony and the inaugural address, likely caused by the demand placed on Internet bandwidth by millions of live video streams, White said.

Interest in Tuesday's events was fueled by a combination of factors. President Obama is the country's first African-American president. In addition, he comes into office trailed by widespread hope that he'll fix the country's economic crisis.

This inauguration was the first since online video became a mainstream activity, so it wasn't a surprise that TV networks like CNN and MSNBC, as well as major newspapers like The New York Times and The Washington Post, provided live broadcasts on their sites.

CNN, which began its Web broadcast at 8 a.m., partnered with Facebook to display "status updates" from members of the social-networking site as they reacted to the events. According to Facebook, by 1:15 p.m., 600,000 status updates had been posted on CNN.com Live, with 8,500 hitting at the minute President Obama began his speech.
CNN.com, which will stream live video until the last inaugural ball ends, had, as of 3:30 p.m., generated more than 136 million page views, and its CNN.com Live section had served up more than 21.3 million live video streams globally, setting a new daily streaming record for itself, a spokeswoman said via e-mail. CNN.com Live estimates it served more than 1.3 million concurrent live streams during its peak immediately prior to President Obama’s inaugural address, she said.

Also breaking its own record for live streams was NBC's Hulu.com video site, the company said.

Content delivery specialist Akamai reported delivering record streams and content to its customer sites, such as The New York Times, Viacom and The Wall Street Journal. Akamai delivered a peak of more than 7 million simultaneous streams, most of them live, over its EdgePlatform, at approximately 12:15 p.m., at which time total traffic on its network surpassed more than 2 terabits per second. Akamai’s Net Usage Index for News, a daily Web traffic report of aggregate total visitors per minute to more than 100 news sites, recorded more than 5.4 million visitors per minute at approximately 11:45 a.m.

Other inauguration day sections could be found in Google's YouTube video sharing site and Yahoo's Flickr photo sharing site. The Twitter microblogging service partnered with Al Gore's Internet TV company Current TV to display messages from Twitter members online and on TV during the inauguration.

In addition to video and news articles, sites often provided other features like photo slide shows, interactive maps, opinion polls, reader comment forums and timelines.

Many TV stations and Web sites began tracking President Obama and First Lady Michelle Obama when they emerged at around 8:30 a.m. from Blair House, their temporary Washington, D.C. residence, and headed to a prayer service at a nearby church.

After the one-hour church service, the Obamas were driven to the White House for coffee with now-former President George W. Bush and his wife, Laura Bush. All four arrived at the Capitol for the inauguration ceremony at around 11 a.m.

President Obama was sworn in shortly after noon and wrapped up his speech at around 12:30 p.m. A lunch at the Capitol and a parade to the White House followed, and the festivities will continue into the night.

source: computerworld

Massive Theft of Credit Card Numbers Reported

Erik Larkin

Jan 21, 2009 3:02 am

A payment processor responsible for handling about 100 million credit card transactions every month today disclosed that thieves had used malicious software in its network in 2008 to steal an unknown number of credit card numbers.

The company's information on the incident site, http://2008breach.com/, attempts to downplay the loss of data by asserting that no Social Security numbers, unencrypted PINs or other types of data were stolen. But according to some good reporting from Brian Krebs at the Washington Post, Heartland's CEO says a piece of spyware stole payment card data as it passed through the company network, including the data from the magnetic stripe that can be used to create counterfeit cards.

Heartland says it did not discover until the Breach Visa and Mastercard came knocking about Suspicious activity involving card numbers processed by Heartland. Disheartening, to say the least.

It's all the more sad that we as consumers really can not do a darn thing to protect ourselves against this kind of theft. We can be incredibly careful with our own PC and data, but we have no control over how it's handled by the plethora of companies that store and process our information. All you can do is to keep an extra close eye on your credit card statements and credit reports for anything Hound.

You can pick up free credit reports from https: / / www.annualcreditreport.com (slimy avoid those sites that try to get you to pay for them). Also, as you scan your credit card statements, be on the lookout for even small charges, possibly even less than a dollar. Such charges can be a sign that thieves are testing the account to see if they can pass a fradulent charge, and may signal a much larger charge to come.

For more info on the Heartland theft, see Krebs' Security Fix post and the Heartland disclosure site. And yes, you have to wonder about disclosing this on a day when most everyone's attention is focused elsewhere.

Pcworld

Tuesday, January 20, 2009

Hackers deface NATO, U.S. Army Web sites

(IDG News Service) Hackers have taken down two high-profile targets as they continue their ongoing Web attacks in support of Palestine, defacing Web sites run by the U.S. Army and the North Atlantic Treaty Organization (NATO).

The attacks on Thursday took down the Web sites for the U.S. Army Military District of Washington and the NATO Parliamentary Assembly, according to Zone-H, a Web site that tracks defacement activity.

The NATO site is now back online, but the U.S. Army site was still offline Friday morning. A version of the Web page cached by Google Inc. reads, "Stop attacks u israel and usa ! you cursed nations ! one day muslims will clean the world from you !" NATO didn't immediately respond to a request for comment.

Most other U.S. Army sites do not appear to have been affected by that attack. The U.S. Army Military District of Washington is an army command based in Fort Lesley J. McNair in the District of Columbia.

Using what's known as a SQL injection attack, the group also defaced the Web site of the Joint Force Headquarters of the National Capital Region, which handles military incident response for the Washington area, according to Gary Warner, director of research in computer forensics at the University of Alabama at Birmingham. A U.S. Army spokeswoman was unable to immediately comment on reports of the hacks.

All of these attacks are credited to a Turkish hacking group called Agd_Scorp/Peace Crew.

This group has claimed many Web hacks over the past few months, including Microsoft Corp.'s Web sites in Canada, Ireland and China; Royal Dutch Shell PLC; Harvard University; and National Basketball Association Inc., Warner said in a blog posting.

"Although the group is now calling themselves 'Peace Crew,' the same membership was calling itself 'Terrorist Crew' as recently as December," Warner wrote.

As tensions in Gaza have intensified over the past few weeks, loosely organized hacking groups from countries such as Morocco, Turkey and Iran have defaced thousands of Web pages. This latest wave of attacks has mostly focused on Israeli sites, particularly easy targets belonging to individuals or small businesses. However, some high-profile targets have been hit too, such as news site Ynetnews.com.

computer world

IBM Could Face Mainframe Antitrust Investigation in Europe

Peter Sayer, IDG News Service

Florida mainframe manufacturer T3 Technologies has filed a formal complaint against IBM with the European Union's antitrust authority, it said Tuesday. In its complaint, T3 accuses IBM of refusing to sell its mainframe operating system to customers wanting to run it on computers made by T3.

The company wants the Commission to investigate the prices IBM charges for its mainframe systems, saying that European mainframe buyers could save US$48 billion over 20 years if there was fair competition in the market

IBM was ordered by the U.S. Department of Justice to stop its tying of hardware and software sales in a landmark case half a century ago.

Since then, according to T3, IBM has taken a calculated series of actions to stop companies such as Amdahl, Hitachi, Comparex, PSI and T3 from selling IBM-compatible mainframes, giving IBM an exclusive lock on the mainframe market.

T3 warned last August that it was preparing to file the complaint with the European Commission's Directorate General of Competition, shortly after IBM acquired Platform Solutions Inc., putting an end to an antitrust case brought by PSI in which T3 had wanted to participate.

That case began in 2006, when IBM sued PSI in the U.S. District Court for the Southern District of New York, alleging patent infringement. PSI countersued in early 2007, accusing IBM of antitrust violations and unfair competition, and T3 asked the court to be allowed to join the case against IBM. PSI withdrew the case after its acquisition by IBM.

U.S. taxpayers too have something to gain from T3's European action, according to the Computer and Communications Industry Association, a lobby group based in Washington, D.C. Many banks and government departments are reliant on mainframes and must pay the price IBM demands, the CCIA said.

The fact that a U.S. company had to go to Europe to seek relief is indicative of the vacuum in U.S. competition law enforcement, the CCIA said. It called on Barack Obama and the incoming U.S. government to direct the U.S. Department of Justice and the U.S. Federal Trade Commission to better protect competition in the mainframe market.

A former IBM partner and mainframe reseller, T3 introduced its own competing IBM-compatible computers, the tServer range, at the low end of the market starting in 2000. It introduced a new family of mainframes, Liberty, in 2006.

T3 now has the backing of Microsoft, which invested an undisclosed sum in the company last November to fund development of new products for mutual customers. Microsoft has had its own share of antitrust conflicts with the Commission, which opened a new investigation last week into the company's practice of bundling its Internet Explorer browser application with its Windows operating system.

European Commission representatives were unavailable for comment on T3's complaint.


www.pcworld.com

Update: IBM intros Lotus cloud suite, partners with Skype, SF.com

By Juan Carlos Perez
IBM has introduced LotusLive, which it describes as a portfolio of integrated Internet-hosted services for social networking and collaboration in workplaces.

LotusLive's Web site is now the portal where all of IBM's Lotus "cloud" offerings are located, including e-mail, collaboration and Web conferencing, IBM announced Monday at its Lotusphere conference in Orlando.

At LotusLive.com, organizations can find a suite of hosted collaboration and communication services designed to be easy to use and adopt, without requiring a hefty IT investment, IBM said.

Built on open standards, LotusLive is designed to allow for simple integration with third-party applications. It features a "click to cloud" functionality to tie existing applications residing on customer servers with LotusLive services.

IBM has also announced partnerships with Skype, LinkedIn and Salesforce.com for LotusLive. LinkedIn, which operates a social network for professional contacts, plans to tie its site with LotusLive, Lotus Notes and Lotus Connections. Salesforce.com intends to integrate its CRM software with LotusLive services. Skype will provide voice and video capabilities within LotusLive.

This move by IBM is in line with the trend from vendors like Google, Zoho, Jive, Microsoft, Salesforce.com, Socialtext, Jive Software, Central Desktop, Telligent and Atlassian and others to offer collaboration and communication applications via the software-as-a-service (SaaS) model.

The SaaS approach, in which software is hosted by vendors in their data centers and provided to customers via the Internet, has touched a wide variety of application types, including office productivity suites, e-mail, CRM and collaboration.

IBM has been one of the world's biggest providers of enterprise communication and collaboration software since it acquired Lotus in 1995. However, in recent years with the rise of Web 2.0 technologies like blogging, wikis and RSS, vendors like Google, Zoho and Yahoo's Zimbra have simplified the adoption and use of communication and collaboration software by offering it via the Internet, making it more affordable to smaller companies along the way.

IBM, with its Notes and Domino platform, as well as other major vendors of legacy communication and collaboration software, like Microsoft with Office and Outlook/Exchange, are trying to modify and extend those products to latch on the popularity of the SaaS model.

One Web 2.0 application that is gaining a lot of attention in the workplace is social networking, popularized in the consumer market in recent years by Facebook and MySpace. IT managers and CIOs are implementing social computing capabilities in their workplaces, having seen how these systems can help employees communicate, collaborate and do their jobs more efficiently. IBM's entry in this space is Lotus Connections.

Specifically, enterprise social networks typically mimic the core functionality of consumer sites, including the creation of lists of "friends" and the easy sharing of messages and content, as well as the automated notifications of contacts' actions. However, enterprise social networks, like other workplace social computing technologies, have special security and IT control features, as well as workplace-specific capabilities.

Other announcements at Lotusphere include:

-- IBM and SAP plan to release in March their first joint software product called Alloy, which links Lotus Notes with SAP Business Suite. Alloy is designed to present data from SAP applications within Lotus Notes. Both companies will sell the product. IBM and SAP announced this collaboration, code-named Atlantic, at last year's Lotusphere.

-- IBM and Research In Motion will deliver new Lotus software capabilities for BlackBerry phones. By the second quarter, BlackBerry users will be able to access Lotus Symphony documents, as well as Lotus Connections activities, blogs and communities. In the second half of the year, BlackBerry users will have access to Lotus Quickr team software, while developers will gain additional functionality thanks to new BlackBerry platform support for IBM Lotus Domino Designer and XPages.

-- The next version of Lotus Connections will add a Facebook-like "wall" to profiles where information can be posted, as well as Twitter-like microblogging capabilities for employees to share short updates of their actions and whereabouts. Connections is also gaining a new wiki service.

source:computerworld